Sample records for technology lsst program

  1. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  2. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Astrophysics Data System (ADS)

    Sullivan, M. R.

    1982-06-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  3. The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowden, Gordon B.; Langton, Brian J.; /SLAC

    2014-05-28

    The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less

  4. The Large Synoptic Survey Telescope Science Requirements

    NASA Astrophysics Data System (ADS)

    Tyson, J. A.; LSST Collaboration

    2004-12-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope facility that will add a qualitatively new capability in astronomy and will address some of the most pressing open questions in astronomy and fundamental physics. The 8.4-meter telescope and 3 billion pixel camera covering ten square degrees will reach sky in less than 10 seconds in each of 5-6 optical bands. This is enabled by advances in microelectronics, software, and large optics fabrication. The unprecedented optical throughput drives LSST's ability to go faint-wide-fast. The LSST will produce time-lapse digital imaging of faint astronomical objects across the entire visible sky with good resolution. For example, the LSST will provide unprecedented 3-dimensional maps of the mass distribution in the Universe, in addition to the traditional images of luminous stars and galaxies. These weak lensing data can be used to better understand the nature of Dark Energy. The LSST will also provide a comprehensive census of our solar system. By surveying deeply the entire accessible sky every few nights, the LSST will provide large samples of events which we now only rarely observe, and will create substantial potential for new discoveries. The LSST will produce the largest non-proprietary data set in the world. Several key science drivers are representative of the LSST system capabilities: Precision Characterization of Dark Energy, Solar System Map, Optical Transients, and a map of our Galaxy and its environs. In addition to enabling all four of these major scientific initiatives, LSST will make it possible to pursue many other research programs. The community has suggested a number of exciting programs using these data, and the long-lived data archives of the LSST will have the astrometric and photometric precision needed to support entirely new research directions which will inevitably develop during the next several decades.

  5. Technology for large space systems: A special bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  6. Technology for large space systems: A special bibliography with indexes (supplement 01)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  7. The Effects of Commercial Airline Traffic on LSST Observing Efficiency

    NASA Astrophysics Data System (ADS)

    Gibson, Rose; Claver, Charles; Stubbs, Christopher

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).

  8. Designing a Multi-Petabyte Database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei

    2007-01-10

    The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less

  9. Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  10. Technology for large space systems: A special bibliography with indexes (supplement 05)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  11. Technology for large space systems: A special bibliography with indexes (supplement 06)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  12. LSST and the Physics of the Dark Universe

    ScienceCinema

    Tyson, Anthony [UC Davis, California, United States

    2017-12-09

    The physics that underlies the accelerating cosmic expansion is unknown. This, 'dark energy' and the equally mysterious 'dark matter' comprise most of the mass-energy of the universe and are outside the standard model. Recent advances in optics, detectors, and information technology, has led to the design of a facility that will repeatedly image an unprecedented volume of the universe: LSST. For the first time, the sky will be surveyed wide, deep and fast. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. I will review the technology of LSST, and focus on several independent probes of the nature of dark energy and dark matter. These new investigations will rely on the statistical precision obtainable with billions of galaxies.

  13. Cosmology with the Large Synoptic Survey Telescope: an overview

    NASA Astrophysics Data System (ADS)

    Zhan, Hu; Tyson, J. Anthony

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  14. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  15. LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.

    2005-12-01

    We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.

  16. The Large Synoptic Survey Telescope project management control system

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey P.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.

  17. Cosmology with the Large Synoptic Survey Telescope: an overview.

    PubMed

    Zhan, Hu; Anthony Tyson, J

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an [Formula: see text] ([Formula: see text] effective) aperture, a novel three-mirror design achieving a seeing-limited [Formula: see text] field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an [Formula: see text] survey in six passbands (ugrizy) to a coadded depth of [Formula: see text] over 10 years using [Formula: see text] of its observational time. The remaining [Formula: see text] of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with [Formula: see text] exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  18. A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias; Langton, J. Brian; Wahl, Bill

    2017-09-01

    This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.

  19. The LSST operations simulator

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen

    2014-08-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.

  20. LSST: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration

    2009-01-01

    The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.

  1. NOAO and LSST: Illuminating the Path to LSST for All Users

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Matheson, T.; Ridgway, S. T.; Saha, A.; Lauer, T. R.; NOAO LSST Science Working Group

    2013-01-01

    As LSST moves toward construction and survey definition, the burden on the user community to begin planning and preparing for the massive data stream grows. In light of the significant challenge and opportunity that LSST now brings, a critical role for a National Observatory will be to advocate for, respond to, and advise the U.S. community on its use of LSST. NOAO intends to establish an LSST Community Science Center to meet these common needs. Such a Center builds on NOAO's leadership in offering survey-style instruments, proposal opportunities, and data management software over the last decade. This leadership has enabled high-impact scientific results, as evidenced by the award of the 2011 Nobel Prize in Physics for the discovery of Dark Energy, which stemmed directly from survey-style observations taken at NOAO. As steps towards creating an LSST Community Science Center, NOAO is 1) supporting the LSST Science Collaborations through membership calls and collaboration meetings; 2) developing the LSST operations simulator, the tool by which the community's scientific goals of are tested against the reality of what LSST's cadence can deliver; 3) embarking on a project to establish metrics for science data quality assessment, which will be critical for establishing faith in LSST results; 4) developing a roadmap and proposal to host and support the capability to help the community manage the expected flood of automated alerts from LSST; and 5) starting a serious discussion of the system capabilities needed for photometric and spectroscopic followup of LSST observations. The fundamental goal is to enable productive, world-class research with LSST by the entire US community-at-large in tight collaboration with the LSST Project, LSST Science Collaborations, and the funding agencies.

  2. Giga-z: A 100,000 Object Superconducting Spectrophotometer for LSST Follow-up

    NASA Astrophysics Data System (ADS)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran; Hirata, Chris

    2013-09-01

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R 423 nm = E/ΔE = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations. In 3 yr on a dedicated 4 m class telescope, Giga-z could observe ≈2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z ≈ 6 with magnitudes mi <~ 25, with accuracy σΔz/(1 + z) ≈ 0.03 for the whole sample, and σΔz/(1 + z) ≈ 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of α = 1.27 (wp ), 1.53 (wa ), or 1.98 (Δγ). This is equivalent to multiplying both the LSST coverage area and the training sets by α and reducing all systematics by a factor of 1/\\sqrt{\\alpha }, advantages that are robust to even more extreme models of intrinsic alignment.

  3. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    abate, alex; cheu, elliott

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  4. LSST: Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.

  5. Effect of elastic band-based high-speed power training on cognitive function, physical performance and muscle strength in older women with mild cognitive impairment.

    PubMed

    Yoon, Dong Hyun; Kang, Dongheon; Kim, Hee-Jae; Kim, Jin-Soo; Song, Han Sol; Song, Wook

    2017-05-01

    The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated. In particular, unconventional high-speed resistance training can improve muscle power development. In the present study, the effectiveness of 12 weeks of elastic band-based high-speed power training (HSPT) was examined. Participants were randomly assigned into a HSPT group (n = 14, age 75.0 ± 0.9 years), a low-speed strength training (LSST) group (n = 9, age 76.0 ± 1.3 years) and a control group (CON; n = 7, age 78.0 ± 1.0 years). A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups, and balance and tone exercises were carried out by the CON group. Significant increases in levels of cognitive function, physical function, and muscle strength were observed in both the HSPT and LSST groups. In cognitive function, significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group. In physical functions, Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group. In the 12 weeks of elastic band-based training, the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group, although both regimens were effective in improving cognitive function, physical function and muscle strength. We conclude that elastic band-based HSPT, as compared with LSST, is more efficient in helping older women with mild cognitive impairment to improve cognitive function, physical performance and muscle strength. Geriatr Gerontol Int 2017; 17: 765-772. © 2016 Japan Geriatrics Society.

  6. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  7. Scientific Synergy between LSST and Euclid

    NASA Astrophysics Data System (ADS)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja

    2017-12-01

    Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.

  8. Formal Education with LSST

    NASA Astrophysics Data System (ADS)

    Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.

  9. The LSST Scheduler from design to construction

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Reuter, Michael A.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.

  10. Scientific Synergy between LSST and Euclid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  11. Scientific Synergy between LSST and Euclid

    DOE PAGES

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...

    2017-12-07

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  12. The European perspective for LSST

    NASA Astrophysics Data System (ADS)

    Gangler, Emmanuel

    2017-06-01

    LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.

  13. The LSST OCS scheduler design

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Schumacher, German

    2014-08-01

    The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.

  14. Giga-z: A 100,000 OBJECT SUPERCONDUCTING SPECTROPHOTOMETER FOR LSST FOLLOW-UP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran

    2013-09-15

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R{sub 423{sub nm}} = E/{Delta}E = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations.more » In 3 yr on a dedicated 4 m class telescope, Giga-z could observe Almost-Equal-To 2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z Almost-Equal-To 6 with magnitudes m{sub i} {approx}< 25, with accuracy {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.03 for the whole sample, and {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of {alpha} = 1.27 (w{sub p} ), 1.53 (w{sub a} ), or 1.98 ({Delta}{gamma}). This is equivalent to multiplying both the LSST coverage area and the training sets by {alpha} and reducing all systematics by a factor of 1/{radical}({alpha}), advantages that are robust to even more extreme models of intrinsic alignment.« less

  15. The Large Synoptic Survey Telescope (LSST) Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  16. Transient Alerts in LSST

    NASA Astrophysics Data System (ADS)

    Kantor, J.

    During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.

  17. The Large Synoptic Survey Telescope (LSST) Camera

    ScienceCinema

    None

    2018-06-13

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  18. FY-79 - development of fiber optics connector technology for large space systems

    NASA Technical Reports Server (NTRS)

    Campbell, T. G.

    1980-01-01

    The development of physical concepts for integrating fiber optic connectors and cables with structural concepts proposed for the LSST is discussed. Emphasis is placed on remote connections using integrated cables.

  19. Optimizing the LSST Dither Pattern for Survey Uniformity

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.

  20. Examining the Potential of LSST to Contribute to Exoplanet Discovery

    NASA Astrophysics Data System (ADS)

    Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.

    2018-01-01

    The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.

  1. Science Education with the LSST

    NASA Astrophysics Data System (ADS)

    Jacoby, S. H.; Khandro, L. M.; Larson, A. M.; McCarthy, D. W.; Pompea, S. M.; Shara, M. M.

    2004-12-01

    LSST will create the first true celestial cinematography - a revolution in public access to the changing universe. The challenge will be to take advantage of the unique capabilities of the LSST while presenting the data in ways that are manageable, engaging, and supportive of national science education goals. To prepare for this opportunity for exploration, tools and displays will be developed using current deep-sky multi-color imaging data. Education professionals from LSST partners invite input from interested members of the community. Initial LSST science education priorities include: - Fostering authentic student-teacher research projects at all levels, - Exploring methods of visualizing the large and changing datasets in science centers, - Defining Web-based interfaces and tools for access and interaction with the data, - Delivering online instructional materials, and - Developing meaningful interactions between LSST scientists and the public.

  2. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 1

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    The first of a two-phase program was performed to develop the technology necessary to evaluate, design, manufacture, package, transport and deploy the hoop/column deployable antenna reflector by means of a ground based program. The hoop/column concept consists of a cable stiffened large diameter hoop and central column structure that supports and contours a radio frequency reflective mesh surface. Mission scenarios for communications, radiometer and radio astronomy, were studied. The data to establish technology drivers that resulted in a specification of a point design was provided. The point design is a multiple beam quadaperture offset antenna system wich provides four separate offset areas of illumination on a 100 meter diameter symmetrical parent reflector. The periphery of the reflector is a hoop having 48 segments that articulate into a small stowed volume around a center extendable column. The hoop and column are structurally connected by graphite and quartz cables. The prominence of cables in the design resulted in the development of advanced cable technology. Design verification models were built of the hoop, column, and surface stowage subassemblies. Model designs were generated for a half scale sector of the surface and a 1/6 scale of the complete deployable reflector.

  3. LSST active optics system software architecture

    NASA Astrophysics Data System (ADS)

    Thomas, Sandrine J.; Chandrasekharan, Srinivasan; Lotz, Paul; Xin, Bo; Claver, Charles; Angeli, George; Sebag, Jacques; Dubois-Felsmann, Gregory P.

    2016-08-01

    The Large Synoptic Survey Telescope (LSST) is an 8-meter class wide-field telescope now under construction on Cerro Pachon, near La Serena, Chile. This ground-based telescope is designed to conduct a decade-long time domain survey of the optical sky. In order to achieve the LSST scientific goals, the telescope requires delivering seeing limited image quality over the 3.5 degree field-of-view. Like many telescopes, LSST will use an Active Optics System (AOS) to correct in near real-time the system aberrations primarily introduced by gravity and temperature gradients. The LSST AOS uses a combination of 4 curvature wavefront sensors (CWS) located on the outside of the LSST field-of-view. The information coming from the 4 CWS is combined to calculate the appropriate corrections to be sent to the 3 different mirrors composing LSST. The AOS software incorporates a wavefront sensor estimation pipeline (WEP) and an active optics control system (AOCS). The WEP estimates the wavefront residual error from the CWS images. The AOCS determines the correction to be sent to the different degrees of freedom every 30 seconds. In this paper, we describe the design and implementation of the AOS. More particularly, we will focus on the software architecture as well as the AOS interactions with the various subsystems within LSST.

  4. Geostationary platform systems concepts definition follow-on study. Volume 2A: Technical Task 2 LSST special emphasis

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.

  5. LSST Survey Data: Models for EPO Interaction

    NASA Astrophysics Data System (ADS)

    Olsen, J. K.; Borne, K. D.

    2007-12-01

    The potential for education and public outreach with the Large Synoptic Survey Telescope is as far reaching as the telescope itself. LSST data will be available to the public, giving anyone with a web browser a movie-like window on the Universe. The LSST project is unique in designing its data management and data access systems with the public and community users in mind. The enormous volume of data to be generated by LSST is staggering: 30 Terabytes per night, 10 Petabytes per year. The final database of extracted science parameters from the images will also be enormous -- 50-100 Petabytes -- a rich gold mine for data mining and scientific discovery potential. LSST will also generate 100,000 astronomical alerts per night, for 10 years. The LSST EPO team is examining models for EPO interaction with the survey data, particularly in how the community (amateurs, teachers, students, and general public) can participate in the discovery process. We will outline some of our models of community interaction for inquiry-based science using the LSST survey data, and we invite discussion on these topics.

  6. Satellite Power Systems (SPS). LSST systems and integration task for SPS flight test article

    NASA Technical Reports Server (NTRS)

    Greenberg, H. S.

    1981-01-01

    This research activity emphasizes the systems definition and resulting structural requirements for the primary structure of two potential SPS large space structure test articles. These test articles represent potential steps in the SPS research and technology development.

  7. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  8. Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Jurić, Mario; Ivezić, Željko

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.

  9. Investigating the Bright End of LSST Photometry

    NASA Astrophysics Data System (ADS)

    Ojala, Elle; Pepper, Joshua; LSST Collaboration

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.

  10. Investigating interoperability of the LSST data management software stack with Astropy

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  11. The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.

    2018-03-01

    Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.

  12. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration

    2011-01-01

    The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  13. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  14. A Prototype External Event Broker for LSST

    NASA Astrophysics Data System (ADS)

    Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald

    2015-01-01

    LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.

  15. Astrometry with LSST: Objectives and Challenges

    NASA Astrophysics Data System (ADS)

    Casetti-Dinescu, D. I.; Girard, T. M.; Méndez, R. A.; Petronchak, R. M.

    2018-01-01

    The forthcoming Large Synoptic Survey Telescope (LSST) is an optical telescope with an effective aperture of 6.4 m, and a field of view of 9.6 square degrees. Thus, LSST will have an étendue larger than any other optical telescope, performing wide-field, deep imaging of the sky. There are four broad categories of science objectives: 1) dark-energy and dark matter, 2) transients, 3) the Milky Way and its neighbours and, 4) the Solar System. In particular, for the Milky-Way science case, astrometry will make a critical contribution; therefore, special attention must be devoted to extract the maximum amount of astrometric information from the LSST data. Here, we outline the astrometric challenges posed by such a massive survey. We also present some current examples of ground-based, wide-field, deep imagers used for astrometry, as precursors of the LSST.

  16. Data Mining Research with the LSST

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Strauss, M. A.; Tyson, J. A.

    2007-12-01

    The LSST catalog database will exceed 10 petabytes, comprising several hundred attributes for 5 billion galaxies, 10 billion stars, and over 1 billion variable sources (optical variables, transients, or moving objects), extracted from over 20,000 square degrees of deep imaging in 5 passbands with thorough time domain coverage: 1000 visits over the 10-year LSST survey lifetime. The opportunities are enormous for novel scientific discoveries within this rich time-domain ultra-deep multi-band survey database. Data Mining, Machine Learning, and Knowledge Discovery research opportunities with the LSST are now under study, with a potential for new collaborations to develop to contribute to these investigations. We will describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. We also give some illustrative examples of current scientific data mining research in astronomy, and point out where new research is needed. In particular, the data mining research community will need to address several issues in the coming years as we prepare for the LSST data deluge. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; visual data mining algorithms for visual exploration of the data; indexing of multi-attribute multi-dimensional astronomical databases (beyond RA-Dec spatial indexing) for rapid querying of petabyte databases; and more. Finally, we will identify opportunities for synergistic collaboration between the data mining research group and the LSST Data Management and Science Collaboration teams.

  17. The Stellar Populations of the Milky Way and Nearby Galaxies with LSST

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Covey, K.; Saha, A.; Beers, T. C.; Bochanski, J.; Boeshaar, P.; Cargile, P.; Catelan, M.; Burgasser, A.; Cook, K.; Dhital, S.; Figer, D.; Ivezic, Z.; Kalirai, J.; McGehee, P.; Minniti, D.; Pepper, J.; Prsa, A.; Sarajedini, A.; Silva, D.; Smith, J. A.; Stassun, K.; Thorman, P.; Williams, B.; LSST Stellar Populations Collaboration

    2011-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma) when observations at the individual epochs of the standard cadence are stacked. Analyzing the ten years of independent measurements in each field will allow variability, proper motion and parallax measurements to be derived for objects brighter than r=24.5. These photometric, astrometric, and variability data will enable the construction of a detailed and robust map of the stellar populations of the Milky Way, its satellites and its nearest extra-galactic neighbors--allowing exploration of their star formation, chemical enrichment, and accretion histories on a grand scale. For example, with geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, LSST will allow a complete census of all stars above the hydrogen-burning limit that are closer than 500 pc, including thousands of predicted L and T dwarfs. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics; LSST's projected impact on the study of several variable star classes, including eclipsing binaries, are discussed here. We also describe the ongoing efforts of the collaboration to optimize the LSST system for stellar populations science. We are currently investigating the trade-offs associated with the exact wavelength boundaries of the LSST filters, identifying the most scientifically valuable locations for fields that will receive enhanced temporal coverage compared to the standard cadence, and analyzing synthetic LSST outputs to verify that the system's performance will be sufficient to achieve our highest priority science goals.

  18. Firefly: embracing future web technologies

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.

    2016-07-01

    At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.

  19. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team

    2010-01-01

    The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  20. LSST camera grid structure made out of ceramic composite material, HB-Cesic

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias R.; Langton, J. Bryan

    2016-08-01

    In this paper we are presenting the ceramic design and the fabrication of the camera structure which is using the unique manufacturing features of the HB-Cesic technology and associated with a dedicated metrology device in order to ensure the challenging flatness requirement of 4 micron over the full array.

  1. Designing a multi-petabyte database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J; Hanushevsky, A

    2005-12-21

    The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less

  2. TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacklin, Savannah; Lund, Michael B.; Stassun, Keivan G.

    2015-07-15

    The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5more » to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.« less

  3. LSST and the Epoch of Reionization Experiments

    NASA Astrophysics Data System (ADS)

    Ivezić, Željko

    2018-05-01

    The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.

  4. Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies

    NASA Astrophysics Data System (ADS)

    Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter

    2018-01-01

    In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.

  5. The Search for Transients and Variables in the LSST Pathfinder Survey

    NASA Astrophysics Data System (ADS)

    Gorsuch, Mary Katherine; Kotulla, Ralf

    2018-01-01

    This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.

  6. The LSSTC Data Science Fellowship Program

    NASA Astrophysics Data System (ADS)

    Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council

    2017-01-01

    The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.

  7. LSST Resources for the Community

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne

    2011-01-01

    LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.

  8. Measuring the Growth Rate of Structure with Type IA Supernovae from LSST

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Robotham, Aaron S. G.; Lagos, Claudia D. P.; Kim, Alex G.

    2017-10-01

    We investigate the peculiar motions of galaxies up to z = 0.5 using Type Ia supernovae (SNe Ia) from the Large Synoptic Survey Telescope (LSST) and predict the subsequent constraints on the growth rate of structure. We consider two cases. Our first is based on measurements of the volumetric SNe Ia rate and assumes we can obtain spectroscopic redshifts and light curves for varying fractions of objects that are detected pre-peak luminosity by LSST (some of which may be obtained by LSST itself, and others that would require additional follow-up observations). We find that these measurements could produce growth rate constraints at z< 0.5 that significantly outperform those found using Redshift Space Distortions (RSD) with DESI or 4MOST, even though there are ˜ 4× fewer objects. For our second case, we use semi-analytic simulations and a prescription for the SNe Ia rate as a function of stellar mass and star-formation rate to predict the number of LSST SNe IA whose host redshifts may already have been obtained with the Taipan+WALLABY surveys or with a future multi-object spectroscopic survey. We find ˜18,000 and ˜160,000 SNe Ia with host redshifts for these cases, respectively. While this is only a fraction of the total LSST-detected SNe Ia, they could be used to significantly augment and improve the growth rate constraints compared to only RSD. Ultimately, we find that combining LSST SNe Ia with large numbers of galaxy redshifts will provide the most powerful probe of large-scale gravity in the z< 0.5 regime over the coming decades.

  9. Control system design for the large space systems technology reference platform

    NASA Technical Reports Server (NTRS)

    Edmunds, R. S.

    1982-01-01

    Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.

  10. From Science To Design: Systems Engineering For The Lsst

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration

    2009-01-01

    The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.

  11. Strong Gravitational Lensing with LSST

    NASA Astrophysics Data System (ADS)

    Marshall, Philip J.; Bradac, M.; Chartas, G.; Dobler, G.; Eliasdottir, A.; Falco, E.; Fassnacht, C. D.; Jee, M. J.; Keeton, C. R.; Oguri, M.; Tyson, J. A.; LSST Strong Lensing Science Collaboration

    2010-01-01

    LSST will find more strong gravitational lensing events than any other survey preceding it, and will monitor them all at a cadence of a few days to a few weeks. We can expect the biggest advances in strong lensing science made with LSST to be in those areas that benefit most from the large volume, and the high accuracy multi-filter time series: studies of, and using, several thousand lensed quasars and several hundred supernovae. However, the high quality imaging will allow us to detect and measure large numbers of background galaxies multiply-imaged by galaxies, groups and clusters. In this poster we give an overview of the strong lensing science enabled by LSST, and highlight the particular associated technical challenges that will have to be faced when working with the survey.

  12. Expanding the user base beyond HEP for the Ganga distributed analysis user interface

    NASA Astrophysics Data System (ADS)

    Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.

    2017-10-01

    This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.

  13. Big Data Science Cafés: High School Students Experiencing Real Research with Scientists

    NASA Astrophysics Data System (ADS)

    Walker, C. E.; Pompea, S. M.

    2017-12-01

    The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.

  14. LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team

    2011-01-01

    The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.

  15. Evaluation of Potential LSST Spatial Indexing Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolaev, S; Abdulla, G; Matzke, R

    2006-10-13

    The LSST requirement for producing alerts in near real-time, and the fact that generating an alert depends on knowing the history of light variations for a given sky position, both imply that the clustering information for all detections is available at any time during the survey. Therefore, any data structure describing clustering of detections in LSST needs to be continuously updated, even as new detections are arriving from the pipeline. We call this use case ''incremental clustering'', to reflect this continuous updating of clustering information. This document describes the evaluation results for several potential LSST incremental clustering strategies, using: (1)more » Neighbors table and zone optimization to store spatial clusters (a.k.a. Jim Grey's, or SDSS algorithm); (2) MySQL built-in R-tree implementation; (3) an external spatial index library which supports a query interface.« less

  16. Photometric classification and redshift estimation of LSST Supernovae

    NASA Astrophysics Data System (ADS)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve

    2018-07-01

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 per cent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ (z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ (z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.

  17. Systems engineering in the Large Synoptic Survey Telescope project: an application of model based systems engineering

    NASA Astrophysics Data System (ADS)

    Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques

    2014-08-01

    The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.

  18. Commentary: Learning About the Sky Through Simulations. Chapter 34

    NASA Technical Reports Server (NTRS)

    Way, Michael J.

    2012-01-01

    The Large Synoptic Survey Telescope (LSST) simulator being built by Andy Connolly and collaborators is an impressive undertaking and should make working with LSST in the beginning stages far more easy than it was initially with the Sloan Digital Sky Survey (SDSS). However, I would like to focus on an equally important problem that has not yet been discussed here, but in the coming years the community will need to address-can we deal with the flood of data from LSST and will we need to rethink the way we work?

  19. LSST Painting Risk Evaluation Memo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Justin E.

    The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.

  20. Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations

    NASA Astrophysics Data System (ADS)

    Given, Gabriel; Grin, Daniel

    2018-01-01

    Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.

  1. Curvature wavefront sensing performance evaluation for active correction of the Large Synoptic Survey Telescope (LSST).

    PubMed

    Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice

    2010-01-18

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.

  2. LSST Data Management

    NASA Astrophysics Data System (ADS)

    O'Mullane, William; LSST Data Management Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map. We are also discuss the move to Python3 and timeline for dropping Python2.

  3. LSST summit facility construction progress report: reacting to design refinements and field conditions

    NASA Astrophysics Data System (ADS)

    Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo

    2016-07-01

    The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.

  4. Photometric classification and redshift estimation of LSST Supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 percent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z)more » of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ(z phot -z spec 1+z spec )=0.0294 σ(zphot-zspec1+zspec)=0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ(z phot -z spec 1+z spec )=0.0116 σ(zphot-zspec1+zspec)=0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.« less

  5. The LSST Data Mining Research Agenda

    NASA Astrophysics Data System (ADS)

    Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.

    2008-12-01

    We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.

  6. On the Detectability of Planet X with LSST

    NASA Astrophysics Data System (ADS)

    Trilling, David E.; Bellm, Eric C.; Malhotra, Renu

    2018-06-01

    Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.

  7. Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration

    2018-01-01

    The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.

  8. The Large Synoptic Survey Telescope: Projected Near-Earth Object Discovery Performance

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field survey that has the potential to detect millions of asteroids. LSST is under construction with survey operations slated to begin in 2022. We describe an independent study to assess the performance of LSST for detecting and cataloging near-Earth objects (NEOs). A significant component of the study will be to assess the survey's ability to link observations of a single object from among the large numbers of false detections and detections of other objects. We also will explore the survey's basic performance in terms of fraction of NEOs discovered and cataloged, both for the planned baseline survey, but also for enhanced surveys that are more carefully tuned for NEO search, generally at the expense of other science drivers. Preliminary results indicate that with successful linkage under the current baseline survey LSST would discover approximately 65% of NEOs with absolute magnitude H is less than 22, which corresponds approximately to 140m diameter.

  9. An optical to IR sky brightness model for the LSST

    NASA Astrophysics Data System (ADS)

    Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo

    2016-07-01

    To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.

  10. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  11. Probing LSST's Ability to Detect Planets Around White Dwarfs

    NASA Astrophysics Data System (ADS)

    Cortes, Jorge; Kipping, David

    2018-01-01

    Over the last four years more than 2,000 planets outside our solar system have been discovered, motivating us to search for and characterize potentially habitable worlds. Most planets orbit Sun-like stars, but more exotic stars can also host planets. Debris disks and disintegrating planetary bodies have been detected around white dwarf stars, the inert, Earth-sized cores of once-thriving stars like our Sun. These detections are clues that planets may exist around white dwarfs. Due to the faintness of white dwarfs and the potential rarity of planets around them, a vast survey is required to have a chance at detecting these planetary systems. The Large Synoptic Survey Telescope (LSST), scheduled to commence operations in 2023, will image the entire southern sky every few nights for 10 years, providing our first real opportunity to detect planets around white dwarfs. We characterized LSST’s ability to detect planets around white dwarfs through simulations that incorporate realistic models for LSST’s observing strategy and the white dwarf distribution within the Milky Way galaxy. This was done through the use of LSST's Operations Simulator (OpSim) and Catalog Simulator (CatSim). Our preliminary results indicate that, if all white dwarfs were to possess a planet, LSST would yield a detection for every 100 observed white dwarfs. In the future, a larger set of ongoing simulations will help us quantify the number of planets LSST could potentially find.

  12. OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST

    NASA Astrophysics Data System (ADS)

    Roming, Peter; van der Horst, Alexander; OCTOCAM Team

    2018-01-01

    The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.

  13. Mapping Near-Earth Hazards

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    How can we hunt down all the near-Earth asteroids that are capable of posing a threat to us? A new study looks at whether the upcoming Large Synoptic Survey Telescope (LSST) is up to the job.Charting Nearby ThreatsLSST is an 8.4-m wide-survey telescope currently being built in Chile. When it goes online in 2022, it will spend the next ten years surveying our sky, mapping tens of billions of stars and galaxies, searching for signatures of dark energy and dark matter, and hunting for transient optical events like novae and supernovae. But in its scanning, LSST will also be looking for asteroids that approach near Earth.Cumulative number of near-Earth asteroids discovered over time, as of June 16, 2016. [NASA/JPL/Chamberlin]Near-Earth objects (NEOs) have the potential to be hazardous if they cross Earths path and are large enough to do significant damage when they impact Earth. Earths history is riddled with dangerous asteroid encounters, including the recent Chelyabinsk airburst in 2013, the encounter that caused the kilometer-sized Meteor Crater in Arizona, and the impact thought to contribute to the extinction of the dinosaurs.Recognizing the potential danger that NEOs can pose to Earth, Congress has tasked NASA with tracking down 90% of NEOs larger than 140 meters in diameter. With our current survey capabilities, we believe weve discovered roughly 25% of these NEOs thus far. Now a new study led by Tommy Grav (Planetary Science Institute) examines whether LSST will be able to complete this task.Absolute magnitude, H, of asynthetic NEO population. Though these NEOs are all larger than 140 m, they have a large spread in albedos. [Grav et al. 2016]Can LSST Help?Based on previous observations of NEOs and resulting predictions for NEO properties and orbits, Grav and collaborators simulate a synthetic population of NEOs all above 140 m in size. With these improved population models, they demonstrate that the common tactic of using an asteroids absolute magnitude as a proxy for its size is a poor approximation, due to asteroids large spread in albedos. Roughly 23% of NEOs larger than 140 m have absolute magnitudes fainter than H = 22 mag, the authors show which is the value usually assumed as the default absolute magnitude of a 140 m NEO.Fraction of NEOs weve detected as a function of time based on the authors simulations of the current surveys (red), LSST plus the current surveys (black), NEOCam plus the current surveys (blue), and the combined result for all surveys (green). [Grav et al. 2016]Taking this into account, Grav and collaborators then use information about the planned LSST survey strategies and detection limits to test what fraction of this synthetic NEO population LSST will be able to detect in its proposed 10-year mission.The authors find that, within 10 years, LSST will likely be able to detect only 63% of NEOs larger than 140 m. Luckily, LSST may not have to work alone; in addition to the current surveys in operation, a proposed infrared space-based survey mission called NEOCam is planned for launch in 2021. If NEOCam is funded, it will complement LSSTs discovery capabilities, potentially allowing the two surveys to jointly achieve the 90% detection goal within a decade.CitationT. Grav et al 2016 AJ 151 172. doi:10.3847/0004-6256/151/6/172

  14. Final Technical Report for DE-SC0012297

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Antonio, Ian

    This is the final report on the work performed in award DE-SC0012297, Cosmic Frontier work in support of the LSST Dark Energy Science Collaboration's work to develop algorithms, simulations, and statistical tests to ensure optimal extraction of the dark energy properties from galaxy clusters observed with LSST. This work focused on effects that could produce a systematic error on the measurement of cluster masses (that will be used to probe the effects of dark energy on the growth of structure). These effects stem from the deviations from pure ellipticity of the gravitational lensing signal and from the blending of lightmore » of neighboring galaxies. Both these effects are expected to be more significant for LSST than for the stage III experiments such as the Dark Energy Survey. We calculate the magnitude of the mass error (or bias) for the first time and demonstrate that it can be treated as a multiplicative correction and calibrated out, allowing mass measurements of clusters from gravitational lensing to meet the requirements of LSST's dark energy investigation.« less

  15. Using SysML for MBSE analysis of the LSST system

    NASA Astrophysics Data System (ADS)

    Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques

    2010-07-01

    The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.

  16. Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.

    2013-01-01

    The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.

  17. Stellar Populations with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; LSST Stellar Populations Collaboration

    2006-12-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to g 27.5(5σ). Strategically cadenced time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than g 25. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence stars at all distances within the Galaxy, permitting a comprehensive study of star formation histories (SFH) and chemical evolution for field stars. With a geometric parallax accuracy of 1mas, LSST will produce a robust complete sample of the solar neighborhood stars. While delivering parallax accuracy comparable to HIPPARCOS, LSST will extend the catalog to more than a 10 magnitudes fainter limit, and will be complete to MV 15. In the Magellanic Clouds too, the photometry will reach MV +8, allowing the SFH and chemical signatures in the expansive outer extremities to be gleaned from their main sequence stars. This in turn will trace the detailed interaction of the Clouds with the Galaxy halo. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1hr to several years, a feast for variable star astrophysics. Cepheids and LPVs in all galaxies in the Sculptor, M83 and Cen-A groups are obvious data products: comparative studies will reveal systematic differences with galaxy properties, and help to fine tune the rungs of the distance ladder. Dwarf galaxies within 10Mpc that are too faint to find from surface brightness enhancements will be revealed via over-densities of their red giants: this systematic census will extend the luminosity function of galaxies to the faint limit. Novae discovered by LSST time sampling will trace intergalactic stars out to the Virgo and Fornax clusters.

  18. The LSST Metrics Analysis Framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.

    2015-01-01

    Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.

  19. LSST Probes of Dark Energy: New Energy vs New Gravity

    NASA Astrophysics Data System (ADS)

    Bradshaw, Andrew; Tyson, A.; Jee, M. J.; Zhan, H.; Bard, D.; Bean, R.; Bosch, J.; Chang, C.; Clowe, D.; Dell'Antonio, I.; Gawiser, E.; Jain, B.; Jarvis, M.; Kahn, S.; Knox, L.; Newman, J.; Wittman, D.; Weak Lensing, LSST; LSS Science Collaborations

    2012-01-01

    Is the late time acceleration of the universe due to new physics in the form of stress-energy or a departure from General Relativity? LSST will measure the shape, magnitude, and color of 4x109 galaxies to high S/N over 18,000 square degrees. These data will be used to separately measure the gravitational growth of mass structure and distance vs redshift to unprecedented precision by combining multiple probes in a joint analysis. Of the five LSST probes of dark energy, weak gravitational lensing (WL) and baryon acoustic oscillation (BAO) probes are particularly effective in combination. By measuring the 2-D BAO scale in ugrizy-band photometric redshift-selected samples, LSST will determine the angular diameter distance to a dozen redshifts with sub percent-level errors. Reconstruction of the WL shear power spectrum on linear and weakly non-linear scales, and of the cross-correlation of shear measured in different photometric redshift bins provides a constraint on the evolution of dark energy that is complementary to the purely geometric measures provided by supernovae and BAO. Cross-correlation of the WL shear and BAO signal within redshift shells minimizes the sensitivity to systematics. LSST will also detect shear peaks, providing independent constraints. Tomographic study of the shear of background galaxies as a function of redshift allows a geometric test of dark energy. To extract the dark energy signal and distinguish between the two forms of new physics, LSST will rely on accurate stellar point-spread functions (PSF) and unbiased reconstruction of galaxy image shapes from hundreds of exposures. Although a weighted co-added deep image has high S/N, it is a form of lossy compression. Bayesian forward modeling algorithms can in principle use all the information. We explore systematic effects on shape measurements and present tests of an algorithm called Multi-Fit, which appears to avoid PSF-induced shear systematics in a computationally efficient way.

  20. Responding to the Event Deluge

    NASA Technical Reports Server (NTRS)

    Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John

    2012-01-01

    We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others

  1. Mechanical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordby, Martin; Bowden, Gordon; Foss, Mike

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less

  2. Earth's Minimoons: Opportunities for Science and Technology.

    NASA Astrophysics Data System (ADS)

    Jedicke, Robert; Bolin, Bryce T.; Bottke, William F.; Chyba, Monique; Fedorets, Grigori; Granvik, Mikael; Jones, Lynne; Urrutxua, Hodei

    2018-05-01

    Twelve years ago the Catalina Sky Survey discovered Earth's first known natural geocentric object other than the Moon, a few-meter diameter asteroid designated \\RH. Despite significant improvements in ground-based asteroid surveying technology in the past decade they have not discovered another temporarily-captured orbiter (TCO; colloquially known as minimoons) but the all-sky fireball system operated in the Czech Republic as part of the European Fireball Network detected a bright natural meteor that was almost certainly in a geocentric orbit before it struck Earth's atmosphere. Within a few years the Large Synoptic Survey Telescope (LSST) will either begin to regularly detect TCOs or force a re-analysis of the creation and dynamical evolution of small asteroids in the inner solar system. The first studies of the provenance, properties, and dynamics of Earth's minimoons suggested that there should be a steady state population with about one 1- to 2-meter diameter captured objects at any time, with the number of captured meteoroids increasing exponentially for smaller sizes. That model was then improved and extended to include the population of temporarily-captured flybys (TCFs), objects that fail to make an entire revolution around Earth while energetically bound to the Earth-Moon system. Several different techniques for discovering TCOs have been considered but their small diameters, proximity, and rapid motion make them challenging targets for existing ground-based optical, meteor, and radar surveys. However, the LSST's tremendous light gathering power and short exposure times could allow it to detect and discover many minimoons. We expect that if the TCO population is confirmed, and new objects are frequently discovered, they can provide new opportunities for 1) studying the dynamics of the Earth-Moon system, 2) testing models of the production and dynamical evolution of small asteroids from the asteroid belt, 3) rapid and frequent low delta-v missions to multiple minimoons, and 4) evaluating in-situ resource utilization techniques on asteroidal material. Here we review the past decade of minimoon studies in preparation for capitalizing on the scientific and commercial opportunities of TCOs in the first decade of LSST operations.

  3. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  4. LSST telescope and site status

    NASA Astrophysics Data System (ADS)

    Gressler, William J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.

  5. Visits to Tololo | CTIO

    Science.gov Websites

    limited to two groups of 40 people. One group meets at the gatehouse at 9 AM and the other at 1PM. Because Reports SOAR End-of-night Reports Gemini South LSST Optical Engineering AURA Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and

  6. Solar System science with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David

    2015-11-01

    The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.

  7. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    NASA Astrophysics Data System (ADS)

    Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration

    2018-01-01

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.

  8. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scolnic, D.; Kessler, R.; Brout, D.

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  9. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE PAGES

    Scolnic, D.; Kessler, R.; Brout, D.; ...

    2017-12-22

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  10. Surveying the Inner Solar System with an Infrared Space Telescope

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.; Reitsema, Harold J.; Linfield, Roger P.

    2016-11-01

    We present an analysis of surveying the inner solar system for objects that may pose some threat to Earth. Most of the analysis is based on understanding the capability provided by Sentinel, a concept for an infrared space-based telescope placed in a heliocentric orbit near the distance of Venus. From this analysis, we show that (1) the size range being targeted can affect the survey design, (2) the orbit distribution of the target sample can affect the survey design, (3) minimum observational arc length during the survey is an important metric of survey performance, and (4) surveys must consider objects as small as D=15{--}30 m to meet the goal of identifying objects that have the potential to cause damage on Earth in the next 100 yr. Sentinel will be able to find 50% of all impactors larger than 40 m in a 6.5 yr survey. The Sentinel mission concept is shown to be as effective as any survey in finding objects bigger than D = 140 m but is more effective when applied to finding smaller objects on Earth-impacting orbits. Sentinel is also more effective at finding objects of interest for human exploration that benefit from lower propulsion requirements. To explore the interaction between space and ground search programs, we also study a case where Sentinel is combined with the Large Synoptic Survey Telescope (LSST) and show the benefit of placing a space-based observatory in an orbit that reduces the overlap in search regions with a ground-based telescope. In this case, Sentinel+LSST can find more than 70% of the impactors larger than 40 m assuming a 6.5 yr lifetime for Sentinel and 10 yr for LSST.

  11. Wavelength-Dependent PSFs and their Impact on Weak Lensing Measurements

    NASA Astrophysics Data System (ADS)

    Carlsten, S. G.; Strauss, Michael A.; Lupton, Robert H.; Meyers, Joshua E.; Miyazaki, Satoshi

    2018-06-01

    We measure and model the wavelength dependence of the point spread function (PSF) in the Hyper Suprime-Cam Subaru Strategic Program survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g, r, and i-bands at the 1-2 per cent level and in the z and y-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF size based on weighted second moments, R, and wavelength of the form R(λ)∝λ-b. We find values of b between 0.2 and 0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ˜10 - 100 m, indicating that the atmosphere is dominating the chromaticity. In the best seeing data, we find that the optical system and detector also contribute some wavelength dependence. Meyers & Burchat (2015b) showed that b must be measured to an accuracy of ˜0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that b can be inferred with this accuracy in the r and i-bands for all positions in the LSST focal plane, assuming a stellar density of 1 star arcmin-2 and that the optical component of the PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.

  12. Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST

    NASA Astrophysics Data System (ADS)

    Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.

    2017-01-01

    The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.

  13. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    NASA Astrophysics Data System (ADS)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; Collett, Thomas E.

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ∼2 over previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Accounting for microlensing, the 1–2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.

  14. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE PAGES

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; ...

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  15. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  16. Mapping the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Harris, A.; Bowell, T.; Bernstein, G.; Stubbs, C.; LSST Collaboration

    2004-12-01

    The currently considered LSST cadence, based on two 10 sec exposures, may result in orbital parameters, light curves and accurate colors for over a million main-belt asteroids (MBA), and about 20,000 trans-Neptunian objects (TNO). Compared to the current state-of-the-art, this sample would represent a factor of 5 increase in the number of MBAs with known orbits, a factor of 20 increase in the number of MBAs with known orbits and accurate color measurements, and a factor of 100 increase in the number of MBAs with measured variability properties. The corresponding sample increase for TNOs is 10, 100, and 1000, respectively. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. For example, they will constrain the MBA size distribution for objects larger than 100 m, and TNO size distribution for objects larger than 100 km, their physical state through variability measurements (solid body vs. a rubble pile), as well as their surface chemistry through color measurements. A proposed deep TNO survey, based on 1 hour exposures, may result in a sample of about 100,000 TNOs, while spending only 10% of the LSST observing time. Such a deep TNO survey would be capable of discovering Sedna-like objects at distances beyond 150 AU, thereby increasing the observable Solar System volume by about a factor of 7. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying asteroid populations.

  17. Management evolution in the LSST project

    NASA Astrophysics Data System (ADS)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  18. Delta Doping High Purity CCDs and CMOS for LSST

    NASA Technical Reports Server (NTRS)

    Blacksberg, Jordana; Nikzad, Shouleh; Hoenk, Michael; Elliott, S. Tom; Bebek, Chris; Holland, Steve; Kolbe, Bill

    2006-01-01

    A viewgraph presentation describing delta doping high purity CCD's and CMOS for LSST is shown. The topics include: 1) Overview of JPL s versatile back-surface process for CCDs and CMOS; 2) Application to SNAP and ORION missions; 3) Delta doping as a back-surface electrode for fully depleted LBNL CCDs; 4) Delta doping high purity CCDs for SNAP and ORION; 5) JPL CMP thinning process development; and 6) Antireflection coating process development.

  19. LSST Astrometric Science

    NASA Astrophysics Data System (ADS)

    Saha, A.; Monet, D.

    2005-12-01

    Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.

  20. The LSST metrics analysis framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.

    2014-07-01

    We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.

  1. Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA

    NASA Astrophysics Data System (ADS)

    Korol, V.; Rossi, E. M.; Groot, P. J.

    2017-03-01

    According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.

  2. LSST camera control system

    NASA Astrophysics Data System (ADS)

    Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

    2006-06-01

    The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

  3. The Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Axelrod, T. S.

    2006-07-01

    The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.

  4. Atmospheric Dispersion Effects in Weak Lensing Measurements

    DOE PAGES

    Plazas, Andrés Alejandro; Bernstein, Gary

    2012-10-01

    The wavelength dependence of atmospheric refraction causes elongation of finite-bandwidth images along the elevation vector, which produces spurious signals in weak gravitational lensing shear measurements unless this atmospheric dispersion is calibrated and removed to high precision. Because astrometric solutions and PSF characteristics are typically calibrated from stellar images, differences between the reference stars' spectra and the galaxies' spectra will leave residual errors in both the astrometric positions (dr) and in the second moment (width) of the wavelength-averaged PSF (dv) for galaxies.We estimate the level of dv that will induce spurious weak lensing signals in PSF-corrected galaxy shapes that exceed themore » statistical errors of the DES and the LSST cosmic-shear experiments. We also estimate the dr signals that will produce unacceptable spurious distortions after stacking of exposures taken at different airmasses and hour angles. We also calculate the errors in the griz bands, and find that dispersion systematics, uncorrected, are up to 6 and 2 times larger in g and r bands,respectively, than the requirements for the DES error budget, but can be safely ignored in i and z bands. For the LSST requirements, the factors are about 30, 10, and 3 in g, r, and i bands,respectively. We find that a simple correction linear in galaxy color is accurate enough to reduce dispersion shear systematics to insignificant levels in the r band for DES and i band for LSST,but still as much as 5 times than the requirements for LSST r-band observations. More complex corrections will likely be able to reduce the systematic cosmic-shear errors below statistical errors for LSST r band. But g-band effects remain large enough that it seems likely that induced systematics will dominate the statistical errors of both surveys, and cosmic-shear measurements should rely on the redder bands.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, John Russell

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  6. The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, B.; Spergel, D.; Connolly, A.

    2015-02-02

    The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The mainmore » argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.« less

  7. The Emerging Infrastructure of Autonomous Astronomy

    NASA Astrophysics Data System (ADS)

    Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.

    2007-10-01

    Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.

  8. LIMB-DARKENING COEFFICIENTS FOR ECLIPSING WHITE DWARFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianninas, A.; Strickland, B. D.; Kilic, Mukremin

    2013-03-20

    We present extensive calculations of linear and nonlinear limb-darkening coefficients as well as complete intensity profiles appropriate for modeling the light-curves of eclipsing white dwarfs. We compute limb-darkening coefficients in the Johnson-Kron-Cousins UBVRI photometric system as well as the Large Synoptic Survey Telescope (LSST) ugrizy system using the most up to date model atmospheres available. In all, we provide the coefficients for seven different limb-darkening laws. We describe the variations of these coefficients as a function of the atmospheric parameters, including the effects of convection at low effective temperatures. Finally, we discuss the importance of having readily available limb-darkening coefficientsmore » in the context of present and future photometric surveys like the LSST, Palomar Transient Factory, and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The LSST, for example, may find {approx}10{sup 5} eclipsing white dwarfs. The limb-darkening calculations presented here will be an essential part of the detailed analysis of all of these systems.« less

  9. The production of ultrathin polyimide films for the solar sail program and Large Space Structures Technology (LSST): A feasibility study

    NASA Technical Reports Server (NTRS)

    Forester, R. H.

    1978-01-01

    Polyimide membranes of a thickness range from under 0.01 micron m to greater than 1 micron m can be produced at an estimated cost of 50 cents per sq m (plus the cost of the polymer). The polymer of interest is dissolved in a solvent which is solube in water. The polymer or casting solution is allowed to flow down an inclined ramp onto a water surface where a pool of floating polymer develops. The solvent dissolves into the water lowering the surface tension of the water on equently, the contact angle of the polymer pool is very low and the edge of the pool is very thin. The solvent dissolves from this thin region too rapidly to be replenished from the bulk of the pool and a solid polymer film forms. Firm formation is rapid and spontaneous and the film spreads out unaided, many feet from the leading edge of the pool. The driving force for this process is the exothermic solution of the organic solvent from the polymer solution into the water.

  10. Is flat fielding safe for precision CCD astronomy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  11. Is flat fielding safe for precision CCD astronomy?

    DOE PAGES

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    2017-07-06

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  12. A daytime measurement of the lunar contribution to the night sky brightness in LSST's ugrizy bands-initial results

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Stubbs, Christopher; Claver, Chuck

    2016-06-01

    We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.

  13. Probing the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Harris, A.; Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Bowell, E.; Bernstein, G.; Cook, K.; Stubbs, C.

    2005-12-01

    LSST will catalog small Potentially Hazardous Asteroids (PHAs), survey the main belt asteroid (MBA) population to extraordinarily small size, discover comets far from the sun where their nuclear properties can be discerned without coma, and survey the Centaur and Trans-Neptunian Object (TNO) populations. The present planned observing strategy is to ``visit'' each field (9.6 deg2) with two back-to-back exposures of ˜ 15 sec, reaching to at least V magnitude 24.5. An intra-night revisit time of the order half an hour will distinguish stationary transients from even very distant ( ˜ 70 AU) solar system bodies. In order to link observations and determine orbits, each sky area will be visited several times during a month, spaced by about a week. This cadence will result in orbital parameters for several million MBAs and about 20,000 TNOs, with light curves and colorimetry for the brighter 10% or so of each population. Compared to the current data available, this would represent factor of 10 to 100 increase in the numbers of orbits, colors, and variability of the two classes of objects. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying moving objects.

  14. Scheduling Algorithm for the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ichharam, Jaimal; Stubbs, Christopher

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.

  15. Looking through the same lens: Shear calibration for LSST, Euclid, and WFIRST with stage 4 CMB lensing

    NASA Astrophysics Data System (ADS)

    Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.

    2017-06-01

    The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.

  16. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  17. On the accuracy of modelling the dynamics of large space structures

    NASA Technical Reports Server (NTRS)

    Diarra, C. M.; Bainum, P. M.

    1985-01-01

    Proposed space missions will require large scale, light weight, space based structural systems. Large space structure technology (LSST) systems will have to accommodate (among others): ocean data systems; electronic mail systems; large multibeam antenna systems; and, space based solar power systems. The structures are to be delivered into orbit by the space shuttle. Because of their inherent size, modelling techniques and scaling algorithms must be developed so that system performance can be predicted accurately prior to launch and assembly. When the size and weight-to-area ratio of proposed LSST systems dictate that the entire system be considered flexible, there are two basic modeling methods which can be used. The first is a continuum approach, a mathematical formulation for predicting the motion of a general orbiting flexible body, in which elastic deformations are considered small compared with characteristic body dimensions. This approach is based on an a priori knowledge of the frequencies and shape functions of all modes included within the system model. Alternatively, finite element techniques can be used to model the entire structure as a system of lumped masses connected by a series of (restoring) springs and possibly dampers. In addition, a computational algorithm was developed to evaluate the coefficients of the various coupling terms in the equations of motion as applied to the finite element model of the Hoop/Column.

  18. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  19. Photometric Redshift Calibration Strategy for WFIRST Cosmology

    NASA Astrophysics Data System (ADS)

    Hemmati, Shoubaneh; WFIRST, WFIRST-HLS-COSMOLOGY

    2018-01-01

    In order for WFIRST and other Stage IV Dark energy experiments (e.g. LSST, Euclid) to infer cosmological parameters not limited by systematic errors, accurate redshift measurements are needed. This accuracy can only be met using spectroscopic subsamples to calibrate the full sample. In this poster, we employ the machine leaning, SOM based spectroscopic sampling technique developed in Masters et al. 2015, using the empirical color-redshift relation among galaxies to find the minimum spectra required for the WFIRST weak lensing calibration. We use galaxies from the CANDELS survey to build the LSST+WFIRST lensing analog sample of ~36k objects and train the LSST+WFIRST SOM. We show that 26% of the WFIRST lensing sample consists of sources fainter than the Euclid depth in the optical, 91% of which live in color cells already occupied by brighter galaxies. We demonstrate the similarity between faint and bright galaxies as well as the feasibility of redshift measurements at different brightness levels. 4% of SOM cells are however only occupied by faint galaxies for which we recommend extra spectroscopy of ~200 new sources. Acquiring the spectra of these sources will enable the comprehensive calibration of the WFIRST color-redshift relation.

  20. New Self-lensing Models of the Small Magellanic Cloud: Can Gravitational Microlensing Detect Extragalactic Exoplanets?

    NASA Astrophysics Data System (ADS)

    Mróz, Przemek; Poleski, Radosław

    2018-04-01

    We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.

  1. The Hyper Suprime-Cam software pipeline

    NASA Astrophysics Data System (ADS)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  2. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, R.; Chiang, J.; Cinabro, D.

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  3. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE PAGES

    Coles, R.; Chiang, J.; Cinabro, D.; ...

    2017-04-18

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  4. Fast force actuators for LSST primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Hileman, Edward; Warner, Michael; Wiecha, Oliver

    2010-07-01

    The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.

  5. Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration

    2018-05-01

    The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.

  6. Gamma Ray Bursts as Cosmological Probes with EXIST

    NASA Astrophysics Data System (ADS)

    Hartmann, Dieter; EXIST Team

    2006-12-01

    The EXIST mission, studied as a Black Hole Finder Probe within NASA's Beyond Einstein Program, would, in its current design, trigger on 1000 Gamma Ray Bursts (GRBs) per year (Grindlay et al, this meeting). The redshift distribution of these GRBs, using results from Swift as a guide, would probe the z > 7 epoch at an event rate of > 50 per year. These bursts trace early cosmic star formation history, point to a first generation of stellar objects that reionize the universe, and provide bright beacons for absorption line studies with groundand space-based observatories. We discuss how EXIST, in conjunction with other space missions and future large survey programs such as LSST, can be utilized to advance our understanding of cosmic chemical evolution, the structure and evolution of the baryonic cosmic web, and the formation of stars in low metallicity environments.

  7. Ground/bonding for Large Space System Technology (LSST). [of metallic and nonmetallic structures

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.

  8. Cables and connectors for Large Space System Technology (LSST)

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The effect of the environment and extravehicular activity/remote assembly operations on the cables and connectors for spacecraft with metallic and/or nonmetallic structures was examined. Cable and connector philosophy was outlined for the electrical systems and electronic compartments which contain high-voltage, high-power electrical and electronic equipment. The influence of plasma and particulates on the system is analyzed and the effect of static buildup on the spacecraft electrical system discussed. Conceptual cable and connector designs are assessed for capability to withstand high current and high voltage without danger of arcs and electromagnetic interference. The extravehicular activites required of the space station and/or supply spacecraft crew members to join and inspect the electrical system, using manual or remote assembly construction are also considered.

  9. Ground/bonding for Large Space System Technology (LSST)

    NASA Astrophysics Data System (ADS)

    Dunbar, W. G.

    1980-04-01

    The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.

  10. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  11. The Hyper Suprime-Cam software pipeline

    DOE PAGES

    Bosch, James; Armstrong, Robert; Bickerton, Steven; ...

    2017-10-12

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  12. The Hyper Suprime-Cam software pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, James; Armstrong, Robert; Bickerton, Steven

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  13. Robust Period Estimation Using Mutual Information for Multiband Light Curves in the Synoptic Survey Era

    NASA Astrophysics Data System (ADS)

    Huijse, Pablo; Estévez, Pablo A.; Förster, Francisco; Daniel, Scott F.; Connolly, Andrew J.; Protopapas, Pavlos; Carrasco, Rodrigo; Príncipe, José C.

    2018-05-01

    The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely sampled time-series are needed. In this paper we present a new method for light curve period estimation based on quadratic mutual information (QMI). The proposed method does not assume a particular model for the light curve nor its underlying probability density and it is robust to non-Gaussian noise and outliers. By combining the QMI from several bands the true period can be estimated even when no single-band QMI yields the period. Period recovery performance as a function of average magnitude and sample size is measured using 30,000 synthetic multiband light curves of RR Lyrae and Cepheid variables generated by the LSST Operations and Catalog simulators. The results show that aggregating information from several bands is highly beneficial in LSST sparsely sampled time-series, obtaining an absolute increase in period recovery rate up to 50%. We also show that the QMI is more robust to noise and light curve length (sample size) than the multiband generalizations of the Lomb–Scargle and AoV periodograms, recovering the true period in 10%–30% more cases than its competitors. A python package containing efficient Cython implementations of the QMI and other methods is provided.

  14. A Euclid, LSST and WFIRST Joint Processing Study

    NASA Astrophysics Data System (ADS)

    Chary, Ranga-Ram; Joint Processing Working Group

    2018-01-01

    Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.

  15. Solar System Science with LSST

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Chesley, S. R.; Connolly, A. J.; Harris, A. W.; Ivezic, Z.; Knezevic, Z.; Kubica, J.; Milani, A.; Trilling, D. E.

    2008-09-01

    The Large Synoptic Survey Telescope (LSST) will provide a unique tool to study moving objects throughout the solar system, creating massive catalogs of Near Earth Objects (NEOs), asteroids, Trojans, TransNeptunian Objects (TNOs), comets and planetary satellites with well-measured orbits and high quality, multi-color photometry accurate to 0.005 magnitudes for the brightest objects. In the baseline LSST observing plan, back-to-back 15-second images will reach a limiting magnitude as faint as r=24.7 in each 9.6 square degree image, twice per night; a total of approximately 15,000 square degrees of the sky will be imaged in multiple filters every 3 nights. This time sampling will continue throughout each lunation, creating a huge database of observations. Fig. 1 Sky coverage of LSST over 10 years; separate panels for each of the 6 LSST filters. Color bars indicate number of observations in filter. The catalogs will include more than 80% of the potentially hazardous asteroids larger than 140m in diameter within the first 10 years of LSST operation, millions of main-belt asteroids and perhaps 20,000 Trans-Neptunian Objects. Objects with diameters as small as 100m in the Main Belt and <100km in the Kuiper Belt can be detected in individual images. Specialized `deep drilling' observing sequences will detect KBOs down to 10s of kilometers in diameter. Long period comets will be detected at larger distances than previously possible, constrainting models of the Oort cloud. With the large number of objects expected in the catalogs, it may be possible to observe a pristine comet start outgassing on its first journey into the inner solar system. By observing fields over a wide range of ecliptic longitudes and latitudes, including large separations from the ecliptic plane, not only will these catalogs greatly increase the numbers of known objects, the characterization of the inclination distributions of these populations will be much improved. Derivation of proper elements for main belt and Trojan asteroids will allow ever more resolution of asteroid families and their size-frequency distribution, as well as the study of the long-term dynamics of the individual asteroids and the asteroid belt as a whole. Fig. 2 Orbital parameters of Main Belt Asteroids, color-coded according to ugriz colors measured by SDSS. The figure to the left shows osculating elements, the figure to the right shows proper elements - note the asteroid families visible as clumps in parameter space [1]. By obtaining multi-color ugrizy data for a substantial fraction of objects, relationships between color and dynamical history can be established. This will also enable taxonomic classification of asteroids, provide further links between diverse populations such as irregular satellites and TNOs or planetary Trojans, and enable estimates of asteroid diameter with rms uncertainty of 30%. With the addition of light-curve information, rotation periods and phase curves can be measured for large fractions of each population, leading to new insight on physical characteristics. Photometric variability information, together with sparse lightcurve inversion, will allow spin state and shape estimation for up to two orders of magnitude more objects than presently known. This will leverage physical studies of asteroids by constraining the size-strength relationship, which has important implications for the internal structure (solid, fractured, rubble pile) and in turn the collisional evolution of the asteroid belt. Similar information can be gained for other solar system bodies. [1] Parker, A., Ivezic

  16. Preparing for LSST with the LCOGT NEO Follow-up Network

    NASA Astrophysics Data System (ADS)

    Greenstreet, Sarah; Lister, Tim; Gomez, Edward

    2016-10-01

    The Las Cumbres Observatory Global Telescope Network (LCOGT) provides an ideal platform for follow-up and characterization of Solar System objects (e.g. asteroids, Kuiper Belt Objects, comets, Near-Earth Objects (NEOs)) and ultimately for the discovery of new objects. The LCOGT NEO Follow-up Network is using the LCOGT telescope network in addition to a web-based system developed to perform prioritized target selection, scheduling, and data reduction to confirm NEO candidates and characterize radar-targeted known NEOs.In order to determine how to maximize our NEO follow-up efforts, we must first define our goals for the LCOGT NEO Follow-up Network. This means answering the following questions. Should we follow-up all objects brighter than some magnitude limit? Should we only focus on the brightest objects or push to the limits of our capabilities by observing the faintest objects we think we can see and risk not finding the objects in our data? Do we (and how do we) prioritize objects somewhere in the middle of our observable magnitude range? If we want to push to faint objects, how do we minimize the amount of data in which the signal-to-noise ratio is too low to see the object? And how do we find a balance between performing follow-up and characterization observations?To help answer these questions, we have developed a LCOGT NEO Follow-up Network simulator that allows us to test our prioritization algorithms for target selection, confirm signal-to-noise predictions, and determine ideal block lengths and exposure times for observing NEO candidates. We will present our results from the simulator and progress on our NEO follow-up efforts.In the era of LSST, developing/utilizing infrastructure, such as the LCOGT NEO Follow-up Network and our web-based platform for selecting, scheduling, and reducing NEO observations, capable of handling the large number of detections expected to be produced on a daily basis by LSST will be critical to follow-up efforts. We hope our work can act as an example and tool for the community as together we prepare for the age of LSST.

  17. Toroid Joining Gun. [thermoplastic welding system using induction heating

    NASA Technical Reports Server (NTRS)

    Buckley, J. D.; Fox, R. L.; Swaim, R J.

    1985-01-01

    The Toroid Joining Gun is a low cost, self-contained, portable low powered (100-400 watts) thermoplastic welding system developed at Langley Research Center for joining plastic and composite parts using an induction heating technique. The device developed for use in the fabrication of large space sructures (LSST Program) can be used in any atmosphere or in a vacuum. Components can be joined in situ, whether on earth or on a space platform. The expanded application of this welding gun is in the joining of thermoplastic composites, thermosetting composites, metals, and combinations of these materials. Its low-power requirements, light weight, rapid response, low cost, portability, and effective joining make it a candidate for solving many varied and unique bonding tasks.

  18. In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms

    NASA Astrophysics Data System (ADS)

    Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.

    2007-12-01

    We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.

  19. The variable sky of deep synoptic surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.

    2014-11-20

    The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night atmore » the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.« less

  20. Projected Near-Earth Object Discovery Performance of the Large Synoptic Survey Telescope

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2017-01-01

    This report describes the methodology and results of an assessment study of the performance of the Large Synoptic Survey Telescope (LSST) in its planned efforts to detect and catalog near-Earth objects (NEOs).

  1. Searching for modified growth patterns with tomographic surveys

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel

    2009-04-01

    In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.

  2. Baseline design and requirements for the LSST rotating enclosure (dome)

    NASA Astrophysics Data System (ADS)

    Neill, D. R.; DeVries, J.; Hileman, E.; Sebag, J.; Gressler, W.; Wiecha, O.; Andrew, J.; Schoening, W.

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile. As a result of the wide field of view, its optical system is unusually susceptible to stray light; consequently besides protecting the telescope from the environment the rotating enclosure (Dome) also provides indispensible light baffling. All dome vents are covered with light baffles which simultaneously provide both essential dome flushing and stray light attenuation. The wind screen also (and primarily) functions as a light screen providing only a minimum clear aperture. Since the dome must operate continuously, and the drives produce significant heat, they are located on the fixed lower enclosure to facilitate glycol water cooling. To accommodate day time thermal control, a duct system channels cooling air provided by the facility when the dome is in its parked position.

  3. LSST communications middleware implementation

    NASA Astrophysics Data System (ADS)

    Mills, Dave; Schumacher, German; Lotz, Paul

    2016-07-01

    The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.

  4. LSST camera readout chip ASPIC: test tools

    NASA Astrophysics Data System (ADS)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  5. LSST Telescope and Optics Status

    NASA Astrophysics Data System (ADS)

    Krabbendam, Victor; Gressler, W. J.; Andrew, J. R.; Barr, J. D.; DeVries, J.; Hileman, E.; Liang, M.; Neill, D. R.; Sebag, J.; Wiecha, O.; LSST Collaboration

    2011-01-01

    The LSST Project continues to advance the design and development of an observatory system capable of capturing 20,000 deg2 of the sky in six wavebands over ten years. Optical fabrication of the unique M1/M3 monolithic mirror has entered final front surface optical processing. After substantial grinding to remove 5 tons of excess glass above the M3 surface, a residual of a single spin casting, both distinct optical surfaces are now clearly evident. Loose abrasive grinding has begun and polishing is to occur during 2011 and final optical testing is planned in early 2012. The M1/M3 telescope cell and internal component designs have matured to support on telescope operational requirements and off telescope coating needs. The mirror position system (hardpoint actuators) and mirror support system (figure actuator) designs have developed through internal laboratory analysis and testing. Review of thermal requirements has assisted with definition of a thermal conditioning and control system. Pre-cooling the M1/M3 substrate will enable productive observing during the large temperature swing often seen at twilight. The M2 ULE™ substrate is complete and lies in storage waiting for additional funding to enable final optical polishing. This 3.5m diameter, 100mm thick meniscus substrate has been ground to within 40 microns of final figure. Detailed design of the telescope mount, including subflooring, has been developed. Finally, substantial progress has been achieved on the facility design. In early 2010, LSST contracted with ARCADIS Geotecnica Consultores, a Santiago based engineering firm to lead the formal architectural design effort for the summit facility.

  6. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Petri, Andrea; May, Morgan

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  7. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE PAGES

    Okura, Yuki; Petri, Andrea; May, Morgan; ...

    2016-06-27

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  8. Near-Earth Object Orbit Linking with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Vereš, Peter; Chesley, Steven R.

    2017-07-01

    We have conducted a detailed simulation of the ability of the Large Synoptic Survey Telescope (LSST) to link near-Earth and main belt asteroid detections into orbits. The key elements of the study were a high-fidelity detection model and the presence of false detections in the form of both statistical noise and difference image artifacts. We employed the Moving Object Processing System (MOPS) to generate tracklets, tracks, and orbits with a realistic detection density for one month of the LSST survey. The main goals of the study were to understand whether (a) the linking of near-Earth objects (NEOs) into orbits can succeed in a realistic survey, (b) the number of false tracks and orbits will be manageable, and (c) the accuracy of linked orbits would be sufficient for automated processing of discoveries and attributions. We found that the overall density of asteroids was more than 5000 per LSST field near opposition on the ecliptic, plus up to 3000 false detections per field in good seeing. We achieved 93.6% NEO linking efficiency for H< 22 on tracks composed of tracklets from at least three distinct nights within a 12 day interval. The derived NEO catalog was comprised of 96% correct linkages. Less than 0.1% of orbits included false detections, and the remainder of false linkages stemmed from main belt confusion, which was an artifact of the short time span of the simulation. The MOPS linking efficiency can be improved by refined attribution of detections to known objects and by improved tuning of the internal kd-tree linking algorithms.

  9. Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era

    NASA Astrophysics Data System (ADS)

    Cenko, Stephen

    2017-08-01

    When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.

  10. Comparison of virtual reality exergaming and home exercise programs in patients with subacromial impingement syndrome and scapular dyskinesis: Short term effect.

    PubMed

    Pekyavas, Nihan Ozunlu; Ergun, Nevin

    2017-05-01

    The aim of this study was to compare the short term effects of home exercise program and virtual reality exergaming in patients with subacromial impingement syndrome (SAIS). A total of 30 patients with SAIS were randomized into two groups which are Home Exercise Program (EX Group) (mean age: 40.6 ± 11.7 years) and Virtual Reality Exergaming Program (WII Group) (mean age: 40.33 ± 13.2 years). Subjects were assessed at the first session, at the end of the treatment (6 weeks) and at 1 month follow-up. The groups were assessed and compared with Visual Analogue Scale (based on rest, activity and night pain), Neer and Hawkins Tests, Scapular Retraction Test (SRT), Scapular Assistance Test (SAT), Lateral Scapular Slide Test (LSST) and shoulder disability (Shoulder Pain and Disability Index (SPADI)). Intensity of pain was significantly decreased in both groups with the treatment (p < 0.05). The WII Group had significantly better results for all Neer test, SRT and SAT than the EX Group (p < 0.05). Virtual reality exergaming programs with these programs were found more effective than home exercise programs at short term in subjects with SAIS. Level I, Therapeutic study. Copyright © 2017 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  11. A warm Spitzer survey of the LSST/DES 'Deep drilling' fields

    NASA Astrophysics Data System (ADS)

    Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee

    2014-12-01

    We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.

  12. Stellar Populations and Nearby Galaxies with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; Monet, D. G.; LSST Stellar Populations Collaboration

    2009-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma). Time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than r=24.7. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence (MS) stars at all distances within the Galaxy as well as in the Magellanic Clouds, and dwarf satellites of the Milky Way. This will support comprehensive studies of star formation histories and chemical evolution for field stars. The structures of the Clouds and dwarf spheroidals will be traced with the MS stars, to equivalent surface densities fainter than 35 mag/square arc-second. With geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, a robust complete sample of solar neighborhood stars will be obtained. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics. The combination of wide coverage, multi-band photometry, time sampling and parallax taken together will address several key problems: e.g. fine tuning the extragalactic distance scale by examining properties of RR Lyraes and Cepheids as a function of parent populations, extending the faint end of the galaxy luminosity function by discovering them using star count density enhancements on degree scales tracing, and indentifying inter-galactic stars through novae and Long Period Variables.

  13. Transient Go: A Mobile App for Transient Astronomy Outreach

    NASA Astrophysics Data System (ADS)

    Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.

    2016-12-01

    Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.

  14. Data Service: Distributed Data Capture and Replication

    NASA Astrophysics Data System (ADS)

    Warner, P. B.; Pietrowicz, S. R.

    2007-10-01

    Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.

  15. Properties of tree rings in LSST sensors

    DOE PAGES

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    2017-05-30

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  16. Fringing in MonoCam Y4 filter images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  17. scarlet: Source separation in multi-band images by Constrained Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert

    2018-03-01

    SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

  18. Properties of tree rings in LSST sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  19. A study of astrometric distortions due to “tree rings” in CCD sensors using LSST Photon Simulator

    DOE PAGES

    Beamer, Benjamin; Nomerotski, Andrei; Tsybychev, Dmitri

    2015-05-22

    Imperfections in the production process of thick CCDs lead to circularly symmetric dopant concentration variations, which in turn produce electric fields transverse to the surface of the fully depleted CCD that displace the photogenerated charges. We use PhoSim, a Monte Carlo photon simulator, to explore and examine the likely impacts these dopant concentration variations will have on astrometric measurements in LSST. The scale and behavior of both the astrometric shifts imparted to point sources and the intensity variations in flat field images that result from these doping imperfections are similar to those previously observed in Dark Energy Camera CCDs, givingmore » initial confirmation of PhoSim's model for these effects. In addition, the organized shape distortions were observed as a result of the symmetric nature of these dopant variations, causing nominally round sources to be imparted with a measurable ellipticity either aligned with or transverse to the radial direction of this dopant variation pattern.« less

  20. Fringing in MonoCam Y4 filter images

    DOE PAGES

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    2017-05-05

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  1. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  2. “Big Data” Teen Astronomy Cafes at NOAO

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen; Walker, Constance E.

    2018-01-01

    The National Optical Astronomy Observatory has designed and implemented a prototype educational program designed to test and understand best practices with high school students to promote an understanding of modern astronomy research with its emphasis on large data sets, data tools, and visualization tools. This program, designed to cultivate the interest of talented youth in astronomy, is based on a teen science café model developed at Los Alamos as the Café Scientifique New Mexico. In our program, we provide a free, fun way for teens to explore current research topics in astronomy on Saturday mornings at the NOAO headquarters. The program encourages stimulating conversations with astronomers in an informal and relaxed setting, with free food of course. The café is organized through a leadership team of local high school students and recruits students from all parts of the greater Tucson area. The high school students who attend have the opportunity to interact with expert astronomers working with large astronomical data sets on topics such as killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, gravitational lensing, dark energy, and dark matter. The students also have the opportunity to explore astronomical data sets and data tools using computers provided by the program. The program may serve as a model for educational outreach for the 40+ institutions involved in the LSST.

  3. chroma: Chromatic effects for LSST weak lensing

    NASA Astrophysics Data System (ADS)

    Meyers, Joshua E.; Burchat, Patricia R.

    2018-04-01

    Chroma investigates biases originating from two chromatic effects in the atmosphere: differential chromatic refraction (DCR), and wavelength dependence of seeing. These biases arise when using the point spread function (PSF) measured with stars to estimate the shapes of galaxies with different spectral energy distributions (SEDs) than the stars.

  4. Near-Field Cosmology with Resolved Stellar Populations Around Local Volume LMC Stellar-Mass Galaxies

    NASA Astrophysics Data System (ADS)

    Carlin, Jeffrey L.; Sand, David J.; Willman, Beth; Brodie, Jean P.; Crnojevic, Denija; Forbes, Duncan; Hargis, Jonathan R.; Peter, Annika; Pucha, Ragadeepika; Romanowsky, Aaron J.; Spekkens, Kristine; Strader, Jay

    2018-06-01

    We discuss our ongoing observational program to comprehensively map the entire virial volumes of roughly LMC stellar mass galaxies at distances of ~2-4 Mpc. The MADCASH (Magellanic Analog Dwarf Companions And Stellar Halos) survey will deliver the first census of the dwarf satellite populations and stellar halo properties within LMC-like environments in the Local Volume. Our results will inform our understanding of the recent DES discoveries of dwarf satellites tentatively affiliated with the LMC/SMC system. This program has already yielded the discovery of the faintest known dwarf galaxy satellite of an LMC stellar-mass host beyond the Local Group, based on deep Subaru+HyperSuprimeCam imaging reaching ~2 magnitudes below its TRGB, and at least two additional candidate satellites. We will summarize the survey results and status to date, highlighting some challenges encountered and lessons learned as we process the data for this program through a prototype LSST pipeline. Our program will examine whether LMC stellar mass dwarfs have extended stellar halos, allowing us to assess the relative contributions of in-situ stars vs. merger debris to their stellar populations and halo density profiles. We outline the constraints on galaxy formation models that will be provided by our observations of low-mass galaxy halos and their satellites.

  5. Wood-Vasey DOE #SC0011834 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood-Vasey, William Michael

    During the past reporting period (Year 3), this grant has provided partial support for graduate students Daniel Perrefort and Kara Ponder. They have been working exploring different aspects of the technical work needed to take full advantage of the potential for cosmological inference using Type Ia supernovae (SNeIa) with LSST.

  6. Dwarf Hosts of Low-z Supernovae

    NASA Astrophysics Data System (ADS)

    Pyotr Kolobow, Craig; Perlman, Eric S.; Strolger, Louis

    2018-01-01

    Hostless supernovae (SNe), or SNe in dwarf galaxies, may serve as excellent beacons for probing the spatial density of dwarf galaxies (M < 10^8M⊙), which themselves are scarcely detected beyond only a few Mpc. Depending on the assumed model for the stellar-mass to halo mass relation for these galaxies, LSST might see 1000s of SNe (of all types) from dwarf galaxies alone. Conversely, one can take the measured rates of these SNe and test the model predictions for the density of dwarf galaxies in the local universe. Current “all-sky” surveys, like PanSTARRS and ASAS-SN, are now finding hostless SNe at a number sufficient to measure their rate. What missing is the appropriate weighting of their host luminosities. Here we seek to continue a successful program to recover the luminosities of these hostless SNe, to z = 0.15, to use their rate to constrain the faint-end slope of the low-z galaxy luminosity function.

  7. Cosmic Visions Dark Energy: Small Projects Portfolio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Kyle; Frieman, Josh; Heitmann, Katrin

    Understanding cosmic acceleration is one of the key science drivers for astrophysics and high-energy physics in the coming decade (2014 P5 Report). With the Large Synoptic Survey Telescope (LSST) and the Dark Energy Spectroscopic Instrument (DESI) and other new facilities beginning operations soon, we are entering an exciting phase during which we expect an order of magnitude improvement in constraints on dark energy and the physics of the accelerating Universe. This is a key moment for a matching Small Projects portfolio that can (1) greatly enhance the science reach of these flagship projects, (2) have immediate scientific impact, and (3)more » lay the groundwork for the next stages of the Cosmic Frontier Dark Energy program. In this White Paper, we outline a balanced portfolio that can accomplish these goals through a combination of observational, experimental, and theory and simulation efforts.« less

  8. Rochester scientist discovers new comet with Dark Energy Camera (DECam) at

    Science.gov Websites

    Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and Meetings Upcoming Colloquia Sky Conditions CTIO Site Conditions TASCA colleagues believe. David Cameron, a visiting scientist in Eric Mamajek's research group in the Department of

  9. Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation

    NASA Astrophysics Data System (ADS)

    Speagle, Joshua S.; Eisenstein, Daniel J.

    2017-07-01

    With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less

  11. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.

  12. Lower Boundary Forcing related to the Occurrence of Rain in the Tropical Western Pacific

    NASA Astrophysics Data System (ADS)

    Li, Y.; Carbone, R. E.

    2013-12-01

    Global weather and climate models have a long and somewhat tortured history with respect to simulation and prediction of tropical rainfall in the relative absence of balanced flow in the geostrophic sense. An important correlate with tropical rainfall is sea surface temperature (SST). The introduction of SST information to convective rainfall parameterization in global models has improved model climatologies of tropical oceanic rainfall. Nevertheless, large systematic errors have persisted, several of which are common to most atmospheric models. Models have evolved to the point where increased spatial resolution demands representation of the SST field at compatible temporal and spatial scales, leading to common usage of monthly SST fields at scales of 10-100 km. While large systematic errors persist, significant skill has been realized from various atmospheric and coupled ocean models, including assimilation of weekly or even daily SST fields, as tested by the European Center for Medium Range Weather Forecasting. A few investigators have explored the role of SST gradients in relation to the occurrence of precipitation. Some of this research has focused on large scale gradients, mainly associated with surface ocean-atmosphere climatology. These studies conclude that lower boundary atmospheric convergence, under some conditions, could be substantially enhanced over SST gradients, destabilizing the atmosphere, and thereby enabling moist convection. While the concept has a firm theoretical foundation, it has not gained a sizeable following far beyond the realm of western boundary currents. Li and Carbone 2012 examined the role of transient mesoscale (~ 100 km) SST gradients in the western Pacific warm pool by means of GHRSST and CMORPH rainfall data. They found that excitation of deep moist convection was strongly associated with the Laplacian of SST (LSST). Specifically, -LSST is associated with rainfall onset in 75% of 10,000 events over 4 years, whereas the background ocean is symmetric about zero Laplacian. This finding is fully consistent with theory for gradients of order ~1degC in low mean wind conditions, capable of inducing atmospheric convergence of N x 10-5s-1. We will present new findings resulting from the application of a Madden-Julian oscillation (MJO) passband filter to GHRSST/CMORPH data. It shows that the -LSST field organizes at scales of 1000-2000 km and can persist for periods of two weeks to 3 months. Such -LSST anomalies are in quadrature with MJO rainfall, tracking and leading the wet phase of the MJO by 10-14 days, from the Indian Ocean to the dateline. More generally, an evaluation of SST structure in rainfall production will be presented, which represents a decidedly alternative view to conventional wisdom. Li, Yanping, and R.E. Carbone, 2012: Excitation of Rainfall over the Tropical Western Pacific, J. Atmos. Sci., 69, 2983-2994.

  13. LSST system analysis and integration task for an advanced science and application space platform

    NASA Technical Reports Server (NTRS)

    1980-01-01

    To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.

  14. Augmenting the Funding Sources for Space Science and the ASTRO-1 Space Telescope

    NASA Astrophysics Data System (ADS)

    Morse, Jon

    2015-08-01

    The BoldlyGo Institute was formed in 2013 to augment the planned space science portfolio through philanthropically funded robotic space missions, similar to how some U.S. medical institutes and ground-based telescopes are funded. I introduce BoldlyGo's two current projects: the SCIM mission to Mars and the ASTRO-1 space telescope. In particular, ASTRO-1 is a 1.8-meter off-axis (unobscured) ultraviolet-visible space observatory to be located in a Lagrange point or heliocentric orbit with a wide-field panchromatic camera, medium- and high-resolution spectrograph, and high-contrast imaging coronagraph and/or an accompanying starshade/occulter. It is intended for the post-Hubble Space Telescope era in the 2020s, enabling unique measurements of a broad range of celestial targets, while providing vital complementary capabilities to other ground- and space-based facilities such as the JWST, ALMA, WFIRST-AFTA, LSST, TESS, Euclid, and PLATO. The ASTRO-1 architecture simultaneously wields great scientific power while being technically viable and affordable. A wide variety of scientific programs can be accomplished, addressing topics across space astronomy, astrophysics, fundamental physics, and solar system science, as well as being technologically informative to future large-aperture programs. ASTRO-1 is intended to be a new-generation research facility serving a broad national and international community, as well as a vessel for impactful public engagement. Traditional institutional partnerships and consortia, such as are common with private ground-based observatories, may play a role in the support and governance of ASTRO-1; we are currently engaging interested international organizations. In addition to our planned open guest observer program and accessible data archive, we intend to provide a mechanism whereby individual scientists can buy in to a fraction of the gauranteed observing time. Our next step in ASTRO-1 development is to form the ASTRO-1 Requirements Team (ART), to which international scientists are invited to apply. The ART will be tasked with anchoring the science case, optimizing the observatory design, and constructing a design reference mission during late-2015 and 2016.

  15. DESCQA: Synthetic Sky Catalog Validation Framework

    NASA Astrophysics Data System (ADS)

    Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph

    2018-04-01

    The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.

  16. Final acceptance testing of the LSST monolithic primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Burge, James H.; Cuerden, Brian; Gressler, William; Martin, Hubert M.; West, Steven C.; Zhao, Chunyu

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a three-mirror wide-field survey telescope with the primary and tertiary mirrors on one monolithic substrate1. This substrate is made of Ohara E6 borosilicate glass in a honeycomb sandwich, spin cast at the Steward Observatory Mirror Lab at The University of Arizona2. Each surface is aspheric, with the specification in terms of conic constant error, maximum active bending forces and finally a structure function specification on the residual errors3. There are high-order deformation terms, but with no tolerance, any error is considered as a surface error and is included in the structure function. The radii of curvature are very different, requiring two independent test stations, each with instantaneous phase-shifting interferometers with null correctors. The primary null corrector is a standard two-element Offner null lens. The tertiary null corrector is a phase-etched computer-generated hologram (CGH). This paper details the two optical systems and their tolerances, showing that the uncertainty in measuring the figure is a small fraction of the structure function specification. Additional metrology includes the radii of curvature, optical axis locations, and relative surface tilts. The methods for measuring these will also be described along with their tolerances.

  17. On the Detectability of Interstellar Objects Like 1I/'Oumuamua

    NASA Astrophysics Data System (ADS)

    Ragozzine, Darin

    2018-04-01

    Almost since Oort's 1950 hypothesis of a tenuously bound cloud of comets, planetary formation theorists have realized that the process of planet formation must have ejected very large numbers of planetesimals into interstellar space. Unforunately, these objects are distributed over galactic volumes, while they are only likely to be detectable if they pass within a few AU of Earth, resulting in an incredibly sparse detectable population. Furthermore, hypotheses for the formation and distribution of these bodies allows for uncertainties of orders of magnitude in the expected detection rate: our analysis suggested LSST would discover 0.01-100 objects during its lifetime (Cook et al. 2016). The discovery of 1I/'Oumuamua by a survey less powerful that LSST indicates either a low probability event and/or that properties of this population are on the more favorable end of the spectrum. We revisit the detailed detection analysis of Cook et al. 2016 in light of the detection of 1I/'Oumuamua. We use these results to better understand 1I/'Oumuamua and to update our assessment of future detections of interstellar objects. We highlight some key questions that can be answered only by additional discoveries.

  18. The Zooniverse

    NASA Astrophysics Data System (ADS)

    Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.

    2009-12-01

    The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.

  19. Liverpool Telescope and Liverpool Telescope 2

    NASA Astrophysics Data System (ADS)

    Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Clay, N. R.; Jermak, H.; Marchant, J. M.; Mottram, C. J.; Piascik, A.; Smith, R. J.

    2016-12-01

    The Liverpool Telescope is a fully robotic optical/near-infrared telescope with a 2-metre clear aperture, located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope is owned and operated by Liverpool John Moores University, with financial support from the UK's Science and Technology Facilities Council. The telescope began routine science operations in 2004 and is a common-user facility with time available through a variety of committees via an open, peer reviewed process. Seven simultaneously mounted instruments support a broad science programme, with a focus on transient follow-up and other time domain topics well suited to the characteristics of robotic observing. Development has also begun on a successor facility, with the working title `Liverpool Telescope 2', to capitalise on the new era of time domain astronomy which will be brought about by the next generation of survey facilities such as LSST. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time. In this paper we provide an overview of the current status of both facilities.

  20. News and Views: LSST mirror blank; More and better maths; Free telescopes; The hurricane season is starting again Get ready: IYA2009 UK website up and running

    NASA Astrophysics Data System (ADS)

    2008-10-01

    As floods and hurricanes disrupt the lives of people round the world, a new generation of scientific tools are supporting both storm preparedness and recovery. As International Year of Astronomy 2009 approaches, the UK website is developing more features that make it easier to see what's planned for this science extravaganza.

  1. The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Foxley-Marrable, Max; Collett, Thomas E.; Vernardos, Georgios; Goldstein, Daniel A.; Bacon, David

    2018-05-01

    We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is ≲ 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (≲ 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ˜ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.

  2. Optical Variability and Classification of High Redshift (3.5 < z < 5.5) Quasars on SDSS Stripe 82

    NASA Astrophysics Data System (ADS)

    AlSayyad, Yusra; McGreer, Ian D.; Fan, Xiaohui; Connolly, Andrew J.; Ivezic, Zeljko; Becker, Andrew C.

    2015-01-01

    Recent studies have shown promise in combining optical colors with variability to efficiently select and estimate the redshifts of low- to mid-redshift quasars in upcoming ground-based time-domain surveys. We extend these studies to fainter and less abundant high-redshift quasars using light curves from 235 sq. deg. and 10 years of Stripe 82 imaging reprocessed with the prototype LSST data management stack. Sources are detected on the i-band co-adds (5σ: i ~ 24) but measured on the single-epoch (ugriz) images, generating complete and unbiased lightcurves for sources fainter than the single-epoch detection threshold. Using these forced photometry lightcurves, we explore optical variability characteristics of high redshift quasars and validate classification methods with particular attention to the low signal limit. In this low SNR limit, we quantify the degradation of the uncertainties and biases on variability parameters using simulated light curves. Completeness/efficiency and redshift accuracy are verified with new spectroscopic observations on the MMT and APO 3.5m. These preliminary results are part of a survey to measure the z~4 luminosity function for quasars (i < 23) on Stripe 82 and to validate purely photometric classification techniques for high redshift quasars in LSST.

  3. Prospects for Determining the Mass Distributions of Galaxy Clusters on Large Scales Using Weak Gravitational Lensing

    NASA Astrophysics Data System (ADS)

    Fong, M.; Bowyer, R.; Whitehead, A.; Lee, B.; King, L.; Applegate, D.; McCarthy, I.

    2018-05-01

    For more than two decades, the Navarro, Frenk, and White (NFW) model has stood the test of time; it has been used to describe the distribution of mass in galaxy clusters out to their outskirts. Stacked weak lensing measurements of clusters are now revealing the distribution of mass out to and beyond their virial radii, where the NFW model is no longer applicable. In this study we assess how well the parameterised Diemer & Kravstov (DK) density profile describes the characteristic mass distribution of galaxy clusters extracted from cosmological simulations. This is determined from stacked synthetic lensing measurements of the 50 most massive clusters extracted from the Cosmo-OWLS simulations, using the Dark Matter Only run and also the run that most closely matches observations. The characteristics of the data reflect the Weighing the Giants survey and data from the future Large Synoptic Survey Telescope (LSST). In comparison with the NFW model, the DK model favored by the stacked data, in particular for the future LSST data, where the number density of background galaxies is higher. The DK profile depends on the accretion history of clusters which is specified in the current study. Eventually however subsamples of galaxy clusters with qualities indicative of disparate accretion histories could be studied.

  4. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  5. New characterization techniques for LSST sensors

    DOE PAGES

    Nomerotski, A.

    2015-06-18

    Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.

  6. SPHEREx: Playing Nicely with Other Missions

    NASA Astrophysics Data System (ADS)

    Werner, Michael; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for a competitive Phase A study in August 2017, is an all-sky survey satellite designed to address all three science goals of NASA's Astrophysics Division. SPHEREx is a wide-field spectral imager, and it would produce the first all-sky near-infrared spectral survey, using a passively cooled telescope with a wide field-of-view for large mapping speed. The SPHEREx spectra would have resolving power R=41 at wavelengths from 0.75 to 4.2um, and R=135 from 4.2 to 5um. The spectra resolution is provided by Linear Variable Filters placed directly over the four SPHEREx H2RG detector arrays. SPHEREx would be sensitive enough to obtain spectra of essentially all near-infrared sources from the WISE survey. During its two-year mission, SPHEREx, to be launched in 2022, would produce four complete all-sky spectral maps that would serve as a rich archive for the astronomy community.SPHEREx would be tremendously synergistic with numerous other missions and facilities [NASA and non-NASA] which will be operating in the coming decade. SPHEREx observations could pick out the most promising and exciting targets for investigation from JWST. From the opposite perspective, SPHEREx statistical samples could be used to refine the conclusions derived from JWST’s indepth studies of a few members of an interesting class of objects. SPHEREx and GAIA spectrophotometry, incorporating photometry from WISE and GALEX as well as GAIA astrometry, could lead to the determination of the radii of main sequence stars, and their transiting exoplanets discovered by TESS, with 1% accuracy. SPHEREx low redshift spectra of millions of galaxies could be used to validate and calibrate the photometric nredshift scale being adopted by WFIRST and Euclid, improving the precision of the dark energy measures being returned by those missions. The poster will briefly address SPHEREx synergisms with these and other missions ranging from LSST to eROSITA.The research described here was carried out in part at the Jet Propulsion Laboratory, California Institute of Technology, operated by the California Institute of Technology under a contract with NASA.

  7. [Galaxy/quasar classification based on nearest neighbor method].

    PubMed

    Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun

    2011-09-01

    With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification.

  8. Wide-Field InfraRed Survey Telescope WFIRST

    NASA Technical Reports Server (NTRS)

    Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Fan, X.; Rauscher, B.; hide

    2012-01-01

    In December 2010, NASA created a Science Definition Team (SDT) for WFIRST, the Wide Field Infra-Red Survey Telescope, recommended by the Astro 2010 Decadal Survey as the highest priority for a large space mission. The SDT was chartered to work with the WFIRST Project Office at GSFC and the Program Office at JPL to produce a Design Reference Mission (DRM) for WFIRST. Part of the original charge was to produce an interim design reference mission by mid-2011. That document was delivered to NASA and widely circulated within the astronomical community. In late 2011 the Astrophysics Division augmented its original charge, asking for two design reference missions. The first of these, DRM1, was to be a finalized version of the interim DRM, reducing overall mission costs where possible. The second of these, DRM2, was to identify and eliminate capabilities that overlapped with those of NASA's James Webb Space Telescope (henceforth JWST), ESA's Euclid mission, and the NSF's ground-based Large Synoptic Survey Telescope (henceforth LSST), and again to reduce overall mission cost, while staying faithful to NWNH. This report presents both DRM1 and DRM2.

  9. Machine-assisted discovery of relationships in astronomy

    NASA Astrophysics Data System (ADS)

    Graham, Matthew J.; Djorgovski, S. G.; Mahabal, Ashish A.; Donalek, Ciro; Drake, Andrew J.

    2013-05-01

    High-volume feature-rich data sets are becoming the bread-and-butter of 21st century astronomy but present significant challenges to scientific discovery. In particular, identifying scientifically significant relationships between sets of parameters is non-trivial. Similar problems in biological and geosciences have led to the development of systems which can explore large parameter spaces and identify potentially interesting sets of associations. In this paper, we describe the application of automated discovery systems of relationships to astronomical data sets, focusing on an evolutionary programming technique and an information-theory technique. We demonstrate their use with classical astronomical relationships - the Hertzsprung-Russell diagram and the Fundamental Plane of elliptical galaxies. We also show how they work with the issue of binary classification which is relevant to the next generation of large synoptic sky surveys, such as the Large Synoptic Survey Telescope (LSST). We find that comparable results to more familiar techniques, such as decision trees, are achievable. Finally, we consider the reality of the relationships discovered and how this can be used for feature selection and extraction.

  10. Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis

    DTIC Science & Technology

    2016-09-15

    future research. 3 II. Astrostatistics Historically, astronomy has been a data-driven science. Larger and more precise data sets have led to the...forthcoming Large Synoptic Survey Telescope (LSST), the human-centric approach to astronomy is becoming strained [13, 24, 25, 63]. More than ever...process. One use of the filtering process is to remove artifacts from the data set. In the context of time domain astronomy , an artifact is an error in

  11. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    DTIC Science & Technology

    2008-01-01

    faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter , taking an inventory of the Solar...Energy and Dark Matter (2) Taking an Inventory of the Solar System (3) Exploring the Transient Optical Sky (4) Mapping the Milky Way Each of these four...Constraining Dark Energy and Dark Matter Current models of cosmology require the exis- tence of both dark matter and dark energy to match observational

  12. Variable classification in the LSST era: exploring a model for quasi-periodic light curves

    NASA Astrophysics Data System (ADS)

    Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.

    2017-06-01

    The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.

  13. Transient survey rates for orphan afterglows from compact merger jets

    NASA Astrophysics Data System (ADS)

    Lamb, Gavin P.; Tanaka, Masaomi; Kobayashi, Shiho

    2018-06-01

    Orphan afterglows from short γ-ray bursts (GRBs) are potential candidates for electromagnetic (EM) counterpart searches to gravitational wave (GW) detected neutron star or neutron star black hole mergers. Various jet dynamical and structure models have been proposed that can be tested by the detection of a large sample of GW-EM counterparts. We make predictions for the expected rate of optical transients from these jet models for future survey telescopes, without a GW or GRB trigger. A sample of merger jets is generated in the redshift limits 0 ≤ z ≤ 3.0, and the expected peak r-band flux and time-scale above the Large Synoptic Survey Telescope (LSST) or Zwicky Transient Factory (ZTF) detection threshold, mr = 24.5 and 20.4, respectively, is calculated. General all-sky rates are shown for mr ≤ 26.0 and mr ≤ 21.0. The detected orphan and GRB afterglow rate depends on jet model, typically 16≲ R≲ 76 yr-1 for the LSST, and 2≲ R ≲ 8 yr-1 for ZTF. An excess in the rate of orphan afterglows for a survey to a depth of mr ≤ 26 would indicate that merger jets have a dominant low-Lorentz factor population, or the jets exhibit intrinsic jet structure. Careful filtering of transients is required to successfully identify orphan afterglows from either short- or long-GRB progenitors.

  14. Artificial Intelligence and the Brave New World of Eclipsing Binaries

    NASA Astrophysics Data System (ADS)

    Devinney, E.; Guinan, E.; Bradstreet, D.; DeGeorge, M.; Giammarco, J.; Alcock, C.; Engle, S.

    2005-12-01

    The explosive growth of observational capabilities and information technology over the past decade has brought astronomy to a tipping point - we are going to be deluged by a virtual fire hose (more like Niagara Falls!) of data. An important component of this deluge will be newly discovered eclipsing binary stars (EBs) and other valuable variable stars. As exploration of the Local Group Galaxies grows via current and new ground-based and satellite programs, the number of EBs is expected to grow explosively from some 10,000 today to 8 million as GAIA comes online. These observational advances will present a unique opportunity to study the properties of EBs formed in galaxies with vastly different dynamical, star formation, and chemical histories than our home Galaxy. Thus the study of these binaries (e.g., from light curve analyses) is expected to provide clues about the star formation rates and dynamics of their host galaxies as well as the possible effects of varying chemical abundance on stellar evolution and structure. Additionally, minimal-assumption-based distances to Local Group objects (and possibly 3-D mapping within these objects) shall be returned. These huge datasets of binary stars will provide tests of current theories (or suggest new theories) regarding binary star formation and evolution. However, these enormous data will far exceed the capabilities of analysis via human examination. To meet the daunting challenge of successfully mining this vast potential of EBs and variable stars for astrophysical results with minimum human intervention, we are developing new data processing techniques and methodologies. Faced with an overwhelming volume of data, our goal is to integrate technologies of Machine Learning and Pattern Processing (Artificial Intelligence [AI]) into the data processing pipelines of the major current and future ground- and space-based observational programs. Data pipelines of the future will have to carry us from observations to astrophysics with minimal human intervention. While there has been some recognition of this need (e.g. the LSST project drawing on the experience of MACHO/OGLE), few steps have been taken to address this crucial issue. Fortunately, advances in AI have created the opportunity to make significant progress in this direction. Here we discuss our plans to develop an Intelligent Data Pipeline (IDP) that can operate autonomously on large observational datasets to produce results of astrophysical value. Plans and initial results are discussed. This research is supported by NSF/RUI Grant AST05-07542 which we gratefully acknowledge.

  15. Extracting meaning from astronomical telegrams

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Conwill, L.; Djorgovski, S. G.; Mahabal, A.; Donalek, C.; Drake, A.

    2011-01-01

    The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using aspects of natural language processing. We demonstrate that it is possible to infer the subject of an ATEL from the vocabulary used and to identify previously unassociated reports.

  16. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  17. RoboTAP: Target priorities for robotic microlensing observations

    NASA Astrophysics Data System (ADS)

    Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.

    2018-01-01

    Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.

  18. The Complete Calibration of the Color–Redshift Relation (C3R2) Survey: Survey Overview and Data Release 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, Daniel C.; Stern, Daniel K.; Rhodes, Jason D.

    A key goal of the Stage IV dark energy experiments Euclid , LSST, and WFIRST is to measure the growth of structure with cosmic time from weak lensing analysis over large regions of the sky. Weak lensing cosmology will be challenging: in addition to highly accurate galaxy shape measurements, statistically robust and accurate photometric redshift (photo- z ) estimates for billions of faint galaxies will be needed in order to reconstruct the three-dimensional matter distribution. Here we present an overview of and initial results from the Complete Calibration of the Color–Redshift Relation (C3R2) survey, which is designed specifically to calibratemore » the empirical galaxy color–redshift relation to the Euclid depth. These redshifts will also be important for the calibrations of LSST and WFIRST . The C3R2 survey is obtaining multiplexed observations with Keck (DEIMOS, LRIS, and MOSFIRE), the Gran Telescopio Canarias (GTC; OSIRIS), and the Very Large Telescope (VLT; FORS2 and KMOS) of a targeted sample of galaxies that are most important for the redshift calibration. We focus spectroscopic efforts on undersampled regions of galaxy color space identified in previous work in order to minimize the number of spectroscopic redshifts needed to map the color–redshift relation to the required accuracy. We present the C3R2 survey strategy and initial results, including the 1283 high-confidence redshifts obtained in the 2016A semester and released as Data Release 1.« less

  19. The LSST Dome final design

    NASA Astrophysics Data System (ADS)

    DeVries, J.; Neill, D. R.; Barr, J.; De Lorenzi, Simone; Marchiori, Gianpietro

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile 1. As a result of the Telescope wide field of view, the optical system is unusually susceptible to stray light 2. In addition, balancing the effect of wind induced telescope vibrations with Dome seeing is crucial. The rotating enclosure system (Dome) includes a moving wind screen and light baffle system. All of the Dome vents include hinged light baffles, which provide exceptional Dome flushing, stray light attenuation, and allows for vent maintenance access from inside the Dome. The wind screen also functions as a light screen, and helps define a clear optical aperture for the Telescope. The Dome must operate continuously without rotational travel limits to accommodate the Telescope cadence and travel. Consequently, the Azimuth drives are located on the fixed lower enclosure to accommodate glycol water cooling without the need for a utility cable wrap. An air duct system aligns when the Dome is in its parked position, and this provides air cooling for temperature conditioning of the Dome during the daytime. A bridge crane and a series of ladders, stairs and platforms provide for the inspection, maintenance and repair of all of the Dome mechanical systems. The contract to build the Dome was awarded to European Industrial Engineering in Mestre, Italy in May 2015. In this paper, we present the final design of this telescope and site sub-system.

  20. New Surveys of the Universe with the Jansky Very Large Array (VLA) and the Very Long Baseline Array (VLBA)

    NASA Astrophysics Data System (ADS)

    Myers, Steven T.

    2013-01-01

    The Jansky Very Large Array is a recently completed upgrade to the VLA that has significantly expanded its capabilities through replacement of the receivers, electronics, signal paths, and correlator with cutting-edge technology. This enhancement provides significantly increased continuum sensitivity and spectral survey speeds (by factors of 100 or more in select cases) from 1-50 GHz and in key bands below 1 GHz. Concurrently, we are greatly enhancing the sensitivity of the Very Long Baseline Array. A suite of ever more ambitious radio sky survey programs undertaken with these new instruments address science goals central to answering the questions posed by Astro2010, and will undoubtedly incite new inquiries. The science themes of the Jansky VLA and the VLBA are: illuminating the obscured, probing the magnetic, sounding the transient, and charting the evolving Universe. New observations will allow us to image young stars in massive black holes in dust enshrouded environments, measure the strength and topology of the cosmic magnetic field, follow the rapid evolution of energetic phenomena, and to study the formation and evolution of stars, galaxies, AGN, and the Universe itself. We can follow the evolution of gas and galaxies and particles and fields through cosmic time to bridge the eras from cosmic dawn to the dawn of new worlds. I will describe the salient features of the Jansky VLA and the VLBA for cosmological survey work, and summarize the multi-wavelength aspects in regard to those with ALMA and next generation optical, infrared, X-ray and Gamma-ray telescopes. Example data taken from Janksy VLA and upgraded VLBA commissioning tests and early science will illustrate these features. I also describe evolution of the VLA and VLBA and their capabilities for future surveys that will lead towards the next decade, into the era of the LSST and the SKA.

  1. Time-Resolved Surveys of Stellar Clusters

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Eggenberger, Patrick; Greco, Claudia; Saesen, Sophie; Anderson, Richard I.; Mowlavi, Nami

    We describe the information that can be gained when a survey is done multi-epoch, and its particular impact in open cluster research. We first explain the irreplaceable information that multi-epoch observations are giving within astrometry, photometry and spectroscopy. Then we give three examples of results on open clusters from multi-epoch surveys, namely, the distance to the Pleiades, the angular momentum evolution of low mass stars and asteroseismology. Finally we mention several very large surveys, which are ongoing or planned for the future, Gaia, JASMINE, LSST, and VVV.

  2. EmpiriciSN: Re-sampling Observed Supernova/Host Galaxy Populations Using an XD Gaussian Mixture Model

    NASA Astrophysics Data System (ADS)

    Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.

    2017-06-01

    We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.

  3. SPHEREx: Probing the Physics of Inflation with an All-Sky Spectroscopic Galaxy Survey

    NASA Astrophysics Data System (ADS)

    Dore, Olivier; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A in August 2017, is an all-sky survey satellite designed to address all three science goals in NASA’s astrophysics division: probe the origin and destiny of our Universe; explore whether planets around other stars could harbor life; and explore the origin and evolution of galaxies. These themes are addressed by a single survey, with a single instrument.In this poster, we describe how SPHEREx can probe the physics of inflationary non-Gaussianity by measuring large-scale structure with galaxy redshifts over a large cosmological volume at low redshifts, complementing high-redshift surveys optimized to constrain dark energy.SPHEREx will be the first all-sky near-infrared spectral survey, creating a legacy archive of spectra. In particular, it will measure the redshifts of over 500 million galaxies of all types, an unprecedented dataset. Using this catalog, SPHEREx will reduce the uncertainty in fNL -- a parameter describing the inflationary initial conditions -- by a factor of more than 10 compared with CMB measurements. At the same time, this catalog will enable strong scientific synergies with Euclid, WFIRST and LSST

  4. EmpiriciSN: Re-sampling Observed Supernova/Host Galaxy Populations Using an XD Gaussian Mixture Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holoien, Thomas W. -S.; Marshall, Philip J.; Wechsler, Risa H.

    We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of amore » subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.« less

  5. Constructing Concept Schemes From Astronomical Telegrams Via Natural Language Clustering

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Zhang, M.; Djorgovski, S. G.; Donalek, C.; Drake, A. J.; Mahabal, A.

    2012-01-01

    The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using hierarchical clustering of processed natural language. This allows us to automatically organize ATELs based on the vocabulary used. We conclude that we can use simple algorithms to process and extract meaning from astronomical textual data.

  6. CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding

    NASA Astrophysics Data System (ADS)

    Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás

    2018-01-01

    Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.

  7. Precision matrix expansion - efficient use of numerical simulations in estimating errors on cosmological parameters

    NASA Astrophysics Data System (ADS)

    Friedrich, Oliver; Eifler, Tim

    2018-01-01

    Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the covariance matrix is the sum of two contributions, C = A+B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard covariance estimator would require >105 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

  8. Strong Lens Time Delay Challenge. I. Experimental Design

    NASA Astrophysics Data System (ADS)

    Dobler, Gregory; Fassnacht, Christopher D.; Treu, Tommaso; Marshall, Phil; Liao, Kai; Hojjati, Alireza; Linder, Eric; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~103 strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders," each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.

  9. Fabrication of the LSST monolithic primary-tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Ketelsen, Dean A.; Law, Kevin; Gressler, William J.; Zhao, Chunyu

    2012-09-01

    As previously reported (at the SPIE Astronomical Instrumentation conference of 2010 in San Diego1), the Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona’s Steward Observatory Mirror Lab. We will provide an update to the status of the mirrors and metrology systems, which have advanced from concepts to hardware in the past two years. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab, reducing the degrees of freedom needed to be controlled in the telescope. The surface specification is described as a structure function, related to seeing in excellent conditions. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper details the manufacturing process and metrology systems for each surface, including the alignment of the two surfaces. M1 is a hyperboloid and can utilize a standard Offner null corrector, whereas M3 is an oblate ellipsoid, so it has positive spherical aberration. The null corrector is a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature. Laser trackers are relied upon to measure the alignment and spacing as well as rough-surface metrology during looseabrasive grinding.

  10. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderPlas, Jacob T.; Ivezic, Željko

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common tomore » all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.« less

  11. The Santiago-Harvard-Edinburgh-Durham void comparison - I. SHEDding light on chameleon gravity tests

    NASA Astrophysics Data System (ADS)

    Cautun, Marius; Paillas, Enrique; Cai, Yan-Chuan; Bose, Sownak; Armijo, Joaquin; Li, Baojiu; Padilla, Nelson

    2018-05-01

    We present a systematic comparison of several existing and new void-finding algorithms, focusing on their potential power to test a particular class of modified gravity models - chameleon f(R) gravity. These models deviate from standard general relativity (GR) more strongly in low-density regions and thus voids are a promising venue to test them. We use halo occupation distribution (HOD) prescriptions to populate haloes with galaxies, and tune the HOD parameters such that the galaxy two-point correlation functions are the same in both f(R) and GR models. We identify both three-dimensional (3D) voids and two-dimensional (2D) underdensities in the plane of the sky to find the same void abundance and void galaxy number density profiles across all models, which suggests that they do not contain much information beyond galaxy clustering. However, the underlying void dark matter density profiles are significantly different, with f(R) voids being more underdense than GR ones, which leads to f(R) voids having a larger tangential shear signal than their GR analogues. We investigate the potential of each void finder to test f(R) models with near-future lensing surveys such as EUCLID and LSST. The 2D voids have the largest power to probe f(R) gravity, with an LSST analysis of tunnel (which is a new type of 2D underdensity introduced here) lensing distinguishing at 80 and 11σ (statistical error) f(R) models with parameters, |fR0| = 10-5 and 10-6, from GR.

  12. Lighting the Fire for 25 years: The Nature and Legacy of Astronomy Camp

    NASA Astrophysics Data System (ADS)

    McCarthy, Donald W.; Hooper, E.; Benecchi, S. D.; Henry, T. J.; Kirkpatrick, J. D.; Kulesa, C.; Oey, M. S.; Regester, J.; Schlingman, W. M.; Camp Staff, Astronomy

    2013-01-01

    In 1988, Astronomy Camp began in an era when science was entirely the realm of professionals, astronomical observatories were off-limits to the public at night, and scientists were not encouraged to spend time in science education. Since then we have grown a dynamic science education program that immerses individuals (ages 11-80), educators, schools, and Girl Scout Leaders in authentic science at Arizona’s research observatories in the Catalina mountains and at Kitt Peak. Often labeled “life changing,” these residential programs have engaged thousands of people from 49 U.S. states and 20 foreign countries. Female enrollment has increased steadily, and women now generally outnumber men in our teenage programs. Graduate students have played a major creative role and many have gone on to become educators and research leaders around the world. By involving a wide range of ages, the Camps have helped strengthen the STEM-pipeline. Many of our alumni remain in touch via social and professional networks and have developed not only into professional astronomers but also into leaders throughout society, parents, and educators. Our emphasis on age-appropriate research helped inspire today’s concepts of research-based science education and Citizen Science. An accompanying paper (E. Hooper et al.) discusses our approach to project-oriented astronomical research. Scientific discoveries include Near-Earth Objects, supernova classification, and lightcurves of Kuiper Belt Objects. The Camps have also contributed to educational research involving informal science education, youth perceptions, and student identities. Ironically, the Camps have leveraged new initiatives in both research and education at NOAO, LSST, and JWST. Here we review the philosophy, conduct, and content of Astronomy Camp and summarize the unexpected nature of its ongoing legacy. We remain grateful to The University of Arizona Alumni Association for its long-term encouragement and support.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Hugh H.; Balasubramanian, V.; Bernstein, G.

    The University of Pennsylvania elementary particle physics/particle cosmology group, funded by the Department of Energy Office of Science, participates in research in high energy physics and particle cosmology that addresses some of the most important unanswered questions in science. The research is divided into five areas. Energy Frontier - We participate in the study of proton-proton collisions at the Large Hadron Collider in Geneva, Switzerland using the ATLAS detector. The University of Pennsylvania group was responsible for the design, installation, and commissioning of the front-end electronics for the Transition Radiation Tracker (TRT) and plays the primary role in its maintenancemore » and operation. We play an important role in the triggering of ATLAS, and we have made large contributions to the TRT performance and to the study and identification of electrons, photons, and taus. We have been actively involved in searches for the Higgs boson and for SUSY and other exotic particles. We have made significant contributions to measurement of Standard Model processes such as inclusive photon production and WW pair production. We also have participated significantly in R&D for upgrades to the ATLAS detector. Cosmic Frontier - The Dark Energy Survey (DES) telescope will be used to elucidate the nature of dark energy and the distribution of dark matter. Penn has played a leading role both in the use of weak gravitational lensing of distant galaxies and the discovery of large numbers of distant supernovae. The techniques and forecasts developed at Penn are also guiding the development of the proposed Large Synoptic Survey Telescope (LSST).We are also developing a new detector, MiniClean, to search for direct detection of dark matter particles. Intensity Frontier - We are participating in the design and R&D of detectors for the Long Baseline Neutrino Experiment (now DUNE), a new experiment to study the properties of neutrinos. Advanced Techology R&D - We have an extensive involvement in electronics required for sophisticated new detectors at the LHC and are developing electronics for the LSST camera. Theoretical Physics - We are carrying out a broad program studying the fundamental forces of nature and early universe cosmology and mathematical physics. Our activities span the range from model building, formal field theory, and string theory to new paradigms for cosmology and the interface of string theory with mathematics. Our effort combines extensive development of the formal aspects of string theory with a focus on real phenomena in particle physics, cosmology and gravity.« less

  14. The Amateurs' Love Affair with Large Datasets

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Jacoby, S. H.; Henden, A.

    2006-12-01

    Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.

  15. Data Management challenges in Astronomy and Astroparticle Physics

    NASA Astrophysics Data System (ADS)

    Lamanna, Giovanni

    2015-12-01

    Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.

  16. The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept

    NASA Astrophysics Data System (ADS)

    Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking

    2018-01-01

    CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.

  17. On determination of charge transfer efficiency of thick, fully depleted CCDs with 55 Fe x-rays

    DOE PAGES

    Yates, D.; Kotov, I.; Nomerotski, A.

    2017-07-01

    Charge transfer efficiency (CTE) is one of the most important CCD characteristics. Our paper examines ways to optimize the algorithms used to analyze 55Fe x-ray data on the CCDs, as well as explores new types of observables for CTE determination that can be used for testing LSST CCDs. Furthermore, the observables are modeled employing simple Monte Carlo simulations to determine how the charge diffusion in thick, fully depleted silicon affects the measurement. The data is compared to the simulations for one of the observables, integral flux of the x-ray hit.

  18. Supporting Shared Resource Usage for a Diverse User Community: the OSG Experience and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele; Levshina, Tanya; Rynge, Mats; Sehgal, Chander; Slyz, Marko

    2012-12-01

    The Open Science Grid (OSG) supports a diverse community of new and existing users in adopting and making effective use of the Distributed High Throughput Computing (DHTC) model. The LHC user community has deep local support within the experiments. For other smaller communities and individual users the OSG provides consulting and technical services through the User Support area. We describe these sometimes successful and sometimes not so successful experiences and analyze lessons learned that are helping us improve our services. The services offered include forums to enable shared learning and mutual support, tutorials and documentation for new technology, and troubleshooting of problematic or systemic failure modes. For new communities and users, we bootstrap their use of the distributed high throughput computing technologies and resources available on the OSG by following a phased approach. We first adapt the application and run a small production campaign on a subset of “friendly” sites. Only then do we move the user to run full production campaigns across the many remote sites on the OSG, adding to the community resources up to hundreds of thousands of CPU hours per day. This scaling up generates new challenges - like no determinism in the time to job completion, and diverse errors due to the heterogeneity of the configurations and environments - so some attention is needed to get good results. We cover recent experiences with image simulation for the Large Synoptic Survey Telescope (LSST), small-file large volume data movement for the Dark Energy Survey (DES), civil engineering simulation with the Network for Earthquake Engineering Simulation (NEES), and accelerator modeling with the Electron Ion Collider group at BNL. We will categorize and analyze the use cases and describe how our processes are evolving based on lessons learned.

  19. Near-Earth Object Survey Simulation Software

    NASA Astrophysics Data System (ADS)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  20. Mental Fatigue Impairs Soccer-Specific Physical and Technical Performance.

    PubMed

    Smith, Mitchell R; Coutts, Aaron J; Merlini, Michele; Deprez, Dieter; Lenoir, Matthieu; Marcora, Samuele M

    2016-02-01

    To investigate the effects of mental fatigue on soccer-specific physical and technical performance. This investigation consisted of two separate studies. Study 1 assessed the soccer-specific physical performance of 12 moderately trained soccer players using the Yo-Yo Intermittent Recovery Test, Level 1 (Yo-Yo IR1). Study 2 assessed the soccer-specific technical performance of 14 experienced soccer players using the Loughborough Soccer Passing and Shooting Tests (LSPT, LSST). Each test was performed on two occasions and preceded, in a randomized, counterbalanced order, by 30 min of the Stroop task (mentally fatiguing treatment) or 30 min of reading magazines (control treatment). Subjective ratings of mental fatigue were measured before and after treatment, and mental effort and motivation were measured after treatment. Distance run, heart rate, and ratings of perceived exertion were recorded during the Yo-Yo IR1. LSPT performance time was calculated as original time plus penalty time. LSST performance was assessed using shot speed, shot accuracy, and shot sequence time. Subjective ratings of mental fatigue and effort were higher after the Stroop task in both studies (P < 0.001), whereas motivation was similar between conditions. This mental fatigue significantly reduced running distance in the Yo-Yo IR1 (P < 0.001). No difference in heart rate existed between conditions, whereas ratings of perceived exertion were significantly higher at iso-time in the mental fatigue condition (P < 0.01). LSPT original time and performance time were not different between conditions; however, penalty time significantly increased in the mental fatigue condition (P = 0.015). Mental fatigue also impaired shot speed (P = 0.024) and accuracy (P < 0.01), whereas shot sequence time was similar between conditions. Mental fatigue impairs soccer-specific running, passing, and shooting performance.

  1. Strong Gravitational Lensing as a Probe of Gravity, Dark-Matter and Super-Massive Black Holes

    NASA Astrophysics Data System (ADS)

    Koopmans, L.V.E.; Barnabe, M.; Bolton, A.; Bradac, M.; Ciotti, L.; Congdon, A.; Czoske, O.; Dye, S.; Dutton, A.; Elliasdottir, A.; Evans, E.; Fassnacht, C.D.; Jackson, N.; Keeton, C.; Lasio, J.; Moustakas, L.; Meneghetti, M.; Myers, S.; Nipoti, C.; Suyu, S.; van de Ven, G.; Vegetti, S.; Wucknitz, O.; Zhao, H.-S.

    Whereas considerable effort has been afforded in understanding the properties of galaxies, a full physical picture, connecting their baryonic and dark-matter content, super-massive black holes, and (metric) theories of gravity, is still ill-defined. Strong gravitational lensing furnishes a powerful method to probe gravity in the central regions of galaxies. It can (1) provide a unique detection-channel of dark-matter substructure beyond the local galaxy group, (2) constrain dark-matter physics, complementary to direct-detection experiments, as well as metric theories of gravity, (3) probe central super-massive black holes, and (4) provide crucial insight into galaxy formation processes from the dark matter point of view, independently of the nature and state of dark matter. To seriously address the above questions, a considerable increase in the number of strong gravitational-lens systems is required. In the timeframe 2010-2020, a staged approach with radio (e.g. EVLA, e-MERLIN, LOFAR, SKA phase-I) and optical (e.g. LSST and JDEM) instruments can provide 10^(2-4) new lenses, and up to 10^(4-6) new lens systems from SKA/LSST/JDEM all-sky surveys around ~2020. Follow-up imaging of (radio) lenses is necessary with moderate ground/space-based optical-IR telescopes and with 30-50m telescopes for spectroscopy (e.g. TMT, GMT, ELT). To answer these fundamental questions through strong gravitational lensing, a strong investment in large radio and optical-IR facilities is therefore critical in the coming decade. In particular, only large-scale radio lens surveys (e.g. with SKA) provide the large numbers of high-resolution and high-fidelity images of lenses needed for SMBH and flux-ratio anomaly studies.

  2. STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobler, Gregory; Fassnacht, Christopher D.; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a groupmore » of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.« less

  3. STRONG LENS TIME DELAY CHALLENGE. II. RESULTS OF TDC1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Kai; Treu, Tommaso; Marshall, Phil

    2015-02-10

    We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g., COSMOGRAIL) and expected from future synoptic surveys (e.g., LSST), and include simulated systematic errors. Seven teams participated in TDC1, submitting results from 78 different method variants. After describing each method, we compute and analyze basic statistics measuring accuracy (ormore » bias) A, goodness of fit χ{sup 2}, precision P, and success rate f. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give |A| < 0.03, P < 0.03, and χ{sup 2} < 1.5, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range f = 20%-40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher f than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with P < 0.03 and |A| < 0.01, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that A and f depend mostly on season length, while P depends mostly on cadence and campaign duration.« less

  4. PROFIT: Bayesian profile fitting of galaxy images

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.

    2017-04-01

    We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (libprofit), accessible via a command-line interface and documented API, along with high-level R (PROFIT) and PYTHON (PyProFit) interfaces (available at github.com/ICRAR/libprofit, github.com/ICRAR/ProFit, and github.com/ICRAR/pyprofit, respectively). R PROFIT is also available pre-built from CRAN; however, this version will be slightly behind the latest GitHub version. libprofit offers fast and accurate two-dimensional integration for a useful number of profiles, including Sérsic, Core-Sérsic, broken-exponential, Ferrer, Moffat, empirical King, point-source, and sky, with a simple mechanism for adding new profiles. We show detailed comparisons between libprofit and GALFIT. libprofit is both faster and more accurate than GALFIT at integrating the ubiquitous Sérsic profile for the most common values of the Sérsic index n (0.5 < n < 8). The high-level fitting code PROFIT is tested on a sample of galaxies with both SDSS and deeper KiDS imaging. We find good agreement in the fit parameters, with larger scatter in best-fitting parameters from fitting images from different sources (SDSS versus KiDS) than from using different codes (PROFIT versus GALFIT). A large suite of Monte Carlo-simulated images are used to assess prospects for automated bulge-disc decomposition with PROFIT on SDSS, KiDS, and future LSST imaging. We find that the biggest increases in fit quality come from moving from SDSS- to KiDS-quality data, with less significant gains moving from KiDS to LSST.

  5. Unveiling the population of orphan γ-ray bursts

    NASA Astrophysics Data System (ADS)

    Ghirlanda, G.; Salvaterra, R.; Campana, S.; Vergani, S. D.; Japelj, J.; Bernardini, M. G.; Burlon, D.; D'Avanzo, P.; Melandri, A.; Gomboc, A.; Nappo, F.; Paladini, R.; Pescalli, A.; Salafia, O. S.; Tagliaferri, G.

    2015-06-01

    Gamma-ray bursts (GRBs) are detectable in the γ-ray band if their jets are oriented toward the observer. However, for each GRB with a typical θjet, there should be ~2/θ2jet bursts whose emission cone is oriented elsewhere in space. These off-axis bursts can eventually be detected when, due to the deceleration of their relativistic jets, the beaming angle becomes comparable to the viewing angle. Orphan afterglows (OAs) should outnumber the current population of bursts detected in the γ-ray band even if they have not been conclusively observed so far at any frequency. We compute the expected flux of the population of orphan afterglows in the mm, optical, and X-ray bands through a population synthesis code of GRBs and the standard afterglow emission model. We estimate the detection rate of OAs with ongoing and forthcoming surveys. The average duration of OAs as transients above a given limiting flux is derived and described with analytical expressions: in general OAs should appear as daily transients in optical surveys and as monthly/yearly transients in the mm/radio band. We find that ~2 OA yr-1 could already be detected by Gaia and up to 20 OA yr-1 could be observed by the ZTF survey. A larger number of 50 OA yr-1 should be detected by LSST in the optical band. For the X-ray band, ~26 OA yr-1 could be detected by the eROSITA. For the large population of OA detectable by LSST, the X-ray and optical follow up of the light curve (for the brightest cases) and/or the extensive follow up of their emission in the mm and radio band could be the key to disentangling their GRB nature from other extragalactic transients of comparable flux density.

  6. Optical testing of the LSST combined primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Gressler, William J.; Zhao, Chunyu

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona's Steward Observatory Mirror Lab. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper describes the basic metrology systems for each surface, with particular attention to the alignment of the two surfaces. These surfaces are aspheric enough to require null correctors for each wavefront. Both M1 and M3 are concave surfaces with both non-zero conic constants and higher-order terms (6th order for M1 and both 6th and 8th orders for M3). M1 is hyperboloidal and can utilize a standard Offner null corrector. M3 is an oblate ellipsoid, so has positive spherical aberration. We have chosen to place a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature (CoC), whereas the M1 null lens is beyond the CoC. One relatively new metrology tool is the laser tracker, which is relied upon to measure the alignment and spacings. A separate laser tracker system will be used to measure both surfaces during loose abrasive grinding and initial polishing.

  7. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  8. Sub-percent Photometry: Faint DA White Dwarf Spectrophotometric Standards for Astrophysical Observatories

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Axelrod, Tim; Calamida, Annalisa; Saha, Abhijit; Matheson, Thomas; Olszewski, Edward; Holberg, Jay; Holberg, Jay; Bohlin, Ralph; Stubbs, Christopher W.; Rest, Armin; Deustua, Susana; Sabbi, Elena; MacKenty, John W.; Points, Sean D.; Hubeny, Ivan

    2018-01-01

    We have established a network of faint (16.5 < V < 19) hot DA white dwarfs as spectrophotometric standards for present and future wide-field observatories. Our standards are accessible from both hemispheres and suitable for ground and space-based covering the UV to the near IR. The network is tied directly to the most precise astrophysical reference presently available - the CALSPEC standards - through a multi-cycle program imaging using the Wide-Field Camera 3 (WFC3) on the Hubble Space Telescope (HST). We have developed two independent analyses to forward model all the observed photometry and ground-based spectroscopy and infer a spectral energy distribution for each source using a non-local-thermodynamic-equilibrium (NLTE) DA white dwarf atmosphere extincted by interstellar dust. The models are in excellent agreement with each other, and agree with the observations to better than 0.01 mag in all passbands, and better than 0.005 mag in the optical. The high-precision of these faint sources, tied directly to the most accurate flux standards presently available, make our network of standards ideally suited for any experiments that have very stringent requirements on absolute flux calibration, such as studies of dark energy using the Large Synoptic Survey Telescope (LSST) and the Wide-Field Infrared Survey Telescope (WFIRST).

  9. voevent-parse: Parse, manipulate, and generate VOEvent XML packets

    NASA Astrophysics Data System (ADS)

    Staley, Tim D.

    2014-11-01

    voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.

  10. The Follow-up Crisis: Optimizing Science in an Opportunity Rich Environment

    NASA Astrophysics Data System (ADS)

    Vestrand, T.

    Rapid follow-up tasking for robotic telescopes has been dominated by a one-dimensional uncoordinated response strategy developed for gamma-ray burst studies. However, this second-grade soccer approach is increasing showing its limitations even when there are only a few events per night. And it will certainly fail when faced with the denial-of-service attack generated by the nightly flood of new transients generated by massive variability surveys like LSST. We discuss approaches for optimizing the scientific return from autonomous robotic telescopes in the high event range limit and explore the potential of a coordinated telescope ecosystem employing heterogeneous telescopes.

  11. DEVELOPMENT OF EMERGING TECHNOLOGIES WITHIN THE SITE PROGRAM

    EPA Science Inventory

    The Site Program is formed by five research programs: the Demonstration Program, the Emerging Technology Program, the Measurement and Monitoring Technology Development Program, the Innovative Technology Program, and the Technology Transfer Program. The Emerging Technology (ET) P...

  12. Initiating the 2002 Mars Science Laboratory (MSL) Technology Program

    NASA Technical Reports Server (NTRS)

    Caffrey, Robert T.; Udomkesmalee, Gabriel; Hayati, Samad A.; Henderson, Rebecca

    2004-01-01

    The Mars Science Laboratory (MSL) Project is an aggressive mission launching in 2009 to investigate the Martian environment and requires new capabilities that are currently are not available. The MSL Technology Program is developing a wide-range of technologies needed for this Mission and potentially other space missions. The MSL Technology Program reports to both the MSL Project and the Mars Technology Program (MTP). The dual reporting process creates a challenging management situation, but ensures the new technology meets both the specific MSL requirements and the broader Mars Program requirements. MTP is a NASA-wide technology development program managed by JPL and is divided into a Focused Program and a Base Program. The MSL Technology Program is under the focused program and is tightly coupled to MSL's mission milestones and deliverables. The technology budget is separate from the flight Project budget, but the technology's requirements and the development process are tightly coordinated with the Project. The MSL Technology Program combines the proven management techniques of flight projects with the commercial technology management strategies of industry and academia, to create a technology management program that meets the short-term requirements of MSL and the long-term requirements of MTP. This paper examines the initiation of 2002 MSL Technology program. Some of the areas discussed in this paper include technology definition, task selection, technology management, and technology assessment. This paper also provides an update of the 2003 MSL technology program and examines some of the drivers that changed the program from its initiation.

  13. Overview of Mars Technology Program

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A.

    2006-01-01

    This viewgraph presentation reviews the development of a technology program leading to Mars missions. The presentation includes: the goals of technology program, elements of technology program, program metrics, major accomplishments, examples and Information about the Mars Technology Program.

  14. Protecting Dark Skies in Chile

    NASA Astrophysics Data System (ADS)

    Smith, R. Chris; Sanhueza, Pedro; Phillips, Mark

    2018-01-01

    Current projections indicate that Chile will host approximately 70% of the astronomical collecting area on Earth by 2030, augmenting the enormous area of ALMA with that of three next-generation optical telescopes: LSST, GMTO, and E-ELT. These cutting-edge facilities represent billions of dollars of investment in the astronomical facilities hosted in Chile. The Chilean government, Chilean astronomical community, and the international observatories in Chile have recognized that these investments are threatened by light pollution, and have formed a strong collaboration to work at managing the threats. We will provide an update on the work being done in Chile, ranging from training municipalities about new lighting regulations to exploring international recognition of the dark sky sites of Northern Chile.

  15. Machine Learning for Zwicky Transient Facility

    NASA Astrophysics Data System (ADS)

    Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey

    2018-01-01

    The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.

  16. Agile software development in an earned value world: a survival guide

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin

    2016-08-01

    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.

  17. Characterising CCDs with cosmic rays

    DOE PAGES

    Fisher-Levine, M.; Nomerotski, A.

    2015-08-06

    The properties of cosmic ray muons make them a useful probe for measuring the properties of thick, fully depleted CCD sensors. The known energy deposition per unit length allows measurement of the gain of the sensor's amplifiers, whilst the straightness of the tracks allows for a crude assessment of the static lateral electric fields at the sensor's edges. The small volume in which the muons deposit their energy allows measurement of the contribution to the PSF from the diffusion of charge as it drifts across the sensor. In this work we present a validation of the cosmic ray gain measurementmore » technique by comparing with radioisotope gain measurments, and calculate the charge diffusion coefficient for prototype LSST sensors.« less

  18. Multi-Wavelength Photometric Identification of Quenching Galaxies in ZFOURGE

    NASA Astrophysics Data System (ADS)

    Forrest, Ben; Tran, Kim-Vy; ZFOURGE Collaboration

    2018-01-01

    In the new millennium, multi-wavelength photometric surveys of thousands of galaxies, such as SDSS, CANDELS, NMBS, and ZFOURGE have become the standard for analyzing large populations.With ongoing surveys such as DES, and upcoming programs with LSST and JWST, finding ways to leverage large amounts of data will continue to be an area of important research.Many diagnostics have been used to classify these galaxies, most notably the rest-frame UVJ color-color diagram, which splits galaxies into star-forming and quiescent populations.With the plethora of data probing wavelengths outside of the optical however, we can do better.In this talk I present a scheme for classifying galaxies with using composite SEDs that clearly reveals rare populations such as extreme emission line galaxies and post-starburst galaxies.We use a sample of ~8000 galaxies from ZFOURGE which have SNR_Ks>20, observations from 0.3-8 microns, and are at 1

  19. Initiating the 2002 Mars Science Laboratory (MSL) Focused Technology Program

    NASA Technical Reports Server (NTRS)

    Caffrey, Robert T.; Udomkesmalee, Gabriel; Hayati, Samad A.

    2004-01-01

    The Mars Science Laboratory (MSL) Project is an aggressive mission launching in 2009 to deliver a new generation of rover safely to the surface of Mars and conduct comprehensive in situ investigations using a new generation of instruments. This system will be designed to land with precision and be capable of operating over a large percentage on the surface of Mars. It will have capabilities that will support NASA's scientific goals into the next decade of exphation. The MSL Technology program is developing a wide-range of technologies needed for this Mission and potentially other space missions. The MSL Technology Program reports to both the MSL Project and the Mars Technology Program (MTP). The dual reporting process creates a challenging management situation, but ensures the new technology meets both the specific MSL requirements and the broader Mars Program requirements. MTP is a NASA-wide technology development program managed by the Jet Propulsion Laboratory (JPL) and is divided into a Focused Program and a Base Program. The Focused Technology Program addresses technologies that are specific and critical to near-term missions, while the Base Technology Program addresses those technologies that are applicable to multiple missions and which can be characterized as longer term, higher risk, and high payoff technologies. The MSL Technology Program is under the Focused Program and is tightly coupled to MSL's mission milestones and deliverables. The technology budget is separate from the flight Project budget, but the technology s requirements and the development process are tightly coordinated with the Project. The Technology Program combines proven management techniques of flight projects with commercial and academic technology management strategies, to create a technology management program that meets the near-term requirements of MSL and the long-term requirements of MTP. This paper examines the initiation of 2002 MSL Technology program. Some of the areas discussed in this paper include technology definition, task selection, technology management, and technology assessment.

  20. Advances in Telescope and Detector Technologies - Impacts on the Study and Understanding of Binary Star and Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Guinan, Edward F.; Engle, Scott; Devinney, Edward J.

    2012-04-01

    Current and planned telescope systems (both on the ground and in space) as well as new technologies will be discussed with emphasis on their impact on the studies of binary star and exoplanet systems. Although no telescopes or space missions are primarily designed to study binary stars (what a pity!), several are available (or will be shortly) to study exoplanet systems. Nonetheless those telescopes and instruments can also be powerful tools for studying binary and variable stars. For example, early microlensing missions (mid-1990s) such as EROS, MACHO and OGLE were initially designed for probing dark matter in the halos of galaxies but, serendipitously, these programs turned out to be a bonanza for the studies of eclipsing binaries and variable stars in the Magellanic Clouds and in the Galactic Bulge. A more recent example of this kind of serendipity is the Kepler Mission. Although Kepler was designed to discover exoplanet transits (and so far has been very successful, returning many planetary candidates), Kepler is turning out to be a ``stealth'' stellar astrophysics mission returning fundamentally important and new information on eclipsing binaries, variable stars and, in particular, providing a treasure trove of data of all types of pulsating stars suitable for detailed Asteroseismology studies. With this in mind, current and planned telescopes and networks, new instruments and techniques (including interferometers) are discussed that can play important roles in our understanding of both binary star and exoplanet systems. Recent advances in detectors (e.g. laser frequency comb spectrographs), telescope networks (both small and large - e.g. Super-WASP, HAT-net, RoboNet, Las Combres Observatory Global Telescope (LCOGT) Network), wide field (panoramic) telescope systems (e.g. Large Synoptic Survey Telescope (LSST) and Pan-Starrs), huge telescopes (e.g. the Thirty Meter Telescope (TMT), the Overwhelming Large Telescope (OWL) and the Extremely Large Telescope (ELT)), and space missions, such as the James Webb Space Telescope (JWST), the possible NASA Explorer Transiting Exoplanet Survey Satellite (TESS - recently approved for further study) and Gaia (due for launch during 2013) will all be discussed. Also highlighted are advances in interferometers (both on the ground and from space) and imaging now possible at sub-millimeter wavelengths from the Extremely Long Array (ELVA) and Atacama Large Millimeter Array (ALMA). High precision Doppler spectroscopy, for example with HARPS, HIRES and more recently the Carnegie Planet Finder Spectrograph, are currently returning RVs typically better than ~2-m/s for some brighter exoplanet systems. But soon it should be possible to measure Doppler shifts as small as ~10-cm/s - sufficiently sensitive for detecting Earth-size planets. Also briefly discussed is the impact these instruments will have on the study of eclipsing binaries, along with future possibilities of utilizing methods from the emerging field of Astroinformatics, including: the Virtual Observatory (VO) and the possibilities of analyzing these huge datasets using Neural Network (NN) and Artificial Intelligence (AI) technologies.

  1. Constraining neutrino masses with the integrated-Sachs-Wolfe-galaxy correlation function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesgourgues, Julien; Valkenburg, Wessel; Gaztanaga, Enrique

    2008-03-15

    Temperature anisotropies in the cosmic microwave background (CMB) are affected by the late integrated Sachs-Wolfe (lISW) effect caused by any time variation of the gravitational potential on linear scales. Dark energy is not the only source of lISW, since massive neutrinos induce a small decay of the potential on small scales during both matter and dark energy domination. In this work, we study the prospect of using the cross correlation between CMB and galaxy-density maps as a tool for constraining the neutrino mass. On the one hand massive neutrinos reduce the cross-correlation spectrum because free-streaming slows down structure formation; onmore » the other hand, they enhance it through their change in the effective linear growth. We show that in the observable range of scales and redshifts, the first effect dominates, but the second one is not negligible. We carry out an error forecast analysis by fitting some mock data inspired by the Planck satellite, Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST). The inclusion of the cross correlation data from Planck and LSST increases the sensitivity to the neutrino mass m{sub {nu}} by 38% (and to the dark energy equation of state w by 83%) with respect to Planck alone. The correlation between Planck and DES brings a far less significant improvement. This method is not potentially as good for detecting m{sub {nu}} as the measurement of galaxy, cluster, or cosmic shear power spectra, but since it is independent and affected by different systematics, it remains potentially interesting if the total neutrino mass is of the order of 0.2 eV; if instead it is close to the lower bound from atmospheric oscillations, m{sub {nu}}{approx}0.05 eV, we do not expect the ISW-galaxy correlation to be ever sensitive to m{sub {nu}}.« less

  2. The Future of Astrometric Education

    NASA Astrophysics Data System (ADS)

    van Altena, W.; Stavinschi, M.

    2005-10-01

    Astrometry is poised to enter an era of unparalleled growth and relevance due to the wealth of highly accurate data expected from the SIM and GAIA space missions. Innovative ground-based telescopes, such as the LSST, are planned which will provide less precise data, but for many more stars. The potential for studies of the structure, kinematics and dynamics of our Galaxy as well as for the physical nature of stars and the cosmological distance scale is without equal in the history of astronomy. It is therefore ironic that in two years not one course in astrometry will be taught in the US, leaving all astrometric education to Europe, China and Latin America. Who will ensure the astrometric quality control for the JWT, SIM, GAIA, LSST, to say nothing about the current large ground-based facilities, such as the VLT, Gemini, Keck, NOAO, Magellan, LBT, etc.? Hipparcos and the HST were astrometric successes due only to the dedicated work of specialists in astrometry who fought to maintain the astrometric characteristics of those satellites and their data pipelines. We propose a renewal of astrometric education in the universities to prepare qualified scientists so that the scientific returns from the investment of billions of dollars in these unique facilities will be maximized. The funding agencies are providing outstanding facilities. The universities, national and international observatories and agencies should acknowledge their responsibility to hire qualified full-time astrometric scientists to teach students, and to supervise existing and planned astronomical facilities so that quality data will be obtained and analyzed. A temporary solution to this problem is proposed in the form of a series of international summer schools in Astrometry. The Michelson Science Center of the SIM project has offered to hold an astrometry summer school in 2005 to begin this process. A one-semester syllabus is suggested as a means of meeting the needs of Astronomy by educating students in astrometric techniques that might be most valuable for careers associated with modern astrophysics.

  3. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    NASA Astrophysics Data System (ADS)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy data workforce encompasses a greater breadth of educational backgrounds. Results show that teams of individuals with distinct expertise are key to ensuring the long-term preservation and usability of astronomy datasets.

  4. Color Me Intrigued: The Discovery of iPTF 16fnm, an SN 2002cx-like Object

    NASA Astrophysics Data System (ADS)

    Miller, A. A.; Kasliwal, M. M.; Cao, Y.; Adams, S. M.; Goobar, A.; Knežević, S.; Laher, R. R.; Lunnan, R.; Masci, F. J.; Nugent, P. E.; Perley, D. A.; Petrushevska, T.; Quimby, R. M.; Rebbapragada, U. D.; Sollerman, J.; Taddia, F.; Kulkarni, S. R.

    2017-10-01

    Modern wide-field, optical time-domain surveys must solve a basic optimization problem: maximize the number of transient discoveries or minimize the follow-up needed for the new discoveries. Here, we describe the Color Me Intrigued experiment, the first from the intermediate Palomar Transient Factory (iPTF) to search for transients simultaneously in the g PTF and R PTF bands. During the course of this experiment, we discovered iPTF 16fnm, a new member of the 02cx-like subclass of Type Ia supernovae (SNe). iPTF 16fnm peaked at {M}{g{PTF}}=-15.09+/- 0.17 {mag}, making it the second-least-luminous known SN Ia. iPTF 16fnm exhibits all the hallmarks of the 02cx-like class: (I) low luminosity at peak, (II) low ejecta velocities, and (III) a non-nebular spectrum several months after peak. Spectroscopically, iPTF 16fnm exhibits a striking resemblance to two other low-luminosity 02cx-like SNe: SN 2007qd and SN 2010ae. iPTF 16fnm and SN 2005hk decline at nearly the same rate, despite a 3 mag difference in brightness at peak. When considering the full subclass of 02cx-like SNe, we do not find evidence for a tight correlation between peak luminosity and decline rate in either the g‧ or r‧ band. We measure the relative rate of 02cx-like SNe to normal SNe Ia and find {r}{N02{cx}/{N}{Ia}}={33}-25+158 % . We further examine the g‧ - r‧ evolution of 02cx-like SNe and find that their unique color evolution can be used to separate them from 91bg-like and normal SNe Ia. This selection function will be especially important in the spectroscopically incomplete Zwicky Transient Facility/Large Synoptic Survey Telescope (LSST) era. Finally, we close by recommending that LSST periodically evaluate, and possibly update, its observing cadence to maximize transient science.

  5. Liverpool telescope 2: a new robotic facility for rapid transient follow-up

    NASA Astrophysics Data System (ADS)

    Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Bersier, D.; Bode, M. F.; Carter, D.; Clay, N. R.; Collins, C. A.; Darnley, M. J.; Davis, C. J.; Gutierrez, C. M.; Harman, D. J.; James, P. A.; Knapen, J. H.; Kobayashi, S.; Marchant, J. M.; Mazzali, P. A.; Mottram, C. J.; Mundell, C. G.; Newsam, A.; Oscoz, A.; Palle, E.; Piascik, A.; Rebolo, R.; Smith, R. J.

    2015-03-01

    The Liverpool Telescope is one of the world's premier facilities for time domain astronomy. The time domain landscape is set to radically change in the coming decade, with synoptic all-sky surveys such as LSST providing huge numbers of transient detections on a nightly basis; transient detections across the electromagnetic spectrum from other major facilities such as SVOM, SKA and CTA; and the era of `multi-messenger astronomy', wherein astrophysical events are detected via non-electromagnetic means, such as neutrino or gravitational wave emission. We describe here our plans for the Liverpool Telescope 2: a new robotic telescope designed to capitalise on this new era of time domain astronomy. LT2 will be a 4-metre class facility co-located with the Liverpool Telescope at the Observatorio del Roque de Los Muchachos on the Canary island of La Palma. The telescope will be designed for extremely rapid response: the aim is that the telescope will take data within 30 seconds of the receipt of a trigger from another facility. The motivation for this is twofold: firstly it will make it a world-leading facility for the study of fast fading transients and explosive phenomena discovered at early times. Secondly, it will enable large-scale programmes of low-to-intermediate resolution spectral classification of transients to be performed with great efficiency. In the target-rich environment of the LSST era, minimising acquisition overheads will be key to maximising the science gains from any follow-up programme. The telescope will have a diverse instrument suite which is simultaneously mounted for automatic changes, but it is envisaged that the primary instrument will be an intermediate resolution, optical/infrared spectrograph for scientific exploitation of transients discovered with the next generation of synoptic survey facilities. In this paper we outline the core science drivers for the telescope, and the requirements for the optical and mechanical design.

  6. Prime Focus Spectrograph: A very wide-field, massively multiplexed, optical & near-infrared spectrograph for Subaru Telescope

    NASA Astrophysics Data System (ADS)

    TAMURA, NAOYUKI

    2015-08-01

    PFS (Prime Focus Spectrograph), a next generation facility instrument on Subaru, is a very wide-field, massively-multiplexed, and optical & near-infrared spectrograph. Exploiting the Subaru prime focus, 2400 reconfigurable fibers will be distributed in the 1.3 degree field. The spectrograph will have 3 arms of blue, red, and near-infrared cameras to simultaneously observe spectra from 380nm to 1260nm at one exposure. The development of this instrument has been undertaken by the international collaboration at the initiative of Kavli IPMU. The project is now going into the construction phase aiming at system integration and on-sky commissioning in 2017-2018, and science operation in 2019. In parallel, the survey design has also been developed envisioning a Subaru Strategic Program (SSP) that spans roughly speaking 300 nights over 5 years. The major science areas are three-folds: Cosmology, galaxy/AGN evolution, and Galactic archaeology (GA). The cosmology program will be to constrain the nature of dark energy via a survey of emission line galaxies over a comoving volume of ~10 Gpc^3 in the redshift range of 0.8 < z < 2.4. In the GA program, radial velocities and chemical abundances of stars in the Milky Way, dwarf spheroidal galaxies, and M31 will be used to understand the past assembly histories of those galaxies and the structures of their dark matter halos. Spectra will be taken for ~1 million stars as faint as V = 22 therefore out to large distances from the Sun. For the extragalactic program, our simulations suggest the wide wavelength coverage of PFS will be particularly powerful in probing the galaxy populations and its clustering properties over a wide redshift range. We will conduct a survey of color-selected 1 < z < 2 galaxies and AGN over 20 square degrees down to J = 23.4, yielding a fair sample of galaxies with stellar masses above ˜10^10 solar masses. Further, PFS will also provide unique spectroscopic opportunities even in the era of Euclid, LSST, WFIRST and TMT. In this presentation, an overview of the instrument, current project status and path forward will be given.

  7. Large-Scale Overlays and Trends: Visually Mining, Panning and Zooming the Observable Universe.

    PubMed

    Luciani, Timothy Basil; Cherinka, Brian; Oliphant, Daniel; Myers, Sean; Wood-Vasey, W Michael; Labrinidis, Alexandros; Marai, G Elisabeta

    2014-07-01

    We introduce a web-based computing infrastructure to assist the visual integration, mining and interactive navigation of large-scale astronomy observations. Following an analysis of the application domain, we design a client-server architecture to fetch distributed image data and to partition local data into a spatial index structure that allows prefix-matching of spatial objects. In conjunction with hardware-accelerated pixel-based overlays and an online cross-registration pipeline, this approach allows the fetching, displaying, panning and zooming of gigabit panoramas of the sky in real time. To further facilitate the integration and mining of spatial and non-spatial data, we introduce interactive trend images-compact visual representations for identifying outlier objects and for studying trends within large collections of spatial objects of a given class. In a demonstration, images from three sky surveys (SDSS, FIRST and simulated LSST results) are cross-registered and integrated as overlays, allowing cross-spectrum analysis of astronomy observations. Trend images are interactively generated from catalog data and used to visually mine astronomy observations of similar type. The front-end of the infrastructure uses the web technologies WebGL and HTML5 to enable cross-platform, web-based functionality. Our approach attains interactive rendering framerates; its power and flexibility enables it to serve the needs of the astronomy community. Evaluation on three case studies, as well as feedback from domain experts emphasize the benefits of this visual approach to the observational astronomy field; and its potential benefits to large scale geospatial visualization in general.

  8. Next Generation Search Interfaces

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2015-09-01

    Astronomers are constantly looking for easier ways to access multiple data sets. While much effort is spent on VO, little thought is given to the types of User Interfaces we need to effectively search this sort of data. For instance, an astronomer might need to search Spitzer, WISE, and 2MASS catalogs and images then see the results presented together in one UI. Moving seamlessly between data sets is key to presenting integrated results. Results need to be viewed using first class, web based, integrated FITS viewers, XY Plots, and advanced table display tools. These components should be able to handle very large datasets. To make a powerful Web based UI that can manage and present multiple searches to the user requires taking advantage of many HTML5 features. AJAX is used to start searches and present results. Push notifications (Server Sent Events) monitor background jobs. Canvas is required for advanced result displays. Lesser known CSS3 technologies makes it all flow seamlessly together. At IPAC, we have been developing our Firefly toolkit for several years. We are now using it to solve this multiple data set, multiple queries, and integrated presentation problem to create a powerful research experience. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). Firefly is the core for applications serving many project archives, including Spitzer, Planck, WISE, PTF, LSST and others. It is also used in IRSA's new Finder Chart and catalog and image displays.

  9. The e-ASTROGAM mission. Exploring the extreme Universe with gamma rays in the MeV - GeV range

    NASA Astrophysics Data System (ADS)

    De Angelis, A.; Tatischeff, V.; Tavani, M.; Oberlack, U.; Grenier, I.; Hanlon, L.; Walter, R.; Argan, A.; von Ballmoos, P.; Bulgarelli, A.; Donnarumma, I.; Hernanz, M.; Kuvvetli, I.; Pearce, M.; Zdziarski, A.; Aboudan, A.; Ajello, M.; Ambrosi, G.; Bernard, D.; Bernardini, E.; Bonvicini, V.; Brogna, A.; Branchesi, M.; Budtz-Jorgensen, C.; Bykov, A.; Campana, R.; Cardillo, M.; Coppi, P.; De Martino, D.; Diehl, R.; Doro, M.; Fioretti, V.; Funk, S.; Ghisellini, G.; Grove, E.; Hamadache, C.; Hartmann, D. H.; Hayashida, M.; Isern, J.; Kanbach, G.; Kiener, J.; Knödlseder, J.; Labanti, C.; Laurent, P.; Limousin, O.; Longo, F.; Mannheim, K.; Marisaldi, M.; Martinez, M.; Mazziotta, M. N.; McEnery, J.; Mereghetti, S.; Minervini, G.; Moiseev, A.; Morselli, A.; Nakazawa, K.; Orleanski, P.; Paredes, J. M.; Patricelli, B.; Peyré, J.; Piano, G.; Pohl, M.; Ramarijaona, H.; Rando, R.; Reichardt, I.; Roncadelli, M.; Silva, R.; Tavecchio, F.; Thompson, D. J.; Turolla, R.; Ulyanov, A.; Vacchi, A.; Wu, X.; Zoglauer, A.

    2017-10-01

    e-ASTROGAM (`enhanced ASTROGAM') is a breakthrough Observatory space mission, with a detector composed by a Silicon tracker, a calorimeter, and an anticoincidence system, dedicated to the study of the non-thermal Universe in the photon energy range from 0.3 MeV to 3 GeV - the lower energy limit can be pushed to energies as low as 150 keV, albeit with rapidly degrading angular resolution, for the tracker, and to 30 keV for calorimetric detection. The mission is based on an advanced space-proven detector technology, with unprecedented sensitivity, angular and energy resolution, combined with polarimetric capability. Thanks to its performance in the MeV-GeV domain, substantially improving its predecessors, e-ASTROGAM will open a new window on the non-thermal Universe, making pioneering observations of the most powerful Galactic and extragalactic sources, elucidating the nature of their relativistic outflows and their effects on the surroundings. With a line sensitivity in the MeV energy range one to two orders of magnitude better than previous generation instruments, e-ASTROGAM will determine the origin of key isotopes fundamental for the understanding of supernova explosion and the chemical evolution of our Galaxy. The mission will provide unique data of significant interest to a broad astronomical community, complementary to powerful observatories such as LIGO-Virgo-GEO600-KAGRA, SKA, ALMA, E-ELT, TMT, LSST, JWST, Athena, CTA, IceCube, KM3NeT, and the promise of eLISA.

  10. The Mars Technology Program

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A.

    2002-01-01

    Future Mars missions require new capabilities that currently are not available. The Mars Technology Program (MTP) is an integral part of the Mars Exploration Program (MEP). Its sole purpose is to assure that required technologies are developed in time to enable the baselined and future missions. The MTP is a NASA-wide technology development program managed by JPL. It is divided into a Focused Program and a Base Program. The Focused Program is tightly tied to the proposed Mars Program mission milestones. It involves time-critical deliverables that must be developed in time for infusion into the proposed Mars 2005, and, 2009 missions. In addition a technology demonstration mission by AFRL will test a LIDAR as part of a joint NASNAFRL experiment. This program bridges the gap between technology and projects by vertically integrating the technology work with pre-project development in a project-like environment with critical dates for technology infusion. A Base Technology Program attacks higher riskhigher payoff technologies not in the critical path of missions.

  11. 75 FR 3791 - Broadband Technology Opportunities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-22

    ... Department of Agriculture Rural Utilities Service Broadband Technology Opportunities Program; Notices #0;#0... 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National Telecommunications and... for the Broadband Technology Opportunities Program (BTOP or Program) that the agency established...

  12. Automated software configuration in the MONSOON system

    NASA Astrophysics Data System (ADS)

    Daly, Philip N.; Buchholz, Nick C.; Moore, Peter C.

    2004-09-01

    MONSOON is the next generation OUV-IR controller project being developed at NOAO. The design is flexible, emphasizing code re-use, maintainability and scalability as key factors. The software needs to support widely divergent detector systems ranging from multi-chip mosaics (for LSST, QUOTA, ODI and NEWFIRM) down to large single or multi-detector laboratory development systems. In order for this flexibility to be effective and safe, the software must be able to configure itself to the requirements of the attached detector system at startup. The basic building block of all MONSOON systems is the PAN-DHE pair which make up a single data acquisition node. In this paper we discuss the software solutions used in the automatic PAN configuration system.

  13. The Dynamics of the Local Group in the Era of Precision Astrometry

    NASA Astrophysics Data System (ADS)

    Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta

    2018-06-01

    Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.

  14. Site Remediation Technology InfoBase: A Guide to Federal Programs, Information Resources, and Publications on Contaminated Site Cleanup Technologies. First Edition

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Table of Contents: Federal Cleanup Programs; Federal Site Remediation Technology Development Assistance Programs; Federal Site Remediation Technology Development Electronic Data Bases; Federal Electronic Resources for Site Remediation Technology Information; Other Electronic Resources for Site Remediation Technology Information; Other Electronic Resources for Site Remediation Technology Information; Selected Bibliography: Federal Publication on Alternative and Innovative Site Remediation; and Appendix: Technology Program Contacts.

  15. Mississippi Curriculum Framework for Drafting and Design Technology (Program CIP: 48.0102--Architectural Drafting Technology) (Program CIP: 48.0101--General Drafting). Postsecondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the two course sequences of the state's postsecondary-level drafting and design technology program: architectural drafting technology and drafting and design technology. Presented first are a program description and…

  16. Space and nuclear research and technology

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A fact sheet is presented on the space and nuclear research and technology program consisting of a research and technology base, system studies, system technology programs, entry systems technology, and experimental programs.

  17. Humanities Perspectives on Technology Program: Science, Technology & Society Program. Lehigh University, 1977-80.

    ERIC Educational Resources Information Center

    Cutcliffe, Stephen H., Ed.

    Newsletter issues pertaining to Lehigh University's Humanities Perspectives on Technology (HPT) Program, which was renamed the Science, Technology and Society Program, are presented. Additionally, a newsletter article excerpt entitled "Elements of Technology in a Liberal Education" is included. Two 1977 issues of "HRP News,"…

  18. Fuel Cell and Hydrogen Technologies Program | Hydrogen and Fuel Cells |

    Science.gov Websites

    NREL Fuel Cell and Hydrogen Technologies Program Fuel Cell and Hydrogen Technologies Program Through its Fuel Cell and Hydrogen Technologies Program, NREL researches, develops, analyzes, and validates fuel cell and hydrogen production, delivery, and storage technologies for transportation

  19. 75 FR 27984 - Broadband Technology Opportunities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ....: 0907141137-0222-10] RIN 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National...; Reopening of Application Filing Window for Broadband Technology Opportunities Program Comprehensive... filing window for the Broadband Technology Opportunities Program (BTOP) that the agency established...

  20. 75 FR 20038 - Railroad Safety Technology Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ...] Railroad Safety Technology Grant Program AGENCY: Federal Railroad Administration, Department of Transportation. ACTION: Notice of Funds Availability, Railroad Safety Technology Program-Correction of Grant... Railroad Safety Technology Program, in the section, ``Requirements and Conditions for Grant Applications...

  1. NASA's Microgravity Technology Report, 1996: Summary of Activities

    NASA Technical Reports Server (NTRS)

    Kierk, Isabella

    1996-01-01

    This report covers technology development and technology transfer activities within the Microgravity Science Research Programs during FY 1996. It also describes the recent major tasks under the Advanced Technology Development (ATD) Program and identifies current technology requirements. This document is consistent with NASA,s Enteprise for the Human Exploration and development of Space (HEDS) Strategic Plan. This annual update reflects changes in the Microgravity Science Research Program's new technology activities and requirements. Appendix A. FY 1996 Advanced Technology Development. Program and Project Descriptions. Appendix B. Technology Development.

  2. 34 CFR 403.1 - What is the State Vocational and Applied Technology Education Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false What is the State Vocational and Applied Technology... TECHNOLOGY EDUCATION PROGRAM General § 403.1 What is the State Vocational and Applied Technology Education Program? (a) Under the State Vocational and Applied Technology Education Program, the Secretary makes...

  3. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  4. CSTI Earth-to-orbit propulsion research and technology program overview

    NASA Technical Reports Server (NTRS)

    Gentz, Steven J.

    1993-01-01

    NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.

  5. 76 FR 18166 - Technology Innovation Program Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Technology Innovation Program Advisory Board AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of public meeting. SUMMARY: The Technology Innovation Program Advisory Board, National...

  6. 75 FR 62369 - Technology Innovation Program Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Technology Innovation Program Advisory Board AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of public meeting. SUMMARY: The Technology Innovation Program Advisory Board, National...

  7. 75 FR 22553 - Technology Innovation Program Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Technology Innovation Program Advisory Board AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of public meeting. SUMMARY: The Technology Innovation Program Advisory Board, National...

  8. 34 CFR 400.1 - What is the purpose of the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Technology Education Programs? 400.1 Section 400.1 Education Regulations of the Offices of the Department of... APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.1 What is the purpose of the Vocational and Applied Technology Education Programs? (a) The purpose of the Vocational and Applied Technology Education...

  9. 76 FR 70970 - Technology Innovation Program Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Technology Innovation Program Advisory Board AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of public meeting. SUMMARY: The Technology Innovation Program (TIP) Advisory Board will...

  10. Teaching Machines, Programming, Computers, and Instructional Technology: The Roots of Performance Technology.

    ERIC Educational Resources Information Center

    Deutsch, William

    1992-01-01

    Reviews the history of the development of the field of performance technology. Highlights include early teaching machines, instructional technology, learning theory, programed instruction, the systems approach, needs assessment, branching versus linear program formats, programing languages, and computer-assisted instruction. (LRW)

  11. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM - TECHNOLOGY PROFILES 4th Edition

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program evaluates new and promising treatment technologies for cleanup of hazardous waste sites. The program was created to encourage the development and routine use of innovative treatment technologies. As a result, the SI...

  12. Evidence-Based Medicine and State Health Care Coverage: The Washington Health Technology Assessment Program.

    PubMed

    Rothman, David J; Blackwood, Kristy L; Adair, Whitney; Rothman, Sheila M

    2018-04-01

    To evaluate the Washington State Health Technology Assessment Program (WHTAP). Washington State Health Technology Assessment Program proceedings in Seattle, Washington. We assessed the program through observation of its proceedings over a 5-year period, 2009-2014. We conducted detailed analyses of the documents it produced and reviewed relevant literature. Washington State Health Technology Assessment Program is unique compared to other state and federal programs. It has successfully applied evidence-based medicine to health care decision making, limited by the strength of available data. It claims cost savings, but they are not substantiated. Washington State Health Technology Assessment Program is a useful model for other states considering implementation of technology assessment programs. We provide key lessons for improving WHTAP's process. © Health Research and Educational Trust.

  13. Supernovae and cosmology with future European facilities.

    PubMed

    Hook, I M

    2013-06-13

    Prospects for future supernova surveys are discussed, focusing on the European Space Agency's Euclid mission and the European Extremely Large Telescope (E-ELT), both expected to be in operation around the turn of the decade. Euclid is a 1.2 m space survey telescope that will operate at visible and near-infrared wavelengths, and has the potential to find and obtain multi-band lightcurves for thousands of distant supernovae. The E-ELT is a planned, general-purpose ground-based, 40-m-class optical-infrared telescope with adaptive optics built in, which will be capable of obtaining spectra of type Ia supernovae to redshifts of at least four. The contribution to supernova cosmology with these facilities will be discussed in the context of other future supernova programmes such as those proposed for DES, JWST, LSST and WFIRST.

  14. Using Deep Learning to Analyze the Voices of Stars.

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas Macaulay

    2018-01-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.

  15. The Maunakea Spectroscopic ExplorerStatus and System overview

    NASA Astrophysics Data System (ADS)

    Mignot, S.; Murowinski, R.; Szeto, K.; Blin, A.; Caillier, P.

    2017-12-01

    The Maunakea Spectroscopic Explorer (MSE) project explores the possibility of upgrading the existing CFHT telescope and collaboration to turn it into the most powerful spectroscopic facility available in the years 2020s. Its 10 meter aperture and its 1.5°² hexagonal field of view will allow both large and deep surveys, as complements to current (Gaia, eRosita, LOFAR) and future imaging (Euclid, WFIRST, SKA, LSST) surveys, but also to provide tentative targets to the TMT or the E-ELT. In perfect agreement with INSU's 2015-2020 prospective, besides being well represented in MSE's science team (23/105 members), France is also a major contributor to the Conceptual Design studies with CRAL developing a concept for the low and moderate spectrographs, DT INSU for the prime focus environment and GEPI for systems engineering.

  16. Center for development technology and program in technology and human affairs. [emphasizing technology-based networks

    NASA Technical Reports Server (NTRS)

    Wong, M. D.

    1974-01-01

    The role of technology in nontraditional higher education with particular emphasis on technology-based networks is analyzed nontraditional programs, institutions, and consortia are briefly reviewed. Nontraditional programs which utilize technology are studied. Technology-based networks are surveyed and analyzed with regard to kinds of students, learning locations, technology utilization, interinstitutional relationships, cost aspects, problems, and future outlook.

  17. 34 CFR 400.9 - What additional requirements govern the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Applied Technology Education Programs? 400.9 Section 400.9 Education Regulations of the Offices of the... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.9 What additional requirements govern the Vocational and Applied Technology Education Programs? In addition to the Act, applicable...

  18. Strengthening 4-H Program Communication through Technology

    ERIC Educational Resources Information Center

    Robideau, Kari; Santl, Karyn

    2011-01-01

    Advances in technology are transforming how youth and parents interact with programs. The Strengthening 4-H Communication through Technology project was implemented in eight county 4-H programs in Northwest Minnesota. This article outlines the intentional process used to effectively implement technology in program planning. The project includes:…

  19. 34 CFR 400.9 - What additional requirements govern the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Applied Technology Education Programs? 400.9 Section 400.9 Education Regulations of the Offices of the... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.9 What additional requirements govern the Vocational and Applied Technology Education Programs? In addition to the Act, applicable...

  20. 34 CFR 400.9 - What additional requirements govern the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Applied Technology Education Programs? 400.9 Section 400.9 Education Regulations of the Offices of the... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.9 What additional requirements govern the Vocational and Applied Technology Education Programs? In addition to the Act, applicable...

  1. 34 CFR 400.9 - What additional requirements govern the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Applied Technology Education Programs? 400.9 Section 400.9 Education Regulations of the Offices of the... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.9 What additional requirements govern the Vocational and Applied Technology Education Programs? In addition to the Act, applicable...

  2. 34 CFR 400.9 - What additional requirements govern the Vocational and Applied Technology Education Programs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Applied Technology Education Programs? 400.9 Section 400.9 Education Regulations of the Offices of the... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS-GENERAL PROVISIONS § 400.9 What additional requirements govern the Vocational and Applied Technology Education Programs? In addition to the Act, applicable...

  3. Clean Coal Technology Demonstration Program: Program Update 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assistant Secretary for Fossil Energy

    1999-03-01

    Annual report on the Clean Coal Technology Demonstration Program (CCT Program). The report address the role of the CCT Program, implementation, funding and costs, accomplishments, project descriptions, legislative history, program history, environmental aspects, and project contacts. The project descriptions describe the technology and provides a brief summary of the demonstration results.

  4. 10 CFR 611.202 - Advanced Technology Vehicle Manufacturing Facility Award Program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Advanced Technology Vehicle Manufacturing Facility Award... TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Facility/Funding Awards § 611.202 Advanced Technology Vehicle Manufacturing Facility Award Program. DOE may issue, under the Advanced Technology Vehicle...

  5. 10 CFR 611.202 - Advanced Technology Vehicle Manufacturing Facility Award Program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Advanced Technology Vehicle Manufacturing Facility Award... TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Facility/Funding Awards § 611.202 Advanced Technology Vehicle Manufacturing Facility Award Program. DOE may issue, under the Advanced Technology Vehicle...

  6. 10 CFR 611.202 - Advanced Technology Vehicle Manufacturing Facility Award Program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Advanced Technology Vehicle Manufacturing Facility Award... TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Facility/Funding Awards § 611.202 Advanced Technology Vehicle Manufacturing Facility Award Program. DOE may issue, under the Advanced Technology Vehicle...

  7. 10 CFR 611.202 - Advanced Technology Vehicle Manufacturing Facility Award Program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Advanced Technology Vehicle Manufacturing Facility Award... TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Facility/Funding Awards § 611.202 Advanced Technology Vehicle Manufacturing Facility Award Program. DOE may issue, under the Advanced Technology Vehicle...

  8. 10 CFR 611.202 - Advanced Technology Vehicle Manufacturing Facility Award Program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Advanced Technology Vehicle Manufacturing Facility Award... TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Facility/Funding Awards § 611.202 Advanced Technology Vehicle Manufacturing Facility Award Program. DOE may issue, under the Advanced Technology Vehicle...

  9. NASA Funding Opportunities for Optical Fabrication and Testing Technology Development

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2013-01-01

    Technologies to fabricate and test optical components are required for NASA to accomplish its highest priority science missions. For example, the NRC ASTRO2010 Decadal Survey states that an advanced large-aperture UVOIR telescope is required to enable the next generation of compelling astrophysics and exo-planet science; and that present technology is not mature enough to affordably build and launch any potential UVOIR mission concept. The NRC 2012 NASA Space Technology Roadmaps and Priorities report states that the highest priority technology in which NASA should invest to 'Expand our understanding of Earth and the universe' is a new generation of astronomical telescopes. And, each of the Astrophysics division Program Office Annual Technology Reports (PATR), identifies specific technology needs. NASA has a variety of programs to fund enabling technology development: SBIR (Small Business Innovative Research); the ROSES APRA and SAT programs (Research Opportunities in Space and Earth Science; Astrophysics Research and Analysis program; Strategic Astrophysics Technology program); and several Office of the Chief Technologist (OCT) technology development programs.

  10. Research and Technology: 2003 Annual Report of the John F Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The John F. Kennedy Space Center (KSC) is America's Spaceport Technology Center. The KSC technology development program encompasses the efforts of the entire KSC team, consisting of Government and contractor personnel, working in partnership with academic institutions and commercial industry. KSC's assigned mission areas are space launch operations and spaceport and range technologies. KSC's technology development customers include current space transportation programs, future space transportation programs / initiatives, and enabling technical programs. The KSC Research and Technology 2003 Annual Report encompasses the efforts of contributors to the KSC advanced technology development program and KSC technology transfer activities. Dr. Dave Bartine, KSC Chief Technologist, (321) 867-7069, is responsible for publication of this report and should be contacted for any desired information regarding KSC's research and technology development activities.

  11. Aeronautics Research and Technology Program and specific objectives, fiscal year 1982

    NASA Technical Reports Server (NTRS)

    Olstad, W. B.

    1981-01-01

    The Aeronautics Research and Technology program is broken down into two program areas (research and technology base, and systems technology programs) which are further broken down into succeedingly more detailed activities to form a work breakdown structure for the aeronautics program: program area, program/discipline objective, specific objective, and research and technology objective and plan (RTOP). A detailed view of this work breakdown structure down to the specific objective level is provided, and goals or objectives at each of these levels are set forth. What is to be accomplished and why are addressed, but not how. The letter falls within the domain of the RTOP.

  12. Impact of emerging technologies on future combat aircraft agility

    NASA Technical Reports Server (NTRS)

    Nguyen, Luat T.; Gilert, William P.

    1990-01-01

    The foreseeable character of future within-visual-range air combat entails a degree of agility which calls for the integration of high-alpha aerodynamics, thrust vectoring, intimate pilot/vehicle interfaces, and advanced weapons/avionics suites, in prospective configurations. The primary technology-development programs currently contributing to these goals are presently discussed; they encompass the F-15 Short Takeoff and Landing/Maneuver Technology Demonstrator Program, the Enhanced Fighter Maneuverability Program, the High Angle-of-Attack Technology Program, and the X-29 Technology Demonstrator Program.

  13. Assessing the Impact and Effectiveness of the Advanced Technological Education (ATE) Program. Survey Results 2004. Volume III: Status of ATE Projects and Articulation Partnerships

    ERIC Educational Resources Information Center

    Coryn, Chris L.; Gullickson, Arlen R.; Hanssen, Carl E.

    2004-01-01

    The Advanced Technological Education (ATE) program is a federally funded program designed to educate technicians for the high-technology disciplines that drive the United State's economy. As stated in the ATE program guidelines, this program promotes improvement in technological education at the undergraduate and secondary school levels by…

  14. Analysis of Engineering Content within Technology Education Programs

    ERIC Educational Resources Information Center

    Fantz, Todd D.; Katsioloudis, Petros J.

    2011-01-01

    In order to effectively teach engineering, technology teachers need to be taught engineering content, concepts, and related pedagogy. Some researchers posit that technology education programs may not have enough content to prepare technology teachers to teach engineering design. Certain technology teacher education programs have responded by…

  15. Women in Technology: The Evolution of a Simple Program That Works.

    ERIC Educational Resources Information Center

    Crumb, Jean Marie; Fenton, Ray

    Three papers present views on women in technology programs and occupations, and on Corning Community College's (CCC's) program to encourage women to enter technological fields in which they have been historically underrepresented. First, Edward F. Herman presents the historical background to the development of CCC's Women in Technology program,…

  16. 15 CFR 296.33 - Annual report.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION PROGRAM... House of Representatives a report describing the Technology Innovation Program's activities, including a... for management of programs to stimulate high-risk, high-reward research. ...

  17. NASA Technology Applications Team

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The contributions of NASA to the advancement of the level of the technology base of the United States are highlighted. Technological transfer from preflight programs, the Viking program, the Apollo program, and the Shuttle and Skylab programs is reported.

  18. 34 CFR 400.2 - What programs are governed by these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAMS... apply to the Vocational and Applied Technology Education Programs as follows: (a) State-administered programs. (1) State Vocational and Applied Technology Education Program (34 CFR part 403). (2) State...

  19. Audiovisual Programming. Technology Learning Activity. Teacher Edition. Technology Education Series.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This packet of technology learning activity (TLA) materials on audiovisual programming for students in grades 6-10 consists of a technology education overview, information on use, and the instructor's and student's sections. The overview discusses the technology education program and materials. Components of the instructor's and student's sections…

  20. Identifying new technologies that save energy and reduce costs to the Federal sector: The New Technology Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, W.D.M.; Conover, D.R.; Stockmeyer, M.K.

    1995-11-01

    In 1990 the New Technology Demonstration Program (formerly the Test Bed Demonstration Program) was initiated by the US Department of Energy`s Office (DOE`s) of Federal Energy Management Programs with the purpose of accelerating the introduction of new technologies into the Federal sector. The program has since expanded into a multi-laboratory collaborative effort that evaluates new technologies and shares the results with the Federal design and procurement communities. These evaluations are performed on a collaborative basis which typically includes technology manufacturers, Federal facilities, utilities, trade associations, research institutes, and other in partnership with DOE. The end result is a range ofmore » effective technology transfer tools that provide operations and performance data on new technologies to Federal designers, building managers, and procurement officials. These tools assist in accelerating a technology`s Federal application and realizing reductions in energy consumption and costs.« less

  1. Free piston space Stirling technology program

    NASA Technical Reports Server (NTRS)

    Dochat, G. R.; Dhar, M.

    1989-01-01

    MTI recently completed an initial technology feasibility program for NASA by designing, fabricating and testing a space power demonstrator engine (SPDE). This program, which confirms the potential of free-piston Stirling engines, provided the major impetus to initiate a free-piston Stirling space engine (SSE) technology program. The accomplishments of the SPDE program are reviewed, and an overview of the SSE technology program and technical status to date is provided. It is shown that progress in both programs continues to justify its potential for either nuclear or solar space power missions.

  2. NASA funding opportunities for optical fabrication and testing technology development

    NASA Astrophysics Data System (ADS)

    Stahl, H. Philip

    2013-09-01

    NASA requires technologies to fabricate and test optical components to accomplish its highest priority science missions. The NRC ASTRO2010 Decadal Survey states that an advanced large-aperture UVOIR telescope is required to enable the next generation of compelling astrophysics and exo-planet science; and, that present technology is not mature enough to affordably build and launch any potential UVOIR mission concept. The NRC 2012 NASA Space Technology Roadmaps and Priorities Report states that the highest priority technology in which NASA should invest to `Expand our understanding of Earth and the universe' is next generation X-ray and UVOIR telescopes. Each of the Astrophysics division Program Office Annual Technology Reports (PATR) identifies specific technology needs. NASA has a variety of programs to fund enabling technology development: SBIR (Small Business Innovative Research); the ROSES APRA and SAT programs (Research Opportunities in Space and Earth Science; Astrophysics Research and Analysis program; Strategic Astrophysics Technology program); and several Office of the Chief Technologist (OCT) programs.

  3. Review of V/STOL lift/cruise fan technology

    NASA Technical Reports Server (NTRS)

    Rolls, L. S.; Quigley, H. C.; Perkins, R. G., Jr.

    1976-01-01

    This paper presents an overview of supporting technology programs conducted to reduce the risk in the joint NASA/Navy Lift/Cruise Fan Research and Technology Aircraft Program. The aeronautical community has endeavored to combine the low-speed and lifting capabilities of the helicopter with the high-speed capabilities of the jet aircraft; recent developments have indicated a lift/cruise fan propulsion system may provide these desired characteristics. NASA and the Navy have formulated a program that will provide a research and technology aircraft to furnish viability of the lift/cruise fan aircraft through flight experiences and obtain data on designs for future naval and civil V/STOL aircraft. The supporting technology programs discussed include: (1) design studies for operational aircraft, a research and technology aircraft, and associated propulsion systems; (2) wind-tunnel tests of several configurations; (3) propulsion-system thrust vectoring tests; and (4) simulation. These supporting technology programs have indicated that a satisfactory research and technology aircraft program can be accomplished within the current level of technology.

  4. NASA Funding Opportunities for Optical Fabrication and Testing Technology Development

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2013-01-01

    NASA requires technologies to fabricate and test optical components to accomplish its highest priority science missions. The NRC ASTRO2010 Decadal Survey states that an advanced large-aperture UVOIR telescope is required to enable the next generation of compelling astrophysics and exo-planet science; and, that present technology is not mature enough to affordably build and launch any potential UVOIR mission concept. The NRC 2012 NASA Space Technology Roadmaps and Priorities Report states that the highest priority technology in which NASA should invest to 'Expand our understanding of Earth and the universe' is next generation X-ray and UVOIR telescopes. Each of the Astrophysics division Program Office Annual Technology Reports (PATR) identifies specific technology needs. NASA has a variety of programs to fund enabling technology development: SBIR (Small Business Innovative Research); the ROSES APRA and SAT programs (Research Opportunities in Space and Earth Science; Astrophysics Research and Analysis program; Strategic Astrophysics Technology program); and several Office of the Chief Technologist (OCT) programs

  5. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM TECHNOLOGY PROFILES: SIXTH EDITION

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program evaluates new and promising treatment and monitoring and measurement technologies for cleanup of hazardous waste sites. The program was created to encourage the development and routine use of innovative treatment techn...

  6. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM - TECHNOLOGY PROFILES - SEVENTH EDITION

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program evaluates new and promising treatment and monitoring and measurement technologies for cleanup of hazardous waste sites. The program was created to encourage the development and routine use of innovative treatment techn...

  7. 75 FR 1591 - Green Technology Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ... DEPARTMENT OF COMMERCE Patent and Trademark Office Green Technology Pilot Program ACTION: Proposed... methods: E-mail: [email protected] . Include A0651-0062 Green Technology Pilot Program [email protected] in... green technologies, including greenhouse gas reduction. [[Page 1592

  8. NASA's In Space Propulsion Technology Program Accomplishments and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Johnson, Les C.; Harris, David

    2008-01-01

    NASA's In-Space Propulsion Technology (ISPT) Program was managed for 5 years at the NASA MSFC and significant strides were made in the advancement of key transportation technologies that will enable or enhance future robotic science and deep space exploration missions. At the program's inception, a set of technology investment priorities were established using an NASA-wide, mission-driven prioritization process and, for the most part, these priorities changed little - thus allowing a consistent framework in which to fund and manage technology development. Technologies in the portfolio included aerocapture, advanced chemical propulsion, solar electric propulsion, solar sail propulsion, electrodynamic and momentum transfer tethers, and various very advanced propulsion technologies with significantly lower technology readiness. The program invested in technologies that have the potential to revolutionize the robotic exploration of deep space. For robotic exploration and science missions, increased efficiencies of future propulsion systems are critical to reduce overall life-cycle costs and, in some cases, enable missions previously considered impossible. Continued reliance on conventional chemical propulsion alone will not enable the robust exploration of deep space - the maximum theoretical efficiencies have almost been reached and they are insufficient to meet needs for many ambitious science missions currently being considered. By developing the capability to support mid-term robotic mission needs, the program was to lay the technological foundation for travel to nearby interstellar space. The ambitious goals of the program at its inception included supporting the development of technologies that could support all of NASA's missions, both human and robotic. As time went on and budgets were never as high as planned, the scope of the program was reduced almost every year, forcing the elimination of not only the broader goals of the initial program, but also of funding for over half of the technologies in the original portfolio. In addition, the frequency at which the application requirements for the program changed exceeded the development time required to mature technologies: forcing sometimes radical rescoping of research efforts already halfway (or more) to completion. At the end of its fifth year, both the scope and funding of the program were at a minimum despite the program successfully meeting all of it's initial high priority objectives. This paper will describe the program, its requirements, technology portfolio, and technology maturation processes. Also discussed will be the major technology milestones achieved and the lessons learned from managing a $100M+ technology program.

  9. NASA/Goddard Thermal Technology Overview 2014

    NASA Technical Reports Server (NTRS)

    Butler, Daniel; Swanson, Theodore D.

    2014-01-01

    This presentation summarizes the current plans and efforts at NASA Goddard to develop new thermal control technology for anticipated future missions. It will also address some of the programmatic developments currently underway at NASA, especially with respect to the Technology Development Program at NASA. While funding for basic technology development is still scarce, significant efforts are being made in direct support of flight programs. New technology development continues to be driven by the needs of future missions, and applications of these technologies to current Goddard programs will be addressed. Many of these technologies also have broad applicability to DOD, DOE, and commercial programs. Partnerships have been developed with the Air Force, Navy, and various universities to promote technology development. In addition, technology development activities supported by internal research and development (IRAD) program, the Small Business Innovative Research (SBIR) program, and the NASA Engineering and Safety Center (NESC), are reviewed in this presentation. Specific technologies addressed include; two-phase systems applications and issues on NASA missions, latest developments of electro-hydrodynamically pumped systems, development of high electrical conductivity coatings, and various other research activities. New Technology program underway at NASA, although funding is limited center dot NASA/GSFC's primary mission of science satellite development is healthy and vibrant, although new missions are scarce - now have people on overhead working new missions and proposals center dot Future mission applications promise to be thermally challenging center dot Direct technology funding is still very restricted - Projects are the best source for direct application of technology - SBIR thermal subtopic resurrected in FY 14 - Limited Technology development underway via IRAD, NESC, other sources - Administrator pushing to revive technology and educational programs at NASA - new HQ directorate established

  10. ATP Interior Noise Technology and Flight Demonstration Program

    NASA Technical Reports Server (NTRS)

    Stephens, David G.; Powell, Clemans A.

    1988-01-01

    The paper provides an overview of the ATP (Advanced Turboprop Program) acoustics program with emphasis on the NASA technology program and the recent NASA/Industry demonstration programs aimed at understanding and controlling passenger cabin noise. Technology developments in propeller (source) noise, cabin noise transmission, and subjective acoustics are described. Finally, an overview of the industry demonstrator programs is presented.

  11. Video and Computer Technologies for Extended-Campus Programming.

    ERIC Educational Resources Information Center

    Sagan, Edgar L.; And Others

    This paper discusses video and computer technologies for extended-campus programming (courses and programs at off-campus sites). The first section provides an overview of the distance education program at the University of Kentucky (UK), and highlights the improved access to graduate and professional programs, advances in technology, funding,…

  12. Spinoff, 1991

    NASA Technical Reports Server (NTRS)

    Haggerty, James J.

    1991-01-01

    This is an instrument of the Technology Utilization Program and is designed to heighten awareness of the technology available for transfer and its potential for public benefit. NASA's mainline programs, whose objectives require development of new technology and therefore expand the bank of technology available for transfer in future years, are summarized. Focus is on the representative sampling of spinoffs (spinoff, in this context, means products and processes developed as secondary applications of existing NASA technology) that resulted from NASA's mainline programs. The various mechanisms NASA employs to stimulate technology transfer are described and contact sources are listed in the appendix for further information about the Technology Utilization Program.

  13. Environmental Technology Verification Program Fact Sheet

    EPA Science Inventory

    This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...

  14. The SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION program - Technology Profiles

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) program was created to evaluate new and promising treatment technologies for cleanup at hazardous waste sites. The mission of the SITE program is to encourage the development and routine use of innovative treatment technologie...

  15. Control Robotics Programming Technology. Technology Learning Activity. Teacher Edition.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This Technology Learning Activity (TLA) for control robotics programming technology in grades 6-10 is designed to teach students to construct and program computer-controlled devices using a LEGO DACTA set and computer interface and to help them understand how control technology and robotics affect them and their lifestyle. The suggested time for…

  16. Research and technology annual report, FY 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Given here is the annual report of the John C. Stennis Space Center (SSC), a NASA center responsible for testing NASA's large propulsion systems, developing supporting test technologies, conducting research in a variety of earth science disciplines, and facilitating the commercial uses of NASA-developed technologies. Described here are activities of the Earth Sciences Research Program, the Technology Development Program, commercial programs, the Technology Utilization Program, and the Information Systems Program. Work is described in such areas as forest ecosystems, land-sea interface, wetland biochemical flux, thermal imaging of crops, gas detectors, plume analysis, synthetic aperture radar, forest resource management, applications engineering, and the Earth Observations Commercial Applications Program.

  17. 2012 DOE Vehicle Technologies Program Annual Merit Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The 2012 DOE Hydrogen Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting was held May 14-18, 2012 in Crystal City, Virginia. The review encompassed all of the work done by the Hydrogen Program and the Vehicle Technologies Program: a total of 309 individual activities were reviewed for Vehicle Technologies, by a total of 189 reviewers. A total of 1,473 individual review responses were received for the technical reviews.

  18. Theme: Emerging Technologies.

    ERIC Educational Resources Information Center

    Malpiedi, Barbara J.; And Others

    1989-01-01

    Consists of six articles discussing the effect of emerging technologies on agriculture. Specific topics include (1) agriscience programs, (2) the National Conference on Agriscience and Emerging Occupations and Technologies, (3) biotechnology, (4) program improvement through technology, (5) the Agriscience Teacher of the Year program, and (6)…

  19. 75 FR 64692 - Green Technology Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... DEPARTMENT OF COMMERCE Patent and Trademark Office Green Technology Pilot Program ACTION: Proposed...- 0062 Green Technology Pilot Program comment'' in the subject line of the message. Fax: 571-273-0112... permits patent applications pertaining to green technologies, including greenhouse gas reduction, to be...

  20. Defense Science Board Task Force on The Manufacturing Technology Program: A Key to Affordably Equipping the Future Force

    DTIC Science & Technology

    2006-02-01

    technology for cost and risk reduction of products, software, and processes; long-term, multi-Service needs; and disruptive technologies , both...initiatives and for disruptive technologies , the Office of the Secretary of Defense (OSD) can better promote the importance and value of the program...multi- Service programs, research in “ disruptive ” technologies , and SBIR programs. Balance current, near term, and future needs as well as small and

  1. State investments in high-technology job growth.

    PubMed

    Leicht, Kevin T; Jenkins, J Craig

    2017-07-01

    Since the early 1970's state and local governments have launched an array of economic development programs designed to promote high-technology development. The question our analysis addresses is whether these programs promote long-term high-technology employment growth net of state location and agglomeration advantages. Proponents talk about an infrastructure strategy that promotes investment in public research and specialized infrastructure to attract and grow new high technology industries in specific locations, and a more decentralized entrepreneurial strategy that reinforces local agglomeration capacities by investing in new enterprises and products, promoting the development of local networks and partnerships. Our results support the entrepreneurial strategy, suggesting that state governments can accelerate high technology development by adopting market-supportive programs that complement private sector initiatives. In addition to positive direct benefits of technology deployment/transfer programs and SBIR programs, entrepreneurial programs affect change in high-technology employment in concert with existing locational and agglomeration advantages. Rural (i.e. low population density) states tend to benefit by technology development programs. Infrastructure strategy programs also facilitate high technology job growth in places where local advantages already exist. Our results suggest that critics of industrial policy are correct that high technology growth is organic and endogenous, yet state governments are able to "pick winners and losers" in ways that grow their local economy. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Small engine technology programs

    NASA Technical Reports Server (NTRS)

    Niedzwiecki, Richard W.

    1990-01-01

    Described here is the small engine technology program being sponsored at the Lewis Research Center. Small gas turbine research is aimed at general aviation, commuter aircraft, rotorcraft, and cruise missile applications. The Rotary Engine program is aimed at supplying fuel flexible, fuel efficient technology to the general aviation industry, but also has applications to other missions. The Automotive Gas Turbine (AGT) and Heavy-Duty Diesel Transport Technology (HDTT) programs are sponsored by DOE. The Compound Cycle Engine program is sponsored by the Army. All of the programs are aimed towards highly efficient engine cycles, very efficient components, and the use of high temperature structural ceramics. This research tends to be generic in nature and has broad applications. The HDTT, rotary technology, and the compound cycle programs are all examining approaches to minimum heat rejection, or 'adiabatic' systems employing advanced materials. The AGT program is also directed towards ceramics application to gas turbine hot section components. Turbomachinery advances in the gas turbine programs will benefit advanced turbochargers and turbocompounders for the intermittent combustion systems, and the fundamental understandings and analytical codes developed in the research and technology programs will be directly applicable to the system projects.

  3. Evaluation of the Advanced Subsonic Technology Program Noise Reduction Benefits

    NASA Technical Reports Server (NTRS)

    Golub, Robert A.; Rawls, John W., Jr.; Russell, James W.

    2005-01-01

    This report presents a detailed evaluation of the aircraft noise reduction technology concepts developed during the course of the NASA/FAA Advanced Subsonic Technology (AST) Noise Reduction Program. In 1992, NASA and the FAA initiated a cosponsored, multi-year program with the U.S. aircraft industry focused on achieving significant advances in aircraft noise reduction. The program achieved success through a systematic development and validation of noise reduction technology. Using the NASA Aircraft Noise Prediction Program, the noise reduction benefit of the technologies that reached a NASA technology readiness level of 5 or 6 were applied to each of four classes of aircraft which included a large four engine aircraft, a large twin engine aircraft, a small twin engine aircraft and a business jet. Total aircraft noise reductions resulting from the implementation of the appropriate technologies for each class of aircraft are presented and compared to the AST program goals.

  4. Study of Federal technology transfer activities in areas of interest to NASA Office of Space and Terrestrial Applications

    NASA Technical Reports Server (NTRS)

    Madigan, J. A.; Earhart, R. W.

    1978-01-01

    Forty-three ongoing technology transfer programs in Federal agencies other than NASA were selected from over 200 current Federal technology transfer activities. Selection was made and specific technology transfer mechanisms utilized. Detailed information was obtained on the selected programs by reviewing published literature, and conducting telephone interviews with each program manager. Specific information collected on each program includes technology areas; user groups, mechanisms employed, duration of program, and level of effort. Twenty-four distinct mechanisms are currently employed in Federal technology transfer activities totaling $260 million per year. Typical applications of each mechanism were reviewed, and caveats on evaluating program effectiveness were discussed. A review of recent federally funded research in technology transfer to state and local governments was made utilizing the Smithsonian Science Information Exchange, and abstracts of interest to NASA were selected for further reference.

  5. Physics of the Cosmos Program Annual Technology Report

    NASA Technical Reports Server (NTRS)

    Pham, Bruce Thai; Cardiff, Ann H.

    2015-01-01

    What's in this Report? What's New? This fifth Program Annual Technology Report (PATR) summarizes the Programs technology development activities for fiscal year (FY) 2015. The PATR serves four purposes.1. Summarize the technology gaps identified by the astrophysics community;2. Present the results of this years technology gap prioritization by the PCOS Technology Management Board (TMB);3. Report on newly funded PCOS Strategic Astrophysics Technology (SAT) projects; and4. Detail progress, current status, and activities planned for the coming year for all technologies supported by PCOS Supporting Research and Technology (SRT) funding in FY 2015. .

  6. Effects of a Technology-Friendly Education Program on Pre-Service Teachers' Perceptions and Learning Styles

    ERIC Educational Resources Information Center

    Kim, Dong-Joong; Choi, Sang-Ho

    2016-01-01

    A technology-friendly teacher education program can make pre-service teachers more comfortable with using technology from laggard to innovator and change their learning styles in which they prefer the use of technology in teaching. It is investigated how a technology-friendly mathematics education program, which provided 49 pre-service teachers an…

  7. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM: PROGRESS AND ACCOMPLISHMENTS - FISCAL YEAR 1991

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) program was the first major program for demonstrating and evaluating full-scale innovative treatment technologies at hazardous waste sites. Having concluded its fifth year, the SITE program is recognized as a leading advocate ...

  8. Dental Laboratory Technology Program Guide.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Dept. of Vocational Education.

    This program guide contains the standard dental laboratory technology curriculum for both diploma programs and associate degree programs in technical institutes in Georgia. The curriculum encompasses the minimum competencies required for entry-level workers in the dental laboratory technology field. The general information section contains the…

  9. U.S. ENVIRONMENTAL PROTECTION AGENCY'S SITE EMERGING TECHNOLOGY PROGRAM: 1991 UPDATE

    EPA Science Inventory

    The Emerging Technology Program (ETP) supports the development of technologies successfully tested at the bench- and pilot-scale level. The ETP is part of the Superfund Innovative Technology Evaluation (SITE) Program which was established in 1986 under the Superfund Amendments an...

  10. 2013 Building Technologies Office Program Peer Review Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    2013-11-01

    The 2013 Building Technologies Office Program Peer Review Report summarizes the results of the 2013 Building Technologies Office (BTO) peer review, which was held in Washington, D.C., on April 2–4, 2013. The review was attended by over 300 participants and included presentations on 59 BTO-funded projects: 29 from BTO’s Emerging Technologies Program, 20 from the Commercial Buildings Integration Program, 6 from the Residential Buildings Integration Program, and 4 from the Building Energy Codes Program. This report summarizes the scores and comments provided by the independent reviewers for each project.

  11. Technology advancements for the U.S. manned Space Station - An overview

    NASA Technical Reports Server (NTRS)

    Simon, William E.

    1987-01-01

    The structure and methodology of the Johnson Space Center (JSC) advanced development program is described. An overview of the program is given, and the technology transfer process to other disciplines is described. The test bed and flight experiment programs are described, as is the technology assessment which was performed at the end of the Phase B program. The technology program within each discipline is summarized, and the coordination and integration of the JSC program with the activities of other NASA centers and with work package contractors are discussed.

  12. Controls-Structures Interaction (CSI) technology program summary. Earth orbiting platforms program area of the space platforms technology program

    NASA Technical Reports Server (NTRS)

    Newsom, Jerry R.

    1991-01-01

    Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.

  13. The NASA technology push towards future space mission systems

    NASA Technical Reports Server (NTRS)

    Sadin, Stanley R.; Povinelli, Frederick P.; Rosen, Robert

    1988-01-01

    As a result of the new Space Policy, the NASA technology program has been called upon to a provide a solid base of national capabilities and talent to serve NASA's civil space program, commercial, and other space sector interests. This paper describes the new technology program structure and its characteristics, traces its origin and evolution, and projects the likely near- and far-term strategic steps. It addresses the alternative 'push-pull' approaches to technology development, the readiness levels to which the technology needs to be developed for effective technology transfer, and the focused technology programs currently being implemented to satisfy the needs of future space systems.

  14. User needs as a basis for advanced technology. [U.S. civil space program

    NASA Technical Reports Server (NTRS)

    Mankins, John C.; Reck, Gregory M.

    1992-01-01

    The NASA Integrated Technology Plan (ITP) is described with treatment given to the identification of U.S. technology needs, space research and technology programs, and some ITP implementations. The ITP is based on the development and transfer of technologies relevant to the space program that also have significant implications for general technological research. Among the areas of technological research identified are: astrophysics, earth sciences, microgravity, and space physics. The Office of Space Science and Applications prioritizes the technology needs in three classes; the highest priority is given to submm and microwave technologies for earth sciences and astrophysics study. Other government and commercial needs are outlined that include cryogenic technologies, low-cost engines, advanced data/signal processing, and low-cost ELVs. It is demonstrated that by identifying and addressing these areas of user technology needs NASA's research and technology program can enhance U.S. trade and industrial competitiveness.

  15. Component technology for stirling power converters

    NASA Technical Reports Server (NTRS)

    Thieme, Lanny G.

    1991-01-01

    NASA Lewis Research Center has organized a component technology program as part of the efforts to develop Stirling converter technology for space power applications. The Stirling Space Power Program is part of the NASA High Capacity Power Project of the Civil Space Technology Initiative (CSTI). NASA Lewis is also providing technical management for the DOE/Sandia program to develop Stirling converters for solar terrestrial power producing electricity for the utility grid. The primary contractors for the space power and solar terrestrial programs develop component technologies directly related to their goals. This Lewis component technology effort, while coordinated with the main programs, aims at longer term issues, advanced technologies, and independent assessments. An overview of work on linear alternators, engine/alternator/load interactions and controls, heat exchangers, materials, life and reliability, and bearings is presented.

  16. NASA Goddard Thermal Technology Overview 2018

    NASA Technical Reports Server (NTRS)

    Butler, Dan; Swanson, Ted

    2018-01-01

    This presentation summarizes the current plans and efforts at NASA/Goddard to develop new thermal control technology for anticipated future missions. It will also address some of the programmatic developments currently underway at NASA, especially with respect to the NASA Technology Development Program. The effects of the recently submitted NASA budget will also be addressed. While funding for basic technology development is still tight, significant efforts are being made in direct support of flight programs. Thermal technology Implementation on current flight programs will be reviewed, and the recent push for Cube-sat mission development will also be addressed. Many of these technologies also have broad applicability to DOD, DOE, and commercial programs. Partnerships have been developed with the Air Force, Navy, and various universities to promote technology development. In addition, technology development activities supported by internal research and development (IRAD) program and the Small Business Innovative Research (SBIR) program are reviewed in this presentation. Specific technologies addressed include; two-phase systems applications and issues on NASA missions, latest developments of thermal control coatings, Atomic Layer Deposition (ALD), Micro-scale Heat Transfer, and various other research activities.

  17. Research and technology operating plan summary: Fiscal year 1975 research and technology program. [space programs, energy technology, and aerospace sciences

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Summaries are presented of Research and Technology Operating Plans currently in progress throughout NASA. Citations and abstracts of the operating plans are presented along with a subject index, technical monitor index, and responsible NASA organization index. Research programs presented include those carried out in the Office of Aeronautics and Space Technology, Office of Energy Programs, Office of Applications, Office of Space Sciences, Office of Tracking and Data Acquisition, and the Office of Manned Space Flight.

  18. ESTIMATING INNOVATIVE TECHNOLOGY COSTS FOR THE SITE PROGRAM

    EPA Science Inventory

    Among the objectives of the EPA`s Superfund Innovative Technology Evaluation (SITE) Program are two which pertain to the issue of economics: 1) That the program will provide a projected cost for each treatment technology demonstrated. 2) That the program will attempt to identify ...

  19. 30 CFR 402.7 - Water-Resources Technology Development Program.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Water-Resources Technology Development Program. 402.7 Section 402.7 Mineral Resources GEOLOGICAL SURVEY, DEPARTMENT OF THE INTERIOR WATER-RESOURCES RESEARCH PROGRAM AND THE WATER-RESOURCES TECHNOLOGY DEVELOPMENT PROGRAM Description of Water-Resources...

  20. 30 CFR 402.7 - Water-Resources Technology Development Program.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false Water-Resources Technology Development Program. 402.7 Section 402.7 Mineral Resources GEOLOGICAL SURVEY, DEPARTMENT OF THE INTERIOR WATER-RESOURCES RESEARCH PROGRAM AND THE WATER-RESOURCES TECHNOLOGY DEVELOPMENT PROGRAM Description of Water-Resources...

  1. 30 CFR 402.7 - Water-Resources Technology Development Program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Water-Resources Technology Development Program. 402.7 Section 402.7 Mineral Resources GEOLOGICAL SURVEY, DEPARTMENT OF THE INTERIOR WATER-RESOURCES RESEARCH PROGRAM AND THE WATER-RESOURCES TECHNOLOGY DEVELOPMENT PROGRAM Description of Water-Resources...

  2. 30 CFR 402.7 - Water-Resources Technology Development Program.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false Water-Resources Technology Development Program. 402.7 Section 402.7 Mineral Resources GEOLOGICAL SURVEY, DEPARTMENT OF THE INTERIOR WATER-RESOURCES RESEARCH PROGRAM AND THE WATER-RESOURCES TECHNOLOGY DEVELOPMENT PROGRAM Description of Water-Resources...

  3. 30 CFR 402.7 - Water-Resources Technology Development Program.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 2 2012-07-01 2012-07-01 false Water-Resources Technology Development Program. 402.7 Section 402.7 Mineral Resources GEOLOGICAL SURVEY, DEPARTMENT OF THE INTERIOR WATER-RESOURCES RESEARCH PROGRAM AND THE WATER-RESOURCES TECHNOLOGY DEVELOPMENT PROGRAM Description of Water-Resources...

  4. Transferring new technologies within the federal sector: The New Technology Demonstration Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, D.R.; Hunt, D.M.

    1994-08-01

    The federal sector is the largest consumer of products in the United States and annually purchases almost 1.5 quads of energy measured at the building site at a cost of almost $10 billion (U.S. Department of Energy 1991). A review of design, construction, and procurement practices in the federal sector, as well as discussions with manufacturers and vendors, indicated that new technologies are not utilized in as timely a manner as possible. As a consequence of this technology transfer lag, the federal sector loses valuable energy and environmental benefits that can be derived through the application of new technologies. Inmore » addition, opportunities are lost to reduce federal energy expenditures and spur U.S. economic growth through the procurement of such technologies. In 1990, under the direction of the U.S. Department of Energy (DOE) Federal Energy Management Program, the Pacific Northwest Laboratory began the design of a program to accelerate the introduction of new U.S. technologies into the federal sector. Designated first as the Test Bed Demonstration Program and more recently the New Technology Demonstration Program, it sought to shorten the acceptance period of new technologies within the federal sector. By installing and evaluating various new technologies at federal facilities, the Program attempts to increase the acceptance of those new technologies through the results of {open_quotes}real-world{close_quotes} federal installations. Since that time, the Program has conducted new technology demonstrations and evaluations, evolved to address the need for more timely information transfer, and explored collaborative opportunities with other DOE offices and laboratories. This paper explains the processes by which a new technology demonstration project is implemented and presents a general description of the Program results to date.« less

  5. COST EVALUATION STRATEGIES FOR TECHNOLOGIES TESTED UNDER THE ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This document provides a general set of guidelines that may be consistently applied for collecting, evaluation, and reporting the costs of technologies tested under the ETV Program. Because of the diverse nature of the technologies and industries covered in this program, each ETV...

  6. 77 FR 46909 - Small Business Innovation Research (SBIR) Program and Small Business Technology Transfer (STTR...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-06

    ... Technology Transfer (STTR) Program Policy Directives AGENCY: U.S. Small Business Administration. ACTION...) and Small Business Technology Transfer Program (STTR) Policy Directives. These amendments implement... to Edsel Brown, Assistant Director, Office of Technology, U.S. Small Business Administrator, 409...

  7. Bridging the Technology Readiness "Valley of Death" Utilizing Nanosats

    NASA Technical Reports Server (NTRS)

    Bauer, Robert A.; Millar, Pamela S.; Norton, Charles D.

    2015-01-01

    Incorporating new technology is a hallmark of space missions. Missions demand ever-improving tools and techniques to allow them to meet the mission science requirements. In Earth Science, these technologies are normally expressed in new instrument capabilities that can enable new measurement concepts, extended capabilities of existing measurement techniques, or totally new detection capabilities, and also, information systems technologies that can enhance data analysis or enable new data analyses to advance modeling and prediction capabilities. Incorporating new technologies has never been easy. There is a large development step beyond demonstration in a laboratory or on an airborne platform to the eventual space environment that is sometimes referred to as the "technology valley of death." Studies have shown that non-validated technology is a primary cause of NASA and DoD mission delays and cost overruns. With the demise of the New Millennium Program within NASA, opportunities for demonstrating technologies in space have been rare. Many technologies are suitable for a flight project after only ground testing. However, some require validation in a relevant or a space flight environment, which cannot be fully tested on the ground or in airborne systems. NASA's Earth Science Technology Program has initiated a nimble program to provide a fairly rapid turn-around of space validated technologies, and thereby reducing future mission risk in incorporating new technologies. The program, called In-Space Validation of Earth Science Technology (InVEST), now has five tasks in development. Each are 3U CubeSats and they are targeted for launch opportunities in the 2016 time period. Prior to formalizing an InVEST program, the technology program office was asked to demonstrate how the program would work and what sort of technologies could benefit from space validation. Three projects were developed and launched, and have demonstrated the technologies that they set out to validate. This paper will provide a brief status of the pre-InVEST CubeSats, and discuss the development and status of the InVEST program. Figure

  8. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE PAGES

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    2017-09-07

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  9. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  10. Architectural Implications for Spatial Object Association Algorithms*

    PubMed Central

    Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste

    2013-01-01

    Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244

  11. Centroid Position as a Function of Total Counts in a Windowed CMOS Image of a Point Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R E; Olivier, S; Riot, V

    2010-05-27

    We obtained 960,200 22-by-22-pixel windowed images of a pinhole spot using the Teledyne H2RG CMOS detector with un-cooled SIDECAR readout. We performed an analysis to determine the precision we might expect in the position error signals to a telescope's guider system. We find that, under non-optimized operating conditions, the error in the computed centroid is strongly dependent on the total counts in the point image only below a certain threshold, approximately 50,000 photo-electrons. The LSST guider camera specification currently requires a 0.04 arcsecond error at 10 Hertz. Given the performance measured here, this specification can be delivered with a singlemore » star at 14th to 18th magnitude, depending on the passband.« less

  12. Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2014-01-01

    We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.

  13. Connecting the time domain community with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Plante, Raymond L.; Kantor, Jeffrey; Good, John C.

    2012-09-01

    The time domain has been identied as one of the most important areas of astronomical research for the next decade. The Virtual Observatory is in the vanguard with dedicated tools and services that enable and facilitate the discovery, dissemination and analysis of time domain data. These range in scope from rapid notications of time-critical astronomical transients to annotating long-term variables with the latest modelling results. In this paper, we will review the prior art in these areas and focus on the capabilities that the VAO is bringing to bear in support of time domain science. In particular, we will focus on the issues involved with the heterogeneous collections of (ancilllary) data associated with astronomical transients, and the time series characterization and classication tools required by the next generation of sky surveys, such as LSST and SKA.

  14. Two Inseparable Facets of Technology Integration Programs: Technology and Theoretical Framework

    ERIC Educational Resources Information Center

    Demir, Servet

    2011-01-01

    This paper considers the process of program development aiming at technology integration for teachers. For this consideration, the paper focused on an integration program which was recently developed as part of a larger project. The participants of this program were 45 in-service teachers. The program continued four weeks and the conduct of the…

  15. Balanced program plan: analysis for biomedical and environmental research. Volume 5. Oil shale technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-06-01

    Oil shale technology has been divided into two sub-technologies: surfaceprocessing and in-situ processing. Definition of the research programs is essentially an amplification of the five King-Muir categories: (A) pollutants: characterization, measurement, and monitoring; (B) physical and chemical processes and effects; (C) health effects; (D) ecological processes and effects; and (E) integrated assessment. Twenty-three biomedical and environmental research projects are described as to program title, scope, milestones, technology time frame, program unit priority, and estimated program unit cost.

  16. Next Generation Launch Technology Program Lessons Learned

    NASA Technical Reports Server (NTRS)

    Cook, Stephen; Tyson, Richard

    2005-01-01

    In November 2002, NASA revised its Integrated Space Transportation Plan (ISTP) to evolve the Space Launch Initiative (SLI) to serve as a theme for two emerging programs. The first of these, the Orbital Space Plane (OSP), was intended to provide crew-escape and crew-transfer functions for the ISS. The second, the NGLT Program, developed technologies needed for safe, routine space access for scientific exploration, commerce, and national defense. The NGLT Program was comprised of 12 projects, ranging from fundamental high-temperature materials research to full-scale engine system developments (turbine and rocket) to scramjet flight test. The Program included technology advancement activities with a broad range of objectives, ultimate applications/timeframes, and technology maturity levels. An over-arching Systems Engineering and Analysis (SE&A) approach was employed to focus technology advancements according to a common set of requirements. Investments were categorized into three segments of technology maturation: propulsion technologies, launch systems technologies, and SE&A.

  17. Surgical Technology Program Guide.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Dept. of Vocational Education.

    This surgical technology program guide presents the standard curriculum for technical institutes in Georgia. The curriculum addresses the minimum competencies for a surgical technology program. The program guide is designed to relate primarily to the development of those skills needed by individuals in the field to provide services in the…

  18. Variable Cycle Engine Technology Program Planning and Definition Study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Stern, A. M.

    1978-01-01

    The variable stream control engine, VSCE-502B, was selected as the base engine, with the inverted flow engine concept selected as a backup. Critical component technologies were identified, and technology programs were formulated. Several engine configurations were defined on a preliminary basis to serve as demonstration vehicles for the various technologies. The different configurations present compromises in cost, technical risk, and technology return. Plans for possible variably cycle engine technology programs were formulated by synthesizing the technology requirements with the different demonstrator configurations.

  19. 5 CFR 2641.207 - One-year restriction on any former private sector assignee under the Information Technology...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... private sector assignee under the Information Technology Exchange Program representing, aiding, counseling... assignee under the Information Technology Exchange Program representing, aiding, counseling or assisting in... the Information Technology Exchange Program, 5 U.S.C. chapter 37, no former assignee shall knowingly...

  20. 5 CFR 2641.207 - One-year restriction on any former private sector assignee under the Information Technology...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... private sector assignee under the Information Technology Exchange Program representing, aiding, counseling... assignee under the Information Technology Exchange Program representing, aiding, counseling or assisting in... the Information Technology Exchange Program, 5 U.S.C. chapter 37, no former assignee shall knowingly...

  1. 5 CFR 2641.207 - One-year restriction on any former private sector assignee under the Information Technology...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... private sector assignee under the Information Technology Exchange Program representing, aiding, counseling... assignee under the Information Technology Exchange Program representing, aiding, counseling or assisting in... the Information Technology Exchange Program, 5 U.S.C. chapter 37, no former assignee shall knowingly...

  2. 5 CFR 2641.207 - One-year restriction on any former private sector assignee under the Information Technology...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... private sector assignee under the Information Technology Exchange Program representing, aiding, counseling... assignee under the Information Technology Exchange Program representing, aiding, counseling or assisting in... the Information Technology Exchange Program, 5 U.S.C. chapter 37, no former assignee shall knowingly...

  3. 5 CFR 2641.207 - One-year restriction on any former private sector assignee under the Information Technology...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... private sector assignee under the Information Technology Exchange Program representing, aiding, counseling... assignee under the Information Technology Exchange Program representing, aiding, counseling or assisting in... the Information Technology Exchange Program, 5 U.S.C. chapter 37, no former assignee shall knowingly...

  4. Using Technology to Enhance an Automotive Program

    ERIC Educational Resources Information Center

    Ashton, Denis

    2009-01-01

    Denis Ashton uses technology in his automotive technology program at East Valley Institute of Technology (EVIT) to positively impact student outcomes. Ashton, the department chair for the automotive programs at EVIT, in Mesa, Arizona, says that using an interactive PowerPoint curriculum makes learning fun for students and provides immediate…

  5. Using Technology To Promote Your Guidance and Counseling Program among Stake Holders.

    ERIC Educational Resources Information Center

    Sabella, Russell A.; Booker, Beverly L.

    2003-01-01

    Focuses on the use of technology to promote guidance and counseling programs among stakeholders. Details the significance of promoting comprehensive guidance and counseling programs and examples of technology tools used for persuasive communication. Discusses the advantages and disadvantages of using technology to promote a school counseling…

  6. EERE Wind and Hydropower Technologies Program Technology Review (Deep Dive) for Under Secretaries Johnson and Koonin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2009-09-01

    September 4, 2009 presentation highlighting the Wind and Hydropower Program, addressing program goals and objectives, budgets, technology pathways, breakthroughs, and DOE solutions to market barriers.

  7. EERE Wind and Hydropower Technologies Program Technology Review (Deep Dive) for Under Secretaries Johnson and Koonin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCluer, Megan

    2009-09-04

    September 4, 2009 presentation highlighting the Wind and Hydropower Program, addressing program goals and objectives, budgets, technology pathways, breakthroughs, and DOE solutions to market barriers.

  8. MLS student active learning within a "cloud" technology program.

    PubMed

    Tille, Patricia M; Hall, Heather

    2011-01-01

    In November 2009, the MLS program in a large public university serving a geographically large, sparsely populated state instituted an initiative for the integration of technology enhanced teaching and learning within the curriculum. This paper is intended to provide an introduction to the system requirements and sample instructional exercises used to create an active learning technology-based classroom. Discussion includes the following: 1.) define active learning and the essential components, 2.) summarize teaching methods, technology and exercises utilized within a "cloud" technology program, 3.) describe a "cloud" enhanced classroom and programming 4.) identify active learning tools and exercises that can be implemented into laboratory science programs, and 5.) describe the evaluation and assessment of curriculum changes and student outcomes. The integration of technology in the MLS program is a continual process and is intended to provide student-driven active learning experiences.

  9. Space technology research plans

    NASA Technical Reports Server (NTRS)

    Hook, W. Ray

    1992-01-01

    Development of new technologies is the primary purpose of the Office of Aeronautics and Space Technology (OAST). OAST's mission includes the following two goals: (1) to conduct research to provide fundamental understanding, develop advanced technology and promote technology transfer to assure U.S. preeminence in aeronautics and to enhance and/or enable future civil space missions: and (2) to provide unique facilities and technical expertise to support national aerospace needs. OAST includes both NASA Headquarters operations as well as programmatic and institutional management of the Ames Research Center, the Langley Research Center and the Lewis Research Center. In addition. a considerable portion of OAST's Space R&T Program is conducted through the flight and science program field centers of NASA. Within OAST, the Space Technology Directorate is responsible for the planning and implementation of the NASA Space Research and Technology Program. The Space Technology Directorate's mission is 'to assure that OAST shall provide technology for future civil space missions and provide a base of research and technology capabilities to serve all national space goals.' Accomplishing this mission entails the following objectives: y Identify, develop, validate and transfer technology to: (1) increase mission safety and reliability; (2) reduce flight program development and operations costs; (3) enhance mission performance; and (4) enable new missions. Provide the capability to: (1) advance technology in critical disciplines; and (2) respond to unanticipated mission needs. In-space experiments are an integral part of OAST's program and provides for experimental studies, development and support for in-space flight research and validation of advanced space technologies. Conducting technology experiments in space is a valuable and cost effective way to introduce advanced technologies into flight programs. These flight experiments support both the R&T base and the focussed programs within OAST.

  10. Transportation technology program: Strategic plan

    NASA Astrophysics Data System (ADS)

    1991-09-01

    The purpose of this report is to define the technology program required to meet the transportation technology needs for current and future civil space missions. It is a part of an integrated plan, prepared by NASA in part in response to the Augustine Committee recommendations, to describe and advocate expanded and more aggressive efforts in the development of advanced space technologies. This expanded program will provide a technology basis for future space missions to which the U.S. aspires, and will help to regain technology leadership for the U.S. on a broader front. The six aspects of this integrated program/plan deal with focused technologies to support space sciences, exploration, transportation, platforms, and operations as well as provide a Research and Technology Base Program. This volume describes the technologies needed to support transportation systems, e.g., technologies needed for upgrades to current transportation systems and to provide reliable and efficient transportation for future space missions. The Office of Aeronautics, Exploration, and Technology (OAET) solicited technology needs from the major agency technology users and the aerospace industry community and formed a transportation technology team (appendix A) to develop a technology program to respond to those needs related to transportation technologies. This report addresses the results of that team activity. It is a strategic plan intended for use as a planning document rather than as a project management tool. It is anticipated that this document will be primarily utilized by research & technology (R&T) management at the various NASA Centers as well as by officials at NASA Headquarters and by industry in planning their corporate Independent Research and Development (IR&D) investments.

  11. Transportation technology program: Strategic plan

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The purpose of this report is to define the technology program required to meet the transportation technology needs for current and future civil space missions. It is a part of an integrated plan, prepared by NASA in part in response to the Augustine Committee recommendations, to describe and advocate expanded and more aggressive efforts in the development of advanced space technologies. This expanded program will provide a technology basis for future space missions to which the U.S. aspires, and will help to regain technology leadership for the U.S. on a broader front. The six aspects of this integrated program/plan deal with focused technologies to support space sciences, exploration, transportation, platforms, and operations as well as provide a Research and Technology Base Program. This volume describes the technologies needed to support transportation systems, e.g., technologies needed for upgrades to current transportation systems and to provide reliable and efficient transportation for future space missions. The Office of Aeronautics, Exploration, and Technology (OAET) solicited technology needs from the major agency technology users and the aerospace industry community and formed a transportation technology team (appendix A) to develop a technology program to respond to those needs related to transportation technologies. This report addresses the results of that team activity. It is a strategic plan intended for use as a planning document rather than as a project management tool. It is anticipated that this document will be primarily utilized by research & technology (R&T) management at the various NASA Centers as well as by officials at NASA Headquarters and by industry in planning their corporate Independent Research and Development (IR&D) investments.

  12. Healthcare technologies, quality improvement programs and hospital organizational culture in Canadian hospitals

    PubMed Central

    2013-01-01

    Background Healthcare technology and quality improvement programs have been identified as a means to influence healthcare costs and healthcare quality in Canada. This study seeks to identify whether the ability to implement healthcare technology by a hospital was related to usage of quality improvement programs within the hospital and whether the culture within a hospital plays a role in the adoption of quality improvement programs. Methods A cross-sectional study of Canadian hospitals was conducted in 2010. The sample consisted of hospital administrators that were selected by provincial review boards. The questionnaire consisted of 3 sections: 20 healthcare technology items, 16 quality improvement program items and 63 culture items. Results Rasch model analysis revealed that a hierarchy existed among the healthcare technologies based upon the difficulty of implementation. The results also showed a significant relationship existed between the ability to implement healthcare technologies and the number of quality improvement programs adopted. In addition, culture within a hospital served a mediating role in quality improvement programs adoption. Conclusions Healthcare technologies each have different levels of difficulty. As a consequence, hospitals need to understand their current level of capability before selecting a particular technology in order to assess the level of resources needed. Further the usage of quality improvement programs is related to the ability to implement technology and the culture within a hospital. PMID:24119419

  13. NASA's Physics of the Cosmos and Cosmic Origins programs manage Strategic Astrophysics Technology (SAT) development

    NASA Astrophysics Data System (ADS)

    Pham, Thai; Thronson, Harley; Seery, Bernard; Ganel, Opher

    2016-07-01

    The strategic astrophysics missions of the coming decades will help answer the questions "How did our universe begin and evolve?" "How did galaxies, stars, and planets come to be?" and "Are we alone?" Enabling these missions requires advances in key technologies far beyond the current state of the art. NASA's Physics of the Cosmos2 (PCOS), Cosmic Origins3 (COR), and Exoplanet Exploration Program4 (ExEP) Program Offices manage technology maturation projects funded through the Strategic Astrophysics Technology (SAT) program to accomplish such advances. The PCOS and COR Program Offices, residing at the NASA Goddard Space Flight Center (GSFC), were established in 2011, and serve as the implementation arm for the Astrophysics Division at NASA Headquarters. We present an overview of the Programs' technology development activities and the current technology investment portfolio of 23 technology advancements. We discuss the process for addressing community-provided technology gaps and Technology Management Board (TMB)-vetted prioritization and investment recommendations that inform the SAT program. The process improves the transparency and relevance of our technology investments, provides the community a voice in the process, and promotes targeted external technology investments by defining needs and identifying customers. The Programs' priorities are driven by strategic direction from the Astrophysics Division, which is informed by the National Research Council's (NRC) "New Worlds, New Horizons in Astronomy and Astrophysics" (NWNH) 2010 Decadal Survey report [1], the Astrophysics Implementation Plan (AIP) [2] as updated, and the Astrophysics Roadmap "Enduring Quests, Daring Visions" [3]. These priorities include technology development for missions to study dark energy, gravitational waves, X-ray and inflation probe science, and large far-infrared (IR) and ultraviolet (UV)/optical/IR telescopes to conduct imaging and spectroscopy studies. The SAT program is the Astrophysics Division's main investment method to mature technologies that will be identified by study teams set up to inform the 2020 Decadal Survey process on several large astrophysics mission concepts.

  14. ADDENDUM TO SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM TECHNOLOGY PROFILES, TENTH EDITION, VOLUME 1 - DEMONSTRATION PROGRAM

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program, now in its thirteenth year, is an integral part of EPA's research into alternative cleanup methods for hazardous waste sites around the nation. The SITE Program was created to encourage the development and routine us...

  15. Electronic Engineering Technology Program Exit Examination as an ABET and Self-Assessment Tool

    ERIC Educational Resources Information Center

    Thomas, Gary; Darayan, Shahryar

    2018-01-01

    Every engineering, computing, and engineering technology program accredited by the Accreditation Board for Engineering and Technology (ABET) has formulated many and varied self-assessment methods. Methods used to assess a program for ABET accreditation and continuous improvement are for keeping programs current with academic and industrial…

  16. THE SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM: PROGRESS AND ACCOMPLISHMENTS - FISCAL YEAR 1990 A FOURTH REPORT TO CONGRESS

    EPA Science Inventory

    The SITE Program was the first major program for demonstrating and evaluating fullscale innovative treatment technologies at hazardous waste sites. Having concluded its fourth year, the SITE Program is recognized as a leading advocate of innovative technology development and comm...

  17. Demographic Survey of Female Faculty in Technology Education Programs.

    ERIC Educational Resources Information Center

    Heidari, Farzin

    A study was conducted to determine the general program information and the demographic status of female faculty in four-year technology education programs in the United States. Information was gathered through a literature review and a questionnaire mailed to all 70 technology education programs listed in the 1994 International Technology…

  18. Successful Examples of Instructional Technology in Higher Education.

    ERIC Educational Resources Information Center

    Hortin, John A.

    College programs considered to be successful in their use of instructional technology are described. The definition of instructional technology used to judge the media programs is as follows: a systematic approach to improve learning through media management, educational program development, and learning resources. Programs include the following:…

  19. 78 FR 8992 - Energy Conservation Program: Test Procedures for Residential Clothes Dryers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-07

    ... Energy, Building Technologies Program, Mailstop EE-2J, 1000 Independence Avenue SW., Washington, DC 20585... Technologies Program, 6th Floor, 950 L'Enfant Plaza SW., Washington, DC 20024. Telephone: (202) 586-2945. If... and Renewable Energy, Building Technologies Program, EE-2J, 1000 Independence Avenue SW., Washington...

  20. Raising awareness of assistive technology in older adults through a community-based, cooperative extension program.

    PubMed

    Sellers, Debra M; Markham, Melinda Stafford

    2012-01-01

    The Fashion an Easier Lifestyle with Assistive Technology (FELAT) curriculum was developed as a needs-based, community educational program provided through a state Cooperative Extension Service. The overall goal for participants was to raise awareness of assistive technology. Program evaluation included a postassessment and subsequent interview to determine short-term knowledge gain and longer term behavior change. The sample consisted of mainly older, married females. The FELAT program was effective at raising awareness and increasing knowledge of assistive technology, and for many participants, the program acted as a catalyst for planning to or taking action related to assistive technology.

  1. ODIN system technology module library, 1972 - 1973

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Watson, D. A.; Glatt, C. R.; Jones, R. T.; Galipeau, J.; Phoa, Y. T.; White, R. J.

    1978-01-01

    ODIN/RLV is a digital computing system for the synthesis and optimization of reusable launch vehicle preliminary designs. The system consists of a library of technology modules in the form of independent computer programs and an executive program, ODINEX, which operates on the technology modules. The technology module library contains programs for estimating all major military flight vehicle system characteristics, for example, geometry, aerodynamics, economics, propulsion, inertia and volumetric properties, trajectories and missions, steady state aeroelasticity and flutter, and stability and control. A general system optimization module, a computer graphics module, and a program precompiler are available as user aids in the ODIN/RLV program technology module library.

  2. Research and technology: 1994 annual report of the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    1994-01-01

    As the NASA Center responsible for assembly, checkout, servicing, launch, recovery, and operational support of Space Transportation System elements and payloads, the John F. Kennedy Space Center is placing increasing emphasis on its advanced technology development program. This program encompasses the efforts of the Engineering Development Directorate laboratories, most of the KSC operations contractors, academia, and selected commercial industries - all working in a team effort within their own areas of expertise. This edition of the Kennedy Space Center Research and Technology 1994 Annual Report covers efforts of all these contributors to the KSC advanced technology development program, as well as our technology transfer activities. The Technology Programs and Commercialization Office (DE-TPO), (407) 867-3017, is responsible for publication of this report and should be contacted for any desired information regarding the advanced technology program.

  3. CSTI high capacity power. [Civil Space Technology Initiative

    NASA Technical Reports Server (NTRS)

    Winter, Jerry M.

    1989-01-01

    In FY-88, the Advanced Technology Program was incorporated into NASA's Civil Space Technology Initiative (CSTI). The CSTI Program was established to provide the foundation for technology development in automation and robotics, information, propulsion, and power. The CSTI High Capacity Power Program builds on the technology efforts of the SP-100 program, incorporates the previous NASA SP-100 Advanced Technology project, and provides a bridge to NASA Project Pathfinder. The elements of CSTI High Capacity Power development include Converrsion Systems, Thermal Management, Power Management, System Diagnostics, and Environmental Interactions. Technology advancement in all areas, including materials, is required to assure the high reliability and 7 to 10 year lifetime demanded for future space nuclear power systems.

  4. Urban Rail Supporting Technology Program Fiscal Year 1975 - Year End Summary

    DOT National Transportation Integrated Search

    1975-12-01

    The Urban Rail Supporting Technology Program is described for the 1975 fiscal year period. Important areas include program management, technical support and applications engineering, facilities development, test and evaluation, and technology develop...

  5. 75 FR 6627 - Broadband Technology Opportunities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration [Docket No. 0907141137-0079-07] RIN 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National... policy and application procedures for the Broadband Technology Opportunities Program (BTOP) established...

  6. Superfund and Technology Liaison Program Fact Sheet

    EPA Pesticide Factsheets

    The Superfund and Technology Liaison (STL) Program was established to facilitate regional access to ORD laboratories, provide technical support, and assist with the integration of science and technology into decision-making for hazardous waste programs.

  7. Mississippi Curriculum Framework for Emergency Medical Technology--Basic (Program CIP: 51.0904). Emergency Medical Technology--Paramedic (Program CIP: 51.0904). Postsecondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the emergency medical technology (EMT) programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline…

  8. Let's get technical: Enhancing program evaluation through the use and integration of internet and mobile technologies.

    PubMed

    Materia, Frank T; Miller, Elizabeth A; Runion, Megan C; Chesnut, Ryan P; Irvin, Jamie B; Richardson, Cameron B; Perkins, Daniel F

    2016-06-01

    Program evaluation has become increasingly important, and information on program performance often drives funding decisions. Technology use and integration can help ease the burdens associated with program evaluation by reducing the resources needed (e.g., time, money, staff) and increasing evaluation efficiency. This paper reviews how program evaluators, across disciplines, can apply internet and mobile technologies to key aspects of program evaluation, which consist of participant registration, participant tracking and retention, process evaluation (e.g., fidelity, assignment completion), and outcome evaluation (e.g., behavior change, knowledge gain). In addition, the paper focuses on the ease of use, relative cost, and fit with populations. An examination on how these tools can be integrated to enhance data collection and program evaluation is discussed. Important limitations of and considerations for technology integration, including the level of technical skill, cost needed to integrate various technologies, data management strategies, and ethical considerations, are highlighted. Lastly, a case study of technology use in an evaluation conducted by the Clearinghouse for Military Family Readiness at Penn State is presented and illustrates how technology integration can enhance program evaluation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Astronomy development in Serbia in view of the IAU Strategic Plan

    NASA Astrophysics Data System (ADS)

    Atanacković, Olga

    2015-03-01

    An overview of astronomy development in Serbia in view of the goals envisaged by the IAU Strategic Plan is given. Due attention is paid to the recent reform of education at all levels. In the primary schools several extra topics in astronomy are introduced in the physics course. Attempts are made to reintroduce astronomy as a separate subject in the secondary schools. Special emphasis is put to the role and activities of the Petnica Science Center the biggest center for informal education in SE Europe, and to a successful participation of the Serbian team in International astronomy olympiads. Astronomy topics are taught at all five state universities in Serbia. At the University of Belgrade and Novi Sad students can enroll in astronomy from the first study year. The students have the training at the Ondrejov Observatory (Czech Republic) and at the astronomical station on the mountain Vidojevica in southern Serbia. Astronomy research in Serbia is performed at the Astronomical Observatory, Belgrade and the Department of Astronomy, Faculty of Mathematics, University of Belgrade. There are about 70 researchers in astronomy in Serbia (and about as many abroad) who participate in eight projects financed by the Ministry of Education and Science and in several international cooperations and projects: SREAC, VAMDC, Belissima (recruitment of experienced expatriate researchers), Astromundus (a 2-year joint master program with other four European universities), LSST. One of the goals in near future is twinning between universities in the SEE region and worldwide. The ever-increasing activities of 20 amateur astronomical societies are also given.

  10. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  11. Summary of Pellet Technology Program Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gebhart, III, Gerald E.; Baylor, Larry R.; Bell, Gary L.

    This report summarizes the activities and budget information of ORNL’s pellet technology program from the start of FY2014 through FY2017. Cost summaries are broken down by year and spending category. Milestone activities are outlined and described by year and further described in the project narrative. The project narrative outlines the main pellet injection technology advances enabled by the pellet technology program. A list of published research products is included, along with biographies of personnel involved. This document was prepared in support of the April 24, 2018, review of the pellet technology program at ORNL.

  12. Final Report: Fire Prevention, Detection, and Suppression Project, Exploration Technology Development Program

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.

    2011-01-01

    The Fire Prevention, Detection, and Suppression (FPDS) project is a technology development effort within the Exploration Technology Development Program of the Exploration System Missions Directorate (ESMD) that addresses all aspects of fire safety aboard manned exploration systems. The overarching goal for work in the FPDS area is to develop technologies that will ensure crew health and safety on exploration missions by reducing the likelihood of a fire, or, if one does occur, minimizing the risk to the crew, mission, or system. This is accomplished by addressing the areas of (1) fire prevention and material flammability, (2) fire signatures and detection, and (3) fire suppression and response. This report describes the outcomes of this project from the formation of the Exploration Technology Development Program (ETDP) in October 2005 to September 31, 2010 when the Exploration Technology Development Program was replaced by the Enabling Technology Development and Demonstration Program. NASA s fire safety work will continue under this new program and will build upon the accomplishments described herein.

  13. Making Technology Ready: Integrated Systems Health Management

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Oliver, Patrick J.

    2007-01-01

    This paper identifies work needed by developers to make integrated system health management (ISHM) technology ready and by programs to make mission infrastructure ready for this technology. This paper examines perceptions of ISHM technologies and experience in legacy programs. Study methods included literature review and interviews with representatives of stakeholder groups. Recommendations address 1) development of ISHM technology, 2) development of ISHM engineering processes and methods, and 3) program organization and infrastructure for ISHM technology evolution, infusion and migration.

  14. NASA'S information technology activities for the 90's

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Erickson, Dan

    1991-01-01

    The Office of Aeronautics, Exploration and Technology (OAET) is completing an extensive assessment of its nearly five hundred million dollars of proposed space technology development work. The budget is divided into four segments which are as follows: (1) the base research and technology program; (2) the Civil Space Technology Initiative (CSTI); (3) the Exploration Technology Program (ETP); and (4) the High Performance Computing Initiative (HPCI). The programs are briefly discussed in the context of Astrotech 21.

  15. Implementing a Computer/Technology Endorsement in a Classroom Technology Master's Program.

    ERIC Educational Resources Information Center

    Brownell, Gregg; O'Bannon, Blanche; Brownell, Nancy

    In the spring of 1998, the Master's program in Classroom Technology at Bowling Green State University (Ohio) was granted conditional approval to grant, as part of the program, the new State of Ohio Department of Education computer/technology endorsement. This paper briefly describes Ohio's change from certification to licensure, the removal of…

  16. Pathways to Commercial Success. Technologies and Products Supported by the Fuel Cell Technologies Program - 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    This FY 2012 report updates the results of an effort to identify and characterize commercial and near-commercial (emerging) technologies and products that benefited from the support of the Fuel Cell Technologies Program and its predecessor programs within DOE's Office of Energy Efficiency and Renewable Energy.

  17. Pathways to Commercial Success. Technologies and Products Supported by the Fuel Cell Technologies Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    This FY 2011 report updates the results of an effort to identify and characterize commercial and near-commercial (emerging) technologies and products that benefited from the support of the Fuel Cell Technologies Program and its predecessor programs within DOE's Office of Energy Efficiency and Renewable Energy.

  18. Improving Outcome Assessment in Information Technology Program Accreditation

    ERIC Educational Resources Information Center

    Goda, Bryan S.; Reynolds, Charles

    2010-01-01

    As of March 2010, there were fourteen Information Technology programs accredited by the Accreditation Board for Engineering and Technology, known as ABET, Inc (ABET Inc. 2009). ABET Inc. is the only recognized institution for the accreditation of engineering, computing, and technology programs in the U.S. There are currently over 128 U.S. schools…

  19. Technology and Literacy: 21st Century Library Programming for Children and Teens

    ERIC Educational Resources Information Center

    Nelson, Jennifer; Braafladt, Keith

    2012-01-01

    Technology may not be a magic wand, but innovative technology programming can genuinely help children become adept at navigating our increasingly wired world while also helping them develop deductive reasoning, math, and other vital literacy skills. One of the simplest and most powerful tools for technology-based public library programming is…

  20. Biomedical Technology. Innovations: The Social Consequences of Science and Technology Program.

    ERIC Educational Resources Information Center

    McInerney, Joseph D.; And Others

    This module is part of an interdisciplinary program designed to educate the general citizenry regarding the issues of science/technology/society that have important consequences for both present and future social policies. Specifically, the program provides an opportunity for students to assess the effects of selected technological innovations in…

  1. A Model for Integrating New Technologies into Pre-Service Teacher Training Programs Ajman University (A Case Study)

    ERIC Educational Resources Information Center

    Shaqour, Ali Zuhdi H.

    2005-01-01

    This study introduces a "Technology Integration Model" for a learning environment utilizing constructivist learning principles and integrating new technologies namely computers and the Internet into pre-service teacher training programs. The technology integrated programs and learning environments may assist learners to gain experiences…

  2. The Impact of an Online Collaborative Learning Program on Students' Attitude towards Technology

    ERIC Educational Resources Information Center

    Magen-Nagar, Noga; Shonfeld, Miri

    2018-01-01

    This quantitative research examined the contribution of an Online Collaborative Learning (OCL) program on attitudes towards technology in terms of technological anxiety, self-confidence and technology orientation among M.Ed. students. The advanced online collaborative program was implemented at two teacher training colleges in Israel for a period…

  3. Environmental restoration and waste management: Robotics technology development program: Robotics 5-year program plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This plan covers robotics Research, Development, Demonstration, Testing and Evaluation activities in the Program for the next five years. These activities range from bench-scale R D to full-scale hot demonstrations at DOE sites. This plan outlines applications of existing technology to near-term needs, the development and application of enhanced technology for longer-term needs, and initiation of advanced technology development to meet those needs beyond the five-year plan. The objective of the Robotic Technology Development Program (RTDP) is to develop and apply robotics technologies that will enable Environmental Restoration and Waste Management (ER WM) operations at DOE sites to be safer,more » faster and cheaper. Five priority DOE sites were visited in March 1990 to identify needs for robotics technology in ER WM operations. This 5-Year Program Plan for the RTDP detailed annual plans for robotics technology development based on identified needs. In July 1990 a forum was held announcing the robotics program. Over 60 organizations (industrial, university, and federal laboratory) made presentations on their robotics capabilities. To stimulate early interactions with the ER WM activities at DOE sites, as well as with the robotics community, the RTDP sponsored four technology demonstrations related to ER WM needs. These demonstrations integrated commercial technology with robotics technology developed by DOE in support of areas such as nuclear reactor maintenance and the civilian reactor waste program. 2 figs.« less

  4. Marine Propulsion Technology Program Meets the Demand

    ERIC Educational Resources Information Center

    Fowler, Howard G.

    1974-01-01

    The marine technology program cluster at Florida Keys Community College is described. Technicians are trained to maintain and repair engines and selected marine accessories through a marine propulsion technology curriculum (certificate program and associate in science degree). (EA)

  5. Educational Leadership.

    ERIC Educational Resources Information Center

    Tollett, John R., Ed.

    This document contains the following papers on educational leadership programs and technology: (1) "Technology Standards for School Administrators: Implications for Administrator Preparation Programs" (Warren C. Hope, Bernadette Kelley, and Janet A. Guyden); (2) "Information Technology and the Transformation of Leadership Preparation Programs: A…

  6. 75 FR 10464 - Broadband Technology Opportunities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-08

    ... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration [Docket Number 0907141137-0119-08] RIN 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National... Infrastructure (CCI) projects under the Broadband Technology Opportunities Program (BTOP) is extended until 5:00...

  7. 75 FR 14131 - Broadband Technology Opportunities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration [Docket Number: 0907141137-0154-09] RIN 0660-ZA28 Broadband Technology Opportunities Program AGENCY: National... under the Broadband Technology Opportunities Program (BTOP) is extended until 10 p.m. Eastern Daylight...

  8. Implementing Technology with Industrial Community: The SBIR Example

    NASA Technical Reports Server (NTRS)

    Ghuman, Parminder

    2005-01-01

    The Earth-Sun system Technology Office (ESTO) works with Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs to supplement its own technology development program. The SBIR/STTR program is a highly competitive program that encourages small business to explore their technological potential to fulfill technology needs identified by ESTO. SBIR program has three phases. The Phase 1 contracts last for 6 months with a maximum funding of $70,000, and Phase 2 contracts last for 24 months with a maximum funding of $600,000. For Phase 3, the small business must find funding in the private sector or other non-SBIR federal agency funding. During this phase ESTO evaluates Phase 2 graduates and selects those that need to be further developed for airborne or spaceflight demonstration and provides funding. This paper will discuss the all three phases in and role of ESTO in this program.

  9. NASA Goddard Thermal Technology Overview 2017

    NASA Technical Reports Server (NTRS)

    Butler, Dan; Swanson, Ted

    2017-01-01

    This presentation summarizes the current plans and efforts at NASA Goddard to develop new thermal control technology for anticipated future missions. It will also address some of the programmatic developments currently underway at NASA, especially with respect to the NASA Technology Development Program. The effects of the recently enacted FY 17 NASA budget, which includes a sizeable increase, will also be addressed. While funding for basic technology development is still tight, significant efforts are being made in direct support of flight programs. Thermal technology Implementation on current flight programs will be reviewed, and the recent push for CubeSat mission development will also be addressed. Many of these technologies also have broad applicability to DOD (Dept. of Defense), DOE (Dept. of the Environment), and commercial programs. Partnerships have been developed with the Air Force, Navy, and various universities to promote technology development. In addition, technology development activities supported by internal research and development (IRAD) program and the Small Business Innovative Research (SBIR) program are reviewed in this presentation. Specific technologies addressed include; two-phase systems applications and issues on NASA missions, latest developments of electro-hydrodynamically pumped systems, Atomic Layer Deposition (ALD), Micro-scale Heat Transfer, and various other research activities.

  10. NASA Goddard Thermal Technology Overview 2016

    NASA Technical Reports Server (NTRS)

    Butler, Dan; Swanson, Ted

    2016-01-01

    This presentation summarizes the current plans and efforts at NASA Goddard to develop new thermal control technology for anticipated future missions. It will also address some of the programmatic developments currently underway at NASA, especially with respect to the NASA Technology Development Program. The effects of the recently enacted FY 16 NASA budget, which includes a sizeable increase, will also be addressed. While funding for basic technology development is still tight, significant efforts are being made in direct support of flight programs. Thermal technology implementation on current flight programs will be reviewed, and the recent push for Cube-sat mission development will also be addressed. Many of these technologies also have broad applicability to DOD, DOE, and commercial programs. Partnerships have been developed with the Air Force, Navy, and various universities to promote technology development. In addition, technology development activities supported by internal research and development (IRAD) program and the Small Business Innovative Research (SBIR) program are reviewed in this presentation. Specific technologies addressed include; two-phase systems applications and issues on NASA missions, latest developments of electro-hydrodynamically pumped systems, Atomic Layer Deposition (ALD), Micro-scale Heat Transfer, and various other research activities.

  11. NASA's Commercial Communication Technology Program

    NASA Technical Reports Server (NTRS)

    Bagwell, James W.

    1998-01-01

    Various issues associated with "NASA's Commercial Communication Technology Program" are presented in viewgraph form. Specific topics include: 1) Coordination/Integration of government program; 2) Achievement of seamless interoperable satellite and terrestrial networks; 3) Establishment of program to enhance Satcom professional and technical workforce; 4) Precompetitive technology development; and 5) Effective utilization of spectrum and orbit assets.

  12. Development of a Curriculum in Laser Technology. Final Report.

    ERIC Educational Resources Information Center

    Wasserman, William J.

    A Seattle Central Community College project visited existing programs, surveyed need, and developed a curriculum for a future program in Laser-Electro-Optics (LEO) Technology. To establish contacts and view successful programs, project staff made visits to LEO technology programs at San Jose City College and Texas State Technical Institute, Center…

  13. Report on High Technology Programs in Illinois Public Community Colleges.

    ERIC Educational Resources Information Center

    Illinois Community Coll. Board, Springfield.

    Survey results are presented from a study of the steps being taken by the 52 Illinois public community colleges to develop and provide programs in high technology fields. First, high technology programs are defined as those occupational programs that educate and train individuals to operate, maintain, and/or repair micro-electronic or computerized…

  14. Parental Decision Making about Technology and Quality in Child Care Programs

    ERIC Educational Resources Information Center

    Rose, Katherine K.; Vittrup, Brigitte; Leveridge, Tinney

    2013-01-01

    Background: This study investigated parental decision making about non-parental child care programs based on the technological and quality components of the program, both child-focused and parent-focused. Child-focused variables related to children's access to technology such as computers, educational television programming, and the internet.…

  15. 34 CFR 403.30 - What documents must a State submit to receive a grant?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION STATE VOCATIONAL AND APPLIED TECHNOLOGY... Technology Education Program on the basis of program years that coincide with program years under section 104... Technology Education Program. (Approved by the Office of Management and Budget under Control No. 1830-0029...

  16. Vehicle Technologies and Fuel Cell Technologies Program: Prospective Benefits Assessment Report for Fiscal Year 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, T. S.; Taylor, C. H.; Moore, J. S.

    Under a diverse set of programs, the Vehicle Technologies and Fuel Cell Technologies offices of DOE’s Office of Energy Efficiency and Renewable Energy invest in research, development, demonstration, and deployment of advanced vehicle, hydrogen production, delivery and storage, and fuel cell technologies. This report estimates the benefits of successfully developing and deploying these technologies (a “Program Success” case) relative to a base case (the “No Program” case). The Program Success case represents the future with completely successful deployment of Vehicle Technologies Office (VTO) and Fuel Cell Technologies Office (FCTO) technologies. The No Program case represents a future in which theremore » is no contribution after FY 2016 by the VTO or FCTO to these technologies. The benefits of advanced vehicle, hydrogen production, delivery and storage, and fuel cell technologies were estimated on the basis of differences in fuel use, primary energy use, and greenhouse gas (GHG) emissions from light-, medium- and heavy-duty vehicles, including energy and emissions from fuel production, between the base case and the Program Success case. Improvements in fuel economy of various vehicle types, growth in the stock of fuel cell vehicles and other advanced technology vehicles, and decreased GHG intensity of hydrogen production and delivery in the Program Success case over the No Program case were projected to result in savings in petroleum use and GHG emissions. Benefits were disaggregated by individual program technology areas, which included the FCTO program and the VTO subprograms of batteries and electric drives; advanced combustion engines; fuels and lubricants; materials (for reduction in vehicle mass, or “lightweighting”); and, for medium- and heavy-duty vehicles, reduction in rolling and aerodynamic resistance. Projections for the Program Success case indicate that by 2035, the average fuel economy of on-road, light-duty vehicle stock could be 47% to 76% higher than in the No Program case. On-road medium- and heavy-duty vehicle stock could be as much as 39% higher. The resulting petroleum savings in 2035 were estimated to be as high as 3.1 million barrels per day, and reductions in GHG emissions were estimated to be as high as 500 million metric tons of CO2 equivalent per year. The benefits of continuing to invest government resources in advanced vehicle and fuel cell technologies would have significant economic value in the U.S. transportation sector and reduce its dependency on oil and its vulnerability to oil price shocks.« less

  17. Mass Balance, Beneficial Use Products, and Cost Comparisons of Four Sediment Treatment Technologies Near Commercialization

    DTIC Science & Technology

    2011-03-01

    Superfund Innovative Technology Evaluation Program (SITE) • Assessment and Remediation of Contaminated Sediment Program (ARCS) • Contaminated Sediment...Agency (USEPA). 1994. Assessment and Remediation of Contaminated Sediments (ARCS) Program Remediation Guidance Document. EPA/905/R-94/003. Chicago, IL...conducted under one of the four following programs: • Superfund Innovative Technology Evaluation (SITE) Program • Assessment and Remediation of

  18. Funding and Strategic Alignment Guidance for Infusing Small Business Innovation Research Technology into NASA Programs Associated with the Aeronautics Research Mission Directorate

    NASA Technical Reports Server (NTRS)

    Nguyen, Hung D.; Steele, Gynelle C.

    2015-01-01

    This report is intended to help NASA program and project managers incorporate Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) technologies that have gone through Phase II of the SBIR program into NASA Aeronautics and Mission Directorate (ARMD) programs. Other Government and commercial program managers can also find this information useful.

  19. DOE Solar Energy Technologies Program FY 2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The DOE Solar Energy Technologies Program FY 2005 Annual Report chronicles the R&D results of the U.S. Department of Energy Solar Energy Technologies Program for Fiscal Year 2005. In particular, the report describes R&D performed by the Program?s national laboratories (National Renewable Energy Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, and Brookhaven National Laboratory) and university and industry partners.

  20. Innovative quantum technologies for microgravity fundamental physics and biological research

    NASA Technical Reports Server (NTRS)

    Kierk, I.; Israelsson, U.; Lee, M.

    2001-01-01

    This paper presents a new technology program, within the fundamental physics research program, focusing on four quantum technology areas: quantum atomics, quantum optics, space superconductivity and quantum sensor technology, and quantum fluid based sensor and modeling technology.

  1. 45 CFR 170.490 - Sunset of the temporary certification program.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....490 Section 170.490 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Temporary Certification Program for HIT...

  2. 45 CFR 170.490 - Sunset of the temporary certification program.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....490 Section 170.490 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Temporary Certification Program for HIT...

  3. 45 CFR 170.490 - Sunset of the temporary certification program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....490 Section 170.490 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Temporary Certification Program for HIT...

  4. 45 CFR 170.490 - Sunset of the temporary certification program.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....490 Section 170.490 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Temporary Certification Program for HIT...

  5. 45 CFR 170.490 - Sunset of the temporary certification program.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....490 Section 170.490 Public Welfare Department of Health and Human Services HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Temporary Certification Program for HIT...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  7. Sensors 2000! Program: Advanced Biosensor and Measurement Systems Technologies for Spaceflight Research and Concurrent, Earth-Based Applications

    NASA Technical Reports Server (NTRS)

    Hines, J.

    1999-01-01

    Sensors 2000! (S2K!) is a specialized, integrated projects team organized to provide focused, directed, advanced biosensor and bioinstrumentation systems technology support to NASA's spaceflight and ground-based research and development programs. Specific technology thrusts include telemetry-based sensor systems, chemical/ biological sensors, medical and physiological sensors, miniaturized instrumentation architectures, and data and signal processing systems. A concurrent objective is to promote the mutual use, application, and transition of developed technology by collaborating in academic-commercial-govemment leveraging, joint research, technology utilization and commercialization, and strategic partnering alliances. Sensors 2000! is organized around three primary program elements: Technology and Product Development, Technology infusion and Applications, and Collaborative Activities. Technology and Product Development involves development and demonstration of biosensor and biotelemetry systems for application to NASA Space Life Sciences Programs; production of fully certified spaceflight hardware and payload elements; and sensor/measurement systems development for NASA research and development activities. Technology Infusion and Applications provides technology and program agent support to identify available and applicable technologies from multiple sources for insertion into NASA's strategic enterprises and initiatives. Collaborative Activities involve leveraging of NASA technologies with those of other government agencies, academia, and industry to concurrently provide technology solutions and products of mutual benefit to participating members.

  8. Spinoff 1998

    NASA Technical Reports Server (NTRS)

    1998-01-01

    In 1958, a Congressional Mandate directed the National Aeronautics and Space Agency to ensure for the widest possible dissemination of its research and development results. Thus, the Scientific and Technical Information (STI) Program was born. While this program addressed mostly the timely dissemination of information to NASA, NASA contractors, other government agencies, and the public, technologies were identified that were clearly transferable and applicable to industry for additional use in the development of commercial products and services. Such considerations spun off the Technology Utilization Program. The very successful program went through several name changes and is today called the NASA Commercial Technology Program. The changes that have occurred over time are not only name changes, but program changes that have dramatically altered the philosophy, mission, and goal of the program. It has been identified that a more intense and proactive outreach effort within the program is necessary in order to make the newest and latest technologies available to industry now-at the time the technology is actually developed. The NASA Commercial Technology Network (NCTN), its interaction with industry at all levels through a large network of organizations and offices, is contributing to the success of small, medium, and large U.S. businesses to remain globally competitive. At the same time, new products and services derived from the transfer and application of NASA technology benefit everyone. This publication includes the following: Aerospace research and development - NASA headquarters and centers. Technology transfer and commercialization. Commercial benefits - spinoffs.NASA success and education. NASA commercial technology network.

  9. Rover and Telerobotics Technology Program

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.

    1998-01-01

    The Jet Propulsion Laboratory's (JPL's) Rover and Telerobotics Technology Program, sponsored by the National Aeronautics and Space Administration (NASA), responds to opportunities presented by NASA space missions and systems, and seeds commerical applications of the emerging robotics technology. The scope of the JPL Rover and Telerobotics Technology Program comprises three major segments of activity: NASA robotic systems for planetary exploration, robotic technology and terrestrial spin-offs, and technology for non-NASA sponsors. Significant technical achievements have been reached in each of these areas, including complete telerobotic system prototypes that have built and tested in realistic scenarios relevant to prospective users. In addition, the program has conducted complementary basic research and created innovative technology and terrestrial applications, as well as enabled a variety of commercial spin-offs.

  10. The NASA controls-structures interaction technology program

    NASA Technical Reports Server (NTRS)

    Newsom, Jerry R.; Layman, W. E.; Waites, H. B.; Hayduk, R. J.

    1990-01-01

    The interaction between a flexible spacecraft structure and its control system is commonly referred to as controls-structures interaction (CSI). The CSI technology program is developing the capability and confidence to integrate the structure and control system, so as to avoid interactions that cause problems and to exploit interactions to increase spacecraft capability. A NASA program has been initiated to advance CSI technology to a point where it can be used in spacecraft design for future missions. The CSI technology program is a multicenter program utilizing the resources of the NASA Langley Research Center (LaRC), the NASA Marshall Space Flight Center (MSFC), and the NASA Jet Propulsion Laboratory (JPL). The purpose is to describe the current activities, results to date, and future activities of the NASA CSI technology program.

  11. Energy Efficiency and Renewable Energy Program. Bibliography, 1993 edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaughan, K.H.

    1993-06-01

    The Bibliography contains listings of publicly available reports, journal articles, and published conference papers sponsored by the DOE Office of Energy Efficiency and Renewable Energy and published between 1987 and mid-1993. The topics of Bibliography include: analysis and evaluation; building equipment research; building thermal envelope systems and materials; district heating; residential and commercial conservation program; weatherization assistance program; existing buildings research program; ceramic technology project; alternative fuels and propulsion technology; microemulsion fuels; industrial chemical heat pumps; materials for advanced industrial heat exchangers; advanced industrial materials; tribology; energy-related inventions program; electric energy systems; superconducting technology program for electric energy systems; thermalmore » energy storage; biofuels feedstock development; biotechnology; continuous chromatography in multicomponent separations; sensors for electrolytic cells; hydropower environmental mitigation; environmental control technology; continuous fiber ceramic composite technology.« less

  12. 76 FR 1261 - Establishment of the Permanent Certification Program for Health Information Technology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-07

    ...This final rule establishes a permanent certification program for the purpose of certifying health information technology (HIT). This final rule is issued pursuant to the authority granted to the National Coordinator for Health Information Technology (the National Coordinator) by section 3001(c)(5) of the Public Health Service Act (PHSA), as added by the Health Information Technology for Economic and Clinical Health (HITECH) Act. The permanent certification program will eventually replace the temporary certification program that was previously established by a final rule. The National Coordinator will use the permanent certification program to authorize organizations to certify electronic health record (EHR) technology, such as Complete EHRs and/or EHR Modules. The permanent certification program could also be expanded to include the certification of other types of HIT.

  13. Innovative Partnerships Program Accomplishments: 2009-2010 at NASA's Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Makufka, David

    2010-01-01

    This document reports on the accomplishments of the Innovative Partnerships Program during the two years of 2009 and 2010. The mission of the Innovative Partnerships Program is to provide leveraged technology alternatives for mission directorates, programs, and projects through joint partnerships with industry, academia, government agencies, and national laboratories. As outlined in this accomplishments summary, the IPP at NASA's Kennedy Space Center achieves this mission via two interdependent goals: (1) Infusion: Bringing external technologies and expertise into Kennedy to benefit NASA missions, programs, and projects (2) Technology Transfer: Spinning out space program technologies to increase the benefits for the nation's economy and humanity

  14. Space Station Engineering and Technology Development. Proceedings of the Panel on Program Performance and Onboard Mission Control

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An ad-hoc committee was asked to review the following questions relevant to the space station program: (1) onboard maintainability and repair; (2) in-space research and technology program and facility plans; (3) solar thermodynamic research and technology development program planning; (4) program performance (cost estimating, management, and cost avoidance); (5) onboard versus ground-based mission control; and (6) technology development road maps from IOC to the growth station. The objective of these new assignments is to provide NASA with advice on ways and means for improving the content, performance, and/or effectiveness of these elements of the space station program.

  15. Evaluation of spacecraft technology programs (effects on communication satellite business ventures), volume 1

    NASA Technical Reports Server (NTRS)

    Greenburg, J. S.; Gaelick, C.; Kaplan, M.; Fishman, J.; Hopkins, C.

    1985-01-01

    Commercial organizations as well as government agencies invest in spacecraft (S/C) technology programs that are aimed at increasing the performance of communications satellites. The value of these programs must be measured in terms of their impacts on the financial performane of the business ventures that may ultimately utilize the communications satellites. An economic evaluation and planning capability was developed and used to assess the impact of NASA on-orbit propulsion and space power programs on typical fixed satellite service (FSS) and direct broadcast service (DBS) communications satellite business ventures. Typical FSS and DBS spin and three-axis stabilized spacecraft were configured in the absence of NASA technology programs. These spacecraft were reconfigured taking into account the anticipated results of NASA specified on-orbit propulsion and space power programs. In general, the NASA technology programs resulted in spacecraft with increased capability. The developed methodology for assessing the value of spacecraft technology programs in terms of their impact on the financial performance of communication satellite business ventures is described. Results of the assessment of NASA specified on-orbit propulsion and space power technology programs are presented for typical FSS and DBS business ventures.

  16. Evaluation of spacecraft technology programs (effects on communication satellite business ventures), volume 1

    NASA Astrophysics Data System (ADS)

    Greenburg, J. S.; Gaelick, C.; Kaplan, M.; Fishman, J.; Hopkins, C.

    1985-09-01

    Commercial organizations as well as government agencies invest in spacecraft (S/C) technology programs that are aimed at increasing the performance of communications satellites. The value of these programs must be measured in terms of their impacts on the financial performane of the business ventures that may ultimately utilize the communications satellites. An economic evaluation and planning capability was developed and used to assess the impact of NASA on-orbit propulsion and space power programs on typical fixed satellite service (FSS) and direct broadcast service (DBS) communications satellite business ventures. Typical FSS and DBS spin and three-axis stabilized spacecraft were configured in the absence of NASA technology programs. These spacecraft were reconfigured taking into account the anticipated results of NASA specified on-orbit propulsion and space power programs. In general, the NASA technology programs resulted in spacecraft with increased capability. The developed methodology for assessing the value of spacecraft technology programs in terms of their impact on the financial performance of communication satellite business ventures is described. Results of the assessment of NASA specified on-orbit propulsion and space power technology programs are presented for typical FSS and DBS business ventures.

  17. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.

  18. General Aviation Propulsion

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Programs exploring and demonstrating new technologies in general aviation propulsion are considered. These programs are the quiet, clean, general aviation turbofan (QCGAT) program; the general aviation turbine engine (GATE) study program; the general aviation propeller technology program; and the advanced rotary, diesel, and reciprocating engine programs.

  19. Human Reproduction: Social and Technological Aspects. Innovations: The Social Consequences of Science and Technology Program.

    ERIC Educational Resources Information Center

    McConnell, Mary C.; And Others

    This module is part of an interdisciplinary program designed to educate the general citizenry regarding the issues of science/technology/society that have important consequences for both present and future social policies. Specifically, the program provides an opportunity for students to assess the effects of selected technological innovations in…

  20. One-to-One Laptop Teacher Education: Does Involvement Affect Candidate Technology Skills and Dispositions?

    ERIC Educational Resources Information Center

    Donovan, Loretta; Green, Tim; Hansen, Laurie E.

    2012-01-01

    This study compares teacher candidates' initial and changed beliefs, dispositions, and uses of technology in two credential program models: a one-to-one laptop program with ubiquitous technology use and a traditional credential program in which students are expected to have specific technology experiences and requirements in each course (a model…

  1. High area rate reconnaissance (HARR) and mine reconnaissance/hunter (MR/H) exploratory development programs

    NASA Astrophysics Data System (ADS)

    Lathrop, John D.

    1995-06-01

    This paper describes the sea mine countermeasures developmental context, technology goals, and progress to date of the two principal Office of Naval Research exploratory development programs addressing sea mine reconnaissance and minehunting technology development. The first of these programs, High Area Rate Reconnaissance, is developing toroidal volume search sonar technology, sidelooking sonar technology, and associated signal processing technologies (motion compensation, beamforming, and computer-aided detection and classification) for reconnaissance and hunting against volume mines and proud bottom mines from 21-inch diameter vehicles operating in deeper waters. The second of these programs, Amphibious Operation Area Mine Reconnaissance/Hunter, is developing a suite of sensor technologies (synthetic aperture sonar, ahead-looking sonar, superconducting magnetic field gradiometer, and electro-optic sensor) and associated signal processing technologies for reconnaissance and hunting against all mine types (including buried mines) in shallow water and very shallow water from 21-inch diameter vehicles. The technologies under development by these two programs must provide excellent capabilities for mine detection, mine classification, and discrimination against false targets.

  2. Integrating the Department of Defense Military Services’ Technology Development Programs to Improve Time, Cost, and Technical Quality Parameters

    DTIC Science & Technology

    2007-03-01

    the DoD in general and across the SR, DD(X), and FCS programs in particular. The findings of this study show that through careful planning and...FCS programs in particular. The findings of this study show that through careful planning and coordinated technology transition, DoD acquisition...careful planning and coordinated technology transition, DoD acquisition programs can indeed leverage the technology development efforts of the three

  3. NASA's Physics of the Cosmos and Cosmic Origins technology development programs

    NASA Astrophysics Data System (ADS)

    Clampin, Mark; Pham, Thai

    2014-07-01

    NASA's Physics of the Cosmos (PCOS) and Cosmic Origins (COR) Program Offices, established in 2011, reside at the NASA Goddard Space Flight Center (GSFC). The offices serve as the implementation arm for the Astrophysics Division at NASA Headquarters. We present an overview of the programs' technology development activities and technology investment portfolio, funded by NASA's Strategic Astrophysics Technology (SAT) program. We currently fund 19 technology advancements to enable future PCOS and COR missions to help answer the questions "How did our universe begin and evolve?" and "How did galaxies, stars, and planets come to be?" We discuss the process for addressing community-provided technology gaps and Technology Management Board (TMB)-vetted prioritization and investment recommendations that inform the SAT program. The process improves the transparency and relevance of our technology investments, provides the community a voice in the process, and promotes targeted external technology investments by defining needs and identifying customers. The programs' goal is to promote and support technology development needed to enable missions envisioned by the National Research Council's (NRC) "New Worlds, New Horizons in Astronomy and Astrophysics" (NWNH) Decadal Survey report [1] and the Astrophysics Implementation Plan (AIP) [2]. These include technology development for dark energy, gravitational waves, X-ray and inflation probe science, and a 4m-class UV/optical telescope to conduct imaging and spectroscopy studies, as a post-Hubble observatory with significantly improved sensitivity and capability.

  4. NASA's Physics of the Cosmos and Cosmic Origins Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Clampin, Mark; Pham, Thai

    2014-01-01

    NASA's Physics of the Cosmos (PCOS) and Cosmic Origins (COR) Program Offices, established in 2011, reside at the NASA Goddard Space Flight Center (GSFC). The offices serve as the implementation arm for the Astrophysics Division at NASA Headquarters. We present an overview of the programs' technology development activities and technology investment portfolio, funded by NASA's Strategic Astrophysics Technology (SAT) program. We currently fund 19 technology advancements to enable future PCOS and COR missions to help answer the questions "How did our universe begin and evolve?" and "How did galaxies, stars, and planets come to be?" We discuss the process for addressing community-provided technology gaps and Technology Management Board (TMB)-vetted prioritization and investment recommendations that inform the SAT program. The process improves the transparency and relevance of our technology investments, provides the community a voice in the process, and promotes targeted external technology investments by defining needs and identifying customers. The programs' goal is to promote and support technology development needed to enable missions envisioned by the National Research Council's (NRC) "New Worlds, New Horizons in Astronomy and Astrophysics" (NWNH) Decadal Survey report [1] and the Astrophysics Implementation Plan (AIP) [2]. These include technology development for dark energy, gravitational waves, X-ray and inflation probe science, and a 4m-class UV/optical telescope to conduct imaging and spectroscopy studies, as a post-Hubble observatory with significantly improved sensitivity and capability.

  5. NASA's Physics of the Cosmos and Cosmic Origins Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Pham, Thai; Seery, Bernard; Ganel, Opher

    2016-01-01

    The strategic astrophysics missions of the coming decades will help answer the questions "How did our universe begin and evolve?" and "How did galaxies, stars, and planets come to be?" Enabling these missions requires advances in key technologies far beyond the current state of the art. NASA's Physics of the Cosmos (PCOS) and Cosmic Origins (COR) Program Offices manage technology maturation projects funded through the Strategic Astrophysics Technology (SAT) program to accomplish such advances. The PCOS and COR Program Offices, residing at the NASA Goddard Space Flight Center (GSFC), were established in 2011, and serve as the implementation arm for the Astrophysics Division at NASA Headquarters. We present an overview of the Programs' technology development activities and the current technology investment portfolio of 23 technology advancements. We discuss the process for addressing community-provided technology gaps and Technology Management Board (TMB)-vetted prioritization and investment recommendations that inform the SAT program. The process improves the transparency and relevance of our technology investments, provides the community a voice in the process, and promotes targeted external technology investments by defining needs and identifying customers. The Programs' priorities are driven by strategic direction from the Astrophysics Division, which is informed by the National Research Council's (NRC) "New Worlds, New Horizons in Astronomy and Astrophysics" (NWNH) 2010 Decadal Survey report [1], the Astrophysics Implementation Plan (AIP) [2] as updated, and the Astrophysics Roadmap "Enduring Quests, Daring Visions" [3]. These priorities include technology development for missions to study dark energy, gravitational waves, X-ray and inflation probe science, and large far-infrared (IR) and ultraviolet (UV)/optical/IR telescopes to conduct imaging and spectroscopy studies. The SAT program is the Astrophysics Division's main investment method to mature technologies that will be identified by study teams set up to inform the 2020 Decadal Survey process on several large astrophysics mission concepts.

  6. 76 FR 22673 - Technology Innovation Program Advisory Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-22

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Technology Innovation Program Advisory Board AGENCY: National Institute of Standards and Technology, Department of Commerce... Standards and Technology (NIST) published a notice in the Federal Register announcing an open meeting for...

  7. Fossil Energy Program Annual Progress Report for the Period April 1, 2000 through March 31, 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkins, RR

    This report covers progress made at Oak Ridge National Laboratory (ORNL) on research and development projects that contribute to the advancement of fossil energy technologies. Projects on the ORNL Fossil Energy Program are supported by the U.S. Department of Energy (DOE) Office of Fossil Energy, the DOE National Energy Technology Laboratory (NETL), the DOE Fossil Energy Clean Coal Technology (CCT) Program, the DOE National Petroleum Technology Office, and the DOE Fossil Energy Office of Strategic Petroleum Reserve (SPR). The ORNL Fossil Energy Program research and development activities cover the areas of coal, clean coal technology, gas, petroleum, and support tomore » the SPR. An important part of the Fossil Energy Program is technical management of all activities on the DOE Fossil Energy Advanced Research (AR) Materials Program. The AR Materials Program involves research at other DOE and government laboratories, at universities, and at industrial organizations.« less

  8. Energy and technology review

    NASA Astrophysics Data System (ADS)

    Johnson, K. C.

    1991-04-01

    This issue of Energy and Technology Review discusses the various educational programs in which Lawrence Livermore National Laboratory (LLNL) participates or sponsors. LLNL has a long history of fostering educational programs for students from kindergarten through graduate school. A goal is to enhance the teaching of science, mathematics, and technology and thereby assist educational institutions to increase the pool of scientists, engineers, and technicians. LLNL programs described include: (1) contributions to the improvement of U.S. science education; (2) the LESSON program; (3) collaborations with Bay Area Science and Technology Education; (4) project HOPES; (5) lasers and fusion energy education; (6) a curriculum on global climate change; (7) computer and technology instruction at LLNL's Science Education Center; (8) the National Education Supercomputer Program; (9) project STAR; (10) the American Indian Program; (11) LLNL programs with historically Black colleges and Universities; (12) the Undergraduate Summer Institute on Contemporary Topics in Applied Science; (13) the National Physical Science Consortium: A Fellowship Program for Minorities and Women; (14) LLNL's participation with AWU; (15) the apprenticeship programs at LLNL; and (16) the future of LLNL's educational programs. An appendix lists all of LLNL's educational programs and activities. Contacts and their respective telephone numbers are given for all these programs and activities.

  9. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM: INNOVATION MAKING A DIFFERENCE

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program encourages commercialization of innovative technologies for characterizing and remediating hazardous waste site contamination through four components: Demonstration, Emerging Technology, and Monitoring & Measurement Pr...

  10. Technology transfer from the viewpoint of a NASA prime contractor

    NASA Technical Reports Server (NTRS)

    Dyer, Gordon

    1992-01-01

    Viewgraphs on technology transfer from the viewpoint of a NASA prime contractor are provided. Technology Transfer Program for Manned Space Systems and the Technology Transfer Program status are addressed.

  11. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  14. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  15. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the Science and Technology program of the Agricultural Marketing Service agency, or any officer or... be delegated, to act. Laboratories. Science and Technology laboratories performing the official analyses described in this subchapter. Program. The Science and Technology (S&T) program of the...

  16. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the Science and Technology program of the Agricultural Marketing Service agency, or any officer or... be delegated, to act. Laboratories. Science and Technology laboratories performing the official analyses described in this subchapter. Program. The Science and Technology (S&T) program of the...

  17. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the Science and Technology program of the Agricultural Marketing Service agency, or any officer or... be delegated, to act. Laboratories. Science and Technology laboratories performing the official analyses described in this subchapter. Program. The Science and Technology (S&T) program of the...

  18. 75 FR 70578 - Competitive and Noncompetitive Nonformula Federal Assistance Programs-Administrative Provisions...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ..., extension, and education programs on technology development and integrated research, extension, and education programs on technology implementation, in accordance with the purpose and priorities as described... the development, distribution, and implementation of biobased energy technologies; to promote...

  19. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM ANNUAL REPORT TO CONGRESS FY 2002

    EPA Science Inventory

    This report details the Fiscal Year 2002 activities of the Superfund Innovative Technology Evaluation (SITE) Program. The Program focused on the remediation needs of the hazardous waste remediation community through demonstration and evaluation of innovative technologies for reme...

  20. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM ANNUAL REPORT TO CONGRESS FY 2001

    EPA Science Inventory

    This report details the fiscal year 2001 activities of the Superfund Innovative Technology Evaluation (SITE) Program. The Program focuses on the remediation needs of the hazardous waste remediation community through demonstration and evaluation of innovative technologies for re...

  1. SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION PROGRAM ANNUAL REPORT TO CONGRESS FY 1995

    EPA Science Inventory

    The Superfund Innovative Technology Evaluation (SITE) Program was established more than nine years ago to encourage the development and implementation of innovative treatment technologies for hazardous waste site remediation. Development of this program was in direct response to ...

  2. 15 CFR 290.3 - Program description.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR THE... subject of research in NIST's Automated Manufacturing Research Facility (AMRF). The core of AMRF research... manufacturing technology. (b) Program objective. The objective of the NIST Manufacturing Technology Centers is...

  3. 15 CFR 290.3 - Program description.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR THE... subject of research in NIST's Automated Manufacturing Research Facility (AMRF). The core of AMRF research... manufacturing technology. (b) Program objective. The objective of the NIST Manufacturing Technology Centers is...

  4. 15 CFR 290.3 - Program description.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR THE... subject of research in NIST's Automated Manufacturing Research Facility (AMRF). The core of AMRF research... manufacturing technology. (b) Program objective. The objective of the NIST Manufacturing Technology Centers is...

  5. USEPA SITE PROGRAM APPROACH TO TECHNOLOGY TRANSFER AND REGULATORY ACCEPTANCE

    EPA Science Inventory

    The SITE Program was created to meet the increased demand for innovative technologies for hazardous waste treatment. To accomplish this mission, the program seeks to advance the development, implementation and commercialization of innovative technologies for hazardous waste chara...

  6. 78 FR 31535 - Assistive Technology Alternative Financing Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-24

    ... DEPARTMENT OF EDUCATION Assistive Technology Alternative Financing Program AGENCY: Office of Special Education and Rehabilitative Services, Department of Education. ACTION: Notice. Catalog of Federal... developed for the Assistive Technology (AT) Alternative Financing Program (AFP) in fiscal year (FY) 2012 to...

  7. Technology Transition a Model for Infusion and Commercialization

    NASA Technical Reports Server (NTRS)

    McMillan, Vernotto C.

    2006-01-01

    The National Aeronautics and Space Administration has as part of its charter the mission of transferring technologies developed for the space program into the private sector for the purpose of affording back to the American people the economical and improved quality of life benefits associated with the technologies developed. In recent years considerable effort has been made to use this program for not only transitioning technologies out of the NASA Mission Directorate Programs, but also to transfer technologies into the Mission Directorate Programs and leverage the impact of government and private sector innovation. The objective of this paper is to outline an approach and the creation of a model that brings together industry, government, and commercialization strategies. When these elements are integrated, the probability of successful technology development, technology infusion into the Mission Programs, and commercialization into the private sector is increased. This model primarily addresses technology readiness levels between TRL 3 and TRL 6. This is typically a gap area known as the valley of death. This gap area is too low for commercial entities to invest heavily and not developed enough for major programs to actively pursue. This model has shown promise for increasing the probably of TRL advancement to an acceptable level for NASA programs and/or commercial entities to afford large investments toward either commercialization or infusion.

  8. NASA space research and technology overview (ITP)

    NASA Technical Reports Server (NTRS)

    Reck, Gregory M.

    1992-01-01

    A series of viewgraphs summarizing NASA space research and technology is presented. Some of the specific topics covered include the organization and goals of the Office of Aeronautics and Space Technology, technology maturation strategy, integrated technology plan for the Civil Space Program, program selection and investment prioritization, and space technology benefits.

  9. Programming and Technology for Accessibility in Geoscience

    NASA Astrophysics Data System (ADS)

    Sevre, E.; Lee, S.

    2013-12-01

    Many people, students and professors alike, shy away from learning to program because it is often believed to be something scary or unattainable. However, integration of programming into geoscience education can be a valuable tool for increasing the accessibility of content for all who are interested. It is my goal to dispel these myths and convince people that: 1) Students with disabilities can use programming to increase their role in the classroom, 2) Everyone can learn to write programs to simplify daily tasks, 3) With a deep understanding of the task, anyone can write a program to do a complex task, 4) Technology can be combined with programming to create an inclusive environment for all students of geoscience, and 5) More advanced knowledge of programming and technology can lead geoscientists to create software to serve as assistive technology in the classroom. It is my goal to share my experiences using technology to enhance the classroom experience as a way of addressing the aforementioned issues. Through my experience, I have found that programming skills can be included and learned by all to enhance the content of courses without detracting from curriculum. I hope that, through this knowledge, geoscience courses can become more accessible for people with disabilities by including programming and technology to the benefit of all involved.

  10. Proposed Social Spending Innovation Research (SSIR) Program: Harnessing American Entrepreneurial Talent to Solve Major U.S. Social Problems

    ERIC Educational Resources Information Center

    Coalition for Evidence-Based Policy, 2015

    2015-01-01

    The Social Spending Innovation Research (SSIR) proposal seeks to replicate, in social spending, the great success of the Small Business Innovation Research (SBIR) program in technology development. The SBIR program funds technology development by entrepreneurial small companies. The program has spawned breakthrough technologies in diverse areas…

  11. The Computer Experience Microvan Program: A Cooperative Endeavor to Improve University-Public School Relations through Technology.

    ERIC Educational Resources Information Center

    Amodeo, Luiza B.; Martin, Jeanette

    To a large extent the Southwest can be described as a rural area. Under these circumstances, programs for public understanding of technology become, first of all, exercises in logistics. In 1982, New Mexico State University introduced a program to inform teachers about computer technology. This program takes microcomputers into rural classrooms…

  12. 23 CFR 420.207 - What are the requirements for research, development, and technology transfer work programs?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., Development and Technology Transfer Program Management § 420.207 What are the requirements for research, development, and technology transfer work programs? (a) The State DOT's RD&T work program must, as a minimum... 23 Highways 1 2013-04-01 2013-04-01 false What are the requirements for research, development, and...

  13. 23 CFR 420.207 - What are the requirements for research, development, and technology transfer work programs?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., Development and Technology Transfer Program Management § 420.207 What are the requirements for research, development, and technology transfer work programs? (a) The State DOT's RD&T work program must, as a minimum... 23 Highways 1 2010-04-01 2010-04-01 false What are the requirements for research, development, and...

  14. 23 CFR 420.207 - What are the requirements for research, development, and technology transfer work programs?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., Development and Technology Transfer Program Management § 420.207 What are the requirements for research, development, and technology transfer work programs? (a) The State DOT's RD&T work program must, as a minimum... 23 Highways 1 2014-04-01 2014-04-01 false What are the requirements for research, development, and...

  15. 23 CFR 420.207 - What are the requirements for research, development, and technology transfer work programs?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., Development and Technology Transfer Program Management § 420.207 What are the requirements for research, development, and technology transfer work programs? (a) The State DOT's RD&T work program must, as a minimum... 23 Highways 1 2012-04-01 2012-04-01 false What are the requirements for research, development, and...

  16. 23 CFR 420.207 - What are the requirements for research, development, and technology transfer work programs?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., Development and Technology Transfer Program Management § 420.207 What are the requirements for research, development, and technology transfer work programs? (a) The State DOT's RD&T work program must, as a minimum... 23 Highways 1 2011-04-01 2011-04-01 false What are the requirements for research, development, and...

  17. 34 CFR 614.1 - What is the purpose of the Preparing Tomorrow's Teachers to Use Technology program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false What is the purpose of the Preparing Tomorrow's Teachers to Use Technology program? 614.1 Section 614.1 Education Regulations of the Offices of the... Use Technology program? (a) This program provides grants to help future teachers become proficient in...

  18. Mississippi Curriculum Framework for Forestry Technology (Program CIP: 03.0401--Forest Harvesting and Production Technology). Postsecondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the forestry technology program cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the…

  19. Mississippi Curriculum Framework for Medical Radiologic Technology (Radiography) (CIP: 51.0907--Medical Radiologic Technology). Postsecondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the radiologic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…

  20. Programs for Middle School Math: An Inventory of Existing Technology. Working Paper

    ERIC Educational Resources Information Center

    Saultz, Andrew

    2012-01-01

    In this working paper, Andrew Saultz of Michigan State University inventories the current landscape of technology programs available for middle school math. The working paper is not intended as a "consumers' guide" to technology programs, and the descriptions of some specific programs are not fully accurate or current. Readers who are interested…

Top