Sample records for systems technology lsst

  1. Designing a Multi-Petabyte Database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei

    2007-01-10

    The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less

  2. A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias; Langton, J. Brian; Wahl, Bill

    2017-09-01

    This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.

  3. The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowden, Gordon B.; Langton, Brian J.; /SLAC

    2014-05-28

    The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less

  4. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration

    2011-01-01

    The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  5. Satellite Power Systems (SPS). LSST systems and integration task for SPS flight test article

    NASA Technical Reports Server (NTRS)

    Greenberg, H. S.

    1981-01-01

    This research activity emphasizes the systems definition and resulting structural requirements for the primary structure of two potential SPS large space structure test articles. These test articles represent potential steps in the SPS research and technology development.

  6. Geostationary platform systems concepts definition follow-on study. Volume 2A: Technical Task 2 LSST special emphasis

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.

  7. LSST active optics system software architecture

    NASA Astrophysics Data System (ADS)

    Thomas, Sandrine J.; Chandrasekharan, Srinivasan; Lotz, Paul; Xin, Bo; Claver, Charles; Angeli, George; Sebag, Jacques; Dubois-Felsmann, Gregory P.

    2016-08-01

    The Large Synoptic Survey Telescope (LSST) is an 8-meter class wide-field telescope now under construction on Cerro Pachon, near La Serena, Chile. This ground-based telescope is designed to conduct a decade-long time domain survey of the optical sky. In order to achieve the LSST scientific goals, the telescope requires delivering seeing limited image quality over the 3.5 degree field-of-view. Like many telescopes, LSST will use an Active Optics System (AOS) to correct in near real-time the system aberrations primarily introduced by gravity and temperature gradients. The LSST AOS uses a combination of 4 curvature wavefront sensors (CWS) located on the outside of the LSST field-of-view. The information coming from the 4 CWS is combined to calculate the appropriate corrections to be sent to the 3 different mirrors composing LSST. The AOS software incorporates a wavefront sensor estimation pipeline (WEP) and an active optics control system (AOCS). The WEP estimates the wavefront residual error from the CWS images. The AOCS determines the correction to be sent to the different degrees of freedom every 30 seconds. In this paper, we describe the design and implementation of the AOS. More particularly, we will focus on the software architecture as well as the AOS interactions with the various subsystems within LSST.

  8. FY-79 - development of fiber optics connector technology for large space systems

    NASA Technical Reports Server (NTRS)

    Campbell, T. G.

    1980-01-01

    The development of physical concepts for integrating fiber optic connectors and cables with structural concepts proposed for the LSST is discussed. Emphasis is placed on remote connections using integrated cables.

  9. The LSST: A System of Systems

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team

    2010-01-01

    The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.

  10. Technology for large space systems: A special bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  11. Technology for large space systems: A special bibliography with indexes (supplement 01)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  12. Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  13. Technology for large space systems: A special bibliography with indexes (supplement 05)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  14. Technology for large space systems: A special bibliography with indexes (supplement 06)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  15. NOAO and LSST: Illuminating the Path to LSST for All Users

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Matheson, T.; Ridgway, S. T.; Saha, A.; Lauer, T. R.; NOAO LSST Science Working Group

    2013-01-01

    As LSST moves toward construction and survey definition, the burden on the user community to begin planning and preparing for the massive data stream grows. In light of the significant challenge and opportunity that LSST now brings, a critical role for a National Observatory will be to advocate for, respond to, and advise the U.S. community on its use of LSST. NOAO intends to establish an LSST Community Science Center to meet these common needs. Such a Center builds on NOAO's leadership in offering survey-style instruments, proposal opportunities, and data management software over the last decade. This leadership has enabled high-impact scientific results, as evidenced by the award of the 2011 Nobel Prize in Physics for the discovery of Dark Energy, which stemmed directly from survey-style observations taken at NOAO. As steps towards creating an LSST Community Science Center, NOAO is 1) supporting the LSST Science Collaborations through membership calls and collaboration meetings; 2) developing the LSST operations simulator, the tool by which the community's scientific goals of are tested against the reality of what LSST's cadence can deliver; 3) embarking on a project to establish metrics for science data quality assessment, which will be critical for establishing faith in LSST results; 4) developing a roadmap and proposal to host and support the capability to help the community manage the expected flood of automated alerts from LSST; and 5) starting a serious discussion of the system capabilities needed for photometric and spectroscopic followup of LSST observations. The fundamental goal is to enable productive, world-class research with LSST by the entire US community-at-large in tight collaboration with the LSST Project, LSST Science Collaborations, and the funding agencies.

  16. Control system design for the large space systems technology reference platform

    NASA Technical Reports Server (NTRS)

    Edmunds, R. S.

    1982-01-01

    Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.

  17. From Science To Design: Systems Engineering For The Lsst

    NASA Astrophysics Data System (ADS)

    Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration

    2009-01-01

    The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.

  18. LSST and the Physics of the Dark Universe

    ScienceCinema

    Tyson, Anthony [UC Davis, California, United States

    2017-12-09

    The physics that underlies the accelerating cosmic expansion is unknown. This, 'dark energy' and the equally mysterious 'dark matter' comprise most of the mass-energy of the universe and are outside the standard model. Recent advances in optics, detectors, and information technology, has led to the design of a facility that will repeatedly image an unprecedented volume of the universe: LSST. For the first time, the sky will be surveyed wide, deep and fast. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. I will review the technology of LSST, and focus on several independent probes of the nature of dark energy and dark matter. These new investigations will rely on the statistical precision obtainable with billions of galaxies.

  19. Systems engineering in the Large Synoptic Survey Telescope project: an application of model based systems engineering

    NASA Astrophysics Data System (ADS)

    Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques

    2014-08-01

    The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.

  20. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  1. Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Jurić, Mario; Ivezić, Željko

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.

  2. The Large Synoptic Survey Telescope Science Requirements

    NASA Astrophysics Data System (ADS)

    Tyson, J. A.; LSST Collaboration

    2004-12-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope facility that will add a qualitatively new capability in astronomy and will address some of the most pressing open questions in astronomy and fundamental physics. The 8.4-meter telescope and 3 billion pixel camera covering ten square degrees will reach sky in less than 10 seconds in each of 5-6 optical bands. This is enabled by advances in microelectronics, software, and large optics fabrication. The unprecedented optical throughput drives LSST's ability to go faint-wide-fast. The LSST will produce time-lapse digital imaging of faint astronomical objects across the entire visible sky with good resolution. For example, the LSST will provide unprecedented 3-dimensional maps of the mass distribution in the Universe, in addition to the traditional images of luminous stars and galaxies. These weak lensing data can be used to better understand the nature of Dark Energy. The LSST will also provide a comprehensive census of our solar system. By surveying deeply the entire accessible sky every few nights, the LSST will provide large samples of events which we now only rarely observe, and will create substantial potential for new discoveries. The LSST will produce the largest non-proprietary data set in the world. Several key science drivers are representative of the LSST system capabilities: Precision Characterization of Dark Energy, Solar System Map, Optical Transients, and a map of our Galaxy and its environs. In addition to enabling all four of these major scientific initiatives, LSST will make it possible to pursue many other research programs. The community has suggested a number of exciting programs using these data, and the long-lived data archives of the LSST will have the astrometric and photometric precision needed to support entirely new research directions which will inevitably develop during the next several decades.

  3. Firefly: embracing future web technologies

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.

    2016-07-01

    At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.

  4. Designing a multi-petabyte database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J; Hanushevsky, A

    2005-12-21

    The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less

  5. The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.

    2018-03-01

    Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.

  6. Investigating interoperability of the LSST data management software stack with Astropy

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  7. The Large Synoptic Survey Telescope project management control system

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey P.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.

  8. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  9. LSST Survey Data: Models for EPO Interaction

    NASA Astrophysics Data System (ADS)

    Olsen, J. K.; Borne, K. D.

    2007-12-01

    The potential for education and public outreach with the Large Synoptic Survey Telescope is as far reaching as the telescope itself. LSST data will be available to the public, giving anyone with a web browser a movie-like window on the Universe. The LSST project is unique in designing its data management and data access systems with the public and community users in mind. The enormous volume of data to be generated by LSST is staggering: 30 Terabytes per night, 10 Petabytes per year. The final database of extracted science parameters from the images will also be enormous -- 50-100 Petabytes -- a rich gold mine for data mining and scientific discovery potential. LSST will also generate 100,000 astronomical alerts per night, for 10 years. The LSST EPO team is examining models for EPO interaction with the survey data, particularly in how the community (amateurs, teachers, students, and general public) can participate in the discovery process. We will outline some of our models of community interaction for inquiry-based science using the LSST survey data, and we invite discussion on these topics.

  10. Examining the Potential of LSST to Contribute to Exoplanet Discovery

    NASA Astrophysics Data System (ADS)

    Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.

    2018-01-01

    The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.

  11. Using SysML for MBSE analysis of the LSST system

    NASA Astrophysics Data System (ADS)

    Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques

    2010-07-01

    The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.

  12. The Stellar Populations of the Milky Way and Nearby Galaxies with LSST

    NASA Astrophysics Data System (ADS)

    Olsen, Knut A.; Covey, K.; Saha, A.; Beers, T. C.; Bochanski, J.; Boeshaar, P.; Cargile, P.; Catelan, M.; Burgasser, A.; Cook, K.; Dhital, S.; Figer, D.; Ivezic, Z.; Kalirai, J.; McGehee, P.; Minniti, D.; Pepper, J.; Prsa, A.; Sarajedini, A.; Silva, D.; Smith, J. A.; Stassun, K.; Thorman, P.; Williams, B.; LSST Stellar Populations Collaboration

    2011-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma) when observations at the individual epochs of the standard cadence are stacked. Analyzing the ten years of independent measurements in each field will allow variability, proper motion and parallax measurements to be derived for objects brighter than r=24.5. These photometric, astrometric, and variability data will enable the construction of a detailed and robust map of the stellar populations of the Milky Way, its satellites and its nearest extra-galactic neighbors--allowing exploration of their star formation, chemical enrichment, and accretion histories on a grand scale. For example, with geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, LSST will allow a complete census of all stars above the hydrogen-burning limit that are closer than 500 pc, including thousands of predicted L and T dwarfs. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics; LSST's projected impact on the study of several variable star classes, including eclipsing binaries, are discussed here. We also describe the ongoing efforts of the collaboration to optimize the LSST system for stellar populations science. We are currently investigating the trade-offs associated with the exact wavelength boundaries of the LSST filters, identifying the most scientifically valuable locations for fields that will receive enhanced temporal coverage compared to the standard cadence, and analyzing synthetic LSST outputs to verify that the system's performance will be sufficient to achieve our highest priority science goals.

  13. Astrometry with LSST: Objectives and Challenges

    NASA Astrophysics Data System (ADS)

    Casetti-Dinescu, D. I.; Girard, T. M.; Méndez, R. A.; Petronchak, R. M.

    2018-01-01

    The forthcoming Large Synoptic Survey Telescope (LSST) is an optical telescope with an effective aperture of 6.4 m, and a field of view of 9.6 square degrees. Thus, LSST will have an étendue larger than any other optical telescope, performing wide-field, deep imaging of the sky. There are four broad categories of science objectives: 1) dark-energy and dark matter, 2) transients, 3) the Milky Way and its neighbours and, 4) the Solar System. In particular, for the Milky-Way science case, astrometry will make a critical contribution; therefore, special attention must be devoted to extract the maximum amount of astrometric information from the LSST data. Here, we outline the astrometric challenges posed by such a massive survey. We also present some current examples of ground-based, wide-field, deep imagers used for astrometry, as precursors of the LSST.

  14. LSST and the Epoch of Reionization Experiments

    NASA Astrophysics Data System (ADS)

    Ivezić, Željko

    2018-05-01

    The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.

  15. A Prototype External Event Broker for LSST

    NASA Astrophysics Data System (ADS)

    Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald

    2015-01-01

    LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.

  16. Curvature wavefront sensing performance evaluation for active correction of the Large Synoptic Survey Telescope (LSST).

    PubMed

    Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice

    2010-01-18

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.

  17. Giga-z: A 100,000 Object Superconducting Spectrophotometer for LSST Follow-up

    NASA Astrophysics Data System (ADS)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran; Hirata, Chris

    2013-09-01

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R 423 nm = E/ΔE = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations. In 3 yr on a dedicated 4 m class telescope, Giga-z could observe ≈2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z ≈ 6 with magnitudes mi <~ 25, with accuracy σΔz/(1 + z) ≈ 0.03 for the whole sample, and σΔz/(1 + z) ≈ 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of α = 1.27 (wp ), 1.53 (wa ), or 1.98 (Δγ). This is equivalent to multiplying both the LSST coverage area and the training sets by α and reducing all systematics by a factor of 1/\\sqrt{\\alpha }, advantages that are robust to even more extreme models of intrinsic alignment.

  18. Data Mining Research with the LSST

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Strauss, M. A.; Tyson, J. A.

    2007-12-01

    The LSST catalog database will exceed 10 petabytes, comprising several hundred attributes for 5 billion galaxies, 10 billion stars, and over 1 billion variable sources (optical variables, transients, or moving objects), extracted from over 20,000 square degrees of deep imaging in 5 passbands with thorough time domain coverage: 1000 visits over the 10-year LSST survey lifetime. The opportunities are enormous for novel scientific discoveries within this rich time-domain ultra-deep multi-band survey database. Data Mining, Machine Learning, and Knowledge Discovery research opportunities with the LSST are now under study, with a potential for new collaborations to develop to contribute to these investigations. We will describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. We also give some illustrative examples of current scientific data mining research in astronomy, and point out where new research is needed. In particular, the data mining research community will need to address several issues in the coming years as we prepare for the LSST data deluge. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; visual data mining algorithms for visual exploration of the data; indexing of multi-attribute multi-dimensional astronomical databases (beyond RA-Dec spatial indexing) for rapid querying of petabyte databases; and more. Finally, we will identify opportunities for synergistic collaboration between the data mining research group and the LSST Data Management and Science Collaboration teams.

  19. LSST: Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.

  20. LSST camera control system

    NASA Astrophysics Data System (ADS)

    Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

    2006-06-01

    The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

  1. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  2. The LSST Data Mining Research Agenda

    NASA Astrophysics Data System (ADS)

    Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.

    2008-12-01

    We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.

  3. Earth's Minimoons: Opportunities for Science and Technology.

    NASA Astrophysics Data System (ADS)

    Jedicke, Robert; Bolin, Bryce T.; Bottke, William F.; Chyba, Monique; Fedorets, Grigori; Granvik, Mikael; Jones, Lynne; Urrutxua, Hodei

    2018-05-01

    Twelve years ago the Catalina Sky Survey discovered Earth's first known natural geocentric object other than the Moon, a few-meter diameter asteroid designated \\RH. Despite significant improvements in ground-based asteroid surveying technology in the past decade they have not discovered another temporarily-captured orbiter (TCO; colloquially known as minimoons) but the all-sky fireball system operated in the Czech Republic as part of the European Fireball Network detected a bright natural meteor that was almost certainly in a geocentric orbit before it struck Earth's atmosphere. Within a few years the Large Synoptic Survey Telescope (LSST) will either begin to regularly detect TCOs or force a re-analysis of the creation and dynamical evolution of small asteroids in the inner solar system. The first studies of the provenance, properties, and dynamics of Earth's minimoons suggested that there should be a steady state population with about one 1- to 2-meter diameter captured objects at any time, with the number of captured meteoroids increasing exponentially for smaller sizes. That model was then improved and extended to include the population of temporarily-captured flybys (TCFs), objects that fail to make an entire revolution around Earth while energetically bound to the Earth-Moon system. Several different techniques for discovering TCOs have been considered but their small diameters, proximity, and rapid motion make them challenging targets for existing ground-based optical, meteor, and radar surveys. However, the LSST's tremendous light gathering power and short exposure times could allow it to detect and discover many minimoons. We expect that if the TCO population is confirmed, and new objects are frequently discovered, they can provide new opportunities for 1) studying the dynamics of the Earth-Moon system, 2) testing models of the production and dynamical evolution of small asteroids from the asteroid belt, 3) rapid and frequent low delta-v missions to multiple minimoons, and 4) evaluating in-situ resource utilization techniques on asteroidal material. Here we review the past decade of minimoon studies in preparation for capitalizing on the scientific and commercial opportunities of TCOs in the first decade of LSST operations.

  4. Advancing the LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group

    2013-01-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.

  5. Probing LSST's Ability to Detect Planets Around White Dwarfs

    NASA Astrophysics Data System (ADS)

    Cortes, Jorge; Kipping, David

    2018-01-01

    Over the last four years more than 2,000 planets outside our solar system have been discovered, motivating us to search for and characterize potentially habitable worlds. Most planets orbit Sun-like stars, but more exotic stars can also host planets. Debris disks and disintegrating planetary bodies have been detected around white dwarf stars, the inert, Earth-sized cores of once-thriving stars like our Sun. These detections are clues that planets may exist around white dwarfs. Due to the faintness of white dwarfs and the potential rarity of planets around them, a vast survey is required to have a chance at detecting these planetary systems. The Large Synoptic Survey Telescope (LSST), scheduled to commence operations in 2023, will image the entire southern sky every few nights for 10 years, providing our first real opportunity to detect planets around white dwarfs. We characterized LSST’s ability to detect planets around white dwarfs through simulations that incorporate realistic models for LSST’s observing strategy and the white dwarf distribution within the Milky Way galaxy. This was done through the use of LSST's Operations Simulator (OpSim) and Catalog Simulator (CatSim). Our preliminary results indicate that, if all white dwarfs were to possess a planet, LSST would yield a detection for every 100 observed white dwarfs. In the future, a larger set of ongoing simulations will help us quantify the number of planets LSST could potentially find.

  6. The European perspective for LSST

    NASA Astrophysics Data System (ADS)

    Gangler, Emmanuel

    2017-06-01

    LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.

  7. The LSST OCS scheduler design

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Schumacher, German

    2014-08-01

    The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.

  8. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  9. Giga-z: A 100,000 OBJECT SUPERCONDUCTING SPECTROPHOTOMETER FOR LSST FOLLOW-UP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran

    2013-09-15

    We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R{sub 423{sub nm}} = E/{Delta}E = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations.more » In 3 yr on a dedicated 4 m class telescope, Giga-z could observe Almost-Equal-To 2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z Almost-Equal-To 6 with magnitudes m{sub i} {approx}< 25, with accuracy {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.03 for the whole sample, and {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of {alpha} = 1.27 (w{sub p} ), 1.53 (w{sub a} ), or 1.98 ({Delta}{gamma}). This is equivalent to multiplying both the LSST coverage area and the training sets by {alpha} and reducing all systematics by a factor of 1/{radical}({alpha}), advantages that are robust to even more extreme models of intrinsic alignment.« less

  10. The Large Synoptic Survey Telescope (LSST) Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  11. On the accuracy of modelling the dynamics of large space structures

    NASA Technical Reports Server (NTRS)

    Diarra, C. M.; Bainum, P. M.

    1985-01-01

    Proposed space missions will require large scale, light weight, space based structural systems. Large space structure technology (LSST) systems will have to accommodate (among others): ocean data systems; electronic mail systems; large multibeam antenna systems; and, space based solar power systems. The structures are to be delivered into orbit by the space shuttle. Because of their inherent size, modelling techniques and scaling algorithms must be developed so that system performance can be predicted accurately prior to launch and assembly. When the size and weight-to-area ratio of proposed LSST systems dictate that the entire system be considered flexible, there are two basic modeling methods which can be used. The first is a continuum approach, a mathematical formulation for predicting the motion of a general orbiting flexible body, in which elastic deformations are considered small compared with characteristic body dimensions. This approach is based on an a priori knowledge of the frequencies and shape functions of all modes included within the system model. Alternatively, finite element techniques can be used to model the entire structure as a system of lumped masses connected by a series of (restoring) springs and possibly dampers. In addition, a computational algorithm was developed to evaluate the coefficients of the various coupling terms in the equations of motion as applied to the finite element model of the Hoop/Column.

  12. Transient Alerts in LSST

    NASA Astrophysics Data System (ADS)

    Kantor, J.

    During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.

  13. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  14. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2

    NASA Astrophysics Data System (ADS)

    Sullivan, M. R.

    1982-06-01

    Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.

  15. The Large Synoptic Survey Telescope (LSST) Camera

    ScienceCinema

    None

    2018-06-13

    Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.

  16. LIMB-DARKENING COEFFICIENTS FOR ECLIPSING WHITE DWARFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianninas, A.; Strickland, B. D.; Kilic, Mukremin

    2013-03-20

    We present extensive calculations of linear and nonlinear limb-darkening coefficients as well as complete intensity profiles appropriate for modeling the light-curves of eclipsing white dwarfs. We compute limb-darkening coefficients in the Johnson-Kron-Cousins UBVRI photometric system as well as the Large Synoptic Survey Telescope (LSST) ugrizy system using the most up to date model atmospheres available. In all, we provide the coefficients for seven different limb-darkening laws. We describe the variations of these coefficients as a function of the atmospheric parameters, including the effects of convection at low effective temperatures. Finally, we discuss the importance of having readily available limb-darkening coefficientsmore » in the context of present and future photometric surveys like the LSST, Palomar Transient Factory, and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The LSST, for example, may find {approx}10{sup 5} eclipsing white dwarfs. The limb-darkening calculations presented here will be an essential part of the detailed analysis of all of these systems.« less

  17. The LSST Scheduler from design to construction

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Reuter, Michael A.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.

  18. Mapping the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Harris, A.; Bowell, T.; Bernstein, G.; Stubbs, C.; LSST Collaboration

    2004-12-01

    The currently considered LSST cadence, based on two 10 sec exposures, may result in orbital parameters, light curves and accurate colors for over a million main-belt asteroids (MBA), and about 20,000 trans-Neptunian objects (TNO). Compared to the current state-of-the-art, this sample would represent a factor of 5 increase in the number of MBAs with known orbits, a factor of 20 increase in the number of MBAs with known orbits and accurate color measurements, and a factor of 100 increase in the number of MBAs with measured variability properties. The corresponding sample increase for TNOs is 10, 100, and 1000, respectively. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. For example, they will constrain the MBA size distribution for objects larger than 100 m, and TNO size distribution for objects larger than 100 km, their physical state through variability measurements (solid body vs. a rubble pile), as well as their surface chemistry through color measurements. A proposed deep TNO survey, based on 1 hour exposures, may result in a sample of about 100,000 TNOs, while spending only 10% of the LSST observing time. Such a deep TNO survey would be capable of discovering Sedna-like objects at distances beyond 150 AU, thereby increasing the observable Solar System volume by about a factor of 7. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying asteroid populations.

  19. An optical to IR sky brightness model for the LSST

    NASA Astrophysics Data System (ADS)

    Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo

    2016-07-01

    To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.

  20. Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration

    2018-01-01

    The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.

  1. The LSST operations simulator

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen

    2014-08-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.

  2. Solar System science with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David

    2015-11-01

    The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.

  3. LSST telescope and site status

    NASA Astrophysics Data System (ADS)

    Gressler, William J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.

  4. LSST summit facility construction progress report: reacting to design refinements and field conditions

    NASA Astrophysics Data System (ADS)

    Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo

    2016-07-01

    The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.

  5. Mechanical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordby, Martin; Bowden, Gordon; Foss, Mike

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less

  6. Science Education with the LSST

    NASA Astrophysics Data System (ADS)

    Jacoby, S. H.; Khandro, L. M.; Larson, A. M.; McCarthy, D. W.; Pompea, S. M.; Shara, M. M.

    2004-12-01

    LSST will create the first true celestial cinematography - a revolution in public access to the changing universe. The challenge will be to take advantage of the unique capabilities of the LSST while presenting the data in ways that are manageable, engaging, and supportive of national science education goals. To prepare for this opportunity for exploration, tools and displays will be developed using current deep-sky multi-color imaging data. Education professionals from LSST partners invite input from interested members of the community. Initial LSST science education priorities include: - Fostering authentic student-teacher research projects at all levels, - Exploring methods of visualizing the large and changing datasets in science centers, - Defining Web-based interfaces and tools for access and interaction with the data, - Delivering online instructional materials, and - Developing meaningful interactions between LSST scientists and the public.

  7. Management evolution in the LSST project

    NASA Astrophysics Data System (ADS)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  8. Scientific Synergy between LSST and Euclid

    NASA Astrophysics Data System (ADS)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja

    2017-12-01

    Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.

  9. Formal Education with LSST

    NASA Astrophysics Data System (ADS)

    Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.

  10. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  11. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  12. Scientific Synergy between LSST and Euclid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  13. Scientific Synergy between LSST and Euclid

    DOE PAGES

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...

    2017-12-07

    We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less

  14. Investigating the Bright End of LSST Photometry

    NASA Astrophysics Data System (ADS)

    Ojala, Elle; Pepper, Joshua; LSST Collaboration

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.

  15. Ground/bonding for Large Space System Technology (LSST). [of metallic and nonmetallic structures

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.

  16. Cables and connectors for Large Space System Technology (LSST)

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.

    1980-01-01

    The effect of the environment and extravehicular activity/remote assembly operations on the cables and connectors for spacecraft with metallic and/or nonmetallic structures was examined. Cable and connector philosophy was outlined for the electrical systems and electronic compartments which contain high-voltage, high-power electrical and electronic equipment. The influence of plasma and particulates on the system is analyzed and the effect of static buildup on the spacecraft electrical system discussed. Conceptual cable and connector designs are assessed for capability to withstand high current and high voltage without danger of arcs and electromagnetic interference. The extravehicular activites required of the space station and/or supply spacecraft crew members to join and inspect the electrical system, using manual or remote assembly construction are also considered.

  17. Ground/bonding for Large Space System Technology (LSST)

    NASA Astrophysics Data System (ADS)

    Dunbar, W. G.

    1980-04-01

    The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.

  18. Probing the Solar System with LSST

    NASA Astrophysics Data System (ADS)

    Harris, A.; Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Bowell, E.; Bernstein, G.; Cook, K.; Stubbs, C.

    2005-12-01

    LSST will catalog small Potentially Hazardous Asteroids (PHAs), survey the main belt asteroid (MBA) population to extraordinarily small size, discover comets far from the sun where their nuclear properties can be discerned without coma, and survey the Centaur and Trans-Neptunian Object (TNO) populations. The present planned observing strategy is to ``visit'' each field (9.6 deg2) with two back-to-back exposures of ˜ 15 sec, reaching to at least V magnitude 24.5. An intra-night revisit time of the order half an hour will distinguish stationary transients from even very distant ( ˜ 70 AU) solar system bodies. In order to link observations and determine orbits, each sky area will be visited several times during a month, spaced by about a week. This cadence will result in orbital parameters for several million MBAs and about 20,000 TNOs, with light curves and colorimetry for the brighter 10% or so of each population. Compared to the current data available, this would represent factor of 10 to 100 increase in the numbers of orbits, colors, and variability of the two classes of objects. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying moving objects.

  19. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, R.; Chiang, J.; Cinabro, D.

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  20. An automated system to measure the quantum efficiency of CCDs for astronomy

    DOE PAGES

    Coles, R.; Chiang, J.; Cinabro, D.; ...

    2017-04-18

    We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less

  1. Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA

    NASA Astrophysics Data System (ADS)

    Korol, V.; Rossi, E. M.; Groot, P. J.

    2017-03-01

    According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.

  2. The Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Axelrod, T. S.

    2006-07-01

    The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.

  3. LSST camera grid structure made out of ceramic composite material, HB-Cesic

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias R.; Langton, J. Bryan

    2016-08-01

    In this paper we are presenting the ceramic design and the fabrication of the camera structure which is using the unique manufacturing features of the HB-Cesic technology and associated with a dedicated metrology device in order to ensure the challenging flatness requirement of 4 micron over the full array.

  4. The Effects of Commercial Airline Traffic on LSST Observing Efficiency

    NASA Astrophysics Data System (ADS)

    Gibson, Rose; Claver, Charles; Stubbs, Christopher

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).

  5. Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST

    NASA Astrophysics Data System (ADS)

    Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.

    2017-01-01

    The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.

  6. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    NASA Astrophysics Data System (ADS)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; Collett, Thomas E.

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ∼2 over previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Accounting for microlensing, the 1–2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.

  7. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE PAGES

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; ...

    2018-03-01

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  8. Precise Time Delays from Strongly Gravitationally Lensed Type Ia Supernovae with Chromatically Microlensed Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.

    Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less

  9. TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacklin, Savannah; Lund, Michael B.; Stassun, Keivan G.

    2015-07-15

    The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5more » to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.« less

  10. Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies

    NASA Astrophysics Data System (ADS)

    Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter

    2018-01-01

    In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.

  11. LSST Resources for the Community

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne

    2011-01-01

    LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.

  12. Measuring the Growth Rate of Structure with Type IA Supernovae from LSST

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Robotham, Aaron S. G.; Lagos, Claudia D. P.; Kim, Alex G.

    2017-10-01

    We investigate the peculiar motions of galaxies up to z = 0.5 using Type Ia supernovae (SNe Ia) from the Large Synoptic Survey Telescope (LSST) and predict the subsequent constraints on the growth rate of structure. We consider two cases. Our first is based on measurements of the volumetric SNe Ia rate and assumes we can obtain spectroscopic redshifts and light curves for varying fractions of objects that are detected pre-peak luminosity by LSST (some of which may be obtained by LSST itself, and others that would require additional follow-up observations). We find that these measurements could produce growth rate constraints at z< 0.5 that significantly outperform those found using Redshift Space Distortions (RSD) with DESI or 4MOST, even though there are ˜ 4× fewer objects. For our second case, we use semi-analytic simulations and a prescription for the SNe Ia rate as a function of stellar mass and star-formation rate to predict the number of LSST SNe IA whose host redshifts may already have been obtained with the Taipan+WALLABY surveys or with a future multi-object spectroscopic survey. We find ˜18,000 and ˜160,000 SNe Ia with host redshifts for these cases, respectively. While this is only a fraction of the total LSST-detected SNe Ia, they could be used to significantly augment and improve the growth rate constraints compared to only RSD. Ultimately, we find that combining LSST SNe Ia with large numbers of galaxy redshifts will provide the most powerful probe of large-scale gravity in the z< 0.5 regime over the coming decades.

  13. Strong Gravitational Lensing with LSST

    NASA Astrophysics Data System (ADS)

    Marshall, Philip J.; Bradac, M.; Chartas, G.; Dobler, G.; Eliasdottir, A.; Falco, E.; Fassnacht, C. D.; Jee, M. J.; Keeton, C. R.; Oguri, M.; Tyson, J. A.; LSST Strong Lensing Science Collaboration

    2010-01-01

    LSST will find more strong gravitational lensing events than any other survey preceding it, and will monitor them all at a cadence of a few days to a few weeks. We can expect the biggest advances in strong lensing science made with LSST to be in those areas that benefit most from the large volume, and the high accuracy multi-filter time series: studies of, and using, several thousand lensed quasars and several hundred supernovae. However, the high quality imaging will allow us to detect and measure large numbers of background galaxies multiply-imaged by galaxies, groups and clusters. In this poster we give an overview of the strong lensing science enabled by LSST, and highlight the particular associated technical challenges that will have to be faced when working with the survey.

  14. Expanding the user base beyond HEP for the Ganga distributed analysis user interface

    NASA Astrophysics Data System (ADS)

    Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.

    2017-10-01

    This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.

  15. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  16. Fast force actuators for LSST primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Hileman, Edward; Warner, Michael; Wiecha, Oliver

    2010-07-01

    The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.

  17. Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration

    2018-05-01

    The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.

  18. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  19. LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team

    2011-01-01

    The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.

  20. Evaluation of Potential LSST Spatial Indexing Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolaev, S; Abdulla, G; Matzke, R

    2006-10-13

    The LSST requirement for producing alerts in near real-time, and the fact that generating an alert depends on knowing the history of light variations for a given sky position, both imply that the clustering information for all detections is available at any time during the survey. Therefore, any data structure describing clustering of detections in LSST needs to be continuously updated, even as new detections are arriving from the pipeline. We call this use case ''incremental clustering'', to reflect this continuous updating of clustering information. This document describes the evaluation results for several potential LSST incremental clustering strategies, using: (1)more » Neighbors table and zone optimization to store spatial clusters (a.k.a. Jim Grey's, or SDSS algorithm); (2) MySQL built-in R-tree implementation; (3) an external spatial index library which supports a query interface.« less

  1. Photometric classification and redshift estimation of LSST Supernovae

    NASA Astrophysics Data System (ADS)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve

    2018-07-01

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 per cent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ (z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ (z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.

  2. Commentary: Learning About the Sky Through Simulations. Chapter 34

    NASA Technical Reports Server (NTRS)

    Way, Michael J.

    2012-01-01

    The Large Synoptic Survey Telescope (LSST) simulator being built by Andy Connolly and collaborators is an impressive undertaking and should make working with LSST in the beginning stages far more easy than it was initially with the Sloan Digital Sky Survey (SDSS). However, I would like to focus on an equally important problem that has not yet been discussed here, but in the coming years the community will need to address-can we deal with the flood of data from LSST and will we need to rethink the way we work?

  3. Cosmology with the Large Synoptic Survey Telescope: an overview

    NASA Astrophysics Data System (ADS)

    Zhan, Hu; Tyson, J. Anthony

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  4. Effect of elastic band-based high-speed power training on cognitive function, physical performance and muscle strength in older women with mild cognitive impairment.

    PubMed

    Yoon, Dong Hyun; Kang, Dongheon; Kim, Hee-Jae; Kim, Jin-Soo; Song, Han Sol; Song, Wook

    2017-05-01

    The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated. In particular, unconventional high-speed resistance training can improve muscle power development. In the present study, the effectiveness of 12 weeks of elastic band-based high-speed power training (HSPT) was examined. Participants were randomly assigned into a HSPT group (n = 14, age 75.0 ± 0.9 years), a low-speed strength training (LSST) group (n = 9, age 76.0 ± 1.3 years) and a control group (CON; n = 7, age 78.0 ± 1.0 years). A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups, and balance and tone exercises were carried out by the CON group. Significant increases in levels of cognitive function, physical function, and muscle strength were observed in both the HSPT and LSST groups. In cognitive function, significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group. In physical functions, Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group. In the 12 weeks of elastic band-based training, the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group, although both regimens were effective in improving cognitive function, physical function and muscle strength. We conclude that elastic band-based HSPT, as compared with LSST, is more efficient in helping older women with mild cognitive impairment to improve cognitive function, physical performance and muscle strength. Geriatr Gerontol Int 2017; 17: 765-772. © 2016 Japan Geriatrics Society.

  5. LSST Painting Risk Evaluation Memo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Justin E.

    The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.

  6. Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations

    NASA Astrophysics Data System (ADS)

    Given, Gabriel; Grin, Daniel

    2018-01-01

    Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.

  7. LSST Data Management

    NASA Astrophysics Data System (ADS)

    O'Mullane, William; LSST Data Management Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map. We are also discuss the move to Python3 and timeline for dropping Python2.

  8. Photometric classification and redshift estimation of LSST Supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 percent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z)more » of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ(z phot -z spec 1+z spec )=0.0294 σ(zphot-zspec1+zspec)=0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ(z phot -z spec 1+z spec )=0.0116 σ(zphot-zspec1+zspec)=0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.« less

  9. LSST Telescope and Optics Status

    NASA Astrophysics Data System (ADS)

    Krabbendam, Victor; Gressler, W. J.; Andrew, J. R.; Barr, J. D.; DeVries, J.; Hileman, E.; Liang, M.; Neill, D. R.; Sebag, J.; Wiecha, O.; LSST Collaboration

    2011-01-01

    The LSST Project continues to advance the design and development of an observatory system capable of capturing 20,000 deg2 of the sky in six wavebands over ten years. Optical fabrication of the unique M1/M3 monolithic mirror has entered final front surface optical processing. After substantial grinding to remove 5 tons of excess glass above the M3 surface, a residual of a single spin casting, both distinct optical surfaces are now clearly evident. Loose abrasive grinding has begun and polishing is to occur during 2011 and final optical testing is planned in early 2012. The M1/M3 telescope cell and internal component designs have matured to support on telescope operational requirements and off telescope coating needs. The mirror position system (hardpoint actuators) and mirror support system (figure actuator) designs have developed through internal laboratory analysis and testing. Review of thermal requirements has assisted with definition of a thermal conditioning and control system. Pre-cooling the M1/M3 substrate will enable productive observing during the large temperature swing often seen at twilight. The M2 ULE™ substrate is complete and lies in storage waiting for additional funding to enable final optical polishing. This 3.5m diameter, 100mm thick meniscus substrate has been ground to within 40 microns of final figure. Detailed design of the telescope mount, including subflooring, has been developed. Finally, substantial progress has been achieved on the facility design. In early 2010, LSST contracted with ARCADIS Geotecnica Consultores, a Santiago based engineering firm to lead the formal architectural design effort for the summit facility.

  10. On the Detectability of Planet X with LSST

    NASA Astrophysics Data System (ADS)

    Trilling, David E.; Bellm, Eric C.; Malhotra, Renu

    2018-06-01

    Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.

  11. LSST: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration

    2009-01-01

    The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.

  12. Solar System Science with LSST

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Chesley, S. R.; Connolly, A. J.; Harris, A. W.; Ivezic, Z.; Knezevic, Z.; Kubica, J.; Milani, A.; Trilling, D. E.

    2008-09-01

    The Large Synoptic Survey Telescope (LSST) will provide a unique tool to study moving objects throughout the solar system, creating massive catalogs of Near Earth Objects (NEOs), asteroids, Trojans, TransNeptunian Objects (TNOs), comets and planetary satellites with well-measured orbits and high quality, multi-color photometry accurate to 0.005 magnitudes for the brightest objects. In the baseline LSST observing plan, back-to-back 15-second images will reach a limiting magnitude as faint as r=24.7 in each 9.6 square degree image, twice per night; a total of approximately 15,000 square degrees of the sky will be imaged in multiple filters every 3 nights. This time sampling will continue throughout each lunation, creating a huge database of observations. Fig. 1 Sky coverage of LSST over 10 years; separate panels for each of the 6 LSST filters. Color bars indicate number of observations in filter. The catalogs will include more than 80% of the potentially hazardous asteroids larger than 140m in diameter within the first 10 years of LSST operation, millions of main-belt asteroids and perhaps 20,000 Trans-Neptunian Objects. Objects with diameters as small as 100m in the Main Belt and <100km in the Kuiper Belt can be detected in individual images. Specialized `deep drilling' observing sequences will detect KBOs down to 10s of kilometers in diameter. Long period comets will be detected at larger distances than previously possible, constrainting models of the Oort cloud. With the large number of objects expected in the catalogs, it may be possible to observe a pristine comet start outgassing on its first journey into the inner solar system. By observing fields over a wide range of ecliptic longitudes and latitudes, including large separations from the ecliptic plane, not only will these catalogs greatly increase the numbers of known objects, the characterization of the inclination distributions of these populations will be much improved. Derivation of proper elements for main belt and Trojan asteroids will allow ever more resolution of asteroid families and their size-frequency distribution, as well as the study of the long-term dynamics of the individual asteroids and the asteroid belt as a whole. Fig. 2 Orbital parameters of Main Belt Asteroids, color-coded according to ugriz colors measured by SDSS. The figure to the left shows osculating elements, the figure to the right shows proper elements - note the asteroid families visible as clumps in parameter space [1]. By obtaining multi-color ugrizy data for a substantial fraction of objects, relationships between color and dynamical history can be established. This will also enable taxonomic classification of asteroids, provide further links between diverse populations such as irregular satellites and TNOs or planetary Trojans, and enable estimates of asteroid diameter with rms uncertainty of 30%. With the addition of light-curve information, rotation periods and phase curves can be measured for large fractions of each population, leading to new insight on physical characteristics. Photometric variability information, together with sparse lightcurve inversion, will allow spin state and shape estimation for up to two orders of magnitude more objects than presently known. This will leverage physical studies of asteroids by constraining the size-strength relationship, which has important implications for the internal structure (solid, fractured, rubble pile) and in turn the collisional evolution of the asteroid belt. Similar information can be gained for other solar system bodies. [1] Parker, A., Ivezic

  13. The Large Synoptic Survey Telescope: Projected Near-Earth Object Discovery Performance

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field survey that has the potential to detect millions of asteroids. LSST is under construction with survey operations slated to begin in 2022. We describe an independent study to assess the performance of LSST for detecting and cataloging near-Earth objects (NEOs). A significant component of the study will be to assess the survey's ability to link observations of a single object from among the large numbers of false detections and detections of other objects. We also will explore the survey's basic performance in terms of fraction of NEOs discovered and cataloged, both for the planned baseline survey, but also for enhanced surveys that are more carefully tuned for NEO search, generally at the expense of other science drivers. Preliminary results indicate that with successful linkage under the current baseline survey LSST would discover approximately 65% of NEOs with absolute magnitude H is less than 22, which corresponds approximately to 140m diameter.

  14. Baseline design and requirements for the LSST rotating enclosure (dome)

    NASA Astrophysics Data System (ADS)

    Neill, D. R.; DeVries, J.; Hileman, E.; Sebag, J.; Gressler, W.; Wiecha, O.; Andrew, J.; Schoening, W.

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile. As a result of the wide field of view, its optical system is unusually susceptible to stray light; consequently besides protecting the telescope from the environment the rotating enclosure (Dome) also provides indispensible light baffling. All dome vents are covered with light baffles which simultaneously provide both essential dome flushing and stray light attenuation. The wind screen also (and primarily) functions as a light screen providing only a minimum clear aperture. Since the dome must operate continuously, and the drives produce significant heat, they are located on the fixed lower enclosure to facilitate glycol water cooling. To accommodate day time thermal control, a duct system channels cooling air provided by the facility when the dome is in its parked position.

  15. Cosmology with the Large Synoptic Survey Telescope: an overview.

    PubMed

    Zhan, Hu; Anthony Tyson, J

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an [Formula: see text] ([Formula: see text] effective) aperture, a novel three-mirror design achieving a seeing-limited [Formula: see text] field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an [Formula: see text] survey in six passbands (ugrizy) to a coadded depth of [Formula: see text] over 10 years using [Formula: see text] of its observational time. The remaining [Formula: see text] of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with [Formula: see text] exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  16. Mapping Near-Earth Hazards

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    How can we hunt down all the near-Earth asteroids that are capable of posing a threat to us? A new study looks at whether the upcoming Large Synoptic Survey Telescope (LSST) is up to the job.Charting Nearby ThreatsLSST is an 8.4-m wide-survey telescope currently being built in Chile. When it goes online in 2022, it will spend the next ten years surveying our sky, mapping tens of billions of stars and galaxies, searching for signatures of dark energy and dark matter, and hunting for transient optical events like novae and supernovae. But in its scanning, LSST will also be looking for asteroids that approach near Earth.Cumulative number of near-Earth asteroids discovered over time, as of June 16, 2016. [NASA/JPL/Chamberlin]Near-Earth objects (NEOs) have the potential to be hazardous if they cross Earths path and are large enough to do significant damage when they impact Earth. Earths history is riddled with dangerous asteroid encounters, including the recent Chelyabinsk airburst in 2013, the encounter that caused the kilometer-sized Meteor Crater in Arizona, and the impact thought to contribute to the extinction of the dinosaurs.Recognizing the potential danger that NEOs can pose to Earth, Congress has tasked NASA with tracking down 90% of NEOs larger than 140 meters in diameter. With our current survey capabilities, we believe weve discovered roughly 25% of these NEOs thus far. Now a new study led by Tommy Grav (Planetary Science Institute) examines whether LSST will be able to complete this task.Absolute magnitude, H, of asynthetic NEO population. Though these NEOs are all larger than 140 m, they have a large spread in albedos. [Grav et al. 2016]Can LSST Help?Based on previous observations of NEOs and resulting predictions for NEO properties and orbits, Grav and collaborators simulate a synthetic population of NEOs all above 140 m in size. With these improved population models, they demonstrate that the common tactic of using an asteroids absolute magnitude as a proxy for its size is a poor approximation, due to asteroids large spread in albedos. Roughly 23% of NEOs larger than 140 m have absolute magnitudes fainter than H = 22 mag, the authors show which is the value usually assumed as the default absolute magnitude of a 140 m NEO.Fraction of NEOs weve detected as a function of time based on the authors simulations of the current surveys (red), LSST plus the current surveys (black), NEOCam plus the current surveys (blue), and the combined result for all surveys (green). [Grav et al. 2016]Taking this into account, Grav and collaborators then use information about the planned LSST survey strategies and detection limits to test what fraction of this synthetic NEO population LSST will be able to detect in its proposed 10-year mission.The authors find that, within 10 years, LSST will likely be able to detect only 63% of NEOs larger than 140 m. Luckily, LSST may not have to work alone; in addition to the current surveys in operation, a proposed infrared space-based survey mission called NEOCam is planned for launch in 2021. If NEOCam is funded, it will complement LSSTs discovery capabilities, potentially allowing the two surveys to jointly achieve the 90% detection goal within a decade.CitationT. Grav et al 2016 AJ 151 172. doi:10.3847/0004-6256/151/6/172

  17. The Search for Transients and Variables in the LSST Pathfinder Survey

    NASA Astrophysics Data System (ADS)

    Gorsuch, Mary Katherine; Kotulla, Ralf

    2018-01-01

    This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.

  18. Final Technical Report for DE-SC0012297

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Antonio, Ian

    This is the final report on the work performed in award DE-SC0012297, Cosmic Frontier work in support of the LSST Dark Energy Science Collaboration's work to develop algorithms, simulations, and statistical tests to ensure optimal extraction of the dark energy properties from galaxy clusters observed with LSST. This work focused on effects that could produce a systematic error on the measurement of cluster masses (that will be used to probe the effects of dark energy on the growth of structure). These effects stem from the deviations from pure ellipticity of the gravitational lensing signal and from the blending of lightmore » of neighboring galaxies. Both these effects are expected to be more significant for LSST than for the stage III experiments such as the Dark Energy Survey. We calculate the magnitude of the mass error (or bias) for the first time and demonstrate that it can be treated as a multiplicative correction and calibrated out, allowing mass measurements of clusters from gravitational lensing to meet the requirements of LSST's dark energy investigation.« less

  19. Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.

    2013-01-01

    The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.

  20. Stellar Populations with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; LSST Stellar Populations Collaboration

    2006-12-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to g 27.5(5σ). Strategically cadenced time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than g 25. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence stars at all distances within the Galaxy, permitting a comprehensive study of star formation histories (SFH) and chemical evolution for field stars. With a geometric parallax accuracy of 1mas, LSST will produce a robust complete sample of the solar neighborhood stars. While delivering parallax accuracy comparable to HIPPARCOS, LSST will extend the catalog to more than a 10 magnitudes fainter limit, and will be complete to MV 15. In the Magellanic Clouds too, the photometry will reach MV +8, allowing the SFH and chemical signatures in the expansive outer extremities to be gleaned from their main sequence stars. This in turn will trace the detailed interaction of the Clouds with the Galaxy halo. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1hr to several years, a feast for variable star astrophysics. Cepheids and LPVs in all galaxies in the Sculptor, M83 and Cen-A groups are obvious data products: comparative studies will reveal systematic differences with galaxy properties, and help to fine tune the rungs of the distance ladder. Dwarf galaxies within 10Mpc that are too faint to find from surface brightness enhancements will be revealed via over-densities of their red giants: this systematic census will extend the luminosity function of galaxies to the faint limit. Novae discovered by LSST time sampling will trace intergalactic stars out to the Virgo and Fornax clusters.

  1. The LSST Metrics Analysis Framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.

    2015-01-01

    Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.

  2. LSST Probes of Dark Energy: New Energy vs New Gravity

    NASA Astrophysics Data System (ADS)

    Bradshaw, Andrew; Tyson, A.; Jee, M. J.; Zhan, H.; Bard, D.; Bean, R.; Bosch, J.; Chang, C.; Clowe, D.; Dell'Antonio, I.; Gawiser, E.; Jain, B.; Jarvis, M.; Kahn, S.; Knox, L.; Newman, J.; Wittman, D.; Weak Lensing, LSST; LSS Science Collaborations

    2012-01-01

    Is the late time acceleration of the universe due to new physics in the form of stress-energy or a departure from General Relativity? LSST will measure the shape, magnitude, and color of 4x109 galaxies to high S/N over 18,000 square degrees. These data will be used to separately measure the gravitational growth of mass structure and distance vs redshift to unprecedented precision by combining multiple probes in a joint analysis. Of the five LSST probes of dark energy, weak gravitational lensing (WL) and baryon acoustic oscillation (BAO) probes are particularly effective in combination. By measuring the 2-D BAO scale in ugrizy-band photometric redshift-selected samples, LSST will determine the angular diameter distance to a dozen redshifts with sub percent-level errors. Reconstruction of the WL shear power spectrum on linear and weakly non-linear scales, and of the cross-correlation of shear measured in different photometric redshift bins provides a constraint on the evolution of dark energy that is complementary to the purely geometric measures provided by supernovae and BAO. Cross-correlation of the WL shear and BAO signal within redshift shells minimizes the sensitivity to systematics. LSST will also detect shear peaks, providing independent constraints. Tomographic study of the shear of background galaxies as a function of redshift allows a geometric test of dark energy. To extract the dark energy signal and distinguish between the two forms of new physics, LSST will rely on accurate stellar point-spread functions (PSF) and unbiased reconstruction of galaxy image shapes from hundreds of exposures. Although a weighted co-added deep image has high S/N, it is a form of lossy compression. Bayesian forward modeling algorithms can in principle use all the information. We explore systematic effects on shape measurements and present tests of an algorithm called Multi-Fit, which appears to avoid PSF-induced shear systematics in a computationally efficient way.

  3. Responding to the Event Deluge

    NASA Technical Reports Server (NTRS)

    Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John

    2012-01-01

    We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others

  4. Data Service: Distributed Data Capture and Replication

    NASA Astrophysics Data System (ADS)

    Warner, P. B.; Pietrowicz, S. R.

    2007-10-01

    Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.

  5. LSST system analysis and integration task for an advanced science and application space platform

    NASA Technical Reports Server (NTRS)

    1980-01-01

    To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.

  6. Preparing for LSST with the LCOGT NEO Follow-up Network

    NASA Astrophysics Data System (ADS)

    Greenstreet, Sarah; Lister, Tim; Gomez, Edward

    2016-10-01

    The Las Cumbres Observatory Global Telescope Network (LCOGT) provides an ideal platform for follow-up and characterization of Solar System objects (e.g. asteroids, Kuiper Belt Objects, comets, Near-Earth Objects (NEOs)) and ultimately for the discovery of new objects. The LCOGT NEO Follow-up Network is using the LCOGT telescope network in addition to a web-based system developed to perform prioritized target selection, scheduling, and data reduction to confirm NEO candidates and characterize radar-targeted known NEOs.In order to determine how to maximize our NEO follow-up efforts, we must first define our goals for the LCOGT NEO Follow-up Network. This means answering the following questions. Should we follow-up all objects brighter than some magnitude limit? Should we only focus on the brightest objects or push to the limits of our capabilities by observing the faintest objects we think we can see and risk not finding the objects in our data? Do we (and how do we) prioritize objects somewhere in the middle of our observable magnitude range? If we want to push to faint objects, how do we minimize the amount of data in which the signal-to-noise ratio is too low to see the object? And how do we find a balance between performing follow-up and characterization observations?To help answer these questions, we have developed a LCOGT NEO Follow-up Network simulator that allows us to test our prioritization algorithms for target selection, confirm signal-to-noise predictions, and determine ideal block lengths and exposure times for observing NEO candidates. We will present our results from the simulator and progress on our NEO follow-up efforts.In the era of LSST, developing/utilizing infrastructure, such as the LCOGT NEO Follow-up Network and our web-based platform for selecting, scheduling, and reducing NEO observations, capable of handling the large number of detections expected to be produced on a daily basis by LSST will be critical to follow-up efforts. We hope our work can act as an example and tool for the community as together we prepare for the age of LSST.

  7. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.

  8. Visits to Tololo | CTIO

    Science.gov Websites

    limited to two groups of 40 people. One group meets at the gatehouse at 9 AM and the other at 1PM. Because Reports SOAR End-of-night Reports Gemini South LSST Optical Engineering AURA Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and

  9. Optimizing the LSST Dither Pattern for Survey Uniformity

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.

  10. Surveying the Inner Solar System with an Infrared Space Telescope

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.; Reitsema, Harold J.; Linfield, Roger P.

    2016-11-01

    We present an analysis of surveying the inner solar system for objects that may pose some threat to Earth. Most of the analysis is based on understanding the capability provided by Sentinel, a concept for an infrared space-based telescope placed in a heliocentric orbit near the distance of Venus. From this analysis, we show that (1) the size range being targeted can affect the survey design, (2) the orbit distribution of the target sample can affect the survey design, (3) minimum observational arc length during the survey is an important metric of survey performance, and (4) surveys must consider objects as small as D=15{--}30 m to meet the goal of identifying objects that have the potential to cause damage on Earth in the next 100 yr. Sentinel will be able to find 50% of all impactors larger than 40 m in a 6.5 yr survey. The Sentinel mission concept is shown to be as effective as any survey in finding objects bigger than D = 140 m but is more effective when applied to finding smaller objects on Earth-impacting orbits. Sentinel is also more effective at finding objects of interest for human exploration that benefit from lower propulsion requirements. To explore the interaction between space and ground search programs, we also study a case where Sentinel is combined with the Large Synoptic Survey Telescope (LSST) and show the benefit of placing a space-based observatory in an orbit that reduces the overlap in search regions with a ground-based telescope. In this case, Sentinel+LSST can find more than 70% of the impactors larger than 40 m assuming a 6.5 yr lifetime for Sentinel and 10 yr for LSST.

  11. Delta Doping High Purity CCDs and CMOS for LSST

    NASA Technical Reports Server (NTRS)

    Blacksberg, Jordana; Nikzad, Shouleh; Hoenk, Michael; Elliott, S. Tom; Bebek, Chris; Holland, Steve; Kolbe, Bill

    2006-01-01

    A viewgraph presentation describing delta doping high purity CCD's and CMOS for LSST is shown. The topics include: 1) Overview of JPL s versatile back-surface process for CCDs and CMOS; 2) Application to SNAP and ORION missions; 3) Delta doping as a back-surface electrode for fully depleted LBNL CCDs; 4) Delta doping high purity CCDs for SNAP and ORION; 5) JPL CMP thinning process development; and 6) Antireflection coating process development.

  12. LSST Astrometric Science

    NASA Astrophysics Data System (ADS)

    Saha, A.; Monet, D.

    2005-12-01

    Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.

  13. The LSST metrics analysis framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.

    2014-07-01

    We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.

  14. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 1

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    The first of a two-phase program was performed to develop the technology necessary to evaluate, design, manufacture, package, transport and deploy the hoop/column deployable antenna reflector by means of a ground based program. The hoop/column concept consists of a cable stiffened large diameter hoop and central column structure that supports and contours a radio frequency reflective mesh surface. Mission scenarios for communications, radiometer and radio astronomy, were studied. The data to establish technology drivers that resulted in a specification of a point design was provided. The point design is a multiple beam quadaperture offset antenna system wich provides four separate offset areas of illumination on a 100 meter diameter symmetrical parent reflector. The periphery of the reflector is a hoop having 48 segments that articulate into a small stowed volume around a center extendable column. The hoop and column are structurally connected by graphite and quartz cables. The prominence of cables in the design resulted in the development of advanced cable technology. Design verification models were built of the hoop, column, and surface stowage subassemblies. Model designs were generated for a half scale sector of the surface and a 1/6 scale of the complete deployable reflector.

  15. LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.

    2005-12-01

    We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.

  16. Atmospheric Dispersion Effects in Weak Lensing Measurements

    DOE PAGES

    Plazas, Andrés Alejandro; Bernstein, Gary

    2012-10-01

    The wavelength dependence of atmospheric refraction causes elongation of finite-bandwidth images along the elevation vector, which produces spurious signals in weak gravitational lensing shear measurements unless this atmospheric dispersion is calibrated and removed to high precision. Because astrometric solutions and PSF characteristics are typically calibrated from stellar images, differences between the reference stars' spectra and the galaxies' spectra will leave residual errors in both the astrometric positions (dr) and in the second moment (width) of the wavelength-averaged PSF (dv) for galaxies.We estimate the level of dv that will induce spurious weak lensing signals in PSF-corrected galaxy shapes that exceed themore » statistical errors of the DES and the LSST cosmic-shear experiments. We also estimate the dr signals that will produce unacceptable spurious distortions after stacking of exposures taken at different airmasses and hour angles. We also calculate the errors in the griz bands, and find that dispersion systematics, uncorrected, are up to 6 and 2 times larger in g and r bands,respectively, than the requirements for the DES error budget, but can be safely ignored in i and z bands. For the LSST requirements, the factors are about 30, 10, and 3 in g, r, and i bands,respectively. We find that a simple correction linear in galaxy color is accurate enough to reduce dispersion shear systematics to insignificant levels in the r band for DES and i band for LSST,but still as much as 5 times than the requirements for LSST r-band observations. More complex corrections will likely be able to reduce the systematic cosmic-shear errors below statistical errors for LSST r band. But g-band effects remain large enough that it seems likely that induced systematics will dominate the statistical errors of both surveys, and cosmic-shear measurements should rely on the redder bands.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, John Russell

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  18. The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, B.; Spergel, D.; Connolly, A.

    2015-02-02

    The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The mainmore » argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.« less

  19. The Emerging Infrastructure of Autonomous Astronomy

    NASA Astrophysics Data System (ADS)

    Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.

    2007-10-01

    Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.

  20. Near-Earth Object Orbit Linking with the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Vereš, Peter; Chesley, Steven R.

    2017-07-01

    We have conducted a detailed simulation of the ability of the Large Synoptic Survey Telescope (LSST) to link near-Earth and main belt asteroid detections into orbits. The key elements of the study were a high-fidelity detection model and the presence of false detections in the form of both statistical noise and difference image artifacts. We employed the Moving Object Processing System (MOPS) to generate tracklets, tracks, and orbits with a realistic detection density for one month of the LSST survey. The main goals of the study were to understand whether (a) the linking of near-Earth objects (NEOs) into orbits can succeed in a realistic survey, (b) the number of false tracks and orbits will be manageable, and (c) the accuracy of linked orbits would be sufficient for automated processing of discoveries and attributions. We found that the overall density of asteroids was more than 5000 per LSST field near opposition on the ecliptic, plus up to 3000 false detections per field in good seeing. We achieved 93.6% NEO linking efficiency for H< 22 on tracks composed of tracklets from at least three distinct nights within a 12 day interval. The derived NEO catalog was comprised of 96% correct linkages. Less than 0.1% of orbits included false detections, and the remainder of false linkages stemmed from main belt confusion, which was an artifact of the short time span of the simulation. The MOPS linking efficiency can be improved by refined attribution of detections to known objects and by improved tuning of the internal kd-tree linking algorithms.

  1. Is flat fielding safe for precision CCD astronomy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  2. Is flat fielding safe for precision CCD astronomy?

    DOE PAGES

    Baumer, Michael; Davis, Christopher P.; Roodman, Aaron

    2017-07-06

    The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less

  3. The LSST Dome final design

    NASA Astrophysics Data System (ADS)

    DeVries, J.; Neill, D. R.; Barr, J.; De Lorenzi, Simone; Marchiori, Gianpietro

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile 1. As a result of the Telescope wide field of view, the optical system is unusually susceptible to stray light 2. In addition, balancing the effect of wind induced telescope vibrations with Dome seeing is crucial. The rotating enclosure system (Dome) includes a moving wind screen and light baffle system. All of the Dome vents include hinged light baffles, which provide exceptional Dome flushing, stray light attenuation, and allows for vent maintenance access from inside the Dome. The wind screen also functions as a light screen, and helps define a clear optical aperture for the Telescope. The Dome must operate continuously without rotational travel limits to accommodate the Telescope cadence and travel. Consequently, the Azimuth drives are located on the fixed lower enclosure to accommodate glycol water cooling without the need for a utility cable wrap. An air duct system aligns when the Dome is in its parked position, and this provides air cooling for temperature conditioning of the Dome during the daytime. A bridge crane and a series of ladders, stairs and platforms provide for the inspection, maintenance and repair of all of the Dome mechanical systems. The contract to build the Dome was awarded to European Industrial Engineering in Mestre, Italy in May 2015. In this paper, we present the final design of this telescope and site sub-system.

  4. A daytime measurement of the lunar contribution to the night sky brightness in LSST's ugrizy bands-initial results

    NASA Astrophysics Data System (ADS)

    Coughlin, Michael; Stubbs, Christopher; Claver, Chuck

    2016-06-01

    We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.

  5. Scheduling Algorithm for the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Ichharam, Jaimal; Stubbs, Christopher

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.

  6. Looking through the same lens: Shear calibration for LSST, Euclid, and WFIRST with stage 4 CMB lensing

    NASA Astrophysics Data System (ADS)

    Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.

    2017-06-01

    The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.

  7. Wavelength-Dependent PSFs and their Impact on Weak Lensing Measurements

    NASA Astrophysics Data System (ADS)

    Carlsten, S. G.; Strauss, Michael A.; Lupton, Robert H.; Meyers, Joshua E.; Miyazaki, Satoshi

    2018-06-01

    We measure and model the wavelength dependence of the point spread function (PSF) in the Hyper Suprime-Cam Subaru Strategic Program survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g, r, and i-bands at the 1-2 per cent level and in the z and y-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF size based on weighted second moments, R, and wavelength of the form R(λ)∝λ-b. We find values of b between 0.2 and 0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ˜10 - 100 m, indicating that the atmosphere is dominating the chromaticity. In the best seeing data, we find that the optical system and detector also contribute some wavelength dependence. Meyers & Burchat (2015b) showed that b must be measured to an accuracy of ˜0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that b can be inferred with this accuracy in the r and i-bands for all positions in the LSST focal plane, assuming a stellar density of 1 star arcmin-2 and that the optical component of the PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less

  9. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    abate, alex; cheu, elliott

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  10. Photometric Redshift Calibration Strategy for WFIRST Cosmology

    NASA Astrophysics Data System (ADS)

    Hemmati, Shoubaneh; WFIRST, WFIRST-HLS-COSMOLOGY

    2018-01-01

    In order for WFIRST and other Stage IV Dark energy experiments (e.g. LSST, Euclid) to infer cosmological parameters not limited by systematic errors, accurate redshift measurements are needed. This accuracy can only be met using spectroscopic subsamples to calibrate the full sample. In this poster, we employ the machine leaning, SOM based spectroscopic sampling technique developed in Masters et al. 2015, using the empirical color-redshift relation among galaxies to find the minimum spectra required for the WFIRST weak lensing calibration. We use galaxies from the CANDELS survey to build the LSST+WFIRST lensing analog sample of ~36k objects and train the LSST+WFIRST SOM. We show that 26% of the WFIRST lensing sample consists of sources fainter than the Euclid depth in the optical, 91% of which live in color cells already occupied by brighter galaxies. We demonstrate the similarity between faint and bright galaxies as well as the feasibility of redshift measurements at different brightness levels. 4% of SOM cells are however only occupied by faint galaxies for which we recommend extra spectroscopy of ~200 new sources. Acquiring the spectra of these sources will enable the comprehensive calibration of the WFIRST color-redshift relation.

  11. New Self-lensing Models of the Small Magellanic Cloud: Can Gravitational Microlensing Detect Extragalactic Exoplanets?

    NASA Astrophysics Data System (ADS)

    Mróz, Przemek; Poleski, Radosław

    2018-04-01

    We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.

  12. Final acceptance testing of the LSST monolithic primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Burge, James H.; Cuerden, Brian; Gressler, William; Martin, Hubert M.; West, Steven C.; Zhao, Chunyu

    2014-07-01

    The Large Synoptic Survey Telescope (LSST) is a three-mirror wide-field survey telescope with the primary and tertiary mirrors on one monolithic substrate1. This substrate is made of Ohara E6 borosilicate glass in a honeycomb sandwich, spin cast at the Steward Observatory Mirror Lab at The University of Arizona2. Each surface is aspheric, with the specification in terms of conic constant error, maximum active bending forces and finally a structure function specification on the residual errors3. There are high-order deformation terms, but with no tolerance, any error is considered as a surface error and is included in the structure function. The radii of curvature are very different, requiring two independent test stations, each with instantaneous phase-shifting interferometers with null correctors. The primary null corrector is a standard two-element Offner null lens. The tertiary null corrector is a phase-etched computer-generated hologram (CGH). This paper details the two optical systems and their tolerances, showing that the uncertainty in measuring the figure is a small fraction of the structure function specification. Additional metrology includes the radii of curvature, optical axis locations, and relative surface tilts. The methods for measuring these will also be described along with their tolerances.

  13. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    DTIC Science & Technology

    2008-01-01

    faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter , taking an inventory of the Solar...Energy and Dark Matter (2) Taking an Inventory of the Solar System (3) Exploring the Transient Optical Sky (4) Mapping the Milky Way Each of these four...Constraining Dark Energy and Dark Matter Current models of cosmology require the exis- tence of both dark matter and dark energy to match observational

  14. CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding

    NASA Astrophysics Data System (ADS)

    Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás

    2018-01-01

    Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.

  15. Automated software configuration in the MONSOON system

    NASA Astrophysics Data System (ADS)

    Daly, Philip N.; Buchholz, Nick C.; Moore, Peter C.

    2004-09-01

    MONSOON is the next generation OUV-IR controller project being developed at NOAO. The design is flexible, emphasizing code re-use, maintainability and scalability as key factors. The software needs to support widely divergent detector systems ranging from multi-chip mosaics (for LSST, QUOTA, ODI and NEWFIRM) down to large single or multi-detector laboratory development systems. In order for this flexibility to be effective and safe, the software must be able to configure itself to the requirements of the attached detector system at startup. The basic building block of all MONSOON systems is the PAN-DHE pair which make up a single data acquisition node. In this paper we discuss the software solutions used in the automatic PAN configuration system.

  16. Robust Period Estimation Using Mutual Information for Multiband Light Curves in the Synoptic Survey Era

    NASA Astrophysics Data System (ADS)

    Huijse, Pablo; Estévez, Pablo A.; Förster, Francisco; Daniel, Scott F.; Connolly, Andrew J.; Protopapas, Pavlos; Carrasco, Rodrigo; Príncipe, José C.

    2018-05-01

    The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely sampled time-series are needed. In this paper we present a new method for light curve period estimation based on quadratic mutual information (QMI). The proposed method does not assume a particular model for the light curve nor its underlying probability density and it is robust to non-Gaussian noise and outliers. By combining the QMI from several bands the true period can be estimated even when no single-band QMI yields the period. Period recovery performance as a function of average magnitude and sample size is measured using 30,000 synthetic multiband light curves of RR Lyrae and Cepheid variables generated by the LSST Operations and Catalog simulators. The results show that aggregating information from several bands is highly beneficial in LSST sparsely sampled time-series, obtaining an absolute increase in period recovery rate up to 50%. We also show that the QMI is more robust to noise and light curve length (sample size) than the multiband generalizations of the Lomb–Scargle and AoV periodograms, recovering the true period in 10%–30% more cases than its competitors. A python package containing efficient Cython implementations of the QMI and other methods is provided.

  17. A Euclid, LSST and WFIRST Joint Processing Study

    NASA Astrophysics Data System (ADS)

    Chary, Ranga-Ram; Joint Processing Working Group

    2018-01-01

    Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.

  18. In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms

    NASA Astrophysics Data System (ADS)

    Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.

    2007-12-01

    We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.

  19. The variable sky of deep synoptic surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.

    2014-11-20

    The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night atmore » the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.« less

  20. Fabrication of the LSST monolithic primary-tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Ketelsen, Dean A.; Law, Kevin; Gressler, William J.; Zhao, Chunyu

    2012-09-01

    As previously reported (at the SPIE Astronomical Instrumentation conference of 2010 in San Diego1), the Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona’s Steward Observatory Mirror Lab. We will provide an update to the status of the mirrors and metrology systems, which have advanced from concepts to hardware in the past two years. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab, reducing the degrees of freedom needed to be controlled in the telescope. The surface specification is described as a structure function, related to seeing in excellent conditions. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper details the manufacturing process and metrology systems for each surface, including the alignment of the two surfaces. M1 is a hyperboloid and can utilize a standard Offner null corrector, whereas M3 is an oblate ellipsoid, so it has positive spherical aberration. The null corrector is a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature. Laser trackers are relied upon to measure the alignment and spacing as well as rough-surface metrology during looseabrasive grinding.

  1. Architectural Implications for Spatial Object Association Algorithms*

    PubMed Central

    Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste

    2013-01-01

    Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244

  2. Projected Near-Earth Object Discovery Performance of the Large Synoptic Survey Telescope

    NASA Technical Reports Server (NTRS)

    Chesley, Steven R.; Veres, Peter

    2017-01-01

    This report describes the methodology and results of an assessment study of the performance of the Large Synoptic Survey Telescope (LSST) in its planned efforts to detect and catalog near-Earth objects (NEOs).

  3. Searching for modified growth patterns with tomographic surveys

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel

    2009-04-01

    In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.

  4. LSST communications middleware implementation

    NASA Astrophysics Data System (ADS)

    Mills, Dave; Schumacher, German; Lotz, Paul

    2016-07-01

    The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.

  5. LSST camera readout chip ASPIC: test tools

    NASA Astrophysics Data System (ADS)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  6. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Petri, Andrea; May, Morgan

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  7. Consequences of CCD imperfections for cosmology determined by weak lensing surveys: from laboratory measurements to cosmological parameter bias

    DOE PAGES

    Okura, Yuki; Petri, Andrea; May, Morgan; ...

    2016-06-27

    Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less

  8. Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era

    NASA Astrophysics Data System (ADS)

    Cenko, Stephen

    2017-08-01

    When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.

  9. Extracting meaning from astronomical telegrams

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Conwill, L.; Djorgovski, S. G.; Mahabal, A.; Donalek, C.; Drake, A.

    2011-01-01

    The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using aspects of natural language processing. We demonstrate that it is possible to infer the subject of an ATEL from the vocabulary used and to identify previously unassociated reports.

  10. Strong Gravitational Lensing as a Probe of Gravity, Dark-Matter and Super-Massive Black Holes

    NASA Astrophysics Data System (ADS)

    Koopmans, L.V.E.; Barnabe, M.; Bolton, A.; Bradac, M.; Ciotti, L.; Congdon, A.; Czoske, O.; Dye, S.; Dutton, A.; Elliasdottir, A.; Evans, E.; Fassnacht, C.D.; Jackson, N.; Keeton, C.; Lasio, J.; Moustakas, L.; Meneghetti, M.; Myers, S.; Nipoti, C.; Suyu, S.; van de Ven, G.; Vegetti, S.; Wucknitz, O.; Zhao, H.-S.

    Whereas considerable effort has been afforded in understanding the properties of galaxies, a full physical picture, connecting their baryonic and dark-matter content, super-massive black holes, and (metric) theories of gravity, is still ill-defined. Strong gravitational lensing furnishes a powerful method to probe gravity in the central regions of galaxies. It can (1) provide a unique detection-channel of dark-matter substructure beyond the local galaxy group, (2) constrain dark-matter physics, complementary to direct-detection experiments, as well as metric theories of gravity, (3) probe central super-massive black holes, and (4) provide crucial insight into galaxy formation processes from the dark matter point of view, independently of the nature and state of dark matter. To seriously address the above questions, a considerable increase in the number of strong gravitational-lens systems is required. In the timeframe 2010-2020, a staged approach with radio (e.g. EVLA, e-MERLIN, LOFAR, SKA phase-I) and optical (e.g. LSST and JDEM) instruments can provide 10^(2-4) new lenses, and up to 10^(4-6) new lens systems from SKA/LSST/JDEM all-sky surveys around ~2020. Follow-up imaging of (radio) lenses is necessary with moderate ground/space-based optical-IR telescopes and with 30-50m telescopes for spectroscopy (e.g. TMT, GMT, ELT). To answer these fundamental questions through strong gravitational lensing, a strong investment in large radio and optical-IR facilities is therefore critical in the coming decade. In particular, only large-scale radio lens surveys (e.g. with SKA) provide the large numbers of high-resolution and high-fidelity images of lenses needed for SMBH and flux-ratio anomaly studies.

  11. Optical testing of the LSST combined primary/tertiary mirror

    NASA Astrophysics Data System (ADS)

    Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Gressler, William J.; Zhao, Chunyu

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona's Steward Observatory Mirror Lab. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper describes the basic metrology systems for each surface, with particular attention to the alignment of the two surfaces. These surfaces are aspheric enough to require null correctors for each wavefront. Both M1 and M3 are concave surfaces with both non-zero conic constants and higher-order terms (6th order for M1 and both 6th and 8th orders for M3). M1 is hyperboloidal and can utilize a standard Offner null corrector. M3 is an oblate ellipsoid, so has positive spherical aberration. We have chosen to place a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature (CoC), whereas the M1 null lens is beyond the CoC. One relatively new metrology tool is the laser tracker, which is relied upon to measure the alignment and spacings. A separate laser tracker system will be used to measure both surfaces during loose abrasive grinding and initial polishing.

  12. Architectural Implications for Spatial Object Association Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V S; Kurc, T; Saltz, J

    2009-01-29

    Spatial object association, also referred to as cross-match of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server R, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation providesmore » insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST).« less

  13. Toroid Joining Gun. [thermoplastic welding system using induction heating

    NASA Technical Reports Server (NTRS)

    Buckley, J. D.; Fox, R. L.; Swaim, R J.

    1985-01-01

    The Toroid Joining Gun is a low cost, self-contained, portable low powered (100-400 watts) thermoplastic welding system developed at Langley Research Center for joining plastic and composite parts using an induction heating technique. The device developed for use in the fabrication of large space sructures (LSST Program) can be used in any atmosphere or in a vacuum. Components can be joined in situ, whether on earth or on a space platform. The expanded application of this welding gun is in the joining of thermoplastic composites, thermosetting composites, metals, and combinations of these materials. Its low-power requirements, light weight, rapid response, low cost, portability, and effective joining make it a candidate for solving many varied and unique bonding tasks.

  14. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    NASA Astrophysics Data System (ADS)

    Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration

    2018-01-01

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.

  15. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scolnic, D.; Kessler, R.; Brout, D.

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  16. How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?

    DOE PAGES

    Scolnic, D.; Kessler, R.; Brout, D.; ...

    2017-12-22

    The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less

  17. A warm Spitzer survey of the LSST/DES 'Deep drilling' fields

    NASA Astrophysics Data System (ADS)

    Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee

    2014-12-01

    We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.

  18. Stellar Populations and Nearby Galaxies with the LSST

    NASA Astrophysics Data System (ADS)

    Saha, Abhijit; Olsen, K.; Monet, D. G.; LSST Stellar Populations Collaboration

    2009-01-01

    The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma). Time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than r=24.7. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence (MS) stars at all distances within the Galaxy as well as in the Magellanic Clouds, and dwarf satellites of the Milky Way. This will support comprehensive studies of star formation histories and chemical evolution for field stars. The structures of the Clouds and dwarf spheroidals will be traced with the MS stars, to equivalent surface densities fainter than 35 mag/square arc-second. With geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, a robust complete sample of solar neighborhood stars will be obtained. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics. The combination of wide coverage, multi-band photometry, time sampling and parallax taken together will address several key problems: e.g. fine tuning the extragalactic distance scale by examining properties of RR Lyraes and Cepheids as a function of parent populations, extending the faint end of the galaxy luminosity function by discovering them using star count density enhancements on degree scales tracing, and indentifying inter-galactic stars through novae and Long Period Variables.

  19. Transient Go: A Mobile App for Transient Astronomy Outreach

    NASA Astrophysics Data System (ADS)

    Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.

    2016-12-01

    Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.

  20. The Hyper Suprime-Cam software pipeline

    NASA Astrophysics Data System (ADS)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  1. Properties of tree rings in LSST sensors

    DOE PAGES

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    2017-05-30

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  2. Fringing in MonoCam Y4 filter images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  3. scarlet: Source separation in multi-band images by Constrained Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert

    2018-03-01

    SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

  4. Properties of tree rings in LSST sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, H. Y.; Nomerotski, A.; Tsybychev, D.

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less

  5. A study of astrometric distortions due to “tree rings” in CCD sensors using LSST Photon Simulator

    DOE PAGES

    Beamer, Benjamin; Nomerotski, Andrei; Tsybychev, Dmitri

    2015-05-22

    Imperfections in the production process of thick CCDs lead to circularly symmetric dopant concentration variations, which in turn produce electric fields transverse to the surface of the fully depleted CCD that displace the photogenerated charges. We use PhoSim, a Monte Carlo photon simulator, to explore and examine the likely impacts these dopant concentration variations will have on astrometric measurements in LSST. The scale and behavior of both the astrometric shifts imparted to point sources and the intensity variations in flat field images that result from these doping imperfections are similar to those previously observed in Dark Energy Camera CCDs, givingmore » initial confirmation of PhoSim's model for these effects. In addition, the organized shape distortions were observed as a result of the symmetric nature of these dopant variations, causing nominally round sources to be imparted with a measurable ellipticity either aligned with or transverse to the radial direction of this dopant variation pattern.« less

  6. Fringing in MonoCam Y4 filter images

    DOE PAGES

    Brooks, J.; Fisher-Levine, M.; Nomerotski, A.

    2017-05-05

    Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less

  7. OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST

    NASA Astrophysics Data System (ADS)

    Roming, Peter; van der Horst, Alexander; OCTOCAM Team

    2018-01-01

    The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.

  8. chroma: Chromatic effects for LSST weak lensing

    NASA Astrophysics Data System (ADS)

    Meyers, Joshua E.; Burchat, Patricia R.

    2018-04-01

    Chroma investigates biases originating from two chromatic effects in the atmosphere: differential chromatic refraction (DCR), and wavelength dependence of seeing. These biases arise when using the point spread function (PSF) measured with stars to estimate the shapes of galaxies with different spectral energy distributions (SEDs) than the stars.

  9. The Maunakea Spectroscopic ExplorerStatus and System overview

    NASA Astrophysics Data System (ADS)

    Mignot, S.; Murowinski, R.; Szeto, K.; Blin, A.; Caillier, P.

    2017-12-01

    The Maunakea Spectroscopic Explorer (MSE) project explores the possibility of upgrading the existing CFHT telescope and collaboration to turn it into the most powerful spectroscopic facility available in the years 2020s. Its 10 meter aperture and its 1.5°² hexagonal field of view will allow both large and deep surveys, as complements to current (Gaia, eRosita, LOFAR) and future imaging (Euclid, WFIRST, SKA, LSST) surveys, but also to provide tentative targets to the TMT or the E-ELT. In perfect agreement with INSU's 2015-2020 prospective, besides being well represented in MSE's science team (23/105 members), France is also a major contributor to the Conceptual Design studies with CRAL developing a concept for the low and moderate spectrographs, DT INSU for the prime focus environment and GEPI for systems engineering.

  10. Constructing Concept Schemes From Astronomical Telegrams Via Natural Language Clustering

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Zhang, M.; Djorgovski, S. G.; Donalek, C.; Drake, A. J.; Mahabal, A.

    2012-01-01

    The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using hierarchical clustering of processed natural language. This allows us to automatically organize ATELs based on the vocabulary used. We conclude that we can use simple algorithms to process and extract meaning from astronomical textual data.

  11. Wood-Vasey DOE #SC0011834 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood-Vasey, William Michael

    During the past reporting period (Year 3), this grant has provided partial support for graduate students Daniel Perrefort and Kara Ponder. They have been working exploring different aspects of the technical work needed to take full advantage of the potential for cosmological inference using Type Ia supernovae (SNeIa) with LSST.

  12. Strong Lens Time Delay Challenge. I. Experimental Design

    NASA Astrophysics Data System (ADS)

    Dobler, Gregory; Fassnacht, Christopher D.; Treu, Tommaso; Marshall, Phil; Liao, Kai; Hojjati, Alireza; Linder, Eric; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~103 strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders," each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.

  13. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  14. The Hyper Suprime-Cam software pipeline

    DOE PAGES

    Bosch, James; Armstrong, Robert; Bickerton, Steven; ...

    2017-10-12

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  15. The Hyper Suprime-Cam software pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, James; Armstrong, Robert; Bickerton, Steven

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  16. Rochester scientist discovers new comet with Dark Energy Camera (DECam) at

    Science.gov Websites

    Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and Meetings Upcoming Colloquia Sky Conditions CTIO Site Conditions TASCA colleagues believe. David Cameron, a visiting scientist in Eric Mamajek's research group in the Department of

  17. Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation

    NASA Astrophysics Data System (ADS)

    Speagle, Joshua S.; Eisenstein, Daniel J.

    2017-07-01

    With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.

  18. Lower Boundary Forcing related to the Occurrence of Rain in the Tropical Western Pacific

    NASA Astrophysics Data System (ADS)

    Li, Y.; Carbone, R. E.

    2013-12-01

    Global weather and climate models have a long and somewhat tortured history with respect to simulation and prediction of tropical rainfall in the relative absence of balanced flow in the geostrophic sense. An important correlate with tropical rainfall is sea surface temperature (SST). The introduction of SST information to convective rainfall parameterization in global models has improved model climatologies of tropical oceanic rainfall. Nevertheless, large systematic errors have persisted, several of which are common to most atmospheric models. Models have evolved to the point where increased spatial resolution demands representation of the SST field at compatible temporal and spatial scales, leading to common usage of monthly SST fields at scales of 10-100 km. While large systematic errors persist, significant skill has been realized from various atmospheric and coupled ocean models, including assimilation of weekly or even daily SST fields, as tested by the European Center for Medium Range Weather Forecasting. A few investigators have explored the role of SST gradients in relation to the occurrence of precipitation. Some of this research has focused on large scale gradients, mainly associated with surface ocean-atmosphere climatology. These studies conclude that lower boundary atmospheric convergence, under some conditions, could be substantially enhanced over SST gradients, destabilizing the atmosphere, and thereby enabling moist convection. While the concept has a firm theoretical foundation, it has not gained a sizeable following far beyond the realm of western boundary currents. Li and Carbone 2012 examined the role of transient mesoscale (~ 100 km) SST gradients in the western Pacific warm pool by means of GHRSST and CMORPH rainfall data. They found that excitation of deep moist convection was strongly associated with the Laplacian of SST (LSST). Specifically, -LSST is associated with rainfall onset in 75% of 10,000 events over 4 years, whereas the background ocean is symmetric about zero Laplacian. This finding is fully consistent with theory for gradients of order ~1degC in low mean wind conditions, capable of inducing atmospheric convergence of N x 10-5s-1. We will present new findings resulting from the application of a Madden-Julian oscillation (MJO) passband filter to GHRSST/CMORPH data. It shows that the -LSST field organizes at scales of 1000-2000 km and can persist for periods of two weeks to 3 months. Such -LSST anomalies are in quadrature with MJO rainfall, tracking and leading the wet phase of the MJO by 10-14 days, from the Indian Ocean to the dateline. More generally, an evaluation of SST structure in rainfall production will be presented, which represents a decidedly alternative view to conventional wisdom. Li, Yanping, and R.E. Carbone, 2012: Excitation of Rainfall over the Tropical Western Pacific, J. Atmos. Sci., 69, 2983-2994.

  19. Supporting Shared Resource Usage for a Diverse User Community: the OSG Experience and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele; Levshina, Tanya; Rynge, Mats; Sehgal, Chander; Slyz, Marko

    2012-12-01

    The Open Science Grid (OSG) supports a diverse community of new and existing users in adopting and making effective use of the Distributed High Throughput Computing (DHTC) model. The LHC user community has deep local support within the experiments. For other smaller communities and individual users the OSG provides consulting and technical services through the User Support area. We describe these sometimes successful and sometimes not so successful experiences and analyze lessons learned that are helping us improve our services. The services offered include forums to enable shared learning and mutual support, tutorials and documentation for new technology, and troubleshooting of problematic or systemic failure modes. For new communities and users, we bootstrap their use of the distributed high throughput computing technologies and resources available on the OSG by following a phased approach. We first adapt the application and run a small production campaign on a subset of “friendly” sites. Only then do we move the user to run full production campaigns across the many remote sites on the OSG, adding to the community resources up to hundreds of thousands of CPU hours per day. This scaling up generates new challenges - like no determinism in the time to job completion, and diverse errors due to the heterogeneity of the configurations and environments - so some attention is needed to get good results. We cover recent experiences with image simulation for the Large Synoptic Survey Telescope (LSST), small-file large volume data movement for the Dark Energy Survey (DES), civil engineering simulation with the Network for Earthquake Engineering Simulation (NEES), and accelerator modeling with the Electron Ion Collider group at BNL. We will categorize and analyze the use cases and describe how our processes are evolving based on lessons learned.

  20. STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobler, Gregory; Fassnacht, Christopher D.; Rumbaugh, Nicholas

    2015-02-01

    The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a groupmore » of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.« less

  1. Summary of LSST systems analysis and integration task for SPS flight test articles

    NASA Astrophysics Data System (ADS)

    Greenberg, H. S.

    1981-02-01

    The structural and equipment requirements for two solar power satellite (SPS) test articles are defined. The first SPS concept uses a hexagonal frame structure to stabilize the array of primary tension cables configured to support a Mills Cross antenna containing 17,925 subarrays composed of dipole radiating elements and solid state power amplifier modules. The second test article consists of a microwave antenna and its power source, a 20 by 200 m array of solar cell blankets, both of which are supported by the solar blanket array support structure. The test article structure, a ladder, is comprised of two longitudinal beams (215 m long) spaced 10 m apart and interconnected by six lateral beams. The system control module structure and bridge fitting provide bending and torsional stiffness, and supplement the in plane Vierendeel structure behavior. Mission descriptions, construction, and structure interfaces are addressed.

  2. DESCQA: Synthetic Sky Catalog Validation Framework

    NASA Astrophysics Data System (ADS)

    Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph

    2018-04-01

    The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.

  3. On the Detectability of Interstellar Objects Like 1I/'Oumuamua

    NASA Astrophysics Data System (ADS)

    Ragozzine, Darin

    2018-04-01

    Almost since Oort's 1950 hypothesis of a tenuously bound cloud of comets, planetary formation theorists have realized that the process of planet formation must have ejected very large numbers of planetesimals into interstellar space. Unforunately, these objects are distributed over galactic volumes, while they are only likely to be detectable if they pass within a few AU of Earth, resulting in an incredibly sparse detectable population. Furthermore, hypotheses for the formation and distribution of these bodies allows for uncertainties of orders of magnitude in the expected detection rate: our analysis suggested LSST would discover 0.01-100 objects during its lifetime (Cook et al. 2016). The discovery of 1I/'Oumuamua by a survey less powerful that LSST indicates either a low probability event and/or that properties of this population are on the more favorable end of the spectrum. We revisit the detailed detection analysis of Cook et al. 2016 in light of the detection of 1I/'Oumuamua. We use these results to better understand 1I/'Oumuamua and to update our assessment of future detections of interstellar objects. We highlight some key questions that can be answered only by additional discoveries.

  4. The e-ASTROGAM mission. Exploring the extreme Universe with gamma rays in the MeV - GeV range

    NASA Astrophysics Data System (ADS)

    De Angelis, A.; Tatischeff, V.; Tavani, M.; Oberlack, U.; Grenier, I.; Hanlon, L.; Walter, R.; Argan, A.; von Ballmoos, P.; Bulgarelli, A.; Donnarumma, I.; Hernanz, M.; Kuvvetli, I.; Pearce, M.; Zdziarski, A.; Aboudan, A.; Ajello, M.; Ambrosi, G.; Bernard, D.; Bernardini, E.; Bonvicini, V.; Brogna, A.; Branchesi, M.; Budtz-Jorgensen, C.; Bykov, A.; Campana, R.; Cardillo, M.; Coppi, P.; De Martino, D.; Diehl, R.; Doro, M.; Fioretti, V.; Funk, S.; Ghisellini, G.; Grove, E.; Hamadache, C.; Hartmann, D. H.; Hayashida, M.; Isern, J.; Kanbach, G.; Kiener, J.; Knödlseder, J.; Labanti, C.; Laurent, P.; Limousin, O.; Longo, F.; Mannheim, K.; Marisaldi, M.; Martinez, M.; Mazziotta, M. N.; McEnery, J.; Mereghetti, S.; Minervini, G.; Moiseev, A.; Morselli, A.; Nakazawa, K.; Orleanski, P.; Paredes, J. M.; Patricelli, B.; Peyré, J.; Piano, G.; Pohl, M.; Ramarijaona, H.; Rando, R.; Reichardt, I.; Roncadelli, M.; Silva, R.; Tavecchio, F.; Thompson, D. J.; Turolla, R.; Ulyanov, A.; Vacchi, A.; Wu, X.; Zoglauer, A.

    2017-10-01

    e-ASTROGAM (`enhanced ASTROGAM') is a breakthrough Observatory space mission, with a detector composed by a Silicon tracker, a calorimeter, and an anticoincidence system, dedicated to the study of the non-thermal Universe in the photon energy range from 0.3 MeV to 3 GeV - the lower energy limit can be pushed to energies as low as 150 keV, albeit with rapidly degrading angular resolution, for the tracker, and to 30 keV for calorimetric detection. The mission is based on an advanced space-proven detector technology, with unprecedented sensitivity, angular and energy resolution, combined with polarimetric capability. Thanks to its performance in the MeV-GeV domain, substantially improving its predecessors, e-ASTROGAM will open a new window on the non-thermal Universe, making pioneering observations of the most powerful Galactic and extragalactic sources, elucidating the nature of their relativistic outflows and their effects on the surroundings. With a line sensitivity in the MeV energy range one to two orders of magnitude better than previous generation instruments, e-ASTROGAM will determine the origin of key isotopes fundamental for the understanding of supernova explosion and the chemical evolution of our Galaxy. The mission will provide unique data of significant interest to a broad astronomical community, complementary to powerful observatories such as LIGO-Virgo-GEO600-KAGRA, SKA, ALMA, E-ELT, TMT, LSST, JWST, Athena, CTA, IceCube, KM3NeT, and the promise of eLISA.

  5. The production of ultrathin polyimide films for the solar sail program and Large Space Structures Technology (LSST): A feasibility study

    NASA Technical Reports Server (NTRS)

    Forester, R. H.

    1978-01-01

    Polyimide membranes of a thickness range from under 0.01 micron m to greater than 1 micron m can be produced at an estimated cost of 50 cents per sq m (plus the cost of the polymer). The polymer of interest is dissolved in a solvent which is solube in water. The polymer or casting solution is allowed to flow down an inclined ramp onto a water surface where a pool of floating polymer develops. The solvent dissolves into the water lowering the surface tension of the water on equently, the contact angle of the polymer pool is very low and the edge of the pool is very thin. The solvent dissolves from this thin region too rapidly to be replenished from the bulk of the pool and a solid polymer film forms. Firm formation is rapid and spontaneous and the film spreads out unaided, many feet from the leading edge of the pool. The driving force for this process is the exothermic solution of the organic solvent from the polymer solution into the water.

  6. Liverpool Telescope and Liverpool Telescope 2

    NASA Astrophysics Data System (ADS)

    Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Clay, N. R.; Jermak, H.; Marchant, J. M.; Mottram, C. J.; Piascik, A.; Smith, R. J.

    2016-12-01

    The Liverpool Telescope is a fully robotic optical/near-infrared telescope with a 2-metre clear aperture, located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope is owned and operated by Liverpool John Moores University, with financial support from the UK's Science and Technology Facilities Council. The telescope began routine science operations in 2004 and is a common-user facility with time available through a variety of committees via an open, peer reviewed process. Seven simultaneously mounted instruments support a broad science programme, with a focus on transient follow-up and other time domain topics well suited to the characteristics of robotic observing. Development has also begun on a successor facility, with the working title `Liverpool Telescope 2', to capitalise on the new era of time domain astronomy which will be brought about by the next generation of survey facilities such as LSST. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time. In this paper we provide an overview of the current status of both facilities.

  7. News and Views: LSST mirror blank; More and better maths; Free telescopes; The hurricane season is starting again Get ready: IYA2009 UK website up and running

    NASA Astrophysics Data System (ADS)

    2008-10-01

    As floods and hurricanes disrupt the lives of people round the world, a new generation of scientific tools are supporting both storm preparedness and recovery. As International Year of Astronomy 2009 approaches, the UK website is developing more features that make it easier to see what's planned for this science extravaganza.

  8. The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Foxley-Marrable, Max; Collett, Thomas E.; Vernardos, Georgios; Goldstein, Daniel A.; Bacon, David

    2018-05-01

    We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is ≲ 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (≲ 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ˜ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.

  9. The LSSTC Data Science Fellowship Program

    NASA Astrophysics Data System (ADS)

    Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council

    2017-01-01

    The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.

  10. Optical Variability and Classification of High Redshift (3.5 < z < 5.5) Quasars on SDSS Stripe 82

    NASA Astrophysics Data System (ADS)

    AlSayyad, Yusra; McGreer, Ian D.; Fan, Xiaohui; Connolly, Andrew J.; Ivezic, Zeljko; Becker, Andrew C.

    2015-01-01

    Recent studies have shown promise in combining optical colors with variability to efficiently select and estimate the redshifts of low- to mid-redshift quasars in upcoming ground-based time-domain surveys. We extend these studies to fainter and less abundant high-redshift quasars using light curves from 235 sq. deg. and 10 years of Stripe 82 imaging reprocessed with the prototype LSST data management stack. Sources are detected on the i-band co-adds (5σ: i ~ 24) but measured on the single-epoch (ugriz) images, generating complete and unbiased lightcurves for sources fainter than the single-epoch detection threshold. Using these forced photometry lightcurves, we explore optical variability characteristics of high redshift quasars and validate classification methods with particular attention to the low signal limit. In this low SNR limit, we quantify the degradation of the uncertainties and biases on variability parameters using simulated light curves. Completeness/efficiency and redshift accuracy are verified with new spectroscopic observations on the MMT and APO 3.5m. These preliminary results are part of a survey to measure the z~4 luminosity function for quasars (i < 23) on Stripe 82 and to validate purely photometric classification techniques for high redshift quasars in LSST.

  11. Prospects for Determining the Mass Distributions of Galaxy Clusters on Large Scales Using Weak Gravitational Lensing

    NASA Astrophysics Data System (ADS)

    Fong, M.; Bowyer, R.; Whitehead, A.; Lee, B.; King, L.; Applegate, D.; McCarthy, I.

    2018-05-01

    For more than two decades, the Navarro, Frenk, and White (NFW) model has stood the test of time; it has been used to describe the distribution of mass in galaxy clusters out to their outskirts. Stacked weak lensing measurements of clusters are now revealing the distribution of mass out to and beyond their virial radii, where the NFW model is no longer applicable. In this study we assess how well the parameterised Diemer & Kravstov (DK) density profile describes the characteristic mass distribution of galaxy clusters extracted from cosmological simulations. This is determined from stacked synthetic lensing measurements of the 50 most massive clusters extracted from the Cosmo-OWLS simulations, using the Dark Matter Only run and also the run that most closely matches observations. The characteristics of the data reflect the Weighing the Giants survey and data from the future Large Synoptic Survey Telescope (LSST). In comparison with the NFW model, the DK model favored by the stacked data, in particular for the future LSST data, where the number density of background galaxies is higher. The DK profile depends on the accretion history of clusters which is specified in the current study. Eventually however subsamples of galaxy clusters with qualities indicative of disparate accretion histories could be studied.

  12. Centroid Position as a Function of Total Counts in a Windowed CMOS Image of a Point Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R E; Olivier, S; Riot, V

    2010-05-27

    We obtained 960,200 22-by-22-pixel windowed images of a pinhole spot using the Teledyne H2RG CMOS detector with un-cooled SIDECAR readout. We performed an analysis to determine the precision we might expect in the position error signals to a telescope's guider system. We find that, under non-optimized operating conditions, the error in the computed centroid is strongly dependent on the total counts in the point image only below a certain threshold, approximately 50,000 photo-electrons. The LSST guider camera specification currently requires a 0.04 arcsecond error at 10 Hertz. Given the performance measured here, this specification can be delivered with a singlemore » star at 14th to 18th magnitude, depending on the passband.« less

  13. Primary results from the Pan-STARRS-1 Outer Solar System Key Project

    NASA Astrophysics Data System (ADS)

    Holman, Matthew J.; Chen, Ying-Tung; Lackner, Michael; Payne, Matthew John; Lin, Hsing-Wen; Cristopher Fraser, Wesley; Lacerda, Pedro; Pan-STARRS 1 Science Consortium

    2016-10-01

    We have completed a search for slow moving bodies in the data obtained by the Pan-STARRS-1 (PS1) Science Consortium from 2010 to 2014. The data set covers the full sky north of -30 degrees declination, in the PS1 g, r, i, z, y, and w (g+r+i) filters. Our novel distance-based search is effective at detecting and linking very slow moving objects with sparsely sampled observations, even if observations are widely separated in RA, Dec and time, which is relevant to the future LSST solar system searches. In particular, our search is sensitive to objects at heliocentric distances of 25-2000 AU with magnitudes brighter than approximately r=22.5, without limits on the inclination of the object. We recover hundreds of known TNOs and Centaurs and discover hundreds of new objects, measuring phase and color information for many of them. Other highlights include the discovery of a second retrograde TNO, a number of Neptune Trojans, and large numbers of distant resonant TNOs.

  14. New characterization techniques for LSST sensors

    DOE PAGES

    Nomerotski, A.

    2015-06-18

    Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.

  15. The dynamics and control of large flexible space structures-V

    NASA Technical Reports Server (NTRS)

    Bainum, P. M.; Reddy, A. S. S. R.; Diarra, C. M.; Kumar, V. K.

    1982-01-01

    A general survey of the progress made in the areas of mathematical modelling of the system dynamics, structural analysis, development of control algorithms, and simulation of environmental disturbances is presented. The use of graph theory techniques is employed to examine the effects of inherent damping associated with LSST systems on the number and locations of the required control actuators. A mathematical model of the forces and moments induced on a flexible orbiting beam due to solar radiation pressure is developed and typical steady state open loop responses obtained for the case when rotations and vibrations are limited to occur within the orbit plane. A preliminary controls analysis based on a truncated (13 mode) finite element model of the 122m. Hoop/Column antenna indicates that a minimum of six appropriately placed actuators is required for controllability. An algorithm to evaluate the coefficients which describe coupling between the rigid rotational and flexible modes and also intramodal coupling was developed and numerical evaluation based on the finite element model of Hoop/Column system is currently in progress.

  16. Detection technique for artificially illuminated objects in the outer solar system and beyond.

    PubMed

    Loeb, Abraham; Turner, Edwin L

    2012-04-01

    Existing and planned optical telescopes and surveys can detect artificially illuminated objects, comparable in total brightness to a major terrestrial city, at the outskirts of the Solar System. Orbital parameters of Kuiper belt objects (KBOs) are routinely measured to exquisite precisions of<10(-3). Here, we propose to measure the variation of the observed flux F from such objects as a function of their changing orbital distances D. Sunlight-illuminated objects will show a logarithmic slope α ≡ (d log F/d log D)=-4, whereas artificially illuminated objects should exhibit α=-2. The proposed Large Synoptic Survey Telescope (LSST) and other planned surveys will provide superb data and allow measurement of α for thousands of KBOs. If objects with α=-2 are found, follow-up observations could measure their spectra to determine whether they are illuminated by artificial lighting. The search can be extended beyond the Solar System with future generations of telescopes on the ground and in space that would have the capacity to detect phase modulation due to very strong artificial illumination on the nightside of planets as they orbit their parent stars.

  17. Solar system science with ESA Euclid

    NASA Astrophysics Data System (ADS)

    Carry, B.

    2018-01-01

    Context. The ESA Euclid mission has been designed to map the geometry of the dark Universe. Scheduled for launch in 2020, it will conduct a six-year visible and near-infrared imaging and spectroscopic survey over 15 000 deg2 down to VAB 24.5. Although the survey will avoid ecliptic latitudes below 15°, the survey pattern in repeated sequences of four broadband filters seems well-adapted to detect and characterize solar system objects (SSOs). Aims: We aim at evaluating the capability of Euclid of discovering SSOs and of measuring their position, apparent magnitude, and spectral energy distribution. We also investigate how the SSO orbits, morphology (activity and multiplicity), physical properties (rotation period, spin orientation, and 3D shape), and surface composition can be determined based on these measurements. Methods: We used the current census of SSOs to extrapolate the total amount of SSOs that will be detectable by Euclid, that is, objects within the survey area and brighter than the limiting magnitude. For each different population of SSO, from neighboring near-Earth asteroids to distant Kuiper-belt objects (KBOs) and including comets, we compared the expected Euclid astrometry, photometry, and spectroscopy with the SSO properties to estimate how Euclid will constrain the SSOs dynamical, physical, and compositional properties. Results: With the current survey design, about 150 000 SSOs, mainly from the asteroid main-belt, should be observable by Euclid. These objects will all have high inclination, which is a difference to many SSO surveys that focus on the ecliptic plane. Euclid may be able to discover several 104 SSOs, in particular, distant KBOs at high declination. The Euclid observations will consist of a suite of four sequences of four measurements and will refine the spectral classification of SSOs by extending the spectral coverage provided by Gaia and the LSST, for instance, to 2 microns. Combined with sparse photometry such as measured by Gaia and the LSST, the time-resolved photometry will contribute to determining the SSO rotation period, spin orientation, and 3D shape model. The sharp and stable point-spread function of Euclid will also allow us to resolve binary systems in the Kuiper belt and detect activity around Centaurs. Conclusions: The depth of the Euclid survey (VAB 24.5), its spectral coverage (0.5 to 2.0 μm), and its observation cadence has great potential for solar system research. A dedicated processing for SSOs is being set up within the Euclid consortium to produce astrometry catalogs, multicolor and time-resolved photometry, and spectral classification of some 105 SSOs, which will be delivered as Legacy Science.

  18. Big Data Science Cafés: High School Students Experiencing Real Research with Scientists

    NASA Astrophysics Data System (ADS)

    Walker, C. E.; Pompea, S. M.

    2017-12-01

    The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.

  19. Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis

    DTIC Science & Technology

    2016-09-15

    future research. 3 II. Astrostatistics Historically, astronomy has been a data-driven science. Larger and more precise data sets have led to the...forthcoming Large Synoptic Survey Telescope (LSST), the human-centric approach to astronomy is becoming strained [13, 24, 25, 63]. More than ever...process. One use of the filtering process is to remove artifacts from the data set. In the context of time domain astronomy , an artifact is an error in

  20. Variable classification in the LSST era: exploring a model for quasi-periodic light curves

    NASA Astrophysics Data System (ADS)

    Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.

    2017-06-01

    The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.

  1. Transient survey rates for orphan afterglows from compact merger jets

    NASA Astrophysics Data System (ADS)

    Lamb, Gavin P.; Tanaka, Masaomi; Kobayashi, Shiho

    2018-06-01

    Orphan afterglows from short γ-ray bursts (GRBs) are potential candidates for electromagnetic (EM) counterpart searches to gravitational wave (GW) detected neutron star or neutron star black hole mergers. Various jet dynamical and structure models have been proposed that can be tested by the detection of a large sample of GW-EM counterparts. We make predictions for the expected rate of optical transients from these jet models for future survey telescopes, without a GW or GRB trigger. A sample of merger jets is generated in the redshift limits 0 ≤ z ≤ 3.0, and the expected peak r-band flux and time-scale above the Large Synoptic Survey Telescope (LSST) or Zwicky Transient Factory (ZTF) detection threshold, mr = 24.5 and 20.4, respectively, is calculated. General all-sky rates are shown for mr ≤ 26.0 and mr ≤ 21.0. The detected orphan and GRB afterglow rate depends on jet model, typically 16≲ R≲ 76 yr-1 for the LSST, and 2≲ R ≲ 8 yr-1 for ZTF. An excess in the rate of orphan afterglows for a survey to a depth of mr ≤ 26 would indicate that merger jets have a dominant low-Lorentz factor population, or the jets exhibit intrinsic jet structure. Careful filtering of transients is required to successfully identify orphan afterglows from either short- or long-GRB progenitors.

  2. Advances in Telescope and Detector Technologies - Impacts on the Study and Understanding of Binary Star and Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Guinan, Edward F.; Engle, Scott; Devinney, Edward J.

    2012-04-01

    Current and planned telescope systems (both on the ground and in space) as well as new technologies will be discussed with emphasis on their impact on the studies of binary star and exoplanet systems. Although no telescopes or space missions are primarily designed to study binary stars (what a pity!), several are available (or will be shortly) to study exoplanet systems. Nonetheless those telescopes and instruments can also be powerful tools for studying binary and variable stars. For example, early microlensing missions (mid-1990s) such as EROS, MACHO and OGLE were initially designed for probing dark matter in the halos of galaxies but, serendipitously, these programs turned out to be a bonanza for the studies of eclipsing binaries and variable stars in the Magellanic Clouds and in the Galactic Bulge. A more recent example of this kind of serendipity is the Kepler Mission. Although Kepler was designed to discover exoplanet transits (and so far has been very successful, returning many planetary candidates), Kepler is turning out to be a ``stealth'' stellar astrophysics mission returning fundamentally important and new information on eclipsing binaries, variable stars and, in particular, providing a treasure trove of data of all types of pulsating stars suitable for detailed Asteroseismology studies. With this in mind, current and planned telescopes and networks, new instruments and techniques (including interferometers) are discussed that can play important roles in our understanding of both binary star and exoplanet systems. Recent advances in detectors (e.g. laser frequency comb spectrographs), telescope networks (both small and large - e.g. Super-WASP, HAT-net, RoboNet, Las Combres Observatory Global Telescope (LCOGT) Network), wide field (panoramic) telescope systems (e.g. Large Synoptic Survey Telescope (LSST) and Pan-Starrs), huge telescopes (e.g. the Thirty Meter Telescope (TMT), the Overwhelming Large Telescope (OWL) and the Extremely Large Telescope (ELT)), and space missions, such as the James Webb Space Telescope (JWST), the possible NASA Explorer Transiting Exoplanet Survey Satellite (TESS - recently approved for further study) and Gaia (due for launch during 2013) will all be discussed. Also highlighted are advances in interferometers (both on the ground and from space) and imaging now possible at sub-millimeter wavelengths from the Extremely Long Array (ELVA) and Atacama Large Millimeter Array (ALMA). High precision Doppler spectroscopy, for example with HARPS, HIRES and more recently the Carnegie Planet Finder Spectrograph, are currently returning RVs typically better than ~2-m/s for some brighter exoplanet systems. But soon it should be possible to measure Doppler shifts as small as ~10-cm/s - sufficiently sensitive for detecting Earth-size planets. Also briefly discussed is the impact these instruments will have on the study of eclipsing binaries, along with future possibilities of utilizing methods from the emerging field of Astroinformatics, including: the Virtual Observatory (VO) and the possibilities of analyzing these huge datasets using Neural Network (NN) and Artificial Intelligence (AI) technologies.

  3. Machine-assisted discovery of relationships in astronomy

    NASA Astrophysics Data System (ADS)

    Graham, Matthew J.; Djorgovski, S. G.; Mahabal, Ashish A.; Donalek, Ciro; Drake, Andrew J.

    2013-05-01

    High-volume feature-rich data sets are becoming the bread-and-butter of 21st century astronomy but present significant challenges to scientific discovery. In particular, identifying scientifically significant relationships between sets of parameters is non-trivial. Similar problems in biological and geosciences have led to the development of systems which can explore large parameter spaces and identify potentially interesting sets of associations. In this paper, we describe the application of automated discovery systems of relationships to astronomical data sets, focusing on an evolutionary programming technique and an information-theory technique. We demonstrate their use with classical astronomical relationships - the Hertzsprung-Russell diagram and the Fundamental Plane of elliptical galaxies. We also show how they work with the issue of binary classification which is relevant to the next generation of large synoptic sky surveys, such as the Large Synoptic Survey Telescope (LSST). We find that comparable results to more familiar techniques, such as decision trees, are achievable. Finally, we consider the reality of the relationships discovered and how this can be used for feature selection and extraction.

  4. The Complete Calibration of the Color–Redshift Relation (C3R2) Survey: Survey Overview and Data Release 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, Daniel C.; Stern, Daniel K.; Rhodes, Jason D.

    A key goal of the Stage IV dark energy experiments Euclid , LSST, and WFIRST is to measure the growth of structure with cosmic time from weak lensing analysis over large regions of the sky. Weak lensing cosmology will be challenging: in addition to highly accurate galaxy shape measurements, statistically robust and accurate photometric redshift (photo- z ) estimates for billions of faint galaxies will be needed in order to reconstruct the three-dimensional matter distribution. Here we present an overview of and initial results from the Complete Calibration of the Color–Redshift Relation (C3R2) survey, which is designed specifically to calibratemore » the empirical galaxy color–redshift relation to the Euclid depth. These redshifts will also be important for the calibrations of LSST and WFIRST . The C3R2 survey is obtaining multiplexed observations with Keck (DEIMOS, LRIS, and MOSFIRE), the Gran Telescopio Canarias (GTC; OSIRIS), and the Very Large Telescope (VLT; FORS2 and KMOS) of a targeted sample of galaxies that are most important for the redshift calibration. We focus spectroscopic efforts on undersampled regions of galaxy color space identified in previous work in order to minimize the number of spectroscopic redshifts needed to map the color–redshift relation to the required accuracy. We present the C3R2 survey strategy and initial results, including the 1283 high-confidence redshifts obtained in the 2016A semester and released as Data Release 1.« less

  5. The Zooniverse

    NASA Astrophysics Data System (ADS)

    Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.

    2009-12-01

    The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.

  6. Time-Resolved Surveys of Stellar Clusters

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Eggenberger, Patrick; Greco, Claudia; Saesen, Sophie; Anderson, Richard I.; Mowlavi, Nami

    We describe the information that can be gained when a survey is done multi-epoch, and its particular impact in open cluster research. We first explain the irreplaceable information that multi-epoch observations are giving within astrometry, photometry and spectroscopy. Then we give three examples of results on open clusters from multi-epoch surveys, namely, the distance to the Pleiades, the angular momentum evolution of low mass stars and asteroseismology. Finally we mention several very large surveys, which are ongoing or planned for the future, Gaia, JASMINE, LSST, and VVV.

  7. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.

  8. Variability Analysis: Detection and Classification

    NASA Astrophysics Data System (ADS)

    Eyer, L.

    2005-01-01

    The Gaia mission will offer an exceptional opportunity to perform variability studies. The data homogeneity, its optimised photometric systems, composed of 11 medium and 4-5 broad bands, the high photometric precision in G band of one milli-mag for V = 13-15, the radial velocity measurements and the exquisite astrometric precision for one billion stars will permit a detailed description of variable objects like stars, quasars and asteroids. However the time sampling and the total number of measurements change from one object to another because of the satellite scanning law. The data analysis is a challenge because of the huge amount of data, the complexity of the observed objects and the peculiarities of the satellite, and needs thorough preparation. Experience can be gained by the study of past and present survey analyses and results, and Gaia should be put in perspective with the future large scale surveys, like PanSTARRS or LSST. We present the activities of the Variable Star Working Group and a general plan to digest this unprecedented data set, focusing here on the photometry.

  9. Precision matrix expansion - efficient use of numerical simulations in estimating errors on cosmological parameters

    NASA Astrophysics Data System (ADS)

    Friedrich, Oliver; Eifler, Tim

    2018-01-01

    Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the covariance matrix is the sum of two contributions, C = A+B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard covariance estimator would require >105 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.

  10. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderPlas, Jacob T.; Ivezic, Željko

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common tomore » all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.« less

  11. The Santiago-Harvard-Edinburgh-Durham void comparison - I. SHEDding light on chameleon gravity tests

    NASA Astrophysics Data System (ADS)

    Cautun, Marius; Paillas, Enrique; Cai, Yan-Chuan; Bose, Sownak; Armijo, Joaquin; Li, Baojiu; Padilla, Nelson

    2018-05-01

    We present a systematic comparison of several existing and new void-finding algorithms, focusing on their potential power to test a particular class of modified gravity models - chameleon f(R) gravity. These models deviate from standard general relativity (GR) more strongly in low-density regions and thus voids are a promising venue to test them. We use halo occupation distribution (HOD) prescriptions to populate haloes with galaxies, and tune the HOD parameters such that the galaxy two-point correlation functions are the same in both f(R) and GR models. We identify both three-dimensional (3D) voids and two-dimensional (2D) underdensities in the plane of the sky to find the same void abundance and void galaxy number density profiles across all models, which suggests that they do not contain much information beyond galaxy clustering. However, the underlying void dark matter density profiles are significantly different, with f(R) voids being more underdense than GR ones, which leads to f(R) voids having a larger tangential shear signal than their GR analogues. We investigate the potential of each void finder to test f(R) models with near-future lensing surveys such as EUCLID and LSST. The 2D voids have the largest power to probe f(R) gravity, with an LSST analysis of tunnel (which is a new type of 2D underdensity introduced here) lensing distinguishing at 80 and 11σ (statistical error) f(R) models with parameters, |fR0| = 10-5 and 10-6, from GR.

  12. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  13. Detection Technique for Artificially Illuminated Objects in the Outer Solar System and Beyond

    PubMed Central

    Loeb, Abraham

    2012-01-01

    Abstract Existing and planned optical telescopes and surveys can detect artificially illuminated objects, comparable in total brightness to a major terrestrial city, at the outskirts of the Solar System. Orbital parameters of Kuiper belt objects (KBOs) are routinely measured to exquisite precisions of<10−3. Here, we propose to measure the variation of the observed flux F from such objects as a function of their changing orbital distances D. Sunlight-illuminated objects will show a logarithmic slope α ≡ (d log F/d log D)=−4, whereas artificially illuminated objects should exhibit α=−2. The proposed Large Synoptic Survey Telescope (LSST) and other planned surveys will provide superb data and allow measurement of α for thousands of KBOs. If objects with α=−2 are found, follow-up observations could measure their spectra to determine whether they are illuminated by artificial lighting. The search can be extended beyond the Solar System with future generations of telescopes on the ground and in space that would have the capacity to detect phase modulation due to very strong artificial illumination on the nightside of planets as they orbit their parent stars. Key Words: Astrobiology—SETI—Kuiper belt objects—Artificial illumination. Astrobiology 12, 290–294. PMID:22490065

  14. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  15. The Amateurs' Love Affair with Large Datasets

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Jacoby, S. H.; Henden, A.

    2006-12-01

    Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.

  16. Data Management challenges in Astronomy and Astroparticle Physics

    NASA Astrophysics Data System (ADS)

    Lamanna, Giovanni

    2015-12-01

    Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.

  17. The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept

    NASA Astrophysics Data System (ADS)

    Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking

    2018-01-01

    CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.

  18. On determination of charge transfer efficiency of thick, fully depleted CCDs with 55 Fe x-rays

    DOE PAGES

    Yates, D.; Kotov, I.; Nomerotski, A.

    2017-07-01

    Charge transfer efficiency (CTE) is one of the most important CCD characteristics. Our paper examines ways to optimize the algorithms used to analyze 55Fe x-ray data on the CCDs, as well as explores new types of observables for CTE determination that can be used for testing LSST CCDs. Furthermore, the observables are modeled employing simple Monte Carlo simulations to determine how the charge diffusion in thick, fully depleted silicon affects the measurement. The data is compared to the simulations for one of the observables, integral flux of the x-ray hit.

  19. Near-Earth Object Survey Simulation Software

    NASA Astrophysics Data System (ADS)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  20. Mental Fatigue Impairs Soccer-Specific Physical and Technical Performance.

    PubMed

    Smith, Mitchell R; Coutts, Aaron J; Merlini, Michele; Deprez, Dieter; Lenoir, Matthieu; Marcora, Samuele M

    2016-02-01

    To investigate the effects of mental fatigue on soccer-specific physical and technical performance. This investigation consisted of two separate studies. Study 1 assessed the soccer-specific physical performance of 12 moderately trained soccer players using the Yo-Yo Intermittent Recovery Test, Level 1 (Yo-Yo IR1). Study 2 assessed the soccer-specific technical performance of 14 experienced soccer players using the Loughborough Soccer Passing and Shooting Tests (LSPT, LSST). Each test was performed on two occasions and preceded, in a randomized, counterbalanced order, by 30 min of the Stroop task (mentally fatiguing treatment) or 30 min of reading magazines (control treatment). Subjective ratings of mental fatigue were measured before and after treatment, and mental effort and motivation were measured after treatment. Distance run, heart rate, and ratings of perceived exertion were recorded during the Yo-Yo IR1. LSPT performance time was calculated as original time plus penalty time. LSST performance was assessed using shot speed, shot accuracy, and shot sequence time. Subjective ratings of mental fatigue and effort were higher after the Stroop task in both studies (P < 0.001), whereas motivation was similar between conditions. This mental fatigue significantly reduced running distance in the Yo-Yo IR1 (P < 0.001). No difference in heart rate existed between conditions, whereas ratings of perceived exertion were significantly higher at iso-time in the mental fatigue condition (P < 0.01). LSPT original time and performance time were not different between conditions; however, penalty time significantly increased in the mental fatigue condition (P = 0.015). Mental fatigue also impaired shot speed (P = 0.024) and accuracy (P < 0.01), whereas shot sequence time was similar between conditions. Mental fatigue impairs soccer-specific running, passing, and shooting performance.

  1. STRONG LENS TIME DELAY CHALLENGE. II. RESULTS OF TDC1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Kai; Treu, Tommaso; Marshall, Phil

    2015-02-10

    We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g., COSMOGRAIL) and expected from future synoptic surveys (e.g., LSST), and include simulated systematic errors. Seven teams participated in TDC1, submitting results from 78 different method variants. After describing each method, we compute and analyze basic statistics measuring accuracy (ormore » bias) A, goodness of fit χ{sup 2}, precision P, and success rate f. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give |A| < 0.03, P < 0.03, and χ{sup 2} < 1.5, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range f = 20%-40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher f than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with P < 0.03 and |A| < 0.01, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that A and f depend mostly on season length, while P depends mostly on cadence and campaign duration.« less

  2. PROFIT: Bayesian profile fitting of galaxy images

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.

    2017-04-01

    We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (libprofit), accessible via a command-line interface and documented API, along with high-level R (PROFIT) and PYTHON (PyProFit) interfaces (available at github.com/ICRAR/libprofit, github.com/ICRAR/ProFit, and github.com/ICRAR/pyprofit, respectively). R PROFIT is also available pre-built from CRAN; however, this version will be slightly behind the latest GitHub version. libprofit offers fast and accurate two-dimensional integration for a useful number of profiles, including Sérsic, Core-Sérsic, broken-exponential, Ferrer, Moffat, empirical King, point-source, and sky, with a simple mechanism for adding new profiles. We show detailed comparisons between libprofit and GALFIT. libprofit is both faster and more accurate than GALFIT at integrating the ubiquitous Sérsic profile for the most common values of the Sérsic index n (0.5 < n < 8). The high-level fitting code PROFIT is tested on a sample of galaxies with both SDSS and deeper KiDS imaging. We find good agreement in the fit parameters, with larger scatter in best-fitting parameters from fitting images from different sources (SDSS versus KiDS) than from using different codes (PROFIT versus GALFIT). A large suite of Monte Carlo-simulated images are used to assess prospects for automated bulge-disc decomposition with PROFIT on SDSS, KiDS, and future LSST imaging. We find that the biggest increases in fit quality come from moving from SDSS- to KiDS-quality data, with less significant gains moving from KiDS to LSST.

  3. Unveiling the population of orphan γ-ray bursts

    NASA Astrophysics Data System (ADS)

    Ghirlanda, G.; Salvaterra, R.; Campana, S.; Vergani, S. D.; Japelj, J.; Bernardini, M. G.; Burlon, D.; D'Avanzo, P.; Melandri, A.; Gomboc, A.; Nappo, F.; Paladini, R.; Pescalli, A.; Salafia, O. S.; Tagliaferri, G.

    2015-06-01

    Gamma-ray bursts (GRBs) are detectable in the γ-ray band if their jets are oriented toward the observer. However, for each GRB with a typical θjet, there should be ~2/θ2jet bursts whose emission cone is oriented elsewhere in space. These off-axis bursts can eventually be detected when, due to the deceleration of their relativistic jets, the beaming angle becomes comparable to the viewing angle. Orphan afterglows (OAs) should outnumber the current population of bursts detected in the γ-ray band even if they have not been conclusively observed so far at any frequency. We compute the expected flux of the population of orphan afterglows in the mm, optical, and X-ray bands through a population synthesis code of GRBs and the standard afterglow emission model. We estimate the detection rate of OAs with ongoing and forthcoming surveys. The average duration of OAs as transients above a given limiting flux is derived and described with analytical expressions: in general OAs should appear as daily transients in optical surveys and as monthly/yearly transients in the mm/radio band. We find that ~2 OA yr-1 could already be detected by Gaia and up to 20 OA yr-1 could be observed by the ZTF survey. A larger number of 50 OA yr-1 should be detected by LSST in the optical band. For the X-ray band, ~26 OA yr-1 could be detected by the eROSITA. For the large population of OA detectable by LSST, the X-ray and optical follow up of the light curve (for the brightest cases) and/or the extensive follow up of their emission in the mm and radio band could be the key to disentangling their GRB nature from other extragalactic transients of comparable flux density.

  4. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  5. ATLAS: Big Data in a Small Package?

    NASA Astrophysics Data System (ADS)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  6. voevent-parse: Parse, manipulate, and generate VOEvent XML packets

    NASA Astrophysics Data System (ADS)

    Staley, Tim D.

    2014-11-01

    voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.

  7. The Follow-up Crisis: Optimizing Science in an Opportunity Rich Environment

    NASA Astrophysics Data System (ADS)

    Vestrand, T.

    Rapid follow-up tasking for robotic telescopes has been dominated by a one-dimensional uncoordinated response strategy developed for gamma-ray burst studies. However, this second-grade soccer approach is increasing showing its limitations even when there are only a few events per night. And it will certainly fail when faced with the denial-of-service attack generated by the nightly flood of new transients generated by massive variability surveys like LSST. We discuss approaches for optimizing the scientific return from autonomous robotic telescopes in the high event range limit and explore the potential of a coordinated telescope ecosystem employing heterogeneous telescopes.

  8. The Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Venn, Kim; Starkenburg, Else; Martin, Nicolas; Kielty, Collin; Youakim, Kris; Arnetsen, Anke

    2018-06-01

    The Maunakea Spectroscopic Explorer (MSE) is an ambitious project to transform the Canada-France-Hawaii 3.6-metre telescope into an 11.25-metre facility dedicated to wide field multi-object spectroscopy. Following a successful conceptual design review of ten subsystems and the systems-level review in January 2018, MSE is preparing to move into the Preliminary Design Phase. MSE will simultaneously deploy over 3000 fibers that feed low/medium resolution spectrometers and 1000 fibers that feed high-resolution (R~40,000) spectrometers. This design is expected to revolutionize astrophysical studies requiring large spectroscopic datasets: i.e., reconstructing the Milky Way's formation history through the chemical tagging of stars, searches for the effects of dark matter on stellar streams, determination of environmental influences on galaxy formation since cosmic noon, measuring black hole masses through repeat spectroscopy of quasars, follow-up of large samples identified in other surveys (Gaia, LSST, SKA, etc.), and more. MSE will reuse a large fraction of CFHT’s existing facilities while tripling the diameter of the telescope’s primary mirror and increasing the height of the enclosure by only 10%. I will discuss the progress to date and opportunities for partnerships.

  9. Status of mirror segment production for the Giant Magellan Telescope

    NASA Astrophysics Data System (ADS)

    Martin, H. M.; Burge, J. H.; Davis, J. M.; Kim, D. W.; Kingsley, J. S.; Law, K.; Loeff, A.; Lutz, R. D.; Merrill, C.; Strittmatter, P. A.; Tuell, M. T.; Weinberger, S. N.; West, S. C.

    2016-07-01

    The Richard F. Caris Mirror Lab at the University of Arizona is responsible for production of the eight 8.4 m segments for the primary mirror of the Giant Magellan Telescope, including one spare off-axis segment. We report on the successful casting of Segment 4, the center segment. Prior to generating the optical surface of Segment 2, we carried out a major upgrade of our 8.4 m Large Optical Generator. The upgrade includes new hardware and software to improve accuracy, safety, reliability and ease of use. We are currently carrying out an upgrade of our 8.4 m polishing machine that includes improved orbital polishing capabilities. We added and modified several components of the optical tests during the manufacture of Segment 1, and we have continued to improve the systems in preparation for Segments 2-8. We completed two projects that were prior commitments before GMT Segment 2: casting and polishing the combined primary and tertiary mirrors for the LSST, and casting and generating a 6.5 m mirror for the Tokyo Atacama Observatory.

  10. Test of Parameterized Post-Newtonian Gravity with Galaxy-scale Strong Lensing Systems

    NASA Astrophysics Data System (ADS)

    Cao, Shuo; Li, Xiaolei; Biesiada, Marek; Xu, Tengpeng; Cai, Yongzhi; Zhu, Zong-Hong

    2017-01-01

    Based on a mass-selected sample of galaxy-scale strong gravitational lenses from the SLACS, BELLS, LSD, and SL2S surveys and using a well-motivated fiducial set of lens-galaxy parameters, we tested the weak-field metric on kiloparsec scales and found a constraint on the post-Newtonian parameter γ ={0.995}-0.047+0.037 under the assumption of a flat ΛCDM universe with parameters taken from Planck observations. General relativity (GR) predicts exactly γ = 1. Uncertainties concerning the total mass density profile, anisotropy of the velocity dispersion, and the shape of the light profile combine to systematic uncertainties of ˜25%. By applying a cosmological model-independent method to the simulated future LSST data, we found a significant degeneracy between the PPN γ parameter and the spatial curvature of the universe. Setting a prior on the cosmic curvature parameter -0.007 < Ωk < 0.006, we obtained the constraint on the PPN parameter that γ ={1.000}-0.0025+0.0023. We conclude that strong lensing systems with measured stellar velocity dispersions may serve as another important probe to investigate validity of the GR, if the mass-dynamical structure of the lensing galaxies is accurately constrained in future lens surveys.

  11. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  12. Protecting Dark Skies in Chile

    NASA Astrophysics Data System (ADS)

    Smith, R. Chris; Sanhueza, Pedro; Phillips, Mark

    2018-01-01

    Current projections indicate that Chile will host approximately 70% of the astronomical collecting area on Earth by 2030, augmenting the enormous area of ALMA with that of three next-generation optical telescopes: LSST, GMTO, and E-ELT. These cutting-edge facilities represent billions of dollars of investment in the astronomical facilities hosted in Chile. The Chilean government, Chilean astronomical community, and the international observatories in Chile have recognized that these investments are threatened by light pollution, and have formed a strong collaboration to work at managing the threats. We will provide an update on the work being done in Chile, ranging from training municipalities about new lighting regulations to exploring international recognition of the dark sky sites of Northern Chile.

  13. Machine Learning for Zwicky Transient Facility

    NASA Astrophysics Data System (ADS)

    Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey

    2018-01-01

    The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.

  14. Agile software development in an earned value world: a survival guide

    NASA Astrophysics Data System (ADS)

    Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin

    2016-08-01

    Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.

  15. Characterising CCDs with cosmic rays

    DOE PAGES

    Fisher-Levine, M.; Nomerotski, A.

    2015-08-06

    The properties of cosmic ray muons make them a useful probe for measuring the properties of thick, fully depleted CCD sensors. The known energy deposition per unit length allows measurement of the gain of the sensor's amplifiers, whilst the straightness of the tracks allows for a crude assessment of the static lateral electric fields at the sensor's edges. The small volume in which the muons deposit their energy allows measurement of the contribution to the PSF from the diffusion of charge as it drifts across the sensor. In this work we present a validation of the cosmic ray gain measurementmore » technique by comparing with radioisotope gain measurments, and calculate the charge diffusion coefficient for prototype LSST sensors.« less

  16. Constraining neutrino masses with the integrated-Sachs-Wolfe-galaxy correlation function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesgourgues, Julien; Valkenburg, Wessel; Gaztanaga, Enrique

    2008-03-15

    Temperature anisotropies in the cosmic microwave background (CMB) are affected by the late integrated Sachs-Wolfe (lISW) effect caused by any time variation of the gravitational potential on linear scales. Dark energy is not the only source of lISW, since massive neutrinos induce a small decay of the potential on small scales during both matter and dark energy domination. In this work, we study the prospect of using the cross correlation between CMB and galaxy-density maps as a tool for constraining the neutrino mass. On the one hand massive neutrinos reduce the cross-correlation spectrum because free-streaming slows down structure formation; onmore » the other hand, they enhance it through their change in the effective linear growth. We show that in the observable range of scales and redshifts, the first effect dominates, but the second one is not negligible. We carry out an error forecast analysis by fitting some mock data inspired by the Planck satellite, Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST). The inclusion of the cross correlation data from Planck and LSST increases the sensitivity to the neutrino mass m{sub {nu}} by 38% (and to the dark energy equation of state w by 83%) with respect to Planck alone. The correlation between Planck and DES brings a far less significant improvement. This method is not potentially as good for detecting m{sub {nu}} as the measurement of galaxy, cluster, or cosmic shear power spectra, but since it is independent and affected by different systematics, it remains potentially interesting if the total neutrino mass is of the order of 0.2 eV; if instead it is close to the lower bound from atmospheric oscillations, m{sub {nu}}{approx}0.05 eV, we do not expect the ISW-galaxy correlation to be ever sensitive to m{sub {nu}}.« less

  17. The Future of Astrometric Education

    NASA Astrophysics Data System (ADS)

    van Altena, W.; Stavinschi, M.

    2005-10-01

    Astrometry is poised to enter an era of unparalleled growth and relevance due to the wealth of highly accurate data expected from the SIM and GAIA space missions. Innovative ground-based telescopes, such as the LSST, are planned which will provide less precise data, but for many more stars. The potential for studies of the structure, kinematics and dynamics of our Galaxy as well as for the physical nature of stars and the cosmological distance scale is without equal in the history of astronomy. It is therefore ironic that in two years not one course in astrometry will be taught in the US, leaving all astrometric education to Europe, China and Latin America. Who will ensure the astrometric quality control for the JWT, SIM, GAIA, LSST, to say nothing about the current large ground-based facilities, such as the VLT, Gemini, Keck, NOAO, Magellan, LBT, etc.? Hipparcos and the HST were astrometric successes due only to the dedicated work of specialists in astrometry who fought to maintain the astrometric characteristics of those satellites and their data pipelines. We propose a renewal of astrometric education in the universities to prepare qualified scientists so that the scientific returns from the investment of billions of dollars in these unique facilities will be maximized. The funding agencies are providing outstanding facilities. The universities, national and international observatories and agencies should acknowledge their responsibility to hire qualified full-time astrometric scientists to teach students, and to supervise existing and planned astronomical facilities so that quality data will be obtained and analyzed. A temporary solution to this problem is proposed in the form of a series of international summer schools in Astrometry. The Michelson Science Center of the SIM project has offered to hold an astrometry summer school in 2005 to begin this process. A one-semester syllabus is suggested as a means of meeting the needs of Astronomy by educating students in astrometric techniques that might be most valuable for careers associated with modern astrophysics.

  18. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    NASA Astrophysics Data System (ADS)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy data workforce encompasses a greater breadth of educational backgrounds. Results show that teams of individuals with distinct expertise are key to ensuring the long-term preservation and usability of astronomy datasets.

  19. Color Me Intrigued: The Discovery of iPTF 16fnm, an SN 2002cx-like Object

    NASA Astrophysics Data System (ADS)

    Miller, A. A.; Kasliwal, M. M.; Cao, Y.; Adams, S. M.; Goobar, A.; Knežević, S.; Laher, R. R.; Lunnan, R.; Masci, F. J.; Nugent, P. E.; Perley, D. A.; Petrushevska, T.; Quimby, R. M.; Rebbapragada, U. D.; Sollerman, J.; Taddia, F.; Kulkarni, S. R.

    2017-10-01

    Modern wide-field, optical time-domain surveys must solve a basic optimization problem: maximize the number of transient discoveries or minimize the follow-up needed for the new discoveries. Here, we describe the Color Me Intrigued experiment, the first from the intermediate Palomar Transient Factory (iPTF) to search for transients simultaneously in the g PTF and R PTF bands. During the course of this experiment, we discovered iPTF 16fnm, a new member of the 02cx-like subclass of Type Ia supernovae (SNe). iPTF 16fnm peaked at {M}{g{PTF}}=-15.09+/- 0.17 {mag}, making it the second-least-luminous known SN Ia. iPTF 16fnm exhibits all the hallmarks of the 02cx-like class: (I) low luminosity at peak, (II) low ejecta velocities, and (III) a non-nebular spectrum several months after peak. Spectroscopically, iPTF 16fnm exhibits a striking resemblance to two other low-luminosity 02cx-like SNe: SN 2007qd and SN 2010ae. iPTF 16fnm and SN 2005hk decline at nearly the same rate, despite a 3 mag difference in brightness at peak. When considering the full subclass of 02cx-like SNe, we do not find evidence for a tight correlation between peak luminosity and decline rate in either the g‧ or r‧ band. We measure the relative rate of 02cx-like SNe to normal SNe Ia and find {r}{N02{cx}/{N}{Ia}}={33}-25+158 % . We further examine the g‧ - r‧ evolution of 02cx-like SNe and find that their unique color evolution can be used to separate them from 91bg-like and normal SNe Ia. This selection function will be especially important in the spectroscopically incomplete Zwicky Transient Facility/Large Synoptic Survey Telescope (LSST) era. Finally, we close by recommending that LSST periodically evaluate, and possibly update, its observing cadence to maximize transient science.

  20. Liverpool telescope 2: a new robotic facility for rapid transient follow-up

    NASA Astrophysics Data System (ADS)

    Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Bersier, D.; Bode, M. F.; Carter, D.; Clay, N. R.; Collins, C. A.; Darnley, M. J.; Davis, C. J.; Gutierrez, C. M.; Harman, D. J.; James, P. A.; Knapen, J. H.; Kobayashi, S.; Marchant, J. M.; Mazzali, P. A.; Mottram, C. J.; Mundell, C. G.; Newsam, A.; Oscoz, A.; Palle, E.; Piascik, A.; Rebolo, R.; Smith, R. J.

    2015-03-01

    The Liverpool Telescope is one of the world's premier facilities for time domain astronomy. The time domain landscape is set to radically change in the coming decade, with synoptic all-sky surveys such as LSST providing huge numbers of transient detections on a nightly basis; transient detections across the electromagnetic spectrum from other major facilities such as SVOM, SKA and CTA; and the era of `multi-messenger astronomy', wherein astrophysical events are detected via non-electromagnetic means, such as neutrino or gravitational wave emission. We describe here our plans for the Liverpool Telescope 2: a new robotic telescope designed to capitalise on this new era of time domain astronomy. LT2 will be a 4-metre class facility co-located with the Liverpool Telescope at the Observatorio del Roque de Los Muchachos on the Canary island of La Palma. The telescope will be designed for extremely rapid response: the aim is that the telescope will take data within 30 seconds of the receipt of a trigger from another facility. The motivation for this is twofold: firstly it will make it a world-leading facility for the study of fast fading transients and explosive phenomena discovered at early times. Secondly, it will enable large-scale programmes of low-to-intermediate resolution spectral classification of transients to be performed with great efficiency. In the target-rich environment of the LSST era, minimising acquisition overheads will be key to maximising the science gains from any follow-up programme. The telescope will have a diverse instrument suite which is simultaneously mounted for automatic changes, but it is envisaged that the primary instrument will be an intermediate resolution, optical/infrared spectrograph for scientific exploitation of transients discovered with the next generation of synoptic survey facilities. In this paper we outline the core science drivers for the telescope, and the requirements for the optical and mechanical design.

  1. Large-Scale Overlays and Trends: Visually Mining, Panning and Zooming the Observable Universe.

    PubMed

    Luciani, Timothy Basil; Cherinka, Brian; Oliphant, Daniel; Myers, Sean; Wood-Vasey, W Michael; Labrinidis, Alexandros; Marai, G Elisabeta

    2014-07-01

    We introduce a web-based computing infrastructure to assist the visual integration, mining and interactive navigation of large-scale astronomy observations. Following an analysis of the application domain, we design a client-server architecture to fetch distributed image data and to partition local data into a spatial index structure that allows prefix-matching of spatial objects. In conjunction with hardware-accelerated pixel-based overlays and an online cross-registration pipeline, this approach allows the fetching, displaying, panning and zooming of gigabit panoramas of the sky in real time. To further facilitate the integration and mining of spatial and non-spatial data, we introduce interactive trend images-compact visual representations for identifying outlier objects and for studying trends within large collections of spatial objects of a given class. In a demonstration, images from three sky surveys (SDSS, FIRST and simulated LSST results) are cross-registered and integrated as overlays, allowing cross-spectrum analysis of astronomy observations. Trend images are interactively generated from catalog data and used to visually mine astronomy observations of similar type. The front-end of the infrastructure uses the web technologies WebGL and HTML5 to enable cross-platform, web-based functionality. Our approach attains interactive rendering framerates; its power and flexibility enables it to serve the needs of the astronomy community. Evaluation on three case studies, as well as feedback from domain experts emphasize the benefits of this visual approach to the observational astronomy field; and its potential benefits to large scale geospatial visualization in general.

  2. Next Generation Search Interfaces

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2015-09-01

    Astronomers are constantly looking for easier ways to access multiple data sets. While much effort is spent on VO, little thought is given to the types of User Interfaces we need to effectively search this sort of data. For instance, an astronomer might need to search Spitzer, WISE, and 2MASS catalogs and images then see the results presented together in one UI. Moving seamlessly between data sets is key to presenting integrated results. Results need to be viewed using first class, web based, integrated FITS viewers, XY Plots, and advanced table display tools. These components should be able to handle very large datasets. To make a powerful Web based UI that can manage and present multiple searches to the user requires taking advantage of many HTML5 features. AJAX is used to start searches and present results. Push notifications (Server Sent Events) monitor background jobs. Canvas is required for advanced result displays. Lesser known CSS3 technologies makes it all flow seamlessly together. At IPAC, we have been developing our Firefly toolkit for several years. We are now using it to solve this multiple data set, multiple queries, and integrated presentation problem to create a powerful research experience. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). Firefly is the core for applications serving many project archives, including Spitzer, Planck, WISE, PTF, LSST and others. It is also used in IRSA's new Finder Chart and catalog and image displays.

  3. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  4. Large aperture wide field multi-object spectroscopy for the 2020s: the science and status of the Maunakea Spectroscopic Explorer.

    NASA Astrophysics Data System (ADS)

    Devost, Daniel; McConnachie, Alan; Chambers, Kenneth; Gallagher, Sarah; Maunakea Spectroscopic Explorer Project office, MSE Science Advisory group, MSE Science Team

    2018-01-01

    Numerous international reports have recently highlighted the need for fully dedicated, large aperture, highly multiplexed spectroscopy at a range of spectral resolutions in the OIR wavelength range. Such a facility is the most obvious missing link in the emerging network of international multi-wavelength, astronomy facilities, and enables science from reverberation mapping of black holes to the nucleosynthetic history of the Galaxy, and will follow-up discoveries from the optical through to the radio with facilities such as LSST. The only fully dedicated large aperture MOS facility that is in the design phase is the Maunakea Spectroscopic Explorer (MSE), an 11.4m segmented mirror prime focus telescope with a 1.5 square degree field of view that has 3200 fibers at low (R~2500) and moderate (R~6000) resolution, and 1000 fibers at high (R=20/40000) resolution. I will provide an overview of MSE, describing the science drivers and the current design status, as well as the international partnership, and the results of multiple, newly completed, external reviews for the system and subsystems. The anticipated cost and timeline to first light will also be presented.

  5. Generative adversarial networks recover features in astrophysical images of galaxies beyond the deconvolution limit

    NASA Astrophysics Data System (ADS)

    Schawinski, Kevin; Zhang, Ce; Zhang, Hantian; Fowler, Lucas; Santhanam, Gokula Krishnan

    2017-05-01

    Observations of astrophysical objects such as galaxies are limited by various sources of random and systematic noise from the sky background, the optical system of the telescope and the detector used to record the data. Conventional deconvolution techniques are limited in their ability to recover features in imaging data by the Shannon-Nyquist sampling theorem. Here, we train a generative adversarial network (GAN) on a sample of 4550 images of nearby galaxies at 0.01 < z < 0.02 from the Sloan Digital Sky Survey and conduct 10× cross-validation to evaluate the results. We present a method using a GAN trained on galaxy images that can recover features from artificially degraded images with worse seeing and higher noise than the original with a performance that far exceeds simple deconvolution. The ability to better recover detailed features such as galaxy morphology from low signal to noise and low angular resolution imaging data significantly increases our ability to study existing data sets of astrophysical objects as well as future observations with observatories such as the Large Synoptic Sky Telescope (LSST) and the Hubble and James Webb space telescopes.

  6. Learning a Novel Detection Metric for the Detection of O’Connell Effect Eclipsing Binaries

    NASA Astrophysics Data System (ADS)

    Johnston, Kyle; Haber, Rana; Knote, Matthew; Caballero-Nieves, Saida Maria; Peter, Adrian; Petit, Véronique

    2018-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Here we focus on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern detection algorithm for the targeted identification of eclipsing binaries which demonstrate a feature known as the O’Connell Effect. A methodology for the reduction of stellar variable observations (time-domain data) into Distribution Fields (DF) is presented. Push-Pull metric learning, a variant of LMNN learning, is used to generate a learned distance metric for the specific detection problem proposed. The metric will be trained on a set of a labelled Kepler eclipsing binary data, in particular systems showing the O’Connell effect. Performance estimates will be presented, as well the results of the detector applied to an unlabeled Kepler EB data set; this work is a crucial step in the upcoming era of big data from the next generation of big telescopes, such as LSST.

  7. The Feasibility and Benefits of In Situ Exploration of `Oumuamua-like objects

    NASA Astrophysics Data System (ADS)

    Seligman, Darryl; Laughlin, Gregory

    2018-04-01

    A rapid accumulation of observations and interpretation have followed in the wake of 1I `Oumuamua's passage through the inner Solar System. We outline the consequences that this first detection of an interstellar asteroid implies for the planet-forming process, and we assess the near-term prospects for detecting and observing (both remotely and \\textit{in situ}) future Solar System visitors of this type. Drawing on detailed heat-transfer calculations that take both `Oumuamua's unusual shape and its chaotic tumbling into account, we affirm that the lack of a detectable coma in deep images of the object very likely arises from the presence of a radiation-modified coating of high molecular weight material (rather than a refractory bulk composition). Assuming that `Oumuamua is a typical representative of a larger population with a kinematic distribution similar to Population I stars in the local galactic neighborhood, we calculate expected arrival rates, impact parameters and velocities of similar objects and assess their prospects for detection using operational and forthcoming facilities. Using `Oumuamua as a proof-of-concept, we assess the prospects for missions that intercept ISOs using conventional chemical propulsion. Using a ``launch on detection'' paradigm, we estimate wait times of order a year between favorable mission opportunities with the detection capabilities of the Large-Scale Synoptic Survey Telescope (LSST), a figure that will be refined as the population of interstellar asteroids becomes observationally better constrained.

  8. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  9. Accretion Disks Around Binary Black Holes of Unequal Mass: GRMHD Simulations Near Decoupling

    NASA Technical Reports Server (NTRS)

    Gold, Roman; Paschalidis, Vasileios; Etienne, Zachariah B.; Shapiro, Stuart L.; Pfeiffer, Harald, P.

    2013-01-01

    We report on simulations in general relativity of magnetized disks onto black hole binaries. We vary the binary mass ratio from 1:1 to 1:10 and evolve the systems when they orbit near the binary disk decoupling radius. We compare (surface) density profiles, accretion rates (relative to a single, non-spinning black hole), variability, effective alpha-stress levels and luminosities as functions of the mass ratio. We treat the disks in two limiting regimes: rapid radiative cooling and no radiative cooling. The magnetic field lines clearly reveal jets emerging from both black hole horizons and merging into one common jet at large distances. The magnetic fields give rise to much stronger shock heating than the pure hydrodynamic flows, completely alter the disk structure, and boost accretion rates and luminosities. Accretion streams near the horizons are among the densest structures; in fact, the 1:10 no-cooling evolution results in a refilling of the cavity. The typical effective temperature in the bulk of the disk is approx. 10(exp5) (M / 10(exp 8)M solar mass (exp -1/4(L/L(sub edd) (exp 1/4K) yielding characteristic thermal frequencies approx. 10 (exp 15) (M /10(exp 8)M solar mass) (exp -1/4(L/L (sub edd) (1+z) (exp -1)Hz. These systems are thus promising targets for many extragalactic optical surveys, such as LSST, WFIRST, and PanSTARRS.

  10. The Dynamics of the Local Group in the Era of Precision Astrometry

    NASA Astrophysics Data System (ADS)

    Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta

    2018-06-01

    Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.

  11. Supernovae and cosmology with future European facilities.

    PubMed

    Hook, I M

    2013-06-13

    Prospects for future supernova surveys are discussed, focusing on the European Space Agency's Euclid mission and the European Extremely Large Telescope (E-ELT), both expected to be in operation around the turn of the decade. Euclid is a 1.2 m space survey telescope that will operate at visible and near-infrared wavelengths, and has the potential to find and obtain multi-band lightcurves for thousands of distant supernovae. The E-ELT is a planned, general-purpose ground-based, 40-m-class optical-infrared telescope with adaptive optics built in, which will be capable of obtaining spectra of type Ia supernovae to redshifts of at least four. The contribution to supernova cosmology with these facilities will be discussed in the context of other future supernova programmes such as those proposed for DES, JWST, LSST and WFIRST.

  12. Using Deep Learning to Analyze the Voices of Stars.

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas Macaulay

    2018-01-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.

  13. The Strong Lensing Time Delay Challenge (2014)

    NASA Astrophysics Data System (ADS)

    Liao, Kai; Dobler, G.; Fassnacht, C. D.; Treu, T.; Marshall, P. J.; Rumbaugh, N.; Linder, E.; Hojjati, A.

    2014-01-01

    Time delays between multiple images in strong lensing systems are a powerful probe of cosmology. At the moment the application of this technique is limited by the number of lensed quasars with measured time delays. However, the number of such systems is expected to increase dramatically in the next few years. Hundred such systems are expected within this decade, while the Large Synoptic Survey Telescope (LSST) is expected to deliver of order 1000 time delays in the 2020 decade. In order to exploit this bounty of lenses we needed to make sure the time delay determination algorithms have sufficiently high precision and accuracy. As a first step to test current algorithms and identify potential areas for improvement we have started a "Time Delay Challenge" (TDC). An "evil" team has created realistic simulated light curves, to be analyzed blindly by "good" teams. The challenge is open to all interested parties. The initial challenge consists of two steps (TDC0 and TDC1). TDC0 consists of a small number of datasets to be used as a training template. The non-mandatory deadline is December 1 2013. The "good" teams that complete TDC0 will be given access to TDC1. TDC1 consists of thousands of lightcurves, a number sufficient to test precision and accuracy at the subpercent level, necessary for time-delay cosmography. The deadline for responding to TDC1 is July 1 2014. Submissions will be analyzed and compared in terms of predefined metrics to establish the goodness-of-fit, efficiency, precision and accuracy of current algorithms. This poster describes the challenge in detail and gives instructions for participation.

  14. Design of the SLAC RCE Platform: A General Purpose ATCA Based Data Acquisition System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herbst, R.; Claus, R.; Freytag, M.

    2015-01-23

    The SLAC RCE platform is a general purpose clustered data acquisition system implemented on a custom ATCA compliant blade, called the Cluster On Board (COB). The core of the system is the Reconfigurable Cluster Element (RCE), which is a system-on-chip design based upon the Xilinx Zynq family of FPGAs, mounted on custom COB daughter-boards. The Zynq architecture couples a dual core ARM Cortex A9 based processor with a high performance 28nm FPGA. The RCE has 12 external general purpose bi-directional high speed links, each supporting serial rates of up to 12Gbps. 8 RCE nodes are included on a COB, eachmore » with a 10Gbps connection to an on-board 24-port Ethernet switch integrated circuit. The COB is designed to be used with a standard full-mesh ATCA backplane allowing multiple RCE nodes to be tightly interconnected with minimal interconnect latency. Multiple shelves can be clustered using the front panel 10-gbps connections. The COB also supports local and inter-blade timing and trigger distribution. An experiment specific Rear Transition Module adapts the 96 high speed serial links to specific experiments and allows an experiment-specific timing and busy feedback connection. This coupling of processors with a high performance FPGA fabric in a low latency, multiple node cluster allows high speed data processing that can be easily adapted to any physics experiment. RTEMS and Linux are both ported to the module. The RCE has been used or is the baseline for several current and proposed experiments (LCLS, HPS, LSST, ATLAS-CSC, LBNE, DarkSide, ILC-SiD, etc).« less

  15. Augmenting the Funding Sources for Space Science and the ASTRO-1 Space Telescope

    NASA Astrophysics Data System (ADS)

    Morse, Jon

    2015-08-01

    The BoldlyGo Institute was formed in 2013 to augment the planned space science portfolio through philanthropically funded robotic space missions, similar to how some U.S. medical institutes and ground-based telescopes are funded. I introduce BoldlyGo's two current projects: the SCIM mission to Mars and the ASTRO-1 space telescope. In particular, ASTRO-1 is a 1.8-meter off-axis (unobscured) ultraviolet-visible space observatory to be located in a Lagrange point or heliocentric orbit with a wide-field panchromatic camera, medium- and high-resolution spectrograph, and high-contrast imaging coronagraph and/or an accompanying starshade/occulter. It is intended for the post-Hubble Space Telescope era in the 2020s, enabling unique measurements of a broad range of celestial targets, while providing vital complementary capabilities to other ground- and space-based facilities such as the JWST, ALMA, WFIRST-AFTA, LSST, TESS, Euclid, and PLATO. The ASTRO-1 architecture simultaneously wields great scientific power while being technically viable and affordable. A wide variety of scientific programs can be accomplished, addressing topics across space astronomy, astrophysics, fundamental physics, and solar system science, as well as being technologically informative to future large-aperture programs. ASTRO-1 is intended to be a new-generation research facility serving a broad national and international community, as well as a vessel for impactful public engagement. Traditional institutional partnerships and consortia, such as are common with private ground-based observatories, may play a role in the support and governance of ASTRO-1; we are currently engaging interested international organizations. In addition to our planned open guest observer program and accessible data archive, we intend to provide a mechanism whereby individual scientists can buy in to a fraction of the gauranteed observing time. Our next step in ASTRO-1 development is to form the ASTRO-1 Requirements Team (ART), to which international scientists are invited to apply. The ART will be tasked with anchoring the science case, optimizing the observatory design, and constructing a design reference mission during late-2015 and 2016.

  16. Near-Field Cosmology with Resolved Stellar Populations Around Local Volume LMC Stellar-Mass Galaxies

    NASA Astrophysics Data System (ADS)

    Carlin, Jeffrey L.; Sand, David J.; Willman, Beth; Brodie, Jean P.; Crnojevic, Denija; Forbes, Duncan; Hargis, Jonathan R.; Peter, Annika; Pucha, Ragadeepika; Romanowsky, Aaron J.; Spekkens, Kristine; Strader, Jay

    2018-06-01

    We discuss our ongoing observational program to comprehensively map the entire virial volumes of roughly LMC stellar mass galaxies at distances of ~2-4 Mpc. The MADCASH (Magellanic Analog Dwarf Companions And Stellar Halos) survey will deliver the first census of the dwarf satellite populations and stellar halo properties within LMC-like environments in the Local Volume. Our results will inform our understanding of the recent DES discoveries of dwarf satellites tentatively affiliated with the LMC/SMC system. This program has already yielded the discovery of the faintest known dwarf galaxy satellite of an LMC stellar-mass host beyond the Local Group, based on deep Subaru+HyperSuprimeCam imaging reaching ~2 magnitudes below its TRGB, and at least two additional candidate satellites. We will summarize the survey results and status to date, highlighting some challenges encountered and lessons learned as we process the data for this program through a prototype LSST pipeline. Our program will examine whether LMC stellar mass dwarfs have extended stellar halos, allowing us to assess the relative contributions of in-situ stars vs. merger debris to their stellar populations and halo density profiles. We outline the constraints on galaxy formation models that will be provided by our observations of low-mass galaxy halos and their satellites.

  17. The 4MOST instrument concept overview

    NASA Astrophysics Data System (ADS)

    Haynes, Roger; Barden, Samuel; de Jong, Roelof; Schnurr, Olivier; Bellido, Olga; Walcher, Jakob; Haynes, Dionne; Winkler, Roland; Bauer, Svend-Marian; Dionies, Frank; Saviauk, Allar; Chiappini, Cristina; Schwope, Axel; Brynnel, Joar; Steinmetz, Matthias; McMahon, Richard; Feltzing, Sofia; Francois, Patrick; Trager, Scott; Parry, Ian; Irwin, Mike; Walton, Nicholas; King, David; Sun, David; Gonzalez-Solares, Eduaro; Tosh, Ian; Dalton, Gavin; Middleton, Kevin; Bonifacio, Piercarlo; Jagourel, Pascal; Mignot, Shan; Cohen, Mathieu; Amans, Jean-Philippe; Royer, Frederic; Sartoretti, Paola; Pragt, Johan; Gerlofsma, Gerrit; Roelfsema, Ronald; Navarro, Ramon; Thimm, Guido; Seifert, Walter; Christlieb, Norbert; Mandel, Holger; Trifonov, Trifon; Xu, Wenli; Lang-Bardl, Florian; Muschielok, Bernard; Schlichter, Jörg; Hess, Hans-Joachim; Grupp, Frank; Boehringer, Hans; Boller, Thomas; Dwelly, Tom; Bender, Ralf; Rosati, Piero; Iwert, Olaf; Finger, Gert; Lizon L'Allemand, Jean-Louis; Saunders, Will; Sheinis, Andrew; Frost, Gabriella; Farrell, Tony; Waller, Lewis; Depagne, Eric; Laurent, Florence; Caillier, Patrick; Kosmalski, Johan; Richard, Johan; Bacon, Roland; Ansorge, Wolfgang

    2014-07-01

    The 4MOST[1] instrument is a concept for a wide-field, fibre-fed high multiplex spectroscopic instrument facility on the ESO VISTA telescope designed to perform a massive (initially >25x106 spectra in 5 years) combined all-sky public survey. The main science drivers are: Gaia follow up of chemo-dynamical structure of the Milky Way, stellar radial velocities, parameters and abundances, chemical tagging; eROSITA follow up of cosmology with x-ray clusters of galaxies, X-ray AGN/galaxy evolution to z~5, Galactic X-ray sources and resolving the Galactic edge; Euclid/LSST/SKA and other survey follow up of Dark Energy, Galaxy evolution and transients. The surveys will be undertaken simultaneously requiring: highly advanced targeting and scheduling software, also comprehensive data reduction and analysis tools to produce high-level data products. The instrument will allow simultaneous observations of ~1600 targets at R~5,000 from 390-900nm and ~800 targets at R<18,000 in three channels between ~395-675nm (channel bandwidth: 45nm blue, 57nm green and 69nm red) over a hexagonal field of view of ~ 4.1 degrees. The initial 5-year 4MOST survey is currently expect to start in 2020. We provide and overview of the 4MOST systems: optomechanical, control, data management and operations concepts; and initial performance estimates.

  18. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE PAGES

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    2017-09-07

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  19. A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbertson, W.; Nomerotski, A.; Takacs, P.

    In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less

  20. Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2014-01-01

    We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.

  1. Gamma Ray Bursts as Cosmological Probes with EXIST

    NASA Astrophysics Data System (ADS)

    Hartmann, Dieter; EXIST Team

    2006-12-01

    The EXIST mission, studied as a Black Hole Finder Probe within NASA's Beyond Einstein Program, would, in its current design, trigger on 1000 Gamma Ray Bursts (GRBs) per year (Grindlay et al, this meeting). The redshift distribution of these GRBs, using results from Swift as a guide, would probe the z > 7 epoch at an event rate of > 50 per year. These bursts trace early cosmic star formation history, point to a first generation of stellar objects that reionize the universe, and provide bright beacons for absorption line studies with groundand space-based observatories. We discuss how EXIST, in conjunction with other space missions and future large survey programs such as LSST, can be utilized to advance our understanding of cosmic chemical evolution, the structure and evolution of the baryonic cosmic web, and the formation of stars in low metallicity environments.

  2. Connecting the time domain community with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Plante, Raymond L.; Kantor, Jeffrey; Good, John C.

    2012-09-01

    The time domain has been identied as one of the most important areas of astronomical research for the next decade. The Virtual Observatory is in the vanguard with dedicated tools and services that enable and facilitate the discovery, dissemination and analysis of time domain data. These range in scope from rapid notications of time-critical astronomical transients to annotating long-term variables with the latest modelling results. In this paper, we will review the prior art in these areas and focus on the capabilities that the VAO is bringing to bear in support of time domain science. In particular, we will focus on the issues involved with the heterogeneous collections of (ancilllary) data associated with astronomical transients, and the time series characterization and classication tools required by the next generation of sky surveys, such as LSST and SKA.

  3. Mining the Kilo-Degree Survey for solar system objects

    NASA Astrophysics Data System (ADS)

    Mahlke, M.; Bouy, H.; Altieri, B.; Verdoes Kleijn, G.; Carry, B.; Bertin, E.; de Jong, J. T. A.; Kuijken, K.; McFarland, J.; Valentijn, E.

    2018-02-01

    Context. The search for minor bodies in the solar system promises insights into its formation history. Wide imaging surveys offer the opportunity to serendipitously discover and identify these traces of planetary formation and evolution. Aim. We aim to present a method to acquire position, photometry, and proper motion measurements of solar system objects (SSOs) in surveys using dithered image sequences. The application of this method on the Kilo-Degree Survey (KiDS) is demonstrated. Methods: Optical images of 346 deg2 fields of the sky are searched in up to four filters using the AstrOmatic software suite to reduce the pixel to catalog data. The SSOs within the acquired sources are selected based on a set of criteria depending on their number of observation, motion, and size. The Virtual Observatory SkyBoT tool is used to identify known objects. Results: We observed 20 221 SSO candidates, with an estimated false-positive content of less than 0.05%. Of these SSO candidates, 53.4% are identified by SkyBoT. KiDS can detect previously unknown SSOs because of its depth and coverage at high ecliptic latitude, including parts of the Southern Hemisphere. Thus we expect the large fraction of the 46.6% of unidentified objects to be truly new SSOs. Conclusions: Our method is applicable to a variety of dithered surveys such as DES, LSST, and Euclid. It offers a quick and easy-to-implement search for SSOs. SkyBoT can then be used to estimate the completeness of the recovered sample. The tables of raw data are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/610/A21

  4. SPHEREx: Playing Nicely with Other Missions

    NASA Astrophysics Data System (ADS)

    Werner, Michael; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for a competitive Phase A study in August 2017, is an all-sky survey satellite designed to address all three science goals of NASA's Astrophysics Division. SPHEREx is a wide-field spectral imager, and it would produce the first all-sky near-infrared spectral survey, using a passively cooled telescope with a wide field-of-view for large mapping speed. The SPHEREx spectra would have resolving power R=41 at wavelengths from 0.75 to 4.2um, and R=135 from 4.2 to 5um. The spectra resolution is provided by Linear Variable Filters placed directly over the four SPHEREx H2RG detector arrays. SPHEREx would be sensitive enough to obtain spectra of essentially all near-infrared sources from the WISE survey. During its two-year mission, SPHEREx, to be launched in 2022, would produce four complete all-sky spectral maps that would serve as a rich archive for the astronomy community.SPHEREx would be tremendously synergistic with numerous other missions and facilities [NASA and non-NASA] which will be operating in the coming decade. SPHEREx observations could pick out the most promising and exciting targets for investigation from JWST. From the opposite perspective, SPHEREx statistical samples could be used to refine the conclusions derived from JWST’s indepth studies of a few members of an interesting class of objects. SPHEREx and GAIA spectrophotometry, incorporating photometry from WISE and GALEX as well as GAIA astrometry, could lead to the determination of the radii of main sequence stars, and their transiting exoplanets discovered by TESS, with 1% accuracy. SPHEREx low redshift spectra of millions of galaxies could be used to validate and calibrate the photometric nredshift scale being adopted by WFIRST and Euclid, improving the precision of the dark energy measures being returned by those missions. The poster will briefly address SPHEREx synergisms with these and other missions ranging from LSST to eROSITA.The research described here was carried out in part at the Jet Propulsion Laboratory, California Institute of Technology, operated by the California Institute of Technology under a contract with NASA.

  5. The LSST Optical System

    NASA Astrophysics Data System (ADS)

    Liang, M.; Seppala, L.; Sweeney, D.; LSST Project Team

    2005-12-01

    The 8.4m Large Synoptic Survey Telescope facility will digitally survey the entire visible sky. It will explore the nature of dark matter and dark energy, open the faint optical transient time window and catalog earth-crossing asteroids > 300m diameter. We present the design of an f/1.25 modified Paul-Baker or Laux telescope with etendue (A--Ω product) of >318m2 deg2 , >50× beyond any existing facility. The optical design, over a 3.5-degree diameter field of view (9.62 deg2), delivers superb ˜ 0.2 arcsec FWHM images over 6 spectral bands covering 325--1000 nm. The flat focal surface has a plate scale of 51 microns/arcsec, matching the 10 microns pixels of a large 0.65 m diameter mosaic digital detector. The f/1.17 primary can be made using polishing techniques and metrology methods pioneered at the University of Arizona Mirror Lab for the 8.4 m f/1.1 Large Binocular Telescope primaries. The 3.4 m convex secondary is twice the size of the largest convex secondary yet manufactured; the 1.7 m MMT f/5 secondary. We show a fabrication and testing plan for this optic, which has less than 40 microns asphericity from best fit sphere. Five separate null test or alignment tests are built in as part of the optimization of the entire telescope: the three lenses separately, the combination of the first two lenses and the three mirror telescope system, without the camera corrector optics. All five tests help to ensure practicable telescope design.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, J; Dossa, D; Gokhale, M

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe:more » (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows: SuperMicro X7DBE Xeon Dual Socket Blackford Server Motherboard; 2 Intel Xeon Dual-Core 2.66 GHz processors; 1 GB DDR2 PC2-5300 RAM (2 x 512); 80GB Hard Drive (Seagate SATA II Barracuda). The Fusion board is presently capable of 4X in a PCIe slot. The image resampling benchmark was run on a dual Xeon workstation with NVIDIA graphics card (see Chapter 5 for full specification). An XtremeData Opteron+FPGA was used for the language classification application. We observed that these benchmarks are not uniformly I/O intensive. The only benchmark that showed greater that 50% of the time in I/O was the graph algorithm when it accessed data files over NFS. When local disk was used, the graph benchmark spent at most 40% of its time in I/O. The other benchmarks were CPU dominated. The image resampling benchmark and language classification showed order of magnitude speedup over software by using co-processor technology to offload the CPU-intensive kernels. Our experiments to date suggest that emerging hardware technologies offer significant benefit to boosting the performance of data-intensive algorithms. Using GPU and FPGA co-processors, we were able to improve performance by more than an order of magnitude on the benchmark algorithms, eliminating the processor bottleneck of CPU-bound tasks. Experiments with a prototype solid state nonvolative memory available today show 10X better throughput on random reads than disk, with a 2X speedup on a graph processing benchmark when compared to the use of local SATA disk.« less

  7. A Photometric (griz) Metallicity Calibration for Cool Stars

    NASA Astrophysics Data System (ADS)

    West, Andrew A.; Davenport, James R. A.; Dhital, Saurav; Mann, Andrew; Massey, Angela P

    2014-06-01

    We present results from a study that uses wide pairs as tools for estimating and constraining the metal content of cool stars from their spectra and broad band colors. Specifically, we will present results that optimize the Mann et al. M dwarf metallicity calibrations (derived using wide binaries) for the optical regime covered by SDSS spectra. We will demonstrate the robustness of the new calibrations using a sample of wide, low-mass binaries for which both components have an SDSS spectrum. Using these new spectroscopic metallicity calibrations, we will present relations between the metallicities (from optical spectra) and the Sloan colors derived using more than 20,000 M dwarfs in the SDSS DR7 spectroscopic catalog. These relations have important ramifications for studies of Galactic chemical evolution, the search for exoplanets and subdwarfs, and are essential for surveys such as Pan-STARRS and LSST, which use griz photometry but have no spectroscopic component.

  8. Dwarf Hosts of Low-z Supernovae

    NASA Astrophysics Data System (ADS)

    Pyotr Kolobow, Craig; Perlman, Eric S.; Strolger, Louis

    2018-01-01

    Hostless supernovae (SNe), or SNe in dwarf galaxies, may serve as excellent beacons for probing the spatial density of dwarf galaxies (M < 10^8M⊙), which themselves are scarcely detected beyond only a few Mpc. Depending on the assumed model for the stellar-mass to halo mass relation for these galaxies, LSST might see 1000s of SNe (of all types) from dwarf galaxies alone. Conversely, one can take the measured rates of these SNe and test the model predictions for the density of dwarf galaxies in the local universe. Current “all-sky” surveys, like PanSTARRS and ASAS-SN, are now finding hostless SNe at a number sufficient to measure their rate. What missing is the appropriate weighting of their host luminosities. Here we seek to continue a successful program to recover the luminosities of these hostless SNe, to z = 0.15, to use their rate to constrain the faint-end slope of the low-z galaxy luminosity function.

  9. Probing Neutrino Hierarchy and Chirality via Wakes.

    PubMed

    Zhu, Hong-Ming; Pen, Ue-Li; Chen, Xuelei; Inman, Derek

    2016-04-08

    The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and neutrino wakes are expected to develop downstream of the dark matter halos. We propose a method of measuring the neutrino mass based on this mechanism. This neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys with a low redshift galaxy survey or a 21 cm intensity mapping survey, which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make a positive detection if the three neutrino masses are quasidegenerate with each neutrino mass of ∼0.1  eV, and a future high precision 21 cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right-handed Dirac neutrinos may be detectable.

  10. Cosmic Visions Dark Energy: Small Projects Portfolio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Kyle; Frieman, Josh; Heitmann, Katrin

    Understanding cosmic acceleration is one of the key science drivers for astrophysics and high-energy physics in the coming decade (2014 P5 Report). With the Large Synoptic Survey Telescope (LSST) and the Dark Energy Spectroscopic Instrument (DESI) and other new facilities beginning operations soon, we are entering an exciting phase during which we expect an order of magnitude improvement in constraints on dark energy and the physics of the accelerating Universe. This is a key moment for a matching Small Projects portfolio that can (1) greatly enhance the science reach of these flagship projects, (2) have immediate scientific impact, and (3)more » lay the groundwork for the next stages of the Cosmic Frontier Dark Energy program. In this White Paper, we outline a balanced portfolio that can accomplish these goals through a combination of observational, experimental, and theory and simulation efforts.« less

  11. Implications from XMM and Chandra Source Catalogs for Future Studies with Lynx

    NASA Astrophysics Data System (ADS)

    Ptak, Andrew

    2018-01-01

    Lynx will perform extremely sensitive X-ray surveys by combining very high-resolution imaging over a large field of view with a high effective area. These will include deep planned surveys and serendipitous source surveys. Here we discuss implications that can be gleaned from current Chandra and XMM-Newton serendipitous source surveys. These current surveys have discovered novel sources such as tidal disruption events, binary AGN, and ULX pulsars. In addition these surveys have detected large samples of normal galaxies, low-luminosity AGN and quasars due to the wide-area coverage of the Chandra and XMM-Newton source catalogs, allowing the evolution of these phenonema to be explored. The wide area Lynx surveys will probe down further in flux and will be coupled with very sensitive wide-area surveys such as LSST and SKA, allowing for detailed modeling of their SEDs and the discovery of rare, exotic sources and transient events.

  12. Cosmic Evolution Through UV Spectroscopy (CETUS): A NASA Probe-Class Mission Concept

    NASA Astrophysics Data System (ADS)

    Heap, Sara R.; CETUS Team

    2017-01-01

    CETUS is a probe-class mission concept proposed for study to NASA in November 2016. Its overarching objective is to provide access to the ultraviolet (~100-400 nm) after Hubble has died. CETUS will be a major player in the emerging global network of powerful, new telescopes such as E-ROSITA, DESI, Subaru/PFS, GMT, LSST, WFIRST, JWST, and SKA. The CETUS mission concept provisionally features a 1.5-m telescope with a suite of instruments including a near-UV multi-object spectrograph (200-400 nm) complementing Subaru/PFS observations, wide-field far-UV and near-UV cameras, and far-UV and near-UV spectrographs that can be operated in either high-resolution or low-resolution mode. We have derived the scope and specific science requirements for CETUS for understanding the evolutionary history of galaxies, stars, and dust, but other applications are possible.

  13. The science enabled by the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Martin, N. F.; Babusiaux, C.

    2017-12-01

    With its unique wide-field, multi-object, and dedicated spectroscopic capabilities, the Maunakea Spectroscopic Explorer (MSE) is a powerful facility to shed light on the faint Universe. Built around an upgrade of the Canada-France Hawaii Telescope (CFHT) to a 11.25-meter telescope with a dedicated ˜1.5 deg^2, 4,000-fiber wide-field spectrograph that covers the optical and near-infrared wavelengths at resolutions between 2,500 and 40,000, the MSE is the essential follow-up complement to the current and next generations of multi-wavelength imaging surveys, such as the LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for the extremely large telescopes that are currently being built (E-ELT, GMT, and TMT). The science enabled by the MSE is vast and would have an impact on almost all aspects of astronomy research.

  14. Estimating explosion properties of normal hydrogen-rich core-collapse supernovae

    NASA Astrophysics Data System (ADS)

    Pejcha, Ondrej

    2017-08-01

    Recent parameterized 1D explosion models of hundreds of core-collapse supernova progenitors suggest that success and failure are intertwined in a complex pattern that is not a simple function of the progenitor initial mass. This rugged landscape is present also in other explosion properties, allowing for quantitative tests of the neutrino mechanism from observations of hundreds of supernovae discovered every year. We present a new self-consistent and versatile method that derives photospheric radius and temperature variations of normal hydrogen-rich core-collapse supernovae based on their photometric measurements and expansion velocities. We construct SED and bolometric light curves, determine explosion energies, ejecta and nickel masses while taking into account all uncertainties and covariances of the model. We describe the efforts to compare the inferences to the predictions of the neutrino mechanim. The model can be adapted to include more physical assumptions to utilize primarily photometric data coming from surveys such as LSST.

  15. The applications of deep neural networks to sdBV classification

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas M.

    2017-12-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and we show that two separate paradigms of deep learning - the Artificial Neural Network And the Convolutional Neural Network - can both be used to classify this synthetic data effectively. And that additionally this classification can be performed at relatively high levels of accuracy with minimal time spent adjusting network hyperparameters.

  16. LensFlow: A Convolutional Neural Network in Search of Strong Gravitational Lenses

    NASA Astrophysics Data System (ADS)

    Pourrahmani, Milad; Nayyeri, Hooshang; Cooray, Asantha

    2018-03-01

    In this work, we present our machine learning classification algorithm for identifying strong gravitational lenses from wide-area surveys using convolutional neural networks; LENSFLOW. We train and test the algorithm using a wide variety of strong gravitational lens configurations from simulations of lensing events. Images are processed through multiple convolutional layers that extract feature maps necessary to assign a lens probability to each image. LENSFLOW provides a ranking scheme for all sources that could be used to identify potential gravitational lens candidates by significantly reducing the number of images that have to be visually inspected. We apply our algorithm to the HST/ACS i-band observations of the COSMOS field and present our sample of identified lensing candidates. The developed machine learning algorithm is more computationally efficient and complimentary to classical lens identification algorithms and is ideal for discovering such events across wide areas from current and future surveys such as LSST and WFIRST.

  17. HREXI prototype for 4piXIO

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan

    We propose to complete our development of the High Resolution Energetic X-ray Imager (HREXI) and to build and test a full Engineering Model of a detector and telescope system for a 12U Cubesat that will be proposed for a test flight. This will enable a future SMEX (or MIDEX) proposal for a 4piXIO mission: a constellation of Cubesats (or Smallsats) that would dramatically increase the sensitivity, source location precision and especially number of Gamma Ray Bursts (GRBs) to explore the Early Universe. Over the past two years of our current APRA grant, we have developed the world's first (to our knowledge) readout of a high-level imaging detector that is entirely three dimensional so that imaging detectors can then be tiled in close-packed arrays of arbitrary total area. This important new technology is achieved by replacing the external lateral readout of an ASIC, which reads out data from (for example) a 2 x 2 cm imaging detector through "wire bonds" to external circuits in the same plane but beyond the detector, with a vertical readout through the ASIC itself to external circuits directly below. This new technology greatly simplifies the assembly of the large area, tiled arrays of such detectors and their readout ASICs used for coded aperture wide-field telescopes that are uniquely able to discover and study X-ray (and low energy gamma-ray) transients and bursts that are key to understanding the physics and evolution of black holes. The first actual fabrication of such 3D-readout of close-tiled HREXI imaging detectors is underway and will be demonstrated in this third and final year of the current APRA grant. This proposal takes the HREXI detector concept a major step further. By incorporating this technology into the design and fabrication of a complete Engineering Model of a HREXI detector and coded aperture telescope that would fit, with comfortable margins, in a 12U Cubesat, it opens the way for a future low-cost constellation of 25 such 12U Cubesats to achieve the first full-sky, full-time imaging survey for Gamma-ray Bursts (GRBs) and transients. The full-sky/time coverage immediately increases GRB detections by factors of 6, a significant increase in the search for GRBs from the Early Universe. The proposal will also extend the development of smaller pixel size for the required ASIC chips which will significantly improve angular resolution and make the low-cost Cubesat mission even more compelling. The science goals that a multi-satellite mission enabled by HREXI detectors for high resolution imaging over the full sky include using GRBs to trace star formation back to the very first (Pop III) stars and using flares from quasars to track the growth and evolution of supermassive black holes. Both are key NASA and PCOS science objectives. This is achieved by combining coordinated optical and IR data from a 4piXIO mission with LSST ground-based optical data as well as optical/IR spectra from a future optical-IR spectroscopy telescope in space, such as the proposed TSO probe-class mission.

  18. The Palomar Transient Factory: Introduction and Data Release

    NASA Astrophysics Data System (ADS)

    Surace, Jason Anthony

    2015-08-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night, to a depth of roughly 20.5. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST).We announce the availability of the second PTF public data release, which includes epochal images and catalogs, as well as deep (coadded) reference images and associated catalogs, for the majority of the northern sky. The epochal data span the time period from 2009 through 2012, with various cadences and coverages, typically in the tens or hundreds for most points on the sky. The data are available through both a GUI and software API portal at the Infrared Processing and Analysis Center at Caltech. The PTF and current iPTF projects are multi-partner multi-national collaborations.

  19. Morphology-based Query for Galaxy Image Databases

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2017-02-01

    Galaxies of rare morphology are of paramount scientific interest, as they carry important information about the past, present, and future Universe. Once a rare galaxy is identified, studying it more effectively requires a set of galaxies of similar morphology, allowing generalization and statistical analysis that cannot be done when N=1. Databases generated by digital sky surveys can contain a very large number of galaxy images, and therefore once a rare galaxy of interest is identified it is possible that more instances of the same morphology are also present in the database. However, when a researcher identifies a certain galaxy of rare morphology in the database, it is virtually impossible to mine the database manually in the search for galaxies of similar morphology. Here we propose a computer method that can automatically search databases of galaxy images and identify galaxies that are morphologically similar to a certain user-defined query galaxy. That is, the researcher provides an image of a galaxy of interest, and the pattern recognition system automatically returns a list of galaxies that are visually similar to the target galaxy. The algorithm uses a comprehensive set of descriptors, allowing it to support different types of galaxies, and it is not limited to a finite set of known morphologies. While the list of returned galaxies is neither clean nor complete, it contains a far higher frequency of galaxies of the morphology of interest, providing a substantial reduction of the data. Such algorithms can be integrated into data management systems of autonomous digital sky surveys such as the Large Synoptic Survey Telescope (LSST), where the number of galaxies in the database is extremely large. The source code of the method is available at http://vfacstaff.ltu.edu/lshamir/downloads/udat.

  20. Precision cosmology with time delay lenses: High resolution imaging requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Xiao -Lei; Treu, Tommaso; Agnello, Adriano

    Lens time delays are a powerful probe of cosmology, provided that the gravitational potential of the main deflector can be modeled with sufficient precision. Recent work has shown that this can be achieved by detailed modeling of the host galaxies of lensed quasars, which appear as ``Einstein Rings'' in high resolution images. The distortion of these arcs and counter-arcs, as measured over a large number of pixels, provides tight constraints on the difference between the gravitational potential between the quasar image positions, and thus on cosmology in combination with the measured time delay. We carry out a systematic exploration ofmore » the high resolution imaging required to exploit the thousands of lensed quasars that will be discovered by current and upcoming surveys with the next decade. Specifically, we simulate realistic lens systems as imaged by the Hubble Space Telescope (HST), James Webb Space Telescope (JWST), and ground based adaptive optics images taken with Keck or the Thirty Meter Telescope (TMT). We compare the performance of these pointed observations with that of images taken by the Euclid (VIS), Wide-Field Infrared Survey Telescope (WFIRST) and Large Synoptic Survey Telescope (LSST) surveys. We use as our metric the precision with which the slope γ' of the total mass density profile ρ tot∝ r–γ' for the main deflector can be measured. Ideally, we require that the statistical error on γ' be less than 0.02, such that it is subdominant to other sources of random and systematic uncertainties. We find that survey data will likely have sufficient depth and resolution to meet the target only for the brighter gravitational lens systems, comparable to those discovered by the SDSS survey. For fainter systems, that will be discovered by current and future surveys, targeted follow-up will be required. Furthermore, the exposure time required with upcoming facilitites such as JWST, the Keck Next Generation Adaptive Optics System, and TMT, will only be of order a few minutes per system, thus making the follow-up of hundreds of systems a practical and efficient cosmological probe.« less

  1. Precision cosmology with time delay lenses: high resolution imaging requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Xiao-Lei; Liao, Kai; Treu, Tommaso

    Lens time delays are a powerful probe of cosmology, provided that the gravitational potential of the main deflector can be modeled with sufficient precision. Recent work has shown that this can be achieved by detailed modeling of the host galaxies of lensed quasars, which appear as ''Einstein Rings'' in high resolution images. The distortion of these arcs and counter-arcs, as measured over a large number of pixels, provides tight constraints on the difference between the gravitational potential between the quasar image positions, and thus on cosmology in combination with the measured time delay. We carry out a systematic exploration ofmore » the high resolution imaging required to exploit the thousands of lensed quasars that will be discovered by current and upcoming surveys with the next decade. Specifically, we simulate realistic lens systems as imaged by the Hubble Space Telescope (HST), James Webb Space Telescope (JWST), and ground based adaptive optics images taken with Keck or the Thirty Meter Telescope (TMT). We compare the performance of these pointed observations with that of images taken by the Euclid (VIS), Wide-Field Infrared Survey Telescope (WFIRST) and Large Synoptic Survey Telescope (LSST) surveys. We use as our metric the precision with which the slope γ' of the total mass density profile ρ{sub tot}∝ r{sup −γ'} for the main deflector can be measured. Ideally, we require that the statistical error on γ' be less than 0.02, such that it is subdominant to other sources of random and systematic uncertainties. We find that survey data will likely have sufficient depth and resolution to meet the target only for the brighter gravitational lens systems, comparable to those discovered by the SDSS survey. For fainter systems, that will be discovered by current and future surveys, targeted follow-up will be required. However, the exposure time required with upcoming facilitites such as JWST, the Keck Next Generation Adaptive Optics System, and TMT, will only be of order a few minutes per system, thus making the follow-up of hundreds of systems a practical and efficient cosmological probe.« less

  2. Connecting Variability and Metals in White Dwarfs

    NASA Astrophysics Data System (ADS)

    Kilic, Mukremin

    2016-10-01

    The Kepler and K2 missions have revealed that about half of the observed white dwarfs with sufficient signal-to-noise ratio light curves have low-level photometric variations at hour to day timescales. Potential explanations for the observed variability include the relativistic beaming effect, ellipsodial variations, eclipses, and reflection off of giant planets in close orbits. However, these are all rare events. Roughly 10% of white dwarfs are magnetic, and magnetic fields can explain part of this puzzle. However, the high incidence (50%) of variability is currently unexplained. HST COS spectroscopy of nearby white dwarfs show that about half of them have metals on their surface. Hence, we propose that the observed variability is due to the rotation of the star coupled with an inhomogeneous surface distribution of accreted metals. We have recently discovered an ideal system to test this hypothesis. J1529 is an apparently non-magnetic white dwarf that shows 5.9% photometric dips in the optical every 38 min. We propose to obtain COS TIME-TAG spectroscopy of J1529 over 4 orbits to search for surface abundance differences throughout the orbit and look for the flux redistribution effect in the optical. These observations will confirm or rule out the idea that inhomogeneous metal accretion on white dwarfs can explain the high incidence of variability. We predict that the LSST will identify 100,000 variable white dwarfs. Hence, understanding the source of variability in white dwarfs has implications for the current and future transient surveys.

  3. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    NASA Astrophysics Data System (ADS)

    Breddels, M. A.

    2017-06-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.

  4. Ganalyzer: A tool for automatic galaxy image analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-05-01

    Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

  5. Cosmological parameter constraints with the Deep Lens Survey using galaxy-shear correlations and galaxy clustering properties

    NASA Astrophysics Data System (ADS)

    Yoon, Mijin; Jee, Myungkook James; Tyson, Tony

    2018-01-01

    The Deep Lens Survey (DLS), a precursor to the Large Synoptic Survey Telescope (LSST), is a 20 sq. deg survey carried out with NOAO’s Blanco and Mayall telescopes. The strength of the survey lies in its depth reaching down to ~27th mag in BVRz bands. This enables a broad redshift baseline study and allows us to investigate cosmological evolution of the large-scale structure. In this poster, we present the first cosmological analysis from the DLS using galaxy-shear correlations and galaxy clustering signals. Our DLS shear calibration accuracy has been validated through the most recent public weak-lensing data challenge. Photometric redshift systematic errors are tested by performing lens-source flip tests. Instead of real-space correlations, we reconstruct band-limited power spectra for cosmological parameter constraints. Our analysis puts a tight constraint on the matter density and the power spectrum normalization parameters. Our results are highly consistent with our previous cosmic shear analysis and also with the Planck CMB results.

  6. zBEAMS: a unified solution for supernova cosmology with redshift uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Ethan; Lochner, Michelle; Bassett, Bruce A.

    Supernova cosmology without spectra will be an important component of future surveys such as LSST. This lack of supernova spectra results in uncertainty in the redshifts which, if ignored, leads to significantly biased estimates of cosmological parameters. Here we present a hierarchical Bayesian formalism— zBEAMS—that addresses this problem by marginalising over the unknown or uncertain supernova redshifts to produce unbiased cosmological estimates that are competitive with supernova data with spectroscopically confirmed redshifts. zBEAMS provides a unified treatment of both photometric redshifts and host galaxy misidentification (occurring due to chance galaxy alignments or faint hosts), effectively correcting the inevitable contamination inmore » the Hubble diagram. Like its predecessor BEAMS, our formalism also takes care of non-Ia supernova contamination by marginalising over the unknown supernova type. We illustrate this technique with simulations of supernovae with photometric redshifts and host galaxy misidentification. A novel feature of the photometric redshift case is the important role played by the redshift distribution of the supernovae.« less

  7. Unveiling the Low Surface Brightness Stellar Peripheries of Galaxies

    NASA Astrophysics Data System (ADS)

    Ferguson, Annette M. N.

    2018-01-01

    The low surface brightness peripheral regions of galaxies contain a gold mine of information about how minor mergers and accretions have influenced their evolution over cosmic time. Enormous stellar envelopes and copious amounts of faint tidal debris are natural outcomes of the hierarchical assembly process and the search for and study of these features, albeit highly challenging, offers the potential for unrivalled insight into the mechanisms of galaxy growth. Over the last two decades, there has been burgeoning interest in probing galaxy outskirts using resolved stellar populations. Wide-field surveys have uncovered vast tidal debris features and new populations of very remote globular clusters, while deep Hubble Space Telescope photometry has provided exquisite star formation histories back to the earliest epochs. I will highlight some recent results from studies within and beyond the Local Group and conclude by briefly discussing the great potential of future facilities, such as JWST, Euclid, LSST and WFIRST, for major breakthroughs in low surface brightness galaxy periphery science.

  8. Systematic Serendipity: A Method to Discover the Anomalous

    NASA Astrophysics Data System (ADS)

    Giles, Daniel; Walkowicz, Lucianne

    2018-01-01

    One of the challenges in the era of big data astronomical surveys is identifying anomalous data, data that exhibits as-of-yet unobserved behavior. These data may result from systematic errors, extreme (or rare) forms of known phenomena, or, most interestingly, truly novel phenomena that has historically required a trained eye and often fortuitous circumstance to identify. We describe a method that uses machine clustering techniques to discover anomalous data in Kepler lightcurves, as a step towards systematizing the detection of novel phenomena in the era of LSST. As a proof of concept, we apply our anomaly detection method to Kepler data including Boyajian's Star (KIC 8462852). We examine quarters 4, 8, 11, and 16 of the Kepler data which contain Boyajian’s Star acting normally (quarters 4 and 11) and anomalously (quarters 8 and 16). We demonstrate that our method is capable of identifying Boyajian’s Star’s anomalous behavior in quarters of interest, and we further identify other anomalous light curves that exhibit a range of interesting variability.

  9. A model to forecast data centre infrastructure costs.

    NASA Astrophysics Data System (ADS)

    Vernet, R.

    2015-12-01

    The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.

  10. Recovering Galaxy Properties Using Gaussian Process SED Fitting

    NASA Astrophysics Data System (ADS)

    Iyer, Kartheik; Awan, Humna

    2018-01-01

    Information about physical quantities like the stellar mass, star formation rates, and ages for distant galaxies is contained in their spectral energy distributions (SEDs), obtained through photometric surveys like SDSS, CANDELS, LSST etc. However, noise in the photometric observations often is a problem, and using naive machine learning methods to estimate physical quantities can result in overfitting the noise, or converging on solutions that lie outside the physical regime of parameter space.We use Gaussian Process regression trained on a sample of SEDs corresponding to galaxies from a Semi-Analytic model (Somerville+15a) to estimate their stellar masses, and compare its performance to a variety of different methods, including simple linear regression, Random Forests, and k-Nearest Neighbours. We find that the Gaussian Process method is robust to noise and predicts not only stellar masses but also their uncertainties. The method is also robust in the cases where the distribution of the training data is not identical to the target data, which can be extremely useful when generalized to more subtle galaxy properties.

  11. The Planetary Archive

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César

    2014-11-01

    We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.

  12. Transformative Small Body Science Enabled with Pan-STARSS Survey Data

    NASA Astrophysics Data System (ADS)

    Meech, Karen J.; Kleyna, Jan T.; Keane, Jacqueline V.; Hainaut, Olivier R.; MIcheli, Marco

    2018-01-01

    In the first 5 Myr of Solar System formation, gas imprinted a local chemical signature on the planetesimals which were subsequently redistributed during planet formation. Decades-long ground- and space-based studies have tried to map our solar system’s protoplanetary disk chemistry using volatiles in comets. We now know that comet volatiles (H2O, CO, CO2 and organics) have distinct chemical classes. This data contradicts traditional ideas that all volatile-rich bodies formed in the outer disk. In-situ space comet missions have suggested, however, that comets preserve their pristine volatile inventory, and perhaps even their heritage of ices prior to the protoplanetary disk. Recently, a profusion of dynamical models has been developed that can reproduce some of the key characteristics of today’s solar system. Some models require significant giant planet migration, while others do not. The UH-led Pan-STARRS1 survey (PS1) can offer transformative insight into small bodies and the early solar system, providing a preview of LSST. In 2013 PS1 discovered an asteroidal object on a long-period comet orbit, the first of a class of tailless objects informally called Manxes. The second Manx discovered had a surface composition similar to inner solar system rocky S-type material, suggesting the intriguing possibility that we are looking at fresh inner solar system Earth-forming material, preserved for billions of years in the Oort cloud. Currently 10-15 of these objects are discovered each year, with PS1 dominating the discoveries. The number of rocky inner solar system Manx objects can be used to constrain solar system formation models. PS1 is also very good at discovering faint active objects at large distances, including the remarkable discovery of a comet active beyond 16 au from the sun. By searching the PS1 database once these discoveries are made, it is possible to extend the orbit arc backwards in time, allowing us to model the activity, and understand the chemistry and physics of ices and activity in the outer solar system. These discoveries will help us tie together chemistry and dynamics in our solar system with new resolved ALMA observations of protoplanetary disks. Support from NSF grants AST-1617015, 1413736.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Hugh H.; Balasubramanian, V.; Bernstein, G.

    The University of Pennsylvania elementary particle physics/particle cosmology group, funded by the Department of Energy Office of Science, participates in research in high energy physics and particle cosmology that addresses some of the most important unanswered questions in science. The research is divided into five areas. Energy Frontier - We participate in the study of proton-proton collisions at the Large Hadron Collider in Geneva, Switzerland using the ATLAS detector. The University of Pennsylvania group was responsible for the design, installation, and commissioning of the front-end electronics for the Transition Radiation Tracker (TRT) and plays the primary role in its maintenancemore » and operation. We play an important role in the triggering of ATLAS, and we have made large contributions to the TRT performance and to the study and identification of electrons, photons, and taus. We have been actively involved in searches for the Higgs boson and for SUSY and other exotic particles. We have made significant contributions to measurement of Standard Model processes such as inclusive photon production and WW pair production. We also have participated significantly in R&D for upgrades to the ATLAS detector. Cosmic Frontier - The Dark Energy Survey (DES) telescope will be used to elucidate the nature of dark energy and the distribution of dark matter. Penn has played a leading role both in the use of weak gravitational lensing of distant galaxies and the discovery of large numbers of distant supernovae. The techniques and forecasts developed at Penn are also guiding the development of the proposed Large Synoptic Survey Telescope (LSST).We are also developing a new detector, MiniClean, to search for direct detection of dark matter particles. Intensity Frontier - We are participating in the design and R&D of detectors for the Long Baseline Neutrino Experiment (now DUNE), a new experiment to study the properties of neutrinos. Advanced Techology R&D - We have an extensive involvement in electronics required for sophisticated new detectors at the LHC and are developing electronics for the LSST camera. Theoretical Physics - We are carrying out a broad program studying the fundamental forces of nature and early universe cosmology and mathematical physics. Our activities span the range from model building, formal field theory, and string theory to new paradigms for cosmology and the interface of string theory with mathematics. Our effort combines extensive development of the formal aspects of string theory with a focus on real phenomena in particle physics, cosmology and gravity.« less

  14. Eight new luminous z ≥ 6 quasars discovered via SED model fitting of VISTA, WISE and Dark Energy Survey Year 1 observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, S. L.; McMahon, R. G.; Martini, P.

    Here, we present the discovery and spectroscopic confirmation with the European Southern Observatory New Technology Telescope (NTT) and Gemini South telescopes of eight new, and the rediscovery of two previously known, 6.0 < z < 6.5 quasars with zAB < 21.0. These quasars were photometrically selected without any morphological criteria from 1533 deg2 using spectral energy distribution (SED) model fitting to photometric data from Dark Energy Survey (g, r, i, z, Y), VISTA Hemisphere Survey (J, H, K) and Wide-field Infrared Survey Explorer (W1, W2). The photometric data were fitted with a grid of quasar model SEDs with redshift-dependent Lymore » α forest absorption and a range of intrinsic reddening as well as a series of low-mass cool star models. Candidates were ranked using an SED-model-based χ2-statistic, which is extendable to other future imaging surveys (e.g. LSST and Euclid). Our spectral confirmation success rate is 100 per cent without the need for follow-up photometric observations as used in other studies of this type. Combined with automatic removal of the main types of non-astrophysical contaminants, the method allows large data sets to be processed without human intervention and without being overrun by spurious false candidates. We also present a robust parametric redshift estimator that gives comparable accuracy to Mg ii and CO-based redshift estimators. We find two z ~6.2 quasars with H ii near zone sizes ≤3 proper Mpc that could indicate that these quasars may be young with ages ≲ 10 6-10 7 years or lie in over dense regions of the IGM. The z = 6.5 quasar VDES J0224–4711 has JAB = 19.75 and is the second most luminous quasar known with z ≥ 6.5.« less

  15. Eight new luminous z ≥ 6 quasars discovered via SED model fitting of VISTA, WISE and Dark Energy Survey Year 1 observations

    DOE PAGES

    Reed, S. L.; McMahon, R. G.; Martini, P.; ...

    2017-03-24

    Here, we present the discovery and spectroscopic confirmation with the European Southern Observatory New Technology Telescope (NTT) and Gemini South telescopes of eight new, and the rediscovery of two previously known, 6.0 < z < 6.5 quasars with zAB < 21.0. These quasars were photometrically selected without any morphological criteria from 1533 deg2 using spectral energy distribution (SED) model fitting to photometric data from Dark Energy Survey (g, r, i, z, Y), VISTA Hemisphere Survey (J, H, K) and Wide-field Infrared Survey Explorer (W1, W2). The photometric data were fitted with a grid of quasar model SEDs with redshift-dependent Lymore » α forest absorption and a range of intrinsic reddening as well as a series of low-mass cool star models. Candidates were ranked using an SED-model-based χ2-statistic, which is extendable to other future imaging surveys (e.g. LSST and Euclid). Our spectral confirmation success rate is 100 per cent without the need for follow-up photometric observations as used in other studies of this type. Combined with automatic removal of the main types of non-astrophysical contaminants, the method allows large data sets to be processed without human intervention and without being overrun by spurious false candidates. We also present a robust parametric redshift estimator that gives comparable accuracy to Mg ii and CO-based redshift estimators. We find two z ~6.2 quasars with H ii near zone sizes ≤3 proper Mpc that could indicate that these quasars may be young with ages ≲ 10 6-10 7 years or lie in over dense regions of the IGM. The z = 6.5 quasar VDES J0224–4711 has JAB = 19.75 and is the second most luminous quasar known with z ≥ 6.5.« less

  16. What does it mean to manage sky survey data? A model to facilitate stakeholder conversations

    NASA Astrophysics Data System (ADS)

    Sands, Ashley E.; Darch, Peter T.

    2016-06-01

    Astronomy sky surveys, while of great scientific value independently, can be deployed even more effectively when multiple sources of data are combined. Integrating discrete datasets is a non-trivial exercise despite investments in standard data formats and tools. Creating and maintaining data and associated infrastructures requires investments in technology and expertise. Combining data from multiple sources necessitates a common understanding of data, structures, and goals amongst relevant stakeholders.We present a model of Astronomy Stakeholder Perspectives on Data. The model is based on 80 semi-structured interviews with astronomers, computational astronomers, computer scientists, and others involved in the building or use of the Sloan Digital Sky Survey (SDSS) and Large Synoptic Survey Telescope (LSST). Interviewees were selected to ensure a range of roles, institutional affiliations, career stages, and level of astronomy education. Interviewee explanations of data were analyzed to understand how perspectives on astronomy data varied by stakeholder.Interviewees described sky survey data either intrinsically or extrinsically. “Intrinsic” descriptions of data refer to data as an object in and of itself. Respondents with intrinsic perspectives view data management in one of three ways: (1) “Medium” - securing the zeros and ones from bit rot; (2) “Scale” - assuring that changes in state are documented; or (3) “Content” - ensuring the scientific validity of the images, spectra, and catalogs.“Extrinsic” definitions, in contrast, define data in relation to other forms of information. Respondents with extrinsic perspectives view data management in one of three ways: (1) “Source” - supporting the integrity of the instruments and documentation; (2) “Relationship” - retaining relationships between data and their analytical byproducts; or (3) “Use” - ensuring that data remain scientifically usable.This model shows how data management can mean different things to different stakeholders at different times. The model is valuable to those who build and maintain infrastructures because it can be used as a tool to facilitate recognition, understanding, and thus communication between relevant astronomy data stakeholders.

  17. Facilitating Science Discoveries from NED Today and in the 2020s

    NASA Astrophysics Data System (ADS)

    Mazzarella, Joseph M.; NED Team

    2018-06-01

    I will review recent developments, work in progress, and major challenges that lie ahead as we enhance the capabilities of the NASA/IPAC Extragalactic Database (NED) to facilitate and accelerate multi-wavelength research on objects beyond our Milky Way galaxy. The recent fusion of data for over 470 million sources from the 2MASS Point Source Catalog and approximately 750 million sources from the AllWISE Source Catalog (next up) with redshifts from the SDSS and other data in NED is increasing the holdings to over a billion distinct objects with cross-identifications, providing a rich resource for multi-wavelength research. Combining data across such large surveys, as well as integrating data from over 110,000 smaller but scientifically important catalogs and journal articles, presents many challanges including the need to update the computing infrastructure and re-tool production and operations on a regular basis. Integration of the Firefly toolkit into the new user interface is ushering in a new phase of interative data visualization in NED, with features and capabilities familiar to users of IRSA and the emerging LSST science user interface. Graphical characterizations of NED content and estimates of completeness in different sky and spectral regions are also being developed. A newly implemented service that follows the Table Access Protocol (TAP) enables astronomers to issue queries to the NED object directory using Astronomical Data Language (ADQL), a standard shared in common with the NASA mission archives and other virtual observatories around the world. A brief review will be given of new science capabilities under development and planned for 2019-2020, as well as initiatives underway involving deployment of a parallel database, cloud technologies, machine learning, and first steps in bringing analysis capabilities close to the database in collaboration with IRSA. I will close with some questions for the community to consider in helping us plan future science capabilities and directions for NED in the 2020s.

  18. Eight new luminous z ≥ 6 quasars discovered via SED model fitting of VISTA, WISE and Dark Energy Survey Year 1 observations

    NASA Astrophysics Data System (ADS)

    Reed, S. L.; McMahon, R. G.; Martini, P.; Banerji, M.; Auger, M.; Hewett, P. C.; Koposov, S. E.; Gibbons, S. L. J.; Gonzalez-Solares, E.; Ostrovski, F.; Tie, S. S.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Goldstein, D. A.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Melchior, P.; Miller, C. J.; Miquel, R.; Nord, B.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Tucker, D. L.; Walker, A. R.; Wester, W.

    2017-07-01

    We present the discovery and spectroscopic confirmation with the European Southern Observatory New Technology Telescope (NTT) and Gemini South telescopes of eight new, and the rediscovery of two previously known, 6.0 < z < 6.5 quasars with zAB < 21.0. These quasars were photometrically selected without any morphological criteria from 1533 deg2 using spectral energy distribution (SED) model fitting to photometric data from Dark Energy Survey (g, r, I, z, Y), VISTA Hemisphere Survey (J, H, K) and Wide-field Infrared Survey Explorer (W1, W2). The photometric data were fitted with a grid of quasar model SEDs with redshift-dependent Ly α forest absorption and a range of intrinsic reddening as well as a series of low-mass cool star models. Candidates were ranked using an SED-model-based χ2-statistic, which is extendable to other future imaging surveys (e.g. LSST and Euclid). Our spectral confirmation success rate is 100 per cent without the need for follow-up photometric observations as used in other studies of this type. Combined with automatic removal of the main types of non-astrophysical contaminants, the method allows large data sets to be processed without human intervention and without being overrun by spurious false candidates. We also present a robust parametric redshift estimator that gives comparable accuracy to Mg II and CO-based redshift estimators. We find two z ˜ 6.2 quasars with H II near zone sizes ≤3 proper Mpc that could indicate that these quasars may be young with ages ≲ 106-107 years or lie in over dense regions of the IGM. The z = 6.5 quasar VDES J0224-4711 has JAB = 19.75 and is the second most luminous quasar known with z ≥ 6.5.

  19. New Surveys of the Universe with the Jansky Very Large Array (VLA) and the Very Long Baseline Array (VLBA)

    NASA Astrophysics Data System (ADS)

    Myers, Steven T.

    2013-01-01

    The Jansky Very Large Array is a recently completed upgrade to the VLA that has significantly expanded its capabilities through replacement of the receivers, electronics, signal paths, and correlator with cutting-edge technology. This enhancement provides significantly increased continuum sensitivity and spectral survey speeds (by factors of 100 or more in select cases) from 1-50 GHz and in key bands below 1 GHz. Concurrently, we are greatly enhancing the sensitivity of the Very Long Baseline Array. A suite of ever more ambitious radio sky survey programs undertaken with these new instruments address science goals central to answering the questions posed by Astro2010, and will undoubtedly incite new inquiries. The science themes of the Jansky VLA and the VLBA are: illuminating the obscured, probing the magnetic, sounding the transient, and charting the evolving Universe. New observations will allow us to image young stars in massive black holes in dust enshrouded environments, measure the strength and topology of the cosmic magnetic field, follow the rapid evolution of energetic phenomena, and to study the formation and evolution of stars, galaxies, AGN, and the Universe itself. We can follow the evolution of gas and galaxies and particles and fields through cosmic time to bridge the eras from cosmic dawn to the dawn of new worlds. I will describe the salient features of the Jansky VLA and the VLBA for cosmological survey work, and summarize the multi-wavelength aspects in regard to those with ALMA and next generation optical, infrared, X-ray and Gamma-ray telescopes. Example data taken from Janksy VLA and upgraded VLBA commissioning tests and early science will illustrate these features. I also describe evolution of the VLA and VLBA and their capabilities for future surveys that will lead towards the next decade, into the era of the LSST and the SKA.

  20. Ultracool Dwarfs and their companions

    NASA Astrophysics Data System (ADS)

    Blake, Cullen H.

    This thesis explores new techniques for making precise measurements of low-mass stars and brown dwarfs, collectively known as Ultracool Dwarfs (UCDs). These new techniques are directly applicable to the search for extrasolar planets and efforts to test theoretical models of stellar structure and evolution at the bottom of the main sequence. The first three chapters of this thesis describe the development and application of a new technique for making radial velocity measurements of UCDs at near infrared (NIR) wavelengths. The first chapter describes a pilot study that demonstrates a significant improvement over previous work on Doppler measurements in the NIR. Using this technique we have carried out a Doppler survey of 65 L dwarfs. The second chapter describes the discovery of a new spectroscopic binary that may be one of the most important for constraining theoretical models of UCDs. The third chapter describes the Doppler survey in detail and presents measurements of a new spectroscopic binary system that is an excellent candidate for a giant planetary companion to a mid-L dwarf. This chapter also includes a discussion of the of the rotation, space motions, and binarity of the L dwarfs in the survey sample. The fourth chapter describes efforts to obtain precise photometric measurements of UCDs with the Peters Automated Infrared Imaging Telescope (PAIRITEL). Using software scheduling and data reduction systems designed in part by the author, PAIRITEL gathered more than 10 6 seconds of observations of a sample of 20 UCDs. We investigate the limitations to ground-based infrared photometry and characterize the ability of a system like PAIRITEL to detect transits of UCDs by Earth-like planets. The fifth chapter explores the potential impact of future synoptic surveys on studies of UCDs. Surveys like Pan-STARRS and LSST will obtain a small number of high-quality observations of a large number of UCDs. Using data from the Sloan Digital Sky Survey, we demonstrate that such data can be used to reliably detect low-mass eclipsing binary stars. We present the discovery of a double- lined eclipsing binary system that allows us to directly measure the masses and radii of two M dwarfs.

  1. The Smallest Galaxies in the Universe: Investigating the Origins of Ultra-faint Galaxies

    NASA Astrophysics Data System (ADS)

    Qi, Yuewen; Graus, Andrew; Bullock, James

    2018-01-01

    One outstanding question in cosmology is, what are the smallest galaxies that can form? The answer to this question can tell us much about galaxy formation, and even of the properties of dark matter itself. A candidate for the smallest galaxies that can form are the ultrafaint galaxies. The star formation of ultrafaints appears to have been shut off during the epoch of reionization, when radiation from the first stars ionized all the free hydrogen in the universe. This would imply ultrafaints should exist everywhere in the universe. However, we can only observe ultrafaints as satellites of the Milky Way, due to their low brightness. This will change with the next generation of telescopes such as the Large Synoptic Survey Telescope (LSST). The focus of this work is to predict the number of ultrafaints that should be seen with future surveys. To that end, we use the ELVIS suite, which contains 14 dark matter only simulations of Local Group like systems containing a Milky Way and Andromeda-like galaxy and the substructure out to around 1 Mpc of the barycenter. We mock observe the simulations in order to mimic current surveys such as the Sloan Digital Sky Survey (SDSS), and the Dark Energy Survey (DES), and use the population of galaxies found by those surveys to project the population of dwarf galaxies out beyond the virial radius of either galaxy. This number will depend sensitively on the formation mechanism of ultrafaint dwarfs, and comparisons of future surveys to this work could help rule out certain formation scenarios.

  2. Using "Big Data" in a Classroom Setting for Student-Developed Projects

    NASA Astrophysics Data System (ADS)

    Hayes-Gehrke, Melissa; Vogel, Stuart N.

    2018-01-01

    The advances in exploration of the optical transient sky anticipated with major facilities such as the Zwicky Transient Facility (ZTF) and Large Synoptic Survey Telescope (LSST) provide an opportunity to integrate large public research datasets into the undergraduate classroom. As a step in this direction, the NSF PIRE-funded GROWTH (Global Relay of Observatories Watching Transients Happen) collaboration provided funding for curriculum development using data from the precursor to ZTF, the Intermediate Palomar Transient Factory (iPTF). One of the iPTF portals, the PTF Variable Marshal, was used by 56 Astronomy majors in the fall 2016 and 2017 semesters of the required Observational Astronomy course at the University of Maryland. Student teams learned about the iPTF survey and how to use the PTF Variable Marshal and then developed their own hypotheses about variable stars to test using data they gathered from the Variable Marshal. Through this project, students gained experience in how to develop scientific questions that can be explored using large datasets and became aware of the limitations and difficulties of such projects. This work was supported in part by NSF award OISE-1545949.

  3. Astrometric surveys in the Gaia era

    NASA Astrophysics Data System (ADS)

    Zacharias, Norbert

    2018-04-01

    The Gaia first data release (DR1) already provides an almost error free optical reference frame on the milli-arcsecond (mas) level allowing significantly better calibration of ground-based astrometric data than ever before. Gaia DR1 provides positions, proper motions and trigonometric parallaxes for just over 2 million stars in the Tycho-2 catalog. For over 1.1 billion additional stars DR1 gives positions. Proper motions for these, mainly fainter stars (G >= 11.5) are currently provided by several new projects which combine earlier epoch ground-based observations with Gaia DR1 positions. These data are very helpful in the interim period but will become obsolete with the second Gaia data release (DR2) expected in April 2018. The era of traditional, ground-based, wide-field astrometry with the goal to provide accurate reference stars has come to an end. Future ground-based astrometry will fill in some gaps (very bright stars, observations needed at many or specific epochs) and mainly will go fainter than the Gaia limit, like the PanSTARRS and the upcoming LSST surveys.

  4. Wide-Field InfraRed Survey Telescope WFIRST

    NASA Technical Reports Server (NTRS)

    Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Fan, X.; Rauscher, B.; hide

    2012-01-01

    In December 2010, NASA created a Science Definition Team (SDT) for WFIRST, the Wide Field Infra-Red Survey Telescope, recommended by the Astro 2010 Decadal Survey as the highest priority for a large space mission. The SDT was chartered to work with the WFIRST Project Office at GSFC and the Program Office at JPL to produce a Design Reference Mission (DRM) for WFIRST. Part of the original charge was to produce an interim design reference mission by mid-2011. That document was delivered to NASA and widely circulated within the astronomical community. In late 2011 the Astrophysics Division augmented its original charge, asking for two design reference missions. The first of these, DRM1, was to be a finalized version of the interim DRM, reducing overall mission costs where possible. The second of these, DRM2, was to identify and eliminate capabilities that overlapped with those of NASA's James Webb Space Telescope (henceforth JWST), ESA's Euclid mission, and the NSF's ground-based Large Synoptic Survey Telescope (henceforth LSST), and again to reduce overall mission cost, while staying faithful to NWNH. This report presents both DRM1 and DRM2.

  5. Liverpool Telescope 2: beginning the design phase

    NASA Astrophysics Data System (ADS)

    Copperwheat, Christopher M.; Steele, Iain A.; Barnsley, Robert M.; Bates, Stuart D.; Bode, Mike F.; Clay, Neil R.; Collins, Chris A.; Jermak, Helen E.; Knapen, Johan H.; Marchant, Jon M.; Mottram, Chris J.; Piascik, Andrzej S.; Smith, Robert J.

    2016-07-01

    The Liverpool Telescope is a fully robotic 2-metre telescope located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope began routine science operations in 2004, and currently seven simultaneously mounted instruments support a broad science programme, with a focus on transient followup and other time domain topics well suited to the characteristics of robotic observing. Work has begun on a successor facility with the working title `Liverpool Telescope 2'. We are entering a new era of time domain astronomy with new discovery facilities across the electromagnetic spectrum, and the next generation of optical survey facilities such as LSST are set to revolutionise the field of transient science in particular. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time, and will be designed to meet the challenges of this new era. Following a conceptual design phase, we are about to begin the detailed design which will lead towards the start of construction in 2018, for first light ˜2022. In this paper we provide an overview of the facility and an update on progress.

  6. Supernova Cosmology in the Big Data Era

    NASA Astrophysics Data System (ADS)

    Kessler, Richard

    Here we describe large "Big Data" Supernova (SN) Ia surveys, past and present, used to make precision measurements of cosmological parameters that describe the expansion history of the universe. In particular, we focus on surveys designed to measure the dark energy equation of state parameter w and its dependence on cosmic time. These large surveys have at least four photometric bands, and they use a rolling search strategy in which the same instrument is used for both discovery and photometric follow-up observations. These surveys include the Supernova Legacy Survey (SNLS), Sloan Digital Sky Survey II (SDSS-II), Pan-STARRS 1 (PS1), Dark Energy Survey (DES), and Large Synoptic Survey Telescope (LSST). We discuss the development of how systematic uncertainties are evaluated, and how methods to reduce them play a major role is designing new surveys. The key systematic effects that we discuss are (1) calibration, measuring the telescope efficiency in each filter band, (2) biases from a magnitude-limited survey and from the analysis, and (3) photometric SN classification for current surveys that don't have enough resources to spectroscopically confirm each SN candidate.

  7. Prompt emission from the counter jet of a short gamma-ray burst

    NASA Astrophysics Data System (ADS)

    Yamazaki, Ryo; Ioka, Kunihito; Nakamura, Takashi

    2018-03-01

    The counter jet of a short gamma-ray burst (sGRB) has not yet been observed, while recent discoveries of gravitational waves (GWs) from a binary neutron star merger GW170817 and the associated sGRB 170817A have demonstrated that off-axis sGRB jets are detectable. We calculate the prompt emission from the counter jet of an sGRB and show that it is typically 23-26 mag in the optical-infrared band 10-10^3 s after the GWs for an sGRB 170817A-like event, which is brighter than the early macronova (or kilonova) emission and detectable by LSST in the near future. We also propose a new method to constrain the unknown jet properties, such as the Lorentz factor, opening angle, emission radii, and jet launch time, by observing both the forward and counter jets. To scrutinize the counter jets, space GW detectors like DECIGO are powerful in forecasting the merger time (≲ 1 s) and position (≲ 1 arcmin) (˜ a week) before the merger.

  8. The Zwicky Transient Facility: Overview and Commissioning Activities

    NASA Astrophysics Data System (ADS)

    Graham, Matthew; Zwicky Transient Facility (ZTF) Project Team

    2018-01-01

    The Zwicky Transient Facility (ZTF) is the first of a new generation of LSST-scope sky surveys to be realized. It will employ a 47 square degree field-of-view camera mounted on the Samuel Oschin 48-inch Schmidt telescope at Palomar Observatory to scan more than 3750 square degrees an hour to a depth of 20.5 – 21 mag. This will lead to unprecedented discovery rates for transients – a young supernova less than 24 hours after its explosion each night as well as rarer and more exotic sources. Repeated imaging of the Northern sky (including the Galactic Plane) will produce a photometric variability catalog with nearly 300 observations each year, ideal for studies of variable stars, binaries, AGN, and asteroids. ZTF represents a significant increase in scale relative to previous surveys in terms of both data volume and data complexity. It will be the first survey to produce one million alerts a night and the first to have a trillion row data archive. We will present an overview of the survey and its challenges and describe recent commissioning activities.

  9. Priming the search for cosmic superstrings using GADGET2 simulations

    NASA Astrophysics Data System (ADS)

    Cousins, Bryce; Jia, Hewei; Braverman, William; Chernoff, David

    2018-01-01

    String theory is an extensive mathematical theory which, despite its broad explanatory power, is still lacking empirical support. However, this may change when considering the scope of cosmology, where “cosmic superstrings” may serve as observational evidence. According to string theory, these superstrings were stretched to cosmic scales in the early Universe and may now be detectable, via microlensing or gravitational radiation. Negative results from prior surveys have put some limits on superstring properties, so to investigate the parameter space more effectively, we ask: “where should we expect to find cosmic superstrings, and how many should we predict?” This research investigates these questions by simulating cosmic string behavior during structure formation in the universe using GADGET2. The sizes and locations of superstring clusters are assessed using kernel density estimation and radial correlation functions. Currently, only preliminary small-scale simulations have been performed, producing superstring clustering with low sensitivity. However, future simulations of greater magnitude will offer far higher resolution, allowing us to more precisely track superstring behavior within structures. Such results will guide future searches, most imminently those made possible by LSST and WFIRST.

  10. SKA weak lensing - I. Cosmological forecasts and the power of radio-optical cross-correlations

    NASA Astrophysics Data System (ADS)

    Harrison, Ian; Camera, Stefano; Zuntz, Joe; Brown, Michael L.

    2016-12-01

    We construct forecasts for cosmological parameter constraints from weak gravitational lensing surveys involving the Square Kilometre Array (SKA). Considering matter content, dark energy and modified gravity parameters, we show that the first phase of the SKA (SKA1) can be competitive with other Stage III experiments such as the Dark Energy Survey and that the full SKA (SKA2) can potentially form tighter constraints than Stage IV optical weak lensing experiments, such as those that will be conducted with LSST, WFIRST-AFTA or Euclid-like facilities. Using weak lensing alone, going from SKA1 to SKA2 represents improvements by factors of ˜10 in matter, ˜10 in dark energy and ˜5 in modified gravity parameters. We also show, for the first time, the powerful result that comparably tight constraints (within ˜5 per cent) for both Stage III and Stage IV experiments, can be gained from cross-correlating shear maps between the optical and radio wavebands, a process which can also eliminate a number of potential sources of systematic errors which can otherwise limit the utility of weak lensing cosmology.

  11. Science capabilities of the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Devost, Daniel; McConnachie, Alan; Flagey, Nicolas; Cote, Patrick; Balogh, Michael; Driver, Simon P.; Venn, Kim

    2017-01-01

    The Maunakea Spectroscopic Explorer (MSE) project will transform the CFHT 3.6m optical telescope into a 10m class dedicated multiobject spectroscopic facility, with an ability to simultaneously measure thousands of objects with a spectral resolution range spanning 2,000 to 20,000. The project is currently in design phase, with full science operations nominally starting in 2025. MSE will enable transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. MSE is an essential follow-up facility to current and next generations of multi-wavelength imaging surveys, including LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for E-ELT, TMT and GMT. I will give an update on the status of the project and review some of the most exciting scientific capabilities of the observatory.

  12. How the cosmic web induces intrinsic alignments of galaxies

    NASA Astrophysics Data System (ADS)

    Codis, S.; Dubois, Y.; Pichon, C.; Devriendt, J.; Slyz, A.

    2016-10-01

    Intrinsic alignments are believed to be a major source of systematics for future generation of weak gravitational lensing surveys like Euclid or LSST. Direct measurements of the alignment of the projected light distribution of galaxies in wide field imaging data seem to agree on a contamination at a level of a few per cent of the shear correlation functions, although the amplitude of the effect depends on the population of galaxies considered. Given this dependency, it is difficult to use dark matter-only simulations as the sole resource to predict and control intrinsic alignments. We report here estimates on the level of intrinsic alignment in the cosmological hydrodynamical simulation Horizon-AGN that could be a major source of systematic errors in weak gravitational lensing measurements. In particular, assuming that the spin of galaxies is a good proxy for their ellipticity, we show how those spins are spatially correlated and how they couple to the tidal field in which they are embedded. We will also present theoretical calculations that illustrate and qualitatively explain the observed signals.

  13. Making The Most Of Flaring M Dwarfs

    NASA Astrophysics Data System (ADS)

    Hunt-Walker, Nicholas; Hilton, E.; Kowalski, A.; Hawley, S.; Matthews, J.; Holtzman, J.

    2011-01-01

    We present observations of flare activity using the Microvariability and Oscillations of Stars (MOST) satellite in conjunction with simultaneous spectroscopic and photometric observations from the ARC 3.5-meter, NMSU 1.0-meter, and ARCSAT 0.5-meter telescopes at the Apache Point Observatory. The MOST observations enable unprecedented completeness with regard to observing frequent, low-energy flares on the well-known dMe flare star AD Leo with broadband photometry. The observations span approximately one week with a 60-second cadence and are sensitive to flares as small as 0.01-magnitudes. The time-resolved, ground-based spectroscopy gives measurements of Hα and other important chromospheric emission lines, whereas the Johnson U-, SDSS u-, and SDSS g-band photometry provide color information during the flare events and allow us to relate the MOST observations to decades of previous broadband observations. Understanding the rates and energetics of flare events on M dwarfs will help characterize this source of variability in large time-domain surveys such as LSST and Pan-STARRS. Flare rates are also of interest to astrobiology, since flares affect the habitability of exoplanets orbiting M dwarfs.

  14. Exploration-driven NEO Detection Requirements

    NASA Astrophysics Data System (ADS)

    Head, J. N.; Sykes, M. V.

    2005-12-01

    The Vision for Space Exploration calls for use of in situ resources to support human solar system exploration goals. Focus has been on potential lunar polar ice, Martian subsurface water and resource extraction from Phobos. Near-earth objects (NEOs) offer easily accessible targets that may represent a critical component to achieving sustainable human operations, in particular small, newly discovered asteroids within a specified dynamical range having requisite composition and frequency. A minimum size requirement is estimated assuming CONOPs has an NEO harvester on station at L1. When the NEO launch window opens, the vehicle departs, rendezvousing within 30 days. Mining and processing operations ( 60 days) produces dirty water for the return trip ( 30 days) to L1 for final refinement into propellants. A market for propellant at L1 is estimated to be 700 mT /year: 250 mT for Mars missions, 100 mT for GTO services (Blair et al. 2002), 50 mT for L1 to lunar surface services, and 300 mT for bringing NEO-derived propellants to L1. Assuming an appropriate NEO has 5% recoverable water, exploited with 50% efficiency, 23000 mT/year must be processed. At 1500 kg/m3, this corresponds to one object per year with a radius of 15 meters, or two 5 m radius objects per month, of which it is estimated there are 10000 having delta-v < 4.2 km/s and 200/year of these available for short roundtrip missions to meet resource requirements (Jones et al. 2002). The importance of these potential resource objects should drive a requirement that next generation NEO detection systems (e.g., Pan-STARRS/LSST) be capable by 2010 of detecting dark NEOs fainter than V=24, allowing for identification 3 months before closest approach. Blair et al. 2002. Final Report to NASA Exploration Team, December 20, 2002. Jones et al. 2002. ASP Conf. Series Vol. 202 (M. Sykes, Ed.), pp. 141-154.

  15. Astronomical Image Processing with Hadoop

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Krughoff, S.; Gardner, J.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-07-01

    In the coming decade astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. With a requirement that these images be analyzed in real time to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. In the commercial world, new techniques that utilize cloud computing have been developed to handle massive data streams. In this paper we describe how cloud computing, and in particular the map-reduce paradigm, can be used in astronomical data processing. We will focus on our experience implementing a scalable image-processing pipeline for the SDSS database using Hadoop (http://hadoop.apache.org). This multi-terabyte imaging dataset approximates future surveys such as those which will be conducted with the LSST. Our pipeline performs image coaddition in which multiple partially overlapping images are registered, integrated and stitched into a single overarching image. We will first present our initial implementation, then describe several critical optimizations that have enabled us to achieve high performance, and finally describe how we are incorporating a large in-house existing image processing library into our Hadoop system. The optimizations involve prefiltering of the input to remove irrelevant images from consideration, grouping individual FITS files into larger, more efficient indexed files, and a hybrid system in which a relational database is used to determine the input images relevant to the task. The incorporation of an existing image processing library, written in C++, presented difficult challenges since Hadoop is programmed primarily in Java. We will describe how we achieved this integration and the sophisticated image processing routines that were made feasible as a result. We will end by briefly describing the longer term goals of our work, namely detection and classification of transient objects and automated object classification.

  16. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    NASA Astrophysics Data System (ADS)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel yet unverified survey cadences (e.g. the baseline LSST cadence) that sparsely spread the observations required for detection over several days or weeks.

  17. Artificial Intelligence and the Brave New World of Eclipsing Binaries

    NASA Astrophysics Data System (ADS)

    Devinney, E.; Guinan, E.; Bradstreet, D.; DeGeorge, M.; Giammarco, J.; Alcock, C.; Engle, S.

    2005-12-01

    The explosive growth of observational capabilities and information technology over the past decade has brought astronomy to a tipping point - we are going to be deluged by a virtual fire hose (more like Niagara Falls!) of data. An important component of this deluge will be newly discovered eclipsing binary stars (EBs) and other valuable variable stars. As exploration of the Local Group Galaxies grows via current and new ground-based and satellite programs, the number of EBs is expected to grow explosively from some 10,000 today to 8 million as GAIA comes online. These observational advances will present a unique opportunity to study the properties of EBs formed in galaxies with vastly different dynamical, star formation, and chemical histories than our home Galaxy. Thus the study of these binaries (e.g., from light curve analyses) is expected to provide clues about the star formation rates and dynamics of their host galaxies as well as the possible effects of varying chemical abundance on stellar evolution and structure. Additionally, minimal-assumption-based distances to Local Group objects (and possibly 3-D mapping within these objects) shall be returned. These huge datasets of binary stars will provide tests of current theories (or suggest new theories) regarding binary star formation and evolution. However, these enormous data will far exceed the capabilities of analysis via human examination. To meet the daunting challenge of successfully mining this vast potential of EBs and variable stars for astrophysical results with minimum human intervention, we are developing new data processing techniques and methodologies. Faced with an overwhelming volume of data, our goal is to integrate technologies of Machine Learning and Pattern Processing (Artificial Intelligence [AI]) into the data processing pipelines of the major current and future ground- and space-based observational programs. Data pipelines of the future will have to carry us from observations to astrophysics with minimal human intervention. While there has been some recognition of this need (e.g. the LSST project drawing on the experience of MACHO/OGLE), few steps have been taken to address this crucial issue. Fortunately, advances in AI have created the opportunity to make significant progress in this direction. Here we discuss our plans to develop an Intelligent Data Pipeline (IDP) that can operate autonomously on large observational datasets to produce results of astrophysical value. Plans and initial results are discussed. This research is supported by NSF/RUI Grant AST05-07542 which we gratefully acknowledge.

  18. The Path from VLITE to ngLOBO: A Roadmap for Evolving a Low Frequency Commensal System from the JVLA to the ngVLA

    NASA Astrophysics Data System (ADS)

    Kassim, Namir E.; Clarke, Tracy; Giacintucci, Simona; Helmboldt, Joseph; Ray, Paul S.; Peters, Wendy; Polisensky, Emil; hicks, Brian C.; Brisken, Walter; hyman, Scott D.; Deneva, Julia; Kerr, Matthew T.; Taylor, Gregory; Dowell, Jayce; Schinzel, Frank K.

    2018-01-01

    The VLA Low-band Ionosphere and Transient Experiment (VLITE, ) is a commensal observing system on the NRAO Karl G. Jansky Very Large Array (VLA). The separate optical path of the prime-focus sub-GHz dipole feeds and the Cassegrain-focus GHz feeds provided an opportunity to expand the simultaneous frequency operation of the VLA through joint observations across both systems. 16 VLA antennas are outfitted with dedicated samplers and use spare fibers to transport the 320-384 MHz band to the VLITE CPU-based correlator. Initial goals included exploring the scientific potential of a commensal low frequency system for ionospheric remote sensing, astrophysics and transients. VLITE operates at nearly 70% wall time with roughly 6200 hours of VLA time recorded each year.Several papers at this meeting review VLITE science and early results. Here we consider how the project could evolve in the future. Over the next 10 years, a straightforward evolutionary path calls for an expansion of VLITE to all 27 VLA antennas and to the maximum available low band receiver bandwidth (224-480 MHz). The GPU-based correlator for this LOw Band Observatory (LOBO) would also incorporate lower frequency signals from the new VLA 74 MHz system, including from VLA dishes (60-80 MHz) and standalone Long Wavelength Array (LWA) aperture array stations (20-80 MHz).In the longer term, we look towards leveraging the vast infrastructure of the ngVLA to include a commensal low frequency capability, called ngLOBO. As described in our community white paper (Taylor et al. 2018; arXiv:1708.00090), ngLOBO has three primary scientific missions: (1) Radio Large Synoptic Survey Telescope (Radio-LSST): one naturally wide beam, commensal with ngVLA, will conduct a continuous synoptic survey of large swaths of the sky for both slow and fast transients; (2) This same commensal beam will provide complementary low frequency images of all ngVLA targets when such data enhances their value. (3) Independent beams from the ngLOBO-Low aperture array will conduct research in astrophysics, Earth science and space weather applications, engaging new communities and attracting independent resources. We describe our developing ngLOBO technical concept.

  19. Power Systems for Future Missions: Appendices A-L

    NASA Technical Reports Server (NTRS)

    Gill, S. P.; Frye, P. E.; Littman, Franklin D.; Meisl, C. J.

    1994-01-01

    Selection of power system technology for space applications is typically based on mass, readiness of a particular technology to meet specific mission requirements, and life cycle costs (LCC). The LCC is typically used as a discriminator between competing technologies for a single mission application. All other future applications for a given technology are usually ignored. As a result, development cost of a technology becomes a dominant factor in the LCC comparison. Therefore, it is common for technologies such as DIPS and LMR-CBC to be potentially applicable to a wide range of missions and still lose out in the initial LCC comparison due to high development costs. This collection of appendices (A through L) contains the following power systems technology plans: CBC DIPS Technology Roadmap; PEM PFC Technology Roadmap; NAS Battery Technology Roadmap; PV/RFC Power System Technology Roadmap; PV/NAS Battery Technology Roadmap; Thermionic Reactor Power System Technology Roadmap; SP-100 Power System Technology Roadmap; Dynamic SP-100 Power System Technology Roadmap; Near-Term Solar Dynamic Power System Technology Roadmap; Advanced Solar Dynamic Power System Technology Roadmap; Advanced Stirling Cycle Dynamic Isotope Power System Technology Roadmap; and the ESPPRS (Evolutionary Space Power and Propulsion Requirements System) User's Guide.

  20. OAST system technology planning

    NASA Technical Reports Server (NTRS)

    Sadin, S. R.

    1978-01-01

    The NASA Office of Aeronautics and Space Technology developed a planning model for space technology consisting of a space systems technology model, technology forecasts and technology surveys. The technology model describes candidate space missions through the year 2000 and identifies their technology requirements. The technology surveys and technology forecasts provide, respectively, data on the current status and estimates of the projected status of relevant technologies. These tools are used to further the understanding of the activities and resources required to ensure the timely development of technological capabilities. Technology forecasting in the areas of information systems, spacecraft systems, transportation systems, and power systems are discussed.

  1. Serendipitous discovery of a strong-lensed galaxy in integral field spectroscopy from MUSE

    NASA Astrophysics Data System (ADS)

    Galbany, Lluís; Collett, Thomas E.; Méndez-Abreu, Jairo; Sánchez, Sebastián F.; Anderson, Joseph P.; Kuncarayakti, Hanindyo

    2018-06-01

    2MASX J04035024-0239275 is a bright red elliptical galaxy at redshift 0.0661 that presents two extended sources at 2″ to the north-east and 1″ to the south-west. The sizes and surface brightnesses of the two blue sources are consistent with a gravitationally-lensed background galaxy. In this paper we present MUSE observations of this galaxy from the All-weather MUse Supernova Integral-field Nearby Galaxies (AMUSING) survey, and report the discovery of a background lensed galaxy at redshift 0.1915, together with other 15 background galaxies at redshifts ranging from 0.09 to 0.9, that are not multiply imaged. We have extracted aperture spectra of the lens and all the sources and fit the stellar continuum with STARLIGHT to estimate their stellar and emission line properties. A trace of past merger and active nucleus activity is found in the lensing galaxy, while the background lensed galaxy is found to be star-forming. Modeling the lensing potential with a singular isothermal ellipsoid, we find an Einstein radius of 1."45±0."04, which corresponds to 1.9 kpc at the redshift of the lens and it is much smaller than its effective radius (reff ˜ 9″"). Comparing the Einstein mass and the STARLIGHT stellar mass within the same aperture yields a dark matter fraction of 18% ± 8 % within the Einstein radius. The advent of large surveys such as the Large Synoptic Survey Telescope (LSST) will discover a number of strong-lensed systems, and here we demonstrate how wide-field integral field spectroscopy offers an excellent approach to study them and to precisely model lensing effects.

  2. RoboTAP: Target priorities for robotic microlensing observations

    NASA Astrophysics Data System (ADS)

    Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.

    2018-01-01

    Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.

  3. Identifying the progenitors of present-day early-type galaxies in observational surveys: correcting `progenitor bias' using the Horizon-AGN simulation

    NASA Astrophysics Data System (ADS)

    Martin, G.; Kaviraj, S.; Devriendt, J. E. G.; Dubois, Y.; Pichon, C.; Laigle, C.

    2018-03-01

    As endpoints of the hierarchical mass-assembly process, the stellar populations of local early-type galaxies encode the assembly history of galaxies over cosmic time. We use Horizon-AGN, a cosmological hydrodynamical simulation, to study the merger histories of local early-type galaxies and track how the morphological mix of their progenitors evolves over time. We provide a framework for alleviating `progenitor bias' - the bias that occurs if one uses only early-type galaxies to study the progenitor population. Early types attain their final morphology at relatively early epochs - by z ˜ 1, around 60 per cent of today's early types have had their last significant merger. At all redshifts, the majority of mergers have one late-type progenitor, with late-late mergers dominating at z > 1.5 and early-early mergers becoming significant only at z < 0.5. Progenitor bias is severe at all but the lowest redshifts - e.g. at z ˜ 0.6, less than 50 per cent of the stellar mass in today's early types is actually in progenitors with early-type morphology, while, at z ˜ 2, studying only early types misses almost all (80 per cent) of the stellar mass that eventually ends up in local early-type systems. At high redshift, almost all massive late-type galaxies, regardless of their local environment or star formation rate, are progenitors of local early-type galaxies, as are lower mass (M⋆ < 1010.5 M_{⊙}) late-types as long as they reside in high-density environments. In this new era of large observational surveys (e.g. LSST, JWST), this study provides a framework for studying how today's early-type galaxies have been built up over cosmic time.

  4. Searching for Exoplanets using Artificial Intelligence

    NASA Astrophysics Data System (ADS)

    Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann

    2017-10-01

    In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.

  5. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  6. Status of LOFAR Data in HDF5 Format

    NASA Astrophysics Data System (ADS)

    Alexov, A.; Schellart, P.; ter Veen, S.; van der Akker, M.; Bähren, L.; Greissmeier, J.-M.; Hessels, J. W. T.; Mol, J. D.; Renting, G. A.; Swinbank, J.; Wise, M.

    2012-09-01

    The Hierarchical Data Format, version 5 (HDF5) is a data model, library, and file format for storing and managing data. It is designed for flexible and efficient I/O and for high volume, complex data. The Low Frequency Array (LOFAR) project is solving the challenge of data size and complexity using HDF5. Most of LOFAR's standard data products will be stored using HDF5; the beam-formed time-series data and transient buffer board data have already transitioned from project-specific binary format to HDF5. We report on our effort to pave the way towards new astronomical data encapsulation using HDF5, which can be used by future ground and space projects. The LOFAR project has formed a collaboration with NRAO, the Virtual Astronomical Observatory (VAO) and the HDF Group to obtain funding for a full-time staff member to work on documenting and developing standards for astronomical data written in HDF5. We hope our effort will enhance HDF5 visibility and usage within the community, specifically for LSST, the SKA pathfinders (ASKAP, MeerKAT, MWA, LWA), and other major new radio telescopes such as EVLA, ALMA, and eMERLIN.

  7. Constraints on inflation with LSS surveys: features in the primordial power spectrum

    NASA Astrophysics Data System (ADS)

    Palma, Gonzalo A.; Sapone, Domenico; Sypsas, Spyros

    2018-06-01

    We analyse the efficiency of future large scale structure surveys to unveil the presence of scale dependent features in the primordial spectrum—resulting from cosmic inflation—imprinted in the distribution of galaxies. Features may appear as a consequence of non-trivial dynamics during cosmic inflation, in which one or more background quantities experienced small but rapid deviations from their characteristic slow-roll evolution. We consider two families of features: localised features and oscillatory extended features. To characterise them we employ various possible templates parametrising their scale dependence and provide forecasts on the constraints on these parametrisations for LSST like surveys. We perform a Fisher matrix analysis for three observables: cosmic microwave background (CMB), galaxy clustering and weak lensing. We find that the combined data set of these observables will be able to limit the presence of features down to levels that are more restrictive than current constraints coming from CMB observations only. In particular, we address the possibility of gaining information on currently known deviations from scale invariance inferred from CMB data, such as the feature appearing at the l ~ 20 multipole (which is the main contribution to the low-l deficit) and another one around l ~ 800.

  8. EmpiriciSN: Re-sampling Observed Supernova/Host Galaxy Populations Using an XD Gaussian Mixture Model

    NASA Astrophysics Data System (ADS)

    Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.

    2017-06-01

    We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.

  9. Supernovae anisotropy power spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghodsi, Hoda; Baghram, Shant; Habibi, Farhang, E-mail: h.ghodsi@mehr.sharif.ir, E-mail: baghram@sharif.edu, E-mail: habibi@lal.in2p3.fr

    2017-10-01

    We contribute another anisotropy study to this field of research using Type Ia supernovae (SNe Ia). In this work, we utilise the power spectrum calculation method and apply it to both the current SNe Ia data and simulation. Using the Union2.1 data set at all redshifts, we compare the spectrum of the residuals of the observed distance moduli to that expected from an isotropic universe affected by the Union2.1 observational uncertainties at low multipoles. Through this comparison we find a dipolar anisotropy with tension of less that 2σ towards l = 171° ± 21° and b = −26° ± 28°more » which is mainly induced by anisotropic spatial distribution of the SNe with z > 0.2 rather than being a cosmic effect. Furthermore, we find a tension of ∼ 4σ at ℓ = 4 between the two spectra. Our simulations are constructed with the characteristics of the upcoming surveys like the Large Synoptic Survey Telescope (LSST), which shall bring us the largest SNe Ia collection to date. We make predictions for the amplitude of a possible dipolar anisotropy that would be detectable by future SNe Ia surveys.« less

  10. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  11. The SED Machine: A Robotic Spectrograph for Fast Transient Classification

    NASA Astrophysics Data System (ADS)

    Blagorodnova, Nadejda; Neill, James D.; Walters, Richard; Kulkarni, Shrinivas R.; Fremling, Christoffer; Ben-Ami, Sagi; Dekany, Richard G.; Fucik, Jason R.; Konidaris, Nick; Nash, Reston; Ngeow, Chow-Choong; Ofek, Eran O.; O’ Sullivan, Donal; Quimby, Robert; Ritter, Andreas; Vyhmeister, Karl E.

    2018-03-01

    Current time domain facilities are finding several hundreds of transient astronomical events a year. The discovery rate is expected to increase in the future as soon as new surveys such as the Zwicky Transient Facility (ZTF) and the Large Synoptic Sky Survey (LSST) come online. Presently, the rate at which transients are classified is approximately one order or magnitude lower than the discovery rate, leading to an increasing “follow-up drought”. Existing telescopes with moderate aperture can help address this deficit when equipped with spectrographs optimized for spectral classification. Here, we provide an overview of the design, operations and first results of the Spectral Energy Distribution Machine (SEDM), operating on the Palomar 60-inch telescope (P60). The instrument is optimized for classification and high observing efficiency. It combines a low-resolution (R ∼ 100) integral field unit (IFU) spectrograph with “Rainbow Camera” (RC), a multi-band field acquisition camera which also serves as multi-band (ugri) photometer. The SEDM was commissioned during the operation of the intermediate Palomar Transient Factory (iPTF) and has already lived up to its promise. The success of the SEDM demonstrates the value of spectrographs optimized for spectral classification.

  12. Synthetic Survey of the Kepler Field

    NASA Astrophysics Data System (ADS)

    Wells, Mark; Prša, Andrej

    2018-01-01

    In the era of large scale surveys, including LSST and Gaia, binary population studies will flourish due to the large influx of data. In addition to probing binary populations as a function of galactic latitude, under-sampled groups such as low mass binaries will be observed at an unprecedented rate. To prepare for these missions, binary population simulations need to be carried out at high fidelity. These simulations will enable the creation of simulated data and, through comparison with real data, will allow the underlying binary parameter distributions to be explored. In order for the simulations to be considered robust, they should reproduce observed distributions accurately. To this end we have developed a simulator which takes input models and creates a synthetic population of eclipsing binaries. Starting from a galactic single star model, implemented using Galaxia, a code by Sharma et al. (2011), and applying observed multiplicity, mass-ratio, period, and eccentricity distributions, as reported by Raghavan et al. (2010), Duchêne & Kraus (2013), and Moe & Di Stefano (2017), we are able to generate synthetic binary surveys that correspond to any survey cadences. In order to calibrate our input models we compare the results of our synthesized eclipsing binary survey to the Kepler Eclipsing Binary catalog.

  13. SPHEREx: Probing the Physics of Inflation with an All-Sky Spectroscopic Galaxy Survey

    NASA Astrophysics Data System (ADS)

    Dore, Olivier; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A in August 2017, is an all-sky survey satellite designed to address all three science goals in NASA’s astrophysics division: probe the origin and destiny of our Universe; explore whether planets around other stars could harbor life; and explore the origin and evolution of galaxies. These themes are addressed by a single survey, with a single instrument.In this poster, we describe how SPHEREx can probe the physics of inflationary non-Gaussianity by measuring large-scale structure with galaxy redshifts over a large cosmological volume at low redshifts, complementing high-redshift surveys optimized to constrain dark energy.SPHEREx will be the first all-sky near-infrared spectral survey, creating a legacy archive of spectra. In particular, it will measure the redshifts of over 500 million galaxies of all types, an unprecedented dataset. Using this catalog, SPHEREx will reduce the uncertainty in fNL -- a parameter describing the inflationary initial conditions -- by a factor of more than 10 compared with CMB measurements. At the same time, this catalog will enable strong scientific synergies with Euclid, WFIRST and LSST

  14. Halo models of HI selected galaxies

    NASA Astrophysics Data System (ADS)

    Paul, Niladri; Choudhury, Tirthankar Roy; Paranjape, Aseem

    2018-06-01

    Modelling the distribution of neutral hydrogen (HI) in dark matter halos is important for studying galaxy evolution in the cosmological context. We use a novel approach to infer the HI-dark matter connection at the massive end (m_H{I} > 10^{9.8} M_{⊙}) from radio HI emission surveys, using optical properties of low-redshift galaxies as an intermediary. In particular, we use a previously calibrated optical HOD describing the luminosity- and colour-dependent clustering of SDSS galaxies and describe the HI content using a statistical scaling relation between the optical properties and HI mass. This allows us to compute the abundance and clustering properties of HI-selected galaxies and compare with data from the ALFALFA survey. We apply an MCMC-based statistical analysis to constrain the free parameters related to the scaling relation. The resulting best-fit scaling relation identifies massive HI galaxies primarily with optically faint blue centrals, consistent with expectations from galaxy formation models. We compare the Hi-stellar mass relation predicted by our model with independent observations from matched Hi-optical galaxy samples, finding reasonable agreement. As a further application, we make some preliminary forecasts for future observations of HI and optical galaxies in the expected overlap volume of SKA and Euclid/LSST.

  15. EmpiriciSN: Re-sampling Observed Supernova/Host Galaxy Populations Using an XD Gaussian Mixture Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holoien, Thomas W. -S.; Marshall, Philip J.; Wechsler, Risa H.

    We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of amore » subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.« less

  16. The Dark Energy Spectroscopic Instrument (DESI)

    NASA Astrophysics Data System (ADS)

    Flaugher, Brenna; Bebek, Chris

    2014-07-01

    The Dark Energy Spectroscopic Instrument (DESI) is a Stage IV ground-based dark energy experiment that will study baryon acoustic oscillations (BAO) and the growth of structure through redshift-space distortions with a wide-area galaxy and quasar spectroscopic redshift survey. The DESI instrument consists of a new wide-field (3.2 deg. linear field of view) corrector plus a multi-object spectrometer with up to 5000 robotically positioned optical fibers and will be installed at prime focus on the Mayall 4m telescope at Kitt Peak, Arizona. The fibers feed 10 three-arm spectrographs producing spectra that cover a wavelength range from 360-980 nm and have resolution of 2000-5500 depending on the wavelength. The DESI instrument is designed for a 14,000 sq. deg. multi-year survey of targets that trace the evolution of dark energy out to redshift 3.5 using the redshifts of luminous red galaxies (LRGs), emission line galaxies (ELGs) and quasars. DESI is the successor to the successful Stage-III BOSS spectroscopic redshift survey and complements imaging surveys such as the Stage-III Dark Energy Survey (DES, currently operating) and the Stage-IV Large Synoptic Survey Telescope (LSST, planned start early in the next decade).

  17. Cluster cosmology with next-generation surveys.

    NASA Astrophysics Data System (ADS)

    Ascaso, B.

    2017-03-01

    The advent of next-generation surveys will provide a large number of cluster detections that will serve the basis for constraining cos mological parameters using cluster counts. The main two observational ingredients needed are the cluster selection function and the calibration of the mass-observable relation. In this talk, we present the methodology designed to obtain robust predictions of both ingredients based on realistic cosmological simulations mimicking the following next-generation surveys: J-PAS, LSST and Euclid. We display recent results on the selection functions for these mentioned surveys together with others coming from other next-generation surveys such as eROSITA, ACTpol and SPTpol. We notice that the optical and IR surveys will reach the lowest masses between 0.3

  18. Applying Technology Ranking and Systems Engineering in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Luna, Bernadette (Technical Monitor)

    2000-01-01

    According to the Advanced Life Support (ALS) Program Plan, the Systems Modeling and Analysis Project (SMAP) has two important tasks: 1) prioritizing investments in ALS Research and Technology Development (R&TD), and 2) guiding the evolution of ALS systems. Investments could be prioritized simply by independently ranking different technologies, but we should also consider a technology's impact on system design. Guiding future ALS systems will require SMAP to consider many aspects of systems engineering. R&TD investments can be prioritized using familiar methods for ranking technology. The first step is gathering data on technology performance, safety, readiness level, and cost. Then the technologies are ranked using metrics or by decision analysis using net present economic value. The R&TD portfolio can be optimized to provide the maximum expected payoff in the face of uncertain future events. But more is needed. The optimum ALS system can not be designed simply by selecting the best technology for each predefined subsystem. Incorporating a new technology, such as food plants, can change the specifications of other subsystems, such as air regeneration. Systems must be designed top-down starting from system objectives, not bottom-up from selected technologies. The familiar top-down systems engineering process includes defining mission objectives, mission design, system specification, technology analysis, preliminary design, and detail design. Technology selection is only one part of systems analysis and engineering, and it is strongly related to the subsystem definitions. ALS systems should be designed using top-down systems engineering. R&TD technology selection should consider how the technology affects ALS system design. Technology ranking is useful but it is only a small part of systems engineering.

  19. High resolution optical surface metrology with the slope measuring portable optical test system

    NASA Astrophysics Data System (ADS)

    Maldonado, Alejandro V.

    New optical designs strive to achieve extreme performance, and continually increase the complexity of prescribed optical shapes, which often require wide dynamic range and high resolution. SCOTS, or the Software Configurable Optical Test System, can measure a wide range of optical surfaces with high sensitivity using surface slope. This dissertation introduces a high resolution version of SCOTS called SPOTS, or the Slope measuring Portable Optical Test System. SPOTS improves the metrology of surface features on the order of sub-millimeter to decimeter spatial scales and nanometer to micrometer level height scales. Currently there is no optical surface metrology instrument with the same utility. SCOTS uses a computer controlled display (such as an LCD monitor) and camera to measure surface slopes over the entire surface of a mirror. SPOTS differs in that an additional lens is placed near the surface under test. A small prototype system is discussed in general, providing the support for the design of future SPOTS devices. Then the SCOTS instrument transfer function is addressed, which defines the way the system filters surface heights. Lastly, the calibration and performance of larger SPOTS device is analyzed with example measurements of the 8.4-m diameter aspheric Large Synoptic Survey Telescope's (LSST) primary mirror. In general optical systems have a transfer function, which filters data. In the case of optical imaging systems the instrument transfer function (ITF) follows the modulation transfer function (MTF), which causes a reduction of contrast as a function of increasing spatial frequency due to diffraction. In SCOTS, ITF is shown to decrease the measured height of surface features as their spatial frequency increases, and thus the SCOTS and SPOTS ITF is proportional to their camera system's MTF. Theory and simulations are supported by a SCOTS measurement of a test piece with a set of lithographically written sinusoidal surface topographies. In addition, an example of a simple inverse filtering technique is provided. The success of a small SPOTS proof of concept instrument paved the way for a new larger prototype system, which is intended to measure subaperture regions on large optical mirrors. On large optics, the prototype SPOTS is light weight and it rests on the surface being tested. One advantage of this SPOTS is stability over time in maintaining its calibration. Thus the optician can simply place SPOTS on the mirror, perform a simple alignment, collect measurement data, then pick the system up and repeat at a new location. The entire process takes approximately 5 to 10 minutes, of which 3 minutes is spent collecting data. SPOTS' simplicity of design, light weight, robustness, wide dynamic range, and high sensitivity make it a useful tool for optical shop use during the fabrication and testing process of large and small optics.

  20. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  1. Large Space Systems Technology, Part 2, 1981

    NASA Technical Reports Server (NTRS)

    Boyer, W. J. (Compiler)

    1982-01-01

    Four major areas of interest are covered: technology pertinent to large antenna systems; technology related to the control of large space systems; basic technology concerning structures, materials, and analyses; and flight technology experiments. Large antenna systems and flight technology experiments are described. Design studies, structural testing results, and theoretical applications are presented with accompanying validation data. These research studies represent state-of-the art technology that is necessary for the development of large space systems. A total systems approach including structures, analyses, controls, and antennas is presented as a cohesive, programmatic plan for large space systems.

  2. Large space systems technology, 1980, volume 1

    NASA Technical Reports Server (NTRS)

    Kopriver, F., III (Compiler)

    1981-01-01

    The technological and developmental efforts in support of the large space systems technology are described. Three major areas of interests are emphasized: (1) technology pertient to large antenna systems; (2) technology related to large space systems; and (3) activities that support both antenna and platform systems.

  3. The 1980 Large space systems technology. Volume 2: Base technology

    NASA Technical Reports Server (NTRS)

    Kopriver, F., III (Compiler)

    1981-01-01

    Technology pertinent to large antenna systems, technology related to large space platform systems, and base technology applicable to both antenna and platform systems are discussed. Design studies, structural testing results, and theoretical applications are presented with accompanying validation data. A total systems approach including controls, platforms, and antennas is presented as a cohesive, programmatic plan for large space systems.

  4. Space and nuclear research and technology

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A fact sheet is presented on the space and nuclear research and technology program consisting of a research and technology base, system studies, system technology programs, entry systems technology, and experimental programs.

  5. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hannan, M.A., E-mail: hannan@eng.ukm.my; Abdulla Al Mamun, Md., E-mail: md.abdulla@siswa.ukm.edu.my; Hussain, Aini, E-mail: aini@eng.ukm.my

    Highlights: • Classification of available technologies for SWM system in four core category. • Organization of technology based SWM systems in three main groups. • Summary of SWM systems with target application, methodology and functional domain. • Issues and challenges are highlighted for further design of a sustainable system. - Abstract: In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challengesmore » towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system.« less

  6. 1998 IEEE Aerospace Conference. Proceedings.

    NASA Astrophysics Data System (ADS)

    The following topics were covered: science frontiers and aerospace; flight systems technologies; spacecraft attitude determination and control; space power systems; smart structures and dynamics; military avionics; electronic packaging; MEMS; hyperspectral remote sensing for GVP; space laser technology; pointing, control, tracking and stabilization technologies; payload support technologies; protection technologies; 21st century space mission management and design; aircraft flight testing; aerospace test and evaluation; small satellites and enabling technologies; systems design optimisation; advanced launch vehicles; GPS applications and technologies; antennas and radar; software and systems engineering; scalable systems; communications; target tracking applications; remote sensing; advanced sensors; and optoelectronics.

  7. Projecting technology change to improve space technology planning and systems management

    NASA Astrophysics Data System (ADS)

    Walk, Steven Robert

    2011-04-01

    Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.

  8. The Dairy Technology System in Venezuela. Summary of Research 79.

    ERIC Educational Resources Information Center

    Nieto, Ruben D.; Henderson, Janet L.

    A study examined the agricultural technology system in Venezuela with emphasis on the dairy industry. An analytical framework was used to identify the strengths and weaknesses of the following components of Venezuela's agricultural technology system: policy, technology development, technology transfer, and technology use. Selected government…

  9. Applying systems engineering methodologies to the micro- and nanoscale realm

    NASA Astrophysics Data System (ADS)

    Garrison Darrin, M. Ann

    2012-06-01

    Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, NOX CONTROL TECHNOLOGIES, CATALYTICA COMBUSTION SYSTEMS, INC., XONON FLAMELESS COMBUSTION SYSTEM

    EPA Science Inventory

    The Environmental Technology Verification report discusses the technology and performance of the Xonon Cool Combustion System manufactured by Catalytica Energy Systems, Inc., formerly Catalytica Combustion Systems, Inc., to control NOx emissions from gas turbines that operate wit...

  11. The Status of Spacecraft Bus and Platform Technology Development Under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2014-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in three areas that include Propulsion System Technologies, Entry Vehicle Technologies, and Systems Mission Analysis. ISPTs propulsion technologies include: 1) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; 2) a Hall-effect electric propulsion (HEP) system for sample return and low cost missions; 3) the Advanced Xenon Flow Control System (AXFS); ultra-lightweight propellant tank technologies (ULTT); and propulsion technologies for a Mars Ascent Vehicle (MAV). The AXFS and ULTT are two component technologies being developed with nearer-term flight infusion in mind, whereas NEXT and the HEP are being developed as EP systems. ISPTs entry vehicle technologies are: 1) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GNC) models of blunt-body rigid aeroshells; and aerothermal effect models; and 2) Multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions. The Systems Mission Analysis area is focused on developing tools and assessing the application of propulsion, entry vehicle, and spacecraft bus technologies to a wide variety of mission concepts. Several of the ISPT technologies are related to sample return missions and other spacecraft bus technology needs like: MAV propulsion, MMEEV, and electric propulsion. These technologies, as well as Aerocapture, are more vehicle and mission-focused, and present a different set of technology development challenges. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, Flagship and sample return missions currently under consideration. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness.

  12. Trend of Autonomous Decentralized System Technologies and Their Application in IC Card Ticket System

    NASA Astrophysics Data System (ADS)

    Mori, Kinji; Shiibashi, Akio

    The advancement of technology is ensured by step-by-step innovation and its implementation into society. Autonomous Decentralized Systems (ADSs) have been growing since first proposed in 1977. Since then, the ADS technologies and their implementations have interacted with the evolving markets, sciences, and technologies. The ADS concept is proposed on biological analogy, and its technologies have been advanced according to changing and expanding requirements. These technologies are now categorized into six generations on the basis of requirements and system structures, but the ADS concept and its system architecture have not changed. The requirements for the system can be divided in operation-oriented, mass service-oriented, and personal service-oriented categories. Moreover, these technologies have been realized in homogeneous system structure and, as the next step, in heterogeneous system structure. These technologies have been widely applied in manufacturing, telecommunications, information provision/utilization, data centers, transportation, and so on. They have been operating successfully throughout the world. In particular, ADS technologies have been applied in Suica, the IC card ticket system (ICCTS) for fare collection and e-commerce. This system is not only expanding in size and functionality but also its components are being modified almost every day without stopping its operation. This system and its technologies are shown here. Finally, the future direction of ADS is discussed, and one of its technologies is presented.

  13. Large Space Antenna Systems Technology, 1984

    NASA Technical Reports Server (NTRS)

    Boyer, W. J. (Compiler)

    1985-01-01

    Mission applications for large space antenna systems; large space antenna structural systems; materials and structures technology; structural dynamics and control technology, electromagnetics technology, large space antenna systems and the Space Station; and flight test and evaluation were examined.

  14. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  15. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  16. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  17. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  18. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  19. Large Space Antenna Systems Technology, 1984

    NASA Technical Reports Server (NTRS)

    Boyer, W. J. (Compiler)

    1985-01-01

    Papers are presented which provide a comprehensive review of space missions requiring large antenna systems and of the status of key technologies required to enable these missions. Topic areas include mission applications for large space antenna systems, large space antenna structural systems, materials and structures technology, structural dynamics and control technology, electromagnetics technology, large space antenna systems and the space station, and flight test and evaluation.

  20. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  1. An Assessment of Integrated Flywheel System Technology

    NASA Technical Reports Server (NTRS)

    Keckler, C. R. (Editor); Bechtel, R. T. (Editor); Groom, N. J. (Editor)

    1984-01-01

    The current state of the technology in flywheel storage systems and ancillary components, the technology in light of future requirements, and technology development needs to rectify these shortfalls were identified. Technology efforts conducted in Europe and in the United States were reviewed. Results of developments in composite material rotors, magnetic suspension systems, motor/generators and electronics, and system dynamics and control were presented. The technology issues for the various disciplines and technology enhancement scenarios are discussed. A summary of the workshop, and conclusions and recommendations are presented.

  2. SSTAC/ARTS review of the draft Integrated Technology Plan (ITP). Volume 5: Human Support

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Viewgraphs of briefings from the Space Systems and Technology Advisory Committee (SSTAC)/ARTS review of the draft integrated technology plan (ITP) on human support are included. Topics covered include: human support program; human factors; life support technology; fire safety; medical support technology; advanced refrigeration technology; EVA suit system; advanced PLSS technology; and ARC-EVA systems research program.

  3. Design and realization of intelligent tourism service system based on voice interaction

    NASA Astrophysics Data System (ADS)

    Hu, Lei-di; Long, Yi; Qian, Cheng-yang; Zhang, Ling; Lv, Guo-nian

    2008-10-01

    Voice technology is one of the important contents to improve the intelligence and humanization of tourism service system. Combining voice technology, the paper concentrates on application needs and the composition of system to present an overall intelligent tourism service system's framework consisting of presentation layer, Web services layer, and tourism application service layer. On the basis, the paper further elaborated the implementation of the system and its key technologies, including intelligent voice interactive technology, seamless integration technology of multiple data sources, location-perception-based guides' services technology, and tourism safety control technology. Finally, according to the situation of Nanjing tourism, a prototype of Tourism Services System is realized.

  4. Impact of digital systems technology on man-vehicle systems research

    NASA Technical Reports Server (NTRS)

    Bretoi, R. N.

    1983-01-01

    The present study, based on a NASA technology assessment, examines the effect of new technologies on trends in crew-systems design and their implications from the vantage point of man-vehicle systems research. Those technologies that are most relevant to future trends in crew-systems design are considered along with problems associated with the introduction of rapidly changing technologies and systems concepts from a human-factors point of view. The technologies discussed include information processing, displays and controls, flight and propulsion control, flight and systems management, air traffic control, training and simulation, and flight and resource management. The historical evolution of cockpit systems design is used to illustrate past and possible future trends in man-vehicle systems research.

  5. 2 CFR 200.58 - Information technology systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Information technology systems. 200.58..., AND AUDIT REQUIREMENTS FOR FEDERAL AWARDS Acronyms and Definitions Acronyms § 200.58 Information technology systems. Information technology systems means computing devices, ancillary equipment, software...

  6. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  7. The Search for Lensed Supernovae

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-01-01

    Type Ia supernovae that have multiple images due to gravitational lensing can provide us with a wealth of information both about the supernovae themselves and about our surrounding universe. But how can we find these rare explosions?Clues from Multiple ImagesWhen light from a distant object passes by a massive foreground galaxy, the galaxys strong gravitational pull can bend the light, distorting our view of the backgroundobject. In severe cases, this process can cause multiple images of the distant object to appear in the foreground lensing galaxy.An illustration of gravitational lensing. Light from the distant supernova is bent as it passes through a giant elliptical galaxy in the foreground, causing multiple images of the supernova to appear to be hosted by the elliptical galaxy. [Adapted from image by NASA/ESA/A. Feild (STScI)]Observations of multiply-imaged Type Ia supernovae (explosions that occur when white dwarfs in binary systems exceed their maximum allowed mass) could answer a number of astronomical questions. Because Type Ia supernovae are standard candles, distant, lensed Type Ia supernovae can be used to extend the Hubble diagram to high redshifts. Furthermore, the lensing time delays from the multiply-imaged explosion can provide high-precision constraints on cosmological parameters.The catch? So far, weve only found one multiply-imaged Type Ia supernova: iPTF16geu, discovered late last year. Were going to need a lot more of them to develop a useful sample! So how do we identify themutiply-imaged Type Ias among the many billions of fleeting events discovered in current and future surveys of transients?Searching for AnomaliesAbsolute magnitudes for Type Ia supernovae in elliptical galaxies. None are expected to be above -20 in the B band, so if we calculate a magnitude for a Type Ia supernova thats larger than this, its probably not hosted by the galaxy we think it is! [Goldstein Nugent 2017]Two scientists from University of California, Berkeley and Lawrence Berkeley National Laboratory have a plan. In a recent publication, Daniel Goldstein and Peter Nugent propose the following clever procedure to apply to data from transient surveys:From the data, select only the supernova candidates that appear to be hosted by quiescent elliptical galaxies.Use the host galaxies photometric redshifts to calculate absolute magnitudes for the supernovae in this sample.Select from this only the supernovae above the maximum absolute magnitude expected for Type Ia supernovae.Supernovae selected in this way are likely tricking us: their apparent hosts are probably not their hosts at all! Instead, the supernova is likely behind the galaxy, and the galaxy is just lensing its light. Using this strategy therefore allows us to select supernova candidates that are most likely to be distant, gravitationally lensed Type Ia supernovae.Redshift distribution of the multiply-imaged Type Ia supernovae the authors estimate will be detectable by ZTF and LSST in their respective 3- and 10-year survey durations. [Goldstein Nugent 2017]A convenient aspect of Goldstein and Nugents technique is that we dont need to be able to resolve the lensed multiple images for discovery. This is useful, because ground-based optical surveys dont have the resolution to see the separate images yet theyll still be useful for discovering multiply-imaged supernovae.Future ProspectsHow useful? Goldstein and Nugent use Monte Carlo simulations to estimate how many multiply-imaged Type Ia supernovae will be discoverable with future survey projects. They find that theZwicky Transient Facility (ZTF), which will begin operating this year, should be able to find up to 10 using this technique in a 3-year search. The Large Synoptic Survey Telescope (LSST), which should start operating in 2022, will be able to find around 500 multiply-imaged Type Ia supernovae in a 10-year survey.CitationDaniel A. Goldstein and Peter E. Nugent 2017 ApJL 834 L5. doi:10.3847/2041-8213/834/1/L5

  8. A review on technologies and their usage in solid waste monitoring and management systems: Issues and challenges.

    PubMed

    Hannan, M A; Abdulla Al Mamun, Md; Hussain, Aini; Basri, Hassan; Begum, R A

    2015-09-01

    In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challenges towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Space station systems technology study (add-on task). Volume 2: Trade study and technology selection

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The current Space Station Systems Technology Study add on task was an outgrowth of the Advanced Platform Systems Technology Study (APSTS) that was completed in April 1983 and the subsequent Space Station System Technology Study completed in April 1984. The first APSTS proceeded from the identification of 106 technology topics to the selection of five for detailed trade studies. During the advanced platform study, the technical issues and options were evaluated through detailed trade processes, individual consideration was given to costs and benefits for the technologies identified for advancement, and advancement plans were developed. An approach similar to that was used in the subsequent study, with emphasis on system definition in four specific technology areas to facilitate a more in depth analysis of technology issues.

  10. The Status of Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System and ultralightweight propellant tank technologies. Future directions for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV); and 3) electric propulsion. These technologies are more vehicles and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These inspace propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, Aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  11. The status of spacecraft bus and platform technology development under the NASA ISPT program

    NASA Astrophysics Data System (ADS)

    Anderson, D. J.; Munk, M. M.; Pencil, E.; Dankanich, J.; Glaab, L.; Peterson, T.

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN& C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System and ultra-lightweight propellant tank technologies. Future directions for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV); and 3) electric propulsion. These technologies are more vehicles and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicabilit- to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, Aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  12. The Status of Spacecraft Bus and Platform Technology Development Under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric J.; Dankanich, John; Glaab, Louis J.

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance 2) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV) 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) and 3) electric propulsion. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, Aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  13. System Study: Technology Assessment and Prioritizing

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The objective of this NASA funded project is to assess and prioritize advanced technologies required to achieve the goals for an "Intelligent Propulsion System" through collaboration among GEAE, NASA, and Georgia Tech. Key GEAE deliverables are parametric response surface equations (RSE's) relating technology features to system benefits (sfc, weight, fuel burn, design range, acoustics, emission, etc...) and listings of Technology Impact Matrix (TIM) with benefits, debits, and approximate readiness status. TIM has been completed for GEAE and NASA proposed technologies. The combined GEAE and NASA TIM input requirement is shown in Table.1. In the course of building the RSE's and TIM, significant parametric technology modeling and RSE accuracy improvements were accomplished. GEAE has also done preliminary ranking of the technologies using Georgia Tech/GEAE USA developed technology evaluation tools. System level impact was performed by combining beneficial technologies with minimum conflict among various system figures of merits to assess their overall benefits to the system. The shortfalls and issues with modeling the proposed technologies are identified, and recommendations for future work are also proposed.

  14. Avionics systems integration technology

    NASA Technical Reports Server (NTRS)

    Stech, George; Williams, James R.

    1988-01-01

    A very dramatic and continuing explosion in digital electronics technology has been taking place in the last decade. The prudent and timely application of this technology will provide Army aviation the capability to prevail against a numerically superior enemy threat. The Army and NASA have exploited this technology explosion in the development and application of avionics systems integration technology for new and future aviation systems. A few selected Army avionics integration technology base efforts are discussed. Also discussed is the Avionics Integration Research Laboratory (AIRLAB) that NASA has established at Langley for research into the integration and validation of avionics systems, and evaluation of advanced technology in a total systems context.

  15. Astrophysics space systems critical technology needs

    NASA Technical Reports Server (NTRS)

    Gartrell, C. F.

    1982-01-01

    This paper addresses an independent assessment of space system technology needs for future astrophysics flight programs contained within the NASA Space Systems Technology Model. The critical examination of the system needs for the approximately 30 flight programs in the model are compared to independent technology forecasts and possible technology deficits are discussed. These deficits impact the developments needed for spacecraft propulsion, power, materials, structures, navigation, guidance and control, sensors, communications and data processing. There are also associated impacts upon in-orbit assembly technology and space transportation systems. A number of under-utilized technologies are highlighted which could be exploited to reduce cost and enhance scientific return.

  16. Advanced-technology space station study: Summary of systems and pacing technologies

    NASA Technical Reports Server (NTRS)

    Butterfield, A. J.; Garn, P. A.; King, C. B.; Queijo, M. J.

    1990-01-01

    The principal system features defined for the Advanced Technology Space Station are summarized and the 21 pacing technologies identified during the course of the study are described. The descriptions of system configurations were extracted from four previous study reports. The technological areas focus on those systems particular to all large spacecraft which generate artificial gravity by rotation. The summary includes a listing of the functions, crew requirements and electrical power demand that led to the studied configuration. The pacing technologies include the benefits of advanced materials, in-orbit assembly requirements, stationkeeping, evaluations of electrical power generation alternates, and life support systems. The descriptions of systems show the potential for synergies and identifies the beneficial interactions that can result from technological advances.

  17. Large Space Antenna Systems Technology, part 1

    NASA Technical Reports Server (NTRS)

    Lightner, E. B. (Compiler)

    1983-01-01

    A compilation of the unclassified papers presented at the NASA Conference on Large Space Antenna Systems Technology covers the following areas: systems, structures technology, control technology, electromagnetics, and space flight test and evaluation.

  18. System Architecture Modeling for Technology Portfolio Management using ATLAS

    NASA Technical Reports Server (NTRS)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  19. Systems thinking for assistive technology: a commentary on the GREAT summit.

    PubMed

    MacLachlan, Malcolm; Scherer, Marcia J

    2018-07-01

    The area of assistive technology has a long history of technological ingenuity and innovation. In order to ensure that the benefits of assistive technology are equitably distributed across the population and life course, it is necessary to adopt a systemic approach to the area. We describe examples of systems thinking and non-systems thinking across 10 Ps. These Ps are People (or users, as the primary beneficiaries of assistive technology), Policy, Products, Personnel, Provision (as key strategic drivers at systems level); and Procurement, Place, Pace, Promotion and Partnership (as key situational factors for systems). Together these Ps should constitute a framework for an "open" system that can evolve and adapt, that empowers users, inter-connects key components and locates these in the reality of differing contexts. The adoption of a stronger systems thinking perspective within the assistive technology field should allow for more equitable, more resilient and more sustainable assistive technology across high, middle- and low-income contexts and countries. Implications for Rehabilitation The progress of assistive technology provison has been hampered by disconnected initiatives and activities and this needs to be corrected. Systems thinking is a way of thinking about the connections between things and how these are influenced by contextual and other factors. By encouraging the providers and users of assitive technology to think more systemically we can provide a more cohesive and resilient systems. The user experience is the central component of systems thinking in assistive technologies.

  20. NASA In-Space Propulsion Technologies and Their Infusion Potential

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil,Eric J.; Peterson, Todd; Vento, Daniel; Munk, Michelle M.; Glaab, Louis J.; Dankanich, John W.

    2012-01-01

    The In-Space Propulsion Technology (ISPT) program has been developing in-space propulsion technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (Electric and Chemical), Entry Vehicle Technologies (Aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies that will be ready for flight infusion in the near future will be Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future focuses for ISPT are sample return missions and other spacecraft bus technologies like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions; and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle-focused, and present a different set of technology infusion challenges. While the Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  1. 32 CFR 147.15 - Guideline M-Misuse of Information technology systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 1 2012-07-01 2012-07-01 false Guideline M-Misuse of Information technology... CLASSIFIED INFORMATION Adjudication § 147.15 Guideline M—Misuse of Information technology systems. (a) The... ability to properly protect classified systems, networks, and information. Information Technology Systems...

  2. 32 CFR 147.15 - Guideline M-Misuse of Information technology systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false Guideline M-Misuse of Information technology... CLASSIFIED INFORMATION Adjudication § 147.15 Guideline M—Misuse of Information technology systems. (a) The... ability to properly protect classified systems, networks, and information. Information Technology Systems...

  3. 32 CFR 147.15 - Guideline M-Misuse of Information technology systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false Guideline M-Misuse of Information technology... CLASSIFIED INFORMATION Adjudication § 147.15 Guideline M—Misuse of Information technology systems. (a) The... ability to properly protect classified systems, networks, and information. Information Technology Systems...

  4. 32 CFR 147.15 - Guideline M-Misuse of Information technology systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 1 2011-07-01 2011-07-01 false Guideline M-Misuse of Information technology... CLASSIFIED INFORMATION Adjudication § 147.15 Guideline M—Misuse of Information technology systems. (a) The... ability to properly protect classified systems, networks, and information. Information Technology Systems...

  5. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.

  6. Technology for Space Station Evolution. Volume 2: Data Management System/Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Office of Aeronautics and Space Technology conducted a workshop on technology for space station evolution 16-19 Jan. 1990. The purpose of the workshop was to collect and clarify Space Station Freedom technology requirements for evolution and to describe technologies that can potentially fill those requirements. These proceedings are organized into an Executive Summary and Overview and five volumes containing the Technology Discipline Presentations. Volume 2 consists of the technology discipline sections for the Data Management System and the Environmental Control and Life Support Systems. For each technology discipline, there is a Level 3 subsystem description, along with the invited papers.

  7. Technology for Space Station Evolution. Volume 3: EVA/Manned Systems/Fluid Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Office of Aeronautics and Space Technology (OAST) conducted a workshop on technology for space station evolution 16-19 Jan. 1990 in Dallas, Texas. The purpose of this workshop was to collect and clarify Space Station Freedom technology requirements for evolution and to describe technologies that can potentially fill those requirements. These proceedings are organized into an Executive Summary and Overview and five volumes containing the Technology Discipline Presentations. Volume 3 consists of the technology discipline sections for Extravehicular Activity/Manned Systems and the Fluid Management System. For each technology discipline, there is a Level 3 subsystem description, along with the papers.

  8. 75 FR 1446 - Future Systems Technology Advisory Panel Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-11

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2010-0001] Future Systems Technology Advisory Panel... systems technology and electronic services at the agency five to ten years into the future. The Panel will recommend a road map to aid SSA in determining what future systems technologies may be developed to assist...

  9. A Brief Overview of NASA Glenn Research Center Sensor and Electronics Activities

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    2012-01-01

    Aerospace applications require a range of sensing technologies. There is a range of sensor and sensor system technologies being developed using microfabrication and micromachining technology to form smart sensor systems and intelligent microsystems. Drive system intelligence to the local (sensor) level -- distributed smart sensor systems. Sensor and sensor system development examples: (1) Thin-film physical sensors (2) High temperature electronics and wireless (3) "lick and stick" technology. NASA GRC is a world leader in aerospace sensor technology with a broad range of development and application experience. Core microsystems technology applicable to a range of application environmentS.

  10. Cross-correlating 2D and 3D galaxy surveys

    DOE PAGES

    Passaglia, Samuel; Manzotti, Alessandro; Dodelson, Scott

    2017-06-08

    Galaxy surveys probe both structure formation and the expansion rate, making them promising avenues for understanding the dark universe. Photometric surveys accurately map the 2D distribution of galaxy positions and shapes in a given redshift range, while spectroscopic surveys provide sparser 3D maps of the galaxy distribution. We present a way to analyse overlapping 2D and 3D maps jointly and without loss of information. We represent 3D maps using spherical Fourier-Bessel (sFB) modes, which preserve radial coverage while accounting for the spherical sky geometry, and we decompose 2D maps in a spherical harmonic basis. In these bases, a simple expression exists for the cross-correlation of the two fields. One very powerful application is the ability to simultaneously constrain the redshift distribution of the photometric sample, the sample biases, and cosmological parameters. We use our framework to show that combined analysis of DESI and LSST can improve cosmological constraints by factors ofmore » $${\\sim}1.2$$ to $${\\sim}1.8$$ on the region where they overlap relative to identically sized disjoint regions. We also show that in the overlap of DES and SDSS-III in Stripe 82, cross-correlating improves photo-$z$ parameter constraints by factors of $${\\sim}2$$ to $${\\sim}12$$ over internal photo-$z$ reconstructions.« less

  11. WFIRST: Science from Deep Field Surveys

    NASA Astrophysics Data System (ADS)

    Koekemoer, Anton M.; Foley, Ryan; WFIRST Deep Field Working Group

    2018-06-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  12. The Crucial Role of Amateur-Professional Networks in the Golden Age of Large Surveys (Abstract)

    NASA Astrophysics Data System (ADS)

    Rodriguez, J. E.

    2017-06-01

    (Abstract only) With ongoing projects such as HATNet, SuperWASP, KELT, MEarth, and the CoRoT and Kepler/K2 mission, we are in a golden era of large photometric surveys. In addition, LSST and TESS will be coming online in the next three to five years. The combination of all these projects will increased the number of photometrically monitored stars by orders of magnitude. It is expected that these surveys will enhance our knowledge of circumstellar architecture and the early stages of stellar and planetary formation, while providing a better understanding of exoplanet demographics. However, the success of these surveys will be dependent on simultaneous and continued follow up by large networks. With federal scientific funding reduced over the past few years, the availability of astronomical observations has been directly affected. Fortunately, ground based amateur-professional networks like the AAVSO and the KELT Follow-up Network (KELT-FUN) are already providing access to an international, independent resource for professional grade astronomical observations. These networks have both multi-band photometric and spectroscopic capabilities. I provide an overview of the ongoing and future surveys, highlight past and current contributions by amateur-professional networks to scientific discovery, and discuss the role of these networks in upcoming projects.

  13. Host Galaxy Spectra of Stripped SN from the Palomar Transient Factory: SN Progenitor Diagnostics and the SN-GRB Connection

    NASA Astrophysics Data System (ADS)

    Modjaz, Maryam; Gal-Yam, Avishay; Arcavi, Iair

    2012-02-01

    Stripped core-collapse supernovae (Stripped SNe) are powerful cosmic engines that energize and enrich the ISM and that sometimes accompany GRBs, but the exact mass and metallicity range of their massive progenitors is not known, nor the detailed physics of the explosion. We propose to continue conducting the first uniform and statistically significant study of host galaxies of 60 stripped SNe from the same innovative, homogeneous and galaxy-unbiased survey Palomar Transient Factory in order to determine the environmental conditions that influence the various kinds of massive stellar deaths. By obtaining spectra of the immediate host environments of our sample of stripped SN, we will (1) measure local abundances in order to differentiate between the two progenitor scenarios for stripped SN and (2) derive stellar population ages, masses and star formation histories via detailed stellar population synthesis models. Moreover, we will test if natal chemical abundance has effects on basic SN characteristics, such as peak luminosity. Any observed trends will have ramifications on SN and GRB explosion models and imply important demographic SN considerations. Our dataset will provide a crucial complimentary set to host galaxy studies of long-duration GRBs and pave the way for host studies of transients and SN found via upcoming surveys such as LSST.

  14. Host Galaxy Spectra of Stripped SN from the Palomar Transient Factory: SN Progenitor Diagnostics and the SN-GRB Connection

    NASA Astrophysics Data System (ADS)

    Modjaz, Maryam; Gal-Yam, Avishay; Arcavi, Iair

    2011-02-01

    Stripped core-collapse supernovae (Stripped SN) are powerful cosmic engines that energize and enrich the ISM and that sometimes accompany GRBs, but the exact mass and metallicity range of their massive progenitors is not known, nor the detailed physics of the explosion. With the harvest of 50 stripped SN from the innovative survey Palomar Transient Factory, we propose to conduct the first uniform and statistically significant study with SN from the same homogeneous and galaxy-unbiased survey in order to determine the environmental conditions that influence the various kinds of massive stellar deaths. By obtaining spectra of the immediate host environments of our sample of stripped SN, we will (1) measure local abundances in order to differentiate between the two progenitor scenarios for stripped SN and (2) derive stellar population ages, masses and star formation histories via detailed stellar population synthesis models. Moreover, we will test if natal chemical abundance has effects on basic SN characteristics, such as peak luminosity. Any observed trends will have ramifications on SN and GRB explosion models and imply important demographic SN considerations. Our dataset will provide a crucial complimentary set to host galaxy studies of long-duration GRBs and pave the way for host studies of transients and SN found via upcoming surveys such as LSST.

  15. Sub-percent Photometry: Faint DA White Dwarf Spectrophotometric Standards for Astrophysical Observatories

    NASA Astrophysics Data System (ADS)

    Narayan, Gautham; Axelrod, Tim; Calamida, Annalisa; Saha, Abhijit; Matheson, Thomas; Olszewski, Edward; Holberg, Jay; Holberg, Jay; Bohlin, Ralph; Stubbs, Christopher W.; Rest, Armin; Deustua, Susana; Sabbi, Elena; MacKenty, John W.; Points, Sean D.; Hubeny, Ivan

    2018-01-01

    We have established a network of faint (16.5 < V < 19) hot DA white dwarfs as spectrophotometric standards for present and future wide-field observatories. Our standards are accessible from both hemispheres and suitable for ground and space-based covering the UV to the near IR. The network is tied directly to the most precise astrophysical reference presently available - the CALSPEC standards - through a multi-cycle program imaging using the Wide-Field Camera 3 (WFC3) on the Hubble Space Telescope (HST). We have developed two independent analyses to forward model all the observed photometry and ground-based spectroscopy and infer a spectral energy distribution for each source using a non-local-thermodynamic-equilibrium (NLTE) DA white dwarf atmosphere extincted by interstellar dust. The models are in excellent agreement with each other, and agree with the observations to better than 0.01 mag in all passbands, and better than 0.005 mag in the optical. The high-precision of these faint sources, tied directly to the most accurate flux standards presently available, make our network of standards ideally suited for any experiments that have very stringent requirements on absolute flux calibration, such as studies of dark energy using the Large Synoptic Survey Telescope (LSST) and the Wide-Field Infrared Survey Telescope (WFIRST).

  16. The Turn of the Millennium, CTIO and AURA-O, 1993-2007

    NASA Astrophysics Data System (ADS)

    Smith, M. G.

    2015-05-01

    The decade 1993-2003 was a particularly exciting time to be at Cerro Tololo - for most staff members. Tough times at the beginning involved significant staff reductions - the worst moment of my professional career. In return, the observatory was granted a period of stability in which it was possible to rebuild morale, which in turn helped the research climate to recover rapidly. Emphasis was given to improving the image quality at the Blanco 4m (led by Jack Baldwin), while quietly supporting efficient operation of the smaller telescopes (led locally by Alistair Walker) via the development of the SMARTs consortium. In 1998 came the high point - the announcement of the discovery of Dark Energy that showed us all how much we still have to learn about the universe; this work depended crucially on the work of staff members at CTIO and staff and students from Chilean and Argentinian universities (as described elsewhere at this conference). The Inter-American nature of CTIO also laid the crucial groundwork (in which Joao Steiner was a key player) for Brazil's involvement in SOAR and Gemini. Work during the period to minimize the advance of light pollution over Cerro Tololo (and the rest of northern Chile) helped create a credible case for siting the LSST on Cerro Pachón.

  17. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  18. WFIRST: Science from Deep Field Surveys

    NASA Astrophysics Data System (ADS)

    Koekemoer, Anton; Foley, Ryan; WFIRST Deep Field Working Group

    2018-01-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  19. Focusing cosmic telescopes: systematics of strong lens modeling

    NASA Astrophysics Data System (ADS)

    Johnson, Traci Lin; Sharon, Keren q.

    2018-01-01

    The use of strong gravitational lensing by galaxy clusters has become a popular method for studying the high redshift universe. While diverse in computational methods, lens modeling techniques have grasped the means for determining statistical errors on cluster masses and magnifications. However, the systematic errors have yet to be quantified, arising from the number of constraints, availablity of spectroscopic redshifts, and various types of image configurations. I will be presenting my dissertation work on quantifying systematic errors in parametric strong lensing techniques. I have participated in the Hubble Frontier Fields lens model comparison project, using simulated clusters to compare the accuracy of various modeling techniques. I have extended this project to understanding how changing the quantity of constraints affects the mass and magnification. I will also present my recent work extending these studies to clusters in the Outer Rim Simulation. These clusters are typical of the clusters found in wide-field surveys, in mass and lensing cross-section. These clusters have fewer constraints than the HFF clusters and thus, are more susceptible to systematic errors. With the wealth of strong lensing clusters discovered in surveys such as SDSS, SPT, DES, and in the future, LSST, this work will be influential in guiding the lens modeling efforts and follow-up spectroscopic campaigns.

  20. Efficient simulations of large-scale structure in modified gravity cosmologies with comoving Lagrangian acceleration

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2017-05-01

    We implement an adaptation of the cola approach, a hybrid scheme that combines Lagrangian perturbation theory with an N-body approach, to model nonlinear collapse in chameleon and symmetron modified gravity models. Gravitational screening is modeled effectively through the attachment of a suppression factor to the linearized Klein-Gordon equations. The adapted cola approach is benchmarked, with respect to an N-body code both for the Λ cold dark matter (Λ CDM ) scenario and for the modified gravity theories. It is found to perform well in the estimation of the dark matter power spectra, with consistency of 1% to k ˜2.5 h /Mpc . Redshift space distortions are shown to be effectively modeled through a Lorentzian parametrization with a velocity dispersion fit to the data. We find that cola performs less well in predicting the halo mass functions but has consistency, within 1 σ uncertainties of our simulations, in the relative changes to the mass function induced by the modified gravity models relative to Λ CDM . The results demonstrate that cola, proposed to enable accurate and efficient, nonlinear predictions for Λ CDM , can be effectively applied to a wider set of cosmological scenarios, with intriguing properties, for which clustering behavior needs to be understood for upcoming surveys such as LSST, DESI, Euclid, and WFIRST.

  1. The Advanced X-ray Imaging Satellite (AXIS)

    NASA Astrophysics Data System (ADS)

    Reynolds, Christopher S.; Mushotzky, Richard

    2017-08-01

    The Advanced X-ray Imaging Satellite (AXIS) will follow in the footsteps of the spectacularly successful Chandra X-ray Observatory with similar or higher angular resolution and an order of magnitude more collecting area in the 0.3-10keV band. These capabilities will enable major advances in many of the most active areas of astrophysics, including (i) mapping event horizon scale structure in AGN accretion disks and the determination of supermassive black hole (SMBH) spins through monitoring of gravitationally-microlensed quasars; (ii) dramatically deepening our understanding of AGN feedback in galaxies and galaxy clusters out to high-z through the direct imaging of AGN winds and the interaction of jets with the hot interstellar/intracluster medium; (iii) understanding the fueling of AGN by probing hot flows inside of the SMBH sphere of influence; (iv) obtaining geometric distance measurements using dust scattering halos. With a nominal 2028 launch, AXIS will be enormously synergistic with LSST, ALMA, WFIRST and ATHENA, and will be a valuable precursor to Lynx. AXIS is enabled by breakthroughs in the construction of light-weight X-ray optics from mono-crystalline silicon blocks, building on recent developments in the semiconductor industry. Here, we describe the straw-man concept for AXIS, some of the high profile science that this observatory will address, and how you can become involved.

  2. Finding the first cosmic explosions. IV. 90–140 $$\\;{{M}_{\\odot }}$$ pair-stability supernovae

    DOE PAGES

    Smidt, Joseph; Whalen, Daniel J.; Chatzopoulos, E.; ...

    2015-05-19

    Population III stars that die as pair-instability supernovae are usually thought to fall in the mass range of 140 - 260 M ⊙. However, several lines of work have now shown that rotation can build up the He cores needed to encounter the pair instability at stellar masses as low as 90 M ⊙. Depending on the slope of the initial mass function of Population III stars, there could be 4 - 5 times as many stars from 90 - 140 M ⊙ in the primordial universe than in the usually accepted range. We present numerical simulations of the pair-instabilitymore » explosions of such stars performed with the MESA, FLASH and RAGE codes. We find that they will be visible to supernova factories such as Pan-STARRS and LSST in the optical out to z ~ 1-2 and JWST and the 30 m-class telescopes in the NIR out to z ~ 7-10. Such explosions will thus probe the stellar populations of the first galaxies and cosmic star formation rates in the era of cosmological reionization. These supernovae are also easily distinguished from more massive pair-instability explosions, underscoring the fact that there is far greater variety to the light curves of these events than previously understood.« less

  3. The Palomar-Quest Synoptic Sky Survey

    NASA Astrophysics Data System (ADS)

    Mahabal, A.; Djorgovski, S. G.; Graham, M.; Williams, R.; Granett, B.; Bogosavljevic, M.; Baltay, C.; Rabinowitz, D.; Bauer, A.; Andrews, P.; Morgan, N.; Snyder, J.; Ellman, N.; Brunner, R.; Rengstorf, A. W.; Musser, J.; Gebhard, M.; Mufson, S.

    2003-12-01

    Exploration of the time domain is rapidly becoming one of the most exciting areas of astronomy. The Palomar-Quest synoptic sky survey has recently started producing a steady stream of data. In driftscan mode the survey covers Declination strips 4.6 deg wide, between -25 and +30 deg, at least twice in each of the two filter sets, one Johnson-Cousin's UBRI and one SDSS r'i'z'z', at a rate of about 500 square degrees per night. The scans are separated by time baselines of days to months, and we anticipate that they will extend to multi-year time scales over the next 3 to 5 years or beyond. The unprecedented amount of data makes this the largest synoptic survey of its kind both in terms of area covered and depth. We would search for both variable and transient objects, including supernovae, variable AGN, GRB orphan afterglows, cataclysmic variables, interesting stellar flares, novae, other types of variable stars, and possibly even entirely new types of objects or phenomena. We are in the process of designing a real-time data reduction pipeline which would enable a rapid discovery and spectroscopic follow-up of transients and other intersting objects. This survey can be seen as a precursor for the even larger synoptic sky surveys with LSST and PanSTARRS.

  4. Cross-correlating 2D and 3D galaxy surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passaglia, Samuel; Manzotti, Alessandro; Dodelson, Scott

    Galaxy surveys probe both structure formation and the expansion rate, making them promising avenues for understanding the dark universe. Photometric surveys accurately map the 2D distribution of galaxy positions and shapes in a given redshift range, while spectroscopic surveys provide sparser 3D maps of the galaxy distribution. We present a way to analyse overlapping 2D and 3D maps jointly and without loss of information. We represent 3D maps using spherical Fourier-Bessel (sFB) modes, which preserve radial coverage while accounting for the spherical sky geometry, and we decompose 2D maps in a spherical harmonic basis. In these bases, a simple expression exists for the cross-correlation of the two fields. One very powerful application is the ability to simultaneously constrain the redshift distribution of the photometric sample, the sample biases, and cosmological parameters. We use our framework to show that combined analysis of DESI and LSST can improve cosmological constraints by factors ofmore » $${\\sim}1.2$$ to $${\\sim}1.8$$ on the region where they overlap relative to identically sized disjoint regions. We also show that in the overlap of DES and SDSS-III in Stripe 82, cross-correlating improves photo-$z$ parameter constraints by factors of $${\\sim}2$$ to $${\\sim}12$$ over internal photo-$z$ reconstructions.« less

  5. Measuring the scale dependence of intrinsic alignments using multiple shear estimates

    NASA Astrophysics Data System (ADS)

    Leonard, C. Danielle; Mandelbaum, Rachel

    2018-06-01

    We present a new method for measuring the scale dependence of the intrinsic alignment (IA) contamination to the galaxy-galaxy lensing signal, which takes advantage of multiple shear estimation methods applied to the same source galaxy sample. By exploiting the resulting correlation of both shape noise and cosmic variance, our method can provide an increase in the signal-to-noise of the measured IA signal as compared to methods which rely on the difference of the lensing signal from multiple photometric redshift bins. For a galaxy-galaxy lensing measurement which uses LSST sources and DESI lenses, the signal-to-noise on the IA signal from our method is predicted to improve by a factor of ˜2 relative to the method of Blazek et al. (2012), for pairs of shear estimates which yield substantially different measured IA amplitudes and highly correlated shape noise terms. We show that statistical error necessarily dominates the measurement of intrinsic alignments using our method. We also consider a physically motivated extension of the Blazek et al. (2012) method which assumes that all nearby galaxy pairs, rather than only excess pairs, are subject to IA. In this case, the signal-to-noise of the method of Blazek et al. (2012) is improved.

  6. Fundamental physics from future weak-lensing calibrated Sunyaev-Zel'dovich galaxy cluster counts

    NASA Astrophysics Data System (ADS)

    Madhavacheril, Mathew S.; Battaglia, Nicholas; Miyatake, Hironao

    2017-11-01

    Future high-resolution measurements of the cosmic microwave background (CMB) will produce catalogs of tens of thousands of galaxy clusters through the thermal Sunyaev-Zel'dovich (tSZ) effect. We forecast how well different configurations of a CMB Stage-4 experiment can constrain cosmological parameters, in particular, the amplitude of structure as a function of redshift σ8(z ) , the sum of neutrino masses Σ mν, and the dark energy equation of state w (z ). A key element of this effort is calibrating the tSZ scaling relation by measuring the lensing signal around clusters. We examine how the mass calibration from future optical surveys like the Large Synoptic Survey Telescope (LSST) compares with a purely internal calibration using lensing of the CMB itself. We find that, due to its high-redshift leverage, internal calibration gives constraints on cosmological parameters comparable to the optical calibration, and can be used as a cross-check of systematics in the optical measurement. We also show that in contrast to the constraints using the CMB lensing power spectrum, lensing-calibrated tSZ cluster counts can detect a minimal Σ mν at the 3 - 5 σ level even when the dark energy equation of state is freed up.

  7. Configurable technology development for reusable control and monitor ground systems

    NASA Technical Reports Server (NTRS)

    Uhrlaub, David R.

    1994-01-01

    The control monitor unit (CMU) uses configurable software technology for real-time mission command and control, telemetry processing, simulation, data acquisition, data archiving, and ground operations automation. The base technology is currently planned for the following control and monitor systems: portable Space Station checkout systems; ecological life support systems; Space Station logistics carrier system; and the ground system of the Delta Clipper (SX-2) in the Single-Stage Rocket Technology program. The CMU makes extensive use of commercial technology to increase capability and reduce development and life-cycle costs. The concepts and technology are being developed by McDonnell Douglas Space and Defense Systems for the Real-Time Systems Laboratory at NASA's Kennedy Space Center under the Payload Ground Operations Contract. A second function of the Real-Time Systems Laboratory is development and utilization of advanced software development practices.

  8. Sensors 2000! Program: Advanced Biosensor and Measurement Systems Technologies for Spaceflight Research and Concurrent, Earth-Based Applications

    NASA Technical Reports Server (NTRS)

    Hines, J.

    1999-01-01

    Sensors 2000! (S2K!) is a specialized, integrated projects team organized to provide focused, directed, advanced biosensor and bioinstrumentation systems technology support to NASA's spaceflight and ground-based research and development programs. Specific technology thrusts include telemetry-based sensor systems, chemical/ biological sensors, medical and physiological sensors, miniaturized instrumentation architectures, and data and signal processing systems. A concurrent objective is to promote the mutual use, application, and transition of developed technology by collaborating in academic-commercial-govemment leveraging, joint research, technology utilization and commercialization, and strategic partnering alliances. Sensors 2000! is organized around three primary program elements: Technology and Product Development, Technology infusion and Applications, and Collaborative Activities. Technology and Product Development involves development and demonstration of biosensor and biotelemetry systems for application to NASA Space Life Sciences Programs; production of fully certified spaceflight hardware and payload elements; and sensor/measurement systems development for NASA research and development activities. Technology Infusion and Applications provides technology and program agent support to identify available and applicable technologies from multiple sources for insertion into NASA's strategic enterprises and initiatives. Collaborative Activities involve leveraging of NASA technologies with those of other government agencies, academia, and industry to concurrently provide technology solutions and products of mutual benefit to participating members.

  9. Vehicle/Guideway Interaction in Maglev Systems

    DTIC Science & Technology

    1992-03-01

    Technology Division Materials and Components in Maglev Systems Technology Division Materials and Components Technology Division byY. Cai, S. S. Chen, and D. M...Transportation Systems Reports (UC-330, Vehicle/Guideway Interaction in Maglev Systems by Y. Cai and S. S. Chen Materials and Components Technology Division D. M...Surface Irregularities ...................................... 32 4 Vehicle/Guideway Interaction in Transrapid Maglev System .................. 34 4.1

  10. Complex of technologies and prototype systems for eco-friendly shutdown of the power-generating, process, capacitive, and transport equipment

    NASA Astrophysics Data System (ADS)

    Smorodin, A. I.; Red'kin, V. V.; Frolov, Y. D.; Korobkov, A. A.; Kemaev, O. V.; Kulik, M. V.; Shabalin, O. V.

    2015-07-01

    A set of technologies and prototype systems for eco-friendly shutdown of the power-generating, process, capacitive, and transport equipment is offered. The following technologies are regarded as core technologies for the complex: cryogenic technology nitrogen for displacement of hydrogen from the cooling circuit of turbine generators, cryo blasting of the power units by dioxide granules, preservation of the shutdown power units by dehydrated air, and dismantling and severing of equipment and structural materials of power units. Four prototype systems for eco-friendly shutdown of the power units may be built on the basis of selected technologies: Multimode nitrogen cryogenic system with four subsystems, cryo blasting system with CO2 granules for thermal-mechanical and electrical equipment of power units, and compressionless air-drainage systems for drying and storage of the shutdown power units and cryo-gas system for general severing of the steam-turbine power units. Results of the research and pilot and demonstration tests of the operational units of the considered technological systems allow applying the proposed technologies and systems in the prototype systems for shutdown of the power-generating, process, capacitive, and transport equipment.

  11. Technology Transfer Program (TTP). Quality Assurance System. Volume 2. Appendices

    DTIC Science & Technology

    1980-03-03

    LSCo Report No. - 2X23-5.1-4-I TECHNOLOGY TRANSFER PROGRAM (TTP) FINAL REPORT QUALITY ASSURANCE SYSTEM Appendix A Accuracy Control System QUALITY...4-1 TECHNOLOGY TRANSFER PROGRAM (TTP) FINAL REPORT QUALITY ASSURANCE SYSTEM Appendix A Accuracy Control System QUALITY ASSURANCE VOLUME 2 APPENDICES...prepared by: Livingston Shipbuilding Company Orange, Texas March 3, 1980 APPENDIX A ACCURACY CONTROL SYSTEM . IIII MARINE TECHNOLOGY. INC. HP-121

  12. A survey on barcode RFID and NFC

    NASA Astrophysics Data System (ADS)

    Thanapal, P.; Prabhu, J.; Jakhar, Mridula

    2017-11-01

    Over the recent years, many industries have started implementing new technologies for tracing and tracking their products. These technologies are a kind of blessing to their management system. The technology and management system has to work in parallel to avoid loopholes in the system. We can see so many technologies around us and the most difficult and important part is to choose best out of all these new technologies. The important point which we need to take care while choosing a technology for the system is to make sure the technology can integrate properly with the other parameters in the management system. The industry management system consists of many levels such as initial level, intermediate level, final level and tracking. Nowadays tracking a product from its initial stage is becoming a trend. To cope up with this upcoming trend and also with the company demand, integrating the product with Barcode, RFID tags, NFC tag or any other traceable technology. Many supply chain Management system are also adopting this techniques.

  13. Large Space Systems Technology, 1979. [antenna and space platform systems conference

    NASA Technical Reports Server (NTRS)

    Ward, J. C., Jr. (Compiler)

    1980-01-01

    Items of technology and developmental efforts in support of the large space systems technology programs are described. The major areas of interest are large antennas systems, large space platform systems, and activities that support both antennas and platform systems.

  14. Status of Propulsion Technology Development Under the NASA In-space Propulsion Technology Program

    NASA Technical Reports Server (NTRS)

    Anderson, David; Kamhawi, Hani; Patterson, Mike; Dankanich, John; Pencil, Eric; Pinero, Luis

    2014-01-01

    Since 2001, the In-Space Propulsion Technology (ISPT) program has been developing and delivering in-space propulsion technologies for NASA's Science Mission Directorate (SMD). These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, Flagship and sample return missions currently under consideration. The ISPT program is currently developing technology in three areas that include Propulsion System Technologies, Entry Vehicle Technologies, and Systems Mission Analysis. ISPT's propulsion technologies include: 1) the 0.6-7 kW NASA's Evolutionary Xenon Thruster (NEXT) gridded ion propulsion system; 2) a 0.3-3.9kW Hall-effect electric propulsion (HEP) system for low cost and sample return missions; 3) the Xenon Flow Control Module (XFCM); 4) ultra-lightweight propellant tank technologies (ULTT); and 5) propulsion technologies for a Mars Ascent Vehicle (MAV). The HEP system is composed of the High Voltage Hall Accelerator (HiVHAc) thruster, a power processing unit (PPU), and the XFCM. NEXT and the HiVHAc are throttle-able electric propulsion systems for planetary science missions. The XFCM and ULTT are two component technologies which being developed with nearer-term flight infusion in mind. Several of the ISPT technologies are related to sample return missions needs like: MAV propulsion and electric propulsion. And finally, one focus of the SystemsMission Analysis area is developing tools that aid the application or operation of these technologies on wide variety of mission concepts. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness.

  15. The NASA Redox Storage System Development project, 1980

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The technical accomplishments pertaining to the development of Redox systems and related technology are outlined in terms of the task elements: prototype systems development, application analyses, and supporting technology. Prototype systems development provides for a major procurement to develop an industrial capability to take the current NASA Lewis technology and go on to the design, development, and commercialization of iron-chromium Redox storage systems. Application analyses provides for the definition of application concepts and technology requirements, specific definition studies, and the identification of market sectors and their penetration potential. Supporting technology includes both in house and contractual efforts that encompass implementation of technology improvements in membranes, electrodes, reactant processing, and system design. The status of all elements is discussed.

  16. The NASA Redox Storage System Development project, 1980

    NASA Astrophysics Data System (ADS)

    1982-12-01

    The technical accomplishments pertaining to the development of Redox systems and related technology are outlined in terms of the task elements: prototype systems development, application analyses, and supporting technology. Prototype systems development provides for a major procurement to develop an industrial capability to take the current NASA Lewis technology and go on to the design, development, and commercialization of iron-chromium Redox storage systems. Application analyses provides for the definition of application concepts and technology requirements, specific definition studies, and the identification of market sectors and their penetration potential. Supporting technology includes both in house and contractual efforts that encompass implementation of technology improvements in membranes, electrodes, reactant processing, and system design. The status of all elements is discussed.

  17. Visible Parts, Invisible Whole: Swedish Technology Student Teachers' Conceptions about Technological Systems

    ERIC Educational Resources Information Center

    Hallström, Jonas; Klasander, Claes

    2017-01-01

    Technological systems are included as a component of national technology curricula and standards for primary and secondary education as well as corresponding teacher education around the world. Little is known, however, of how pupils, students, and teachers conceive of technological systems. In this article we report on a study investigating…

  18. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.

  19. Two men with multiple disabilities carry out an assembly work activity with the support of a technology system.

    PubMed

    Lancioni, Giulio E; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Oliva, Doretta; Campodonico, Francesca

    2013-10-01

    To assess whether two persons with multiple disabilities could learn a work activity (i.e., assembling trolley wheels) with the support of a technology system. After an initial baseline, the study compared the effects of intervention sessions relying on the technology system (which called the participants to the different workstations and provided feedback and final stimulation) with the effects of intervention sessions carried out without technology. The two types of intervention sessions were conducted according to an alternating treatments design. Eventually, only intervention sessions relying on the technology system were used. Both participants managed to assemble wheels independently during intervention sessions relying on the technology system while they failed during sessions without the system. Their performance was strengthened during the final part of the study, in which only sessions with the system occurred. Technology may be critical in helping persons with multiple disabilities manage multi-step work activities.

  20. Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric; Dankanich, John; Glaab, Louis; Peterson, Todd

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance; 2) NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system; and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures; guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells; and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV); 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions; and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, Aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  1. Spacecraft Bus and Platform Technology Development under the NASA ISPT Program

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Munk, Michelle M.; Pencil, Eric J.; Dankanich, John W.; Glaab, Louis J.; Peterson, Todd T.

    2013-01-01

    The In-Space Propulsion Technology (ISPT) program is developing spacecraft bus and platform technologies that will enable or enhance NASA robotic science missions. The ISPT program is currently developing technology in four areas that include Propulsion System Technologies (electric and chemical), Entry Vehicle Technologies (aerocapture and Earth entry vehicles), Spacecraft Bus and Sample Return Propulsion Technologies (components and ascent vehicles), and Systems/Mission Analysis. Three technologies are ready for near-term flight infusion: 1) the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance 2) NASAs Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system and 3) Aerocapture technology development with investments in a family of thermal protection system (TPS) materials and structures guidance, navigation, and control (GN&C) models of blunt-body rigid aeroshells and aerothermal effect models. Two component technologies being developed with flight infusion in mind are the Advanced Xenon Flow Control System, and ultra-lightweight propellant tank technologies. Future direction for ISPT are technologies that relate to sample return missions and other spacecraft bus technology needs like: 1) Mars Ascent Vehicles (MAV) 2) multi-mission technologies for Earth Entry Vehicles (MMEEV) for sample return missions and 3) electric propulsion for sample return and low cost missions. These technologies are more vehicle and mission-focused, and present a different set of technology development and infusion steps beyond those previously implemented. The Systems/Mission Analysis area is focused on developing tools and assessing the application of propulsion and spacecraft bus technologies to a wide variety of mission concepts. These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, and sample return missions currently under consideration, as well as having broad applicability to potential Flagship missions. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness of in-space propulsion technologies in the areas of electric propulsion, Aerocapture, Earth entry vehicles, propulsion components, Mars ascent vehicle, and mission/systems analysis.

  2. 78 FR 17187 - Notice of Intent To Grant Exclusive Patent License; Fiber Optic Sensor Systems Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ...; Fiber Optic Sensor Systems Technology Corporation AGENCY: Department of the Navy, DoD. ACTION: Notice..., 2012, announcing an intent to grant to Fiber Optic Sensor Systems Technology Corporation, a revocable... the Navy hereby gives notice of its intent to grant to Fiber Optic Sensor Systems Technology...

  3. Modernizing Systems and Software: How Evolving Trends in Future Trends in Systems and Software Technology Bode Well for Advancing the Precision of Technology

    DTIC Science & Technology

    2009-04-23

    of Code Need for increased functionality will be a forcing function to bring the fields of software and systems engineering... of Software-Intensive Systems is Increasing 3 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the Precision of ...Engineering in Continued Partnership 4 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the

  4. Technology Area Roadmap for In Space Propulsion Technologies

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Meyer, Mike; Coote, David; Goebel, Dan; Palaszewski, Bryan; White, Sonny

    2010-01-01

    This slide presentation reviews the technology area (TA) roadmap to develop propulsion technologies that will be used to enable further exploration of the solar system, and beyond. It is hoped that development of the technologies within this TA will result in technical solutions that will improve thrust levels, specific impulse, power, specific mass, volume, system mass, system complexity, operational complexity, commonality with other spacecraft systems, manufacturability and durability. Some of the propulsion technologies that are reviewed include: chemical and non-chemical propulsion, and advanced propulsion (i.e., those with a Technology Readiness level of less than 3). Examples of these advanced technologies include: Beamed Energy, Electric Sail, Fusion, High Energy Density Materials, Antimatter, Advanced Fission and Breakthrough propulsion technologies. Timeframes for development of some of these propulsion technologies are reviewed, and top technical challenges are reviewed. This roadmap describes a portfolio of in-space propulsion technologies that can meet future space science and exploration needs.

  5. Sandia National Laboratories: National Security Missions: Defense Systems

    Science.gov Websites

    ; Technology Defense Systems & Assessments About Defense Systems & Assessments Program Areas Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  6. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  7. Intelligent systems technology infrastructure for integrated systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry, Jr.

    1991-01-01

    Significant advances have occurred during the last decade in intelligent systems technologies (a.k.a. knowledge-based systems, KBS) including research, feasibility demonstrations, and technology implementations in operational environments. Evaluation and simulation data obtained to date in real-time operational environments suggest that cost-effective utilization of intelligent systems technologies can be realized for Automated Rendezvous and Capture applications. The successful implementation of these technologies involve a complex system infrastructure integrating the requirements of transportation, vehicle checkout and health management, and communication systems without compromise to systems reliability and performance. The resources that must be invoked to accomplish these tasks include remote ground operations and control, built-in system fault management and control, and intelligent robotics. To ensure long-term evolution and integration of new validated technologies over the lifetime of the vehicle, system interfaces must also be addressed and integrated into the overall system interface requirements. An approach for defining and evaluating the system infrastructures including the testbed currently being used to support the on-going evaluations for the evolutionary Space Station Freedom Data Management System is presented and discussed. Intelligent system technologies discussed include artificial intelligence (real-time replanning and scheduling), high performance computational elements (parallel processors, photonic processors, and neural networks), real-time fault management and control, and system software development tools for rapid prototyping capabilities.

  8. Multiple access techniques and spectrum utilization of the GLOBALSTAR mobile satellite system

    NASA Astrophysics Data System (ADS)

    Louie, Ming; Cohen, Michel; Rouffet, Denis; Gilhousen, Klein S.

    The GLOBALSTAR System is a Low Earth Orbit (LEO) satellite-based mobile communications system that is interoperable with the current and future Public Land Mobile Network (PLMN). The GLOBALSTAR System concept is based upon technological advancement in two key areas: (1) the advancement in LEO satellite technology; (2) the advancement in cellular telephone technology, including the commercial applications of Code Division Multiple Access (CDMA) technologies, and of the most recent progress in Time Division Multiple Access technologies. The GLOBALSTAR System uses elements of CDMA, Frequency Division Multiple Access (FDMA), and Time Division Multiple Access (TDMA) technology, combining with satellite Multiple Beam Antenna (MBA) technology, to arrive at one of the most efficient modulation and multiple access system ever proposed for a satellite communications system. The technology used in GLOBALSTAR exploits the following techniques in obtaining high spectral efficiency and affordable cost per channel, with minimum coordination among different systems: power control, in open and closed loops, voice activation, spot beam satellite antenna for frequency reuse, weighted satellite antenna gain, multiple satellite coverage, and handoff between satellites. The GLOBALSTAR system design will use the following frequency bands: 1610-1626.5 MHz for up-link and 2483.5-2500 MHz for down-link.

  9. Concept of JINR Corporate Information System

    NASA Astrophysics Data System (ADS)

    Filozova, I. A.; Bashashin, M. V.; Korenkov, V. V.; Kuniaev, S. V.; Musulmanbekov, G.; Semenov, R. N.; Shestakova, G. V.; Strizh, T. A.; Ustenko, P. V.; Zaikina, T. N.

    2016-09-01

    The article presents the concept of JINR Corporate Information System (JINR CIS). Special attention is given to the information support of scientific researches - Current Research Information System as a part of the corporate information system. The objectives of such a system are focused on ensuring an effective implementation and research by using the modern information technology, computer technology and automation, creation, development and integration of digital resources on a common conceptual framework. The project assumes continuous system development, introduction the new information technologies to ensure the technological system relevance.

  10. NASA Technology Area 1: Launch Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McConnaughey, Paul; Femminineo, Mark; Koelfgen, Syri; Lepsch, Roger; Ryan, Richard M.; Taylor, Steven A.

    2011-01-01

    This slide presentation reviews the technology advancements plans for the NASA Technology Area 1, Launch Propulsion Systems Technology Area (LPSTA). The draft roadmap reviews various propulsion system technologies that will be developed during the next 25 + years. This roadmap will be reviewed by the National Research Council which will issue a final report, that will include findings and recommendations.

  11. Structures Technology for Future Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.; Paul, Donald B.; Hopkins, Mark A.

    2000-01-01

    An overview of structures technology for future aerospace systems is given. Discussion focuses on developments in component technologies that will improve the vehicle performance, advance the technology exploitation process, and reduce system life-cycle costs. The component technologies described are smart materials and structures, multifunctional materials and structures, affordable composite structures, extreme environment structures, flexible load bearing structures, and computational methods and simulation-based design. The trends in each of the component technologies are discussed and the applicability of these technologies to future aerospace vehicles is described.

  12. Transformative Learning: Patterns of Psychophysiologic Response and Technology-Enabled Learning and Intervention Systems

    DTIC Science & Technology

    2008-09-01

    Psychophysiologic Response and Technology -Enabled Learning and Intervention Systems PRINCIPAL INVESTIGATOR: Leigh W. Jerome, Ph.D...NUMBER Transformative Learning : Patterns of Psychophysiologic Response and Technology - Enabled Learning and Intervention Systems 5b. GRANT NUMBER...project entitled “Transformative Learning : Patterns of Psychophysiologic Response in Technology Enabled Learning and Intervention Systems.” The

  13. Progress towards autonomous, intelligent systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry; Heer, Ewald

    1987-01-01

    An aggressive program has been initiated to develop, integrate, and implement autonomous systems technologies starting with today's expert systems and evolving to autonomous, intelligent systems by the end of the 1990s. This program includes core technology developments and demonstration projects for technology evaluation and validation. This paper discusses key operational frameworks in the content of systems autonomy applications and then identifies major technological challenges, primarily in artificial intelligence areas. Program content and progress made towards critical technologies and demonstrations that have been initiated to achieve the required future capabilities in the year 2000 era are discussed.

  14. Study of multi-megawatt technology needs for photovoltaic space power systems, volume 2

    NASA Technical Reports Server (NTRS)

    Peterson, D. M.; Pleasant, R. L.

    1981-01-01

    Possible missions requiring multimegawatt photovoltaic space power systems in the 1990's time frame and power system technology needs associated with these missions are examined. Four specific task areas were considered: (1) missions requiring power in the 1-10 megawatt average power region; (2) alternative power systems and component technologies; (3) technology goals and sensitivity trades and analyses; and (4) technology recommendations. Specific concepts for photovoltaic power approaches considered were: planar arrays, concentrating arrays, hybrid systems using Rankine engines, thermophotovoltaic approaches; all with various photovoltaic cell component technologies. Various AC/DC power management approaches, and battery, fuel cell, and flywheel energy storage concepts are evaluated. Interactions with the electrical ion engine injection and stationkeeping system are also considered.

  15. System approach to modeling of industrial technologies

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  16. Conceptual design study: Forest Fire Advanced System Technology (FFAST)

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Warren, J. R.

    1986-01-01

    An integrated forest fire detection and mapping system that will be based upon technology available in the 1990s was defined. Uncertainties in emerging and advanced technologies related to the conceptual design were identified and recommended for inclusion as preferred system components. System component technologies identified for an end-to-end system include thermal infrared, linear array detectors, automatic georeferencing and signal processing, geosynchronous satellite communication links, and advanced data integration and display. Potential system configuration options were developed and examined for possible inclusion in the preferred system configuration. The preferred system configuration will provide increased performance and be cost effective over the system currently in use. Forest fire management user requirements and the system component emerging technologies were the basis for the system configuration design. A preferred system configuration was defined that warrants continued refinement and development, examined economic aspects of the current and preferred system, and provided preliminary cost estimates for follow-on system prototype development.

  17. Status of Propulsion Technology Development Under the NASA In-Space Propulsion Technology Program

    NASA Technical Reports Server (NTRS)

    Anderson, David; Kamhawi, Hani; Patterson, Mike; Pencil, Eric; Pinero, Luis; Falck, Robert; Dankanich, John

    2014-01-01

    Since 2001, the In-Space Propulsion Technology (ISPT) program has been developing and delivering in-space propulsion technologies for NASA's Science Mission Directorate (SMD). These in-space propulsion technologies are applicable, and potentially enabling for future NASA Discovery, New Frontiers, Flagship and sample return missions currently under consideration. The ISPT program is currently developing technology in three areas that include Propulsion System Technologies, Entry Vehicle Technologies, and Systems/Mission Analysis. ISPT's propulsion technologies include: 1) the 0.6-7 kW NASA's Evolutionary Xenon Thruster (NEXT) gridded ion propulsion system; 2) a 0.3-3.9kW Halleffect electric propulsion (HEP) system for low cost and sample return missions; 3) the Xenon Flow Control Module (XFCM); 4) ultra-lightweight propellant tank technologies (ULTT); and 5) propulsion technologies for a Mars Ascent Vehicle (MAV). The NEXT Long Duration Test (LDT) recently exceeded 50,000 hours of operation and 900 kg throughput, corresponding to 34.8 MN-s of total impulse delivered. The HEP system is composed of the High Voltage Hall Accelerator (HIVHAC) thruster, a power processing unit (PPU), and the XFCM. NEXT and the HIVHAC are throttle-able electric propulsion systems for planetary science missions. The XFCM and ULTT are two component technologies which being developed with nearer-term flight infusion in mind. Several of the ISPT technologies are related to sample return missions needs: MAV propulsion and electric propulsion. And finally, one focus of the Systems/Mission Analysis area is developing tools that aid the application or operation of these technologies on wide variety of mission concepts. This paper provides a brief overview of the ISPT program, describing the development status and technology infusion readiness.

  18. Synoptic Sky Surveys: Lessons Learned and Challenges Ahead

    NASA Astrophysics Data System (ADS)

    Djorgovski, Stanislav G.; CRTS Team

    2014-01-01

    A new generation of synoptic sky surveys is now opening the time domain for a systematic exploration, presenting both great new scientific opportunities as well as the challenges. These surveys are touching essentially all subfields of astronomy, producing large statistical samples of the known types of objects and events (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). They are generating new science now, and paving the way for even larger surveys to come, e.g., the LSST. Our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already be severely limited, and this problem will grow by orders of magnitude. This requires an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of transient events, that incorporates heterogeneous data from the surveys themselves, archival information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts now under way. This is inherently an astronomy of telescope-computational systems, that increasingly depends on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is an archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional data parameter spaces.

  19. Two More Candidate AM Canum Venaticorum (am CVn) Binaries from the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Anderson, Scott F.; Becker, Andrew C.; Haggard, Daryl; Prieto, Jose Luis; Knapp, Gillian R.; Sako, Masao; Halford, Kelly E.; Jha, Saurabh; Martin, Blake; Holtzman, Jon; Frieman, Joshua A.; Garnavich, Peter M.; Hayward, Suzanne; Ivezić, Željko; Mukadam, Anjum S.; Sesar, Branimir; Szkody, Paula; Malanushenko, Viktor; Richmond, Michael W.; Schneider, Donald P.; York, Donald G.

    2008-06-01

    AM CVn systems are a select group of ultracompact binaries with the shortest orbital periods of any known binary subclass; mass transfer is likely from a low-mass (partially-)degenerate secondary onto a white dwarf primary, driven by gravitational radiation. In the past few years, the Sloan Digital Sky Survey (SDSS) has provided five new AM CVns. Here we report on two further candidates selected from more recent SDSS data. SDSS J1208+3550 is similar to the earlier SDSS discoveries, recognized as an AM CVn via its distinctive spectrum which is dominated by helium emission. From the expanded SDSS Data Release 6 (DR6) spectroscopic area, we provide an updated surface density estimate for such AM CVns of order 10-3.1-10-2.5 deg-2 for 15 < g < 20.5. In addition, we present another new candidate AM CVn, SDSS J2047+0008, which was discovered in the course of follow-up of SDSS-II supernova candidates. It shows nova-like outbursts in multi-epoch imaging data; in contrast to the other SDSS AM CVn discoveries, its (outburst) spectrum is dominated by helium absorption lines, reminiscent of KL Dra, and 2003aw. The variability selection of SDSS J2047+0008 from the 300 deg2 of SDSS Stripe 82 presages further AM CVn discoveries in future deep, multicolor, and time-domain surveys such as the Large Synoptic Survey Telescope (LSST). The new additions bring the total SDSS yield to seven AM CVns thus far, a substantial contribution to this rare subclass, versus the dozen previously known. Includes optical observations obtained with the Sloan Digital Sky Survey I and II and the Apache Point Observatory (APO) 3.5 m telescope which is owned and operated by the Astrophysical Research Consortium (ARC), and the WIYN Observatory which is a joint facility of the University of Wisconsin, Indiana University, Yale University, and NOAO.

  20. A direct measurement of the high-mass end of the velocity dispersion function at z ~ 0.55 from SDSS-III/BOSS

    DOE PAGES

    Montero-Dorta, Antonio D.; Bolton, Adam S.; Shu, Yiping

    2017-02-24

    When two galaxies that are distant from one another (and also distant from Earth) happen to lie along a single line of sight in the sky, the resulting phenomenon is known as a “gravitational lens.” The gravity of the more nearby galaxy warps the image of the more distant galaxy into multiple images or complete rings (know as “Einstein rings” since the quantitative description of the gravitational lensing effect relies on Einstein’s theory of gravity.) Strong gravitational lens systems have multiple scientific applications. If the more distant galaxy happens to contain a time-varying quasar (bright emission powered by a supermassivemore » black hole at the galaxy’s center) or supernova explosion, the time delay between multiple images can be used as a probe of the expansion rate of the universe (and other cosmological parameters.) Forecasting the incidence of gravitational lenses in future large-scale sky surveys relies on quantifying the population of potential lens galaxies in the universe in terms of their abundance and their lensing efficiency. The lensing efficiency is most directly correlated with the galaxy’s “velocity dispersion:” the characteristic speed with which stars in the galaxy are orbiting under the influence of the galaxy’s overall gravitational field. This paper uses previous results quantifying the combined demographics of galaxies in brightness and velocity dispersion to compute the demographics of massive “elliptical” galaxies in velocity dispersion alone, thereby providing the essential ingredient for forecasting the expected incidence of strong gravitational lensing by these types of galaxies in future sky surveys such as DESI and LSST. These results are also applicable to the association of massive galaxies with their associated dark-matter “halos,” which is an essential ingredient for the most accurate and informative extraction of cosmological parameters from the data sets produced by large-scale surveys of the universe.« less

  1. Current and Future X-ray Studies of High-Redshift AGNs and the First Supermassive Black Holes

    NASA Astrophysics Data System (ADS)

    Brandt, Niel

    2016-01-01

    X-ray observations of high-redshift AGNs at z = 4-7 have played a critical role in understanding the physical processes at work inthese objects as well as their basic demographics. Since 2000, Chandra and XMM-Newton have provided new X-ray detections for more than 120 such objects, and well-defined samples of z > 4 AGNs now allow reliable X-ray population studies. Once luminosity effectsare considered, the basic X-ray continuum properties of most high-redshift AGNs appear remarkably similar to those of local AGNs, although there are some notable apparent exceptions (e.g., highly radio-loud quasars). Furthermore, the X-ray absorption found in some objects has been used as a diagnostic of outflowing winds and circumnuclear material. Demographically, the X-ray data now support an exponential decline in the number density of luminous AGNs above z ~ 3, and quantitative space-density comparisons for optically selected and X-ray selected quasars indicate basic statistical agreement.The current X-ray discoveries point the way toward the future breakthroughs that will be possible with, e.g., Athena and the X-raySurveyor. These missions will execute powerful blank-field surveys to elucidate the demographics of the first growing supermassive black holes (SMBHs), including highly obscured systems, up to z ~ 10. They will also carry out complementary X-ray spectroscopic and variability investigations of high-redshift AGNs by targeting the most-luminous z = 7-10 quasars found in wide-field surveys by, e.g., Euclid, LSST, and WFIRST. X-ray spectroscopic and variability studies of the X-ray continuum and reflection signatures will help determine Eddington ratios and disk/corona properties; measuring these will clarify how the first quasars grew so quickly. Furthermore, absorption line/edge studies will reveal how outflows from the first SMBHs influenced the growth of the first galaxies. I will suggest some efficient observational strategies for Athena and the X-ray Surveyor.

  2. Space power systems technology enablement study. [for the space transportation system

    NASA Technical Reports Server (NTRS)

    Smith, L. D.; Stearns, J. W.

    1978-01-01

    The power system technologies which enable or enhance future space missions requiring a few kilowatts or less and using the space shuttle were assessed. The advances in space power systems necessary for supporting the capabilities of the space transportation system were systematically determined and benefit/cost/risk analyses were used to identify high payoff technologies and technological priorities. The missions that are enhanced by each development are discussed.

  3. Patient source of learning about health technologies and ratings of trust in technologies used in their care

    PubMed Central

    Montague, Enid

    2011-01-01

    In order to design effective health technologies and systems, it is important to understand how patients learn and make decisions about health technologies used in their care. The objective of this study was to examine patients' source of learning about technologies used in their care and how the source related to their trust in the technology used. Individual face-to-face and telephone interviews were conducted with 24 patients. Thirteen unique sources of information about technology were identified and three major themes emerged; outside of the work system versus inside the work system, when the health information was provided, and the medium used. Patients used multiple sources outside of the health care work system to learn about technologies that will be used in their care. Results showed a relationship between learning about technologies from web sources and trust in technologies but no relationship between learning about technologies from health care providers and trust in technologies. PMID:20967654

  4. On determining specifications and selections of alternative technologies for airport checked-baggage security screening.

    PubMed

    Feng, Qianmei

    2007-10-01

    Federal law mandates that every checked bag at all commercial airports be screened by explosive detection systems (EDS), explosive trace detection systems (ETD), or alternative technologies. These technologies serve as critical components of airport security systems that strive to reduce security risks at both national and global levels. To improve the operational efficiency and airport security, emerging image-based technologies have been developed, such as dual-energy X-ray (DX), backscatter X-ray (BX), and multiview tomography (MVT). These technologies differ widely in purchasing cost, maintenance cost, operating cost, processing rate, and accuracy. Based on a mathematical framework that takes into account all these factors, this article investigates two critical issues for operating screening devices: setting specifications for continuous security responses by different technologies; and selecting technology or combination of technologies for efficient 100% baggage screening. For continuous security responses, specifications or thresholds are used for classifying threat items from nonthreat items. By investigating the setting of specifications on system security responses, this article assesses the risk and cost effectiveness of various technologies for both single-device and two-device systems. The findings provide the best selection of image-based technologies for both single-device and two-device systems. Our study suggests that two-device systems outperform single-device systems in terms of both cost effectiveness and accuracy. The model can be readily extended to evaluate risk and cost effectiveness of multiple-device systems for airport checked-baggage security screening.

  5. A Probabilistic System Analysis of Intelligent Propulsion System Technologies

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2007-01-01

    NASA s Intelligent Propulsion System Technology (Propulsion 21) project focuses on developing adaptive technologies that will enable commercial gas turbine engines to produce fewer emissions and less noise while increasing reliability. It features adaptive technologies that have included active tip-clearance control for turbine and compressor, active combustion control, turbine aero-thermal and flow control, and enabling technologies such as sensors which are reliable at high operating temperatures and are minimally intrusive. A probabilistic system analysis is performed to evaluate the impact of these technologies on aircraft CO2 (directly proportional to fuel burn) and LTO (landing and takeoff) NO(x) reductions. A 300-passenger aircraft, with two 396-kN thrust (85,000-pound) engines is chosen for the study. The results show that NASA s Intelligent Propulsion System technologies have the potential to significantly reduce the CO2 and NO(x) emissions. The results are used to support informed decisionmaking on the development of the intelligent propulsion system technology portfolio for CO2 and NO(x) reductions.

  6. Life support systems analysis and technical trades for a lunar outpost

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Ganapathi, G. B.; Rohatgi, N. K.; Seshan, P. K.

    1994-01-01

    The NASA/JPL life support systems analysis (LISSA) software tool was used to perform life support system analysis and technology trades for a Lunar Outpost. The life support system was modeled using a chemical process simulation program on a steady-state, one-person, daily basis. Inputs to the LiSSA model include metabolic balance load data, hygiene load data, technology selection, process operational assumptions and mission parameter assumptions. A baseline set of technologies has been used against which comparisons have been made by running twenty-two cases with technology substitutions. System, subsystem, and technology weights and powers are compared for a crew of 4 and missions of 90 and 600 days. By assigning a weight value to power, equivalent system weights are compared. Several less-developed technologies show potential advantages over the baseline. Solid waste treatment technologies show weight and power disadvantages but one could have benefits associated with the reduction of hazardous wastes and very long missions. Technology development towards reducing the weight of resupplies and lighter materials of construction was recommended. It was also recommended that as technologies are funded for development, contractors should be required to generate and report data useful for quantitative technology comparisons.

  7. The Innovation Deficit in Urban Water: The Need for an Integrated Perspective on Institutions, Organizations, and Technology.

    PubMed

    Kiparsky, Michael; Sedlak, David L; Thompson, Barton H; Truffer, Bernhard

    2013-08-01

    Interaction between institutional change and technological change poses important constraints on transitions of urban water systems to a state that can meet future needs. Research on urban water and other technology-dependent systems provides insights that are valuable to technology researchers interested in assuring that their efforts will have an impact. In the context of research on institutional change, innovation is the development, application, diffusion, and utilization of new knowledge and technology. This definition is intentionally inclusive: technological innovation will play a key role in reinvention of urban water systems, but is only part of what is necessary. Innovation usually depends on context, such that major changes to infrastructure include not only the technological inventions that drive greater efficiencies and physical transformations of water treatment and delivery systems, but also the political, cultural, social, and economic factors that hinder and enable such changes. On the basis of past and present changes in urban water systems, institutional innovation will be of similar importance to technological innovation in urban water reinvention. To solve current urban water infrastructure challenges, technology-focused researchers need to recognize the intertwined nature of technologies and institutions and the social systems that control change.

  8. NASA's Launch Propulsion Systems Technology Roadmap

    NASA Technical Reports Server (NTRS)

    McConnaughey, Paul K.; Femminineo, Mark G.; Koelfgen, Syri J.; Lepsch, Roger A; Ryan, Richard M.; Taylor, Steven A.

    2012-01-01

    Safe, reliable, and affordable access to low-Earth (LEO) orbit is necessary for all of the United States (US) space endeavors. In 2010, NASA s Office of the Chief Technologist commissioned 14 teams to develop technology roadmaps that could be used to guide the Agency s and US technology investment decisions for the next few decades. The Launch Propulsion Systems Technology Area (LPSTA) team was tasked to address the propulsion technology challenges for access to LEO. The developed LPSTA roadmap addresses technologies that enhance existing solid or liquid propulsion technologies and their related ancillary systems or significantly advance the technology readiness level (TRL) of less mature systems like airbreathing, unconventional, and other launch technologies. In developing this roadmap, the LPSTA team consulted previous NASA, military, and industry studies as well as subject matter experts to develop their assessment of this field, which has fundamental technological and strategic impacts for US space capabilities.

  9. Cost-effective implementation of intelligent systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry, Jr.; Heer, Ewald

    1990-01-01

    Significant advances have occurred during the last decade in knowledge-based engineering research and knowledge-based system (KBS) demonstrations and evaluations using integrated intelligent system technologies. Performance and simulation data obtained to date in real-time operational environments suggest that cost-effective utilization of intelligent system technologies can be realized. In this paper the rationale and potential benefits for typical examples of application projects that demonstrate an increase in productivity through the use of intelligent system technologies are discussed. These demonstration projects have provided an insight into additional technology needs and cultural barriers which are currently impeding the transition of the technology into operational environments. Proposed methods which addresses technology evolution and implementation are also discussed.

  10. Using IoT Device Technology in Spacecraft Checkout Systems

    NASA Astrophysics Data System (ADS)

    Plummer, Chris

    2015-09-01

    The Internet of Things (IoT) has become a common theme in both the technical and popular press in recent years because many of the enabling technologies that are required to make IoT a reality have now matured. Those technologies are revolutionising the way industrial systems and products are developed because they offer significant advantages over older technologies. This paper looks at how IoT device technology can be used in spacecraft checkout systems to achieve smaller, more capable, and more scalable solutions than are currently available. It covers the use of IoT device technology for classical spacecraft test systems as well as for hardware-in-the-loop simulation systems used to support spacecraft checkout.

  11. Use of Ubiquitous Technologies in Military Logistic System in Iran

    NASA Astrophysics Data System (ADS)

    Jafari, P.; Sadeghi-Niaraki, A.

    2013-09-01

    This study is about integration and evaluation of RFID and ubiquitous technologies in military logistic system management. Firstly, supply chain management and the necessity of a revolution in logistic systems especially in military area, are explained. Secondly RFID and ubiquitous technologies and the advantages of their use in supply chain management are introduced. Lastly a system based on these technologies for controlling and increasing the speed and accuracy in military logistic system in Iran with its unique properties, is presented. The system is based on full control of military logistics (supplies) from the time of deployment to replenishment using sensor network, ubiquitous and RFID technologies.

  12. Enhanced technologies for unattended ground sensor systems

    NASA Astrophysics Data System (ADS)

    Hartup, David C.

    2010-04-01

    Progress in several technical areas is being leveraged to advantage in Unattended Ground Sensor (UGS) systems. This paper discusses advanced technologies that are appropriate for use in UGS systems. While some technologies provide evolutionary improvements, other technologies result in revolutionary performance advancements for UGS systems. Some specific technologies discussed include wireless cameras and viewers, commercial PDA-based system programmers and monitors, new materials and techniques for packaging improvements, low power cueing sensor radios, advanced long-haul terrestrial and SATCOM radios, and networked communications. Other technologies covered include advanced target detection algorithms, high pixel count cameras for license plate and facial recognition, small cameras that provide large stand-off distances, video transmissions of target activity instead of still images, sensor fusion algorithms, and control center hardware. The impact of each technology on the overall UGS system architecture is discussed, along with the advantages provided to UGS system users. Areas of analysis include required camera parameters as a function of stand-off distance for license plate and facial recognition applications, power consumption for wireless cameras and viewers, sensor fusion communication requirements, and requirements to practically implement video transmission through UGS systems. Examples of devices that have already been fielded using technology from several of these areas are given.

  13. Technical analysis of US Army Weapons Systems and related advanced technologies of military interest. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1991-06-14

    This report summarizes the activities and accomplishments of an US Army technology security project designed to identify and develop effective policy guidelines for militarily critical technologies in specific Army systems and in broad generic technology areas of military interest, Individual systems analyses are documented in separate Weapons Systems Technical Assessments (WSTAs) and the general generic technology areas are evaluated in the Advanced Technology Assessment Reports (ATARs), However, specific details of these assessments are not addressed here, only recommendations regarding aspects of the defined approach, methodology, and format are provided and discussed.

  14. Trust in technology-mediated collaborative health encounters: constructing trust in passive user interactions with technologies.

    PubMed

    Montague, Enid; Asan, Onur

    2012-01-01

    The present study investigated factors that explain patient trust in health technology and the relationship between patient trust in technology and trust in their care provider. Sociotechnical systems theory states that changes in one part of the system are likely related to other parts of the system. Therefore, attitudes about technologies, like trust, are likely related to other aspects of the system. Contributing to appropriate trust at the technological, interpersonal, and system levels can potentially lead to positive health outcomes. The study described in this manuscript used data collected from 101 patients with a Trust in Medical Technology instrument. The instrument measured patients' trust in (1) their providers, (2) the technology, and (3) how their providers used the technology. Measure 3 was positively associated with measures 1 and 2, while measures 1 and 2 were not positively or negatively associated with one another. These results may indicate that patient assessments of the trustworthiness of care providers and technologies are based on their observations of how providers use technologies. Though patients are not active users of technologies in health care, the results of this study show that their perceptions of how providers use technology are related to their trust in both technology and the care provider. Study findings have implications for how trust is conceptualised and measured in interpersonal relationships and in technologies.

  15. A Systems Definition of Educational Technology in Society

    ERIC Educational Resources Information Center

    Luppicini, Rocci

    2005-01-01

    Conceptual development in the field of Educational Technology provides crucial theoretical grounding for ongoing research and practice. This essay draws from theoretical developments both within and external to the field of Educational Technology to articulate a systems definition of Educational Technology in Society. A systems definition of…

  16. 32 CFR 147.15 - Guideline M-Misuse of Information technology systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Guideline M-Misuse of Information technology... CLASSIFIED INFORMATION Adjudication § 147.15 Guideline M—Misuse of Information technology systems. (a) The... technology systems may raise security concerns about an individual's trustworthiness, willingness, and...

  17. An Approach to Establishing System Benefits for Technology in NASA's Hypersonics Investment Area

    NASA Technical Reports Server (NTRS)

    Hueter, Uwe; Pannell, Bill; Cook, Stephen (Technical Monitor)

    2001-01-01

    NASA's has established long term goals for access-to-space. The third generation launch systems are to be fully reusable and operational around 2025. The goals for the third generation launch system are to significantly reduce cost and improve safety over current systems. The Advanced Space Transportation Program (ASTP) Office at the NASA's Marshall Space Flight Center in Huntsville, AL has the agency lead to develop space transportation technologies. Within ASTP, under the Hypersonics Investment Area, third generation technologies are being pursued. The Hypersonics Investment Area's primary objective is to mature vehicle technologies to enable substantial increases in the design and operating margins of third generation RLVs (current Space Shuttle is considered the first generation RLV) by incorporating advanced propulsion systems, materials, structures, thermal protection systems, power, and avionics technologies. The paper describes the system process, tools and concepts used to determine the technology benefits. Preliminary results will be presented along with the current technology investments that are being made by ASTP's Hypersonics Investment Area.

  18. Technology Innovations from NASA's Next Generation Launch Technology Program

    NASA Technical Reports Server (NTRS)

    Cook, Stephen A.; Morris, Charles E. K., Jr.; Tyson, Richard W.

    2004-01-01

    NASA's Next Generation Launch Technology Program has been on the cutting edge of technology, improving the safety, affordability, and reliability of future space-launch-transportation systems. The array of projects focused on propulsion, airframe, and other vehicle systems. Achievements range from building miniature fuel/oxygen sensors to hot-firings of major rocket-engine systems as well as extreme thermo-mechanical testing of large-scale structures. Results to date have significantly advanced technology readiness for future space-launch systems using either airbreathing or rocket propulsion.

  19. Application of data mining in science and technology management information system based on WebGIS

    NASA Astrophysics Data System (ADS)

    Wu, Xiaofang; Xu, Zhiyong; Bao, Shitai; Chen, Feixiang

    2009-10-01

    With the rapid development of science and technology and the quick increase of information, a great deal of data is accumulated in the management department of science and technology. Usually, many knowledge and rules are contained and concealed in the data. Therefore, how to excavate and use the knowledge fully is very important in the management of science and technology. It will help to examine and approve the project of science and technology more scientifically and make the achievement transformed as the realistic productive forces easier. Therefore, the data mine technology will be researched and applied to the science and technology management information system to find and excavate the knowledge in the paper. According to analyzing the disadvantages of traditional science and technology management information system, the database technology, data mining and web geographic information systems (WebGIS) technology will be introduced to develop and construct the science and technology management information system based on WebGIS. The key problems are researched in detail such as data mining and statistical analysis. What's more, the prototype system is developed and validated based on the project data of National Natural Science Foundation Committee. The spatial data mining is done from the axis of time, space and other factors. Then the variety of knowledge and rules will be excavated by using data mining technology, which helps to provide an effective support for decisionmaking.

  20. Tools and technologies for expert systems: A human factors perspective

    NASA Technical Reports Server (NTRS)

    Rajaram, Navaratna S.

    1987-01-01

    It is widely recognized that technologies based on artificial intelligence (AI), especially expert systems, can make significant contributions to the productivity and effectiveness of operations of information and knowledge intensive organizations such as NASA. At the same time, these being relatively new technologies, there is the problem of transfering technology to key personnel of such organizations. The problems of examining the potential of expert systems and of technology transfer is addressed in the context of human factors applications. One of the topics of interest was the investigation of the potential use of expert system building tools, particularly NEXPERT as a technology transfer medium. Two basic conclusions were reached in this regard. First, NEXPERT is an excellent tool for rapid prototyping of experimental expert systems, but not ideal as a delivery vehicle. Therefore, it is not a substitute for general purpose system implementation languages such a LISP or C. This assertion probably holds for nearly all such tools on the market today. Second, an effective technology transfer mechanism is to formulate and implement expert systems for problems which members of the organization in question can relate to. For this purpose, the LIghting EnGineering Expert (LIEGE) was implemented using NEXPERT as the tool for technology transfer and to illustrate the value of expert systems to the activities of the Man-System Division.

  1. Applying Sustainable Systems Development Approach to Educational Technology Systems

    ERIC Educational Resources Information Center

    Huang, Albert

    2012-01-01

    Information technology (IT) is an essential part of modern education. The roles and contributions of technology to education have been thoroughly documented in academic and professional literature. Despite the benefits, the use of educational technology systems (ETS) also creates a significant impact on the environment, primarily due to energy…

  2. 75 FR 67804 - Future Systems Technology Advisory Panel Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2010-0071] Future Systems Technology Advisory Panel... recommendations on the future of systems technology and electronic services at the agency five to ten years into... technologies may be developed to assist in carrying out its statutory mission. Advice and recommendations can...

  3. 1991 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1991 Life Support Systems Analysis Workshop was sponsored by NASA Headquarters' Office of Aeronautics and Space Technology (OAST) to foster communication among NASA, industrial, and academic specialists, and to integrate their inputs and disseminate information to them. The overall objective of systems analysis within the Life Support Technology Program of OAST is to identify, guide the development of, and verify designs which will increase the performance of the life support systems on component, subsystem, and system levels for future human space missions. The specific goals of this workshop were to report on the status of systems analysis capabilities, to integrate the chemical processing industry technologies, and to integrate recommendations for future technology developments related to systems analysis for life support systems. The workshop included technical presentations, discussions, and interactive planning, with time allocated for discussion of both technology status and time-phased technology development recommendations. Key personnel from NASA, industry, and academia delivered inputs and presentations on the status and priorities of current and future systems analysis methods and requirements.

  4. Technology for the Future: In-Space Technology Experiments Program, part 2

    NASA Technical Reports Server (NTRS)

    Breckenridge, Roger A. (Compiler); Clark, Lenwood G. (Compiler); Willshire, Kelli F. (Compiler); Beck, Sherwin M. (Compiler); Collier, Lisa D. (Compiler)

    1991-01-01

    The purpose of the Office of Aeronautics and Space Technology (OAST) In-Space Technology Experiments Program In-STEP 1988 Workshop was to identify and prioritize technologies that are critical for future national space programs and require validation in the space environment, and review current NASA (In-Reach) and industry/ university (Out-Reach) experiments. A prioritized list of the critical technology needs was developed for the following eight disciplines: structures; environmental effects; power systems and thermal management; fluid management and propulsion systems; automation and robotics; sensors and information systems; in-space systems; and humans in space. This is part two of two parts and contains the critical technology presentations for the eight theme elements and a summary listing of critical space technology needs for each theme.

  5. A case for Sandia investment in complex adaptive systems science and technology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colbaugh, Richard; Tsao, Jeffrey Yeenien; Johnson, Curtis Martin

    2012-05-01

    This white paper makes a case for Sandia National Laboratories investments in complex adaptive systems science and technology (S&T) -- investments that could enable higher-value-added and more-robustly-engineered solutions to challenges of importance to Sandia's national security mission and to the nation. Complex adaptive systems are ubiquitous in Sandia's national security mission areas. We often ignore the adaptive complexity of these systems by narrowing our 'aperture of concern' to systems or subsystems with a limited range of function exposed to a limited range of environments over limited periods of time. But by widening our aperture of concern we could increase ourmore » impact considerably. To do so, the science and technology of complex adaptive systems must mature considerably. Despite an explosion of interest outside of Sandia, however, that science and technology is still in its youth. What has been missing is contact with real (rather than model) systems and real domain-area detail. With its center-of-gravity as an engineering laboratory, Sandia's has made considerable progress applying existing science and technology to real complex adaptive systems. It has focused much less, however, on advancing the science and technology itself. But its close contact with real systems and real domain-area detail represents a powerful strength with which to help complex adaptive systems science and technology mature. Sandia is thus both a prime beneficiary of, as well as potentially a prime contributor to, complex adaptive systems science and technology. Building a productive program in complex adaptive systems science and technology at Sandia will not be trivial, but a credible path can be envisioned: in the short run, continue to apply existing science and technology to real domain-area complex adaptive systems; in the medium run, jump-start the creation of new science and technology capability through Sandia's Laboratory Directed Research and Development program; and in the long run, inculcate an awareness at the Department of Energy of the importance of supporting complex adaptive systems science through its Office of Science.« less

  6. Beyond Our Boundaries: Research and Technology

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Topics considered include: Propulsion and Fluid Management; Structures and Dynamics; Materials and Manufacturing Processes; Sensor Technology; Software Technology; Optical Systems; Microgravity Science; Earth System Science; Astrophysics; Solar Physics; and Technology Transfer.

  7. An Approach to Establishing System Benefits for Technologies In NASA's Spaceliner Investment Area

    NASA Technical Reports Server (NTRS)

    Hueter, Uwe; Pannell, Bill; Lyles, Garry M. (Technical Monitor)

    2001-01-01

    NASA's has established long term goals for access-to-space. The third generation launch systems are to be fully reusable and operational around 2025. The goals for the third generation launch system are to significantly reduce cost and improve safety over current systems. The Advanced Space Transportation Program Office (ASTP) at the NASA's Marshall Space Flight Center in Huntsville, AL has the agency lead to develop space transportation technologies. Within ASTP, under the Spaceliner Investment Area, third generation technologies are being pursued. The Spaceliner Investment Area's primary objective is to mature vehicle technologies to enable substantial increases in the design and operating margins of third generation RLVs (current Space Shuttle is considered the first generation RLV) by incorporating advanced propulsion systems, materials, structures, thermal protection systems, power, and avionics technologies. Advancements in design tools and better characterization of the operational environment will result in reduced design and operational variabilities leading to improvements in margins. Improvements in operational efficiencies will be obtained through the introduction of integrated vehicle health management, operations and range technologies. Investments in these technologies will enable the reduction in the high operational costs associated with today's vehicles by allowing components to operate well below their design points resulting in improved component operating life, reliability, and safety which in turn reduces both maintenance and refurbishment costs. The introduction of advanced technologies may enable horizontal takeoff by significantly reducing the takeoff weight and allowing use of existing infrastructure. This would be a major step toward the goal of airline-like operation. These factors in conjunction with increased flight rates, resulting from reductions in transportation costs, will result in significant improvements of future vehicles. The real-world problem is that resources are limited and technologies need to be prioritized to assure the resources are spent on technologies that provide the highest system level benefits. Toward that end, a systems approach is being taken to determine the benefits of technologies for the Spaceliner Investment Area. Technologies identified to be enabling will be funded. However, the other technologies will be funded based on their system's benefits. Since the final launch system concept will not be decided for many years, several vehicle concepts are being evaluated to determine technology benefits. Not only performance, but also cost and operability are being assessed. This will become an annual process to assess these technologies against their goals and the benefits to various launch systems concepts. The paper describes the system process, tools and concepts used to determine the technology benefits. Preliminary results will be presented along with the current technology investments that are being made by ASTP's Spaceliner Investment Area.

  8. Space Transportation Materials and Structures Technology Workshop. Volume 2; Proceedings

    NASA Technical Reports Server (NTRS)

    Cazier, Frank W., Jr. (Compiler); Gardner, James E. (Compiler)

    1993-01-01

    The Space Transportation Materials and Structures Technology Workshop was held on September 23-26, 1991, in Newport News, Virginia. The workshop, sponsored by the NASA Office of Space Flight and the NASA Office of Aeronautics and Space Technology, was held to provide a forum for communication within the space materials and structures technology developer and user communities. Workshop participants were organized into a Vehicle Technology Requirements session and three working panels: Materials and Structures Technologies for Vehicle Systems, Propulsion Systems, and Entry Systems.

  9. Systems Security Engineering

    DTIC Science & Technology

    2010-08-22

    Commission (IEC). “Information technology — Security techniques — Code of practice for information security management ( ISO /IEC 27002 ...Information technology — Security techniques — Information security management systems —Requirements ( ISO /IEC 27002 ),”, “Information technology — Security...was a draft ISO standard on Systems and software engineering, Systems and software assurance [18]. Created by systems engineers for systems

  10. Operationally Efficient Propulsion System Study (OEPSS) data book. Volume 3: Operations technology

    NASA Technical Reports Server (NTRS)

    Vilja, John O.

    1990-01-01

    The study was initiated to identify operational problems and cost drivers for current propulsion systems and to identify technology and design approaches to increase the operational efficiency and reduce operations costs for future propulsion systems. To provide readily usable data for the Advanced Launch System (ALS) program, the results of the OEPSS study were organized into a series of OEPSS Data Books. This volume describes operations technologies that will enhance operational efficiency of propulsion systems. A total of 15 operations technologies were identified that will eliminate or mitigate operations problems described in Volume 2. A recommended development plan is presented for eight promising technologies that will simplify the propulsion system and reduce operational requirements.

  11. Discerning Technological Systems Related to Everyday Objects: Mapping the Variation in Pupils' Experience

    ERIC Educational Resources Information Center

    Svensson, Maria; Ingerman, Ake

    2010-01-01

    Understanding technology today implies more than being able to use the technological objects present in our everyday lives. Our society is increasingly integrated with technological systems, of which technological objects, and their function, form a part. Technological literacy in that context implies understanding how knowledge is constituted in…

  12. Technology Alignment and Portfolio Prioritization (TAPP): Advanced Methods in Strategic Analysis, Technology Forecasting and Long Term Planning for Human Exploration and Operations, Advanced Exploration Systems and Advanced Concepts

    NASA Technical Reports Server (NTRS)

    Funaro, Gregory V.; Alexander, Reginald A.

    2015-01-01

    The Advanced Concepts Office (ACO) at NASA, Marshall Space Flight Center is expanding its current technology assessment methodologies. ACO is developing a framework called TAPP that uses a variety of methods, such as association mining and rule learning from data mining, structure development using a Technological Innovation System (TIS), and social network modeling to measure structural relationships. The role of ACO is to 1) produce a broad spectrum of ideas and alternatives for a variety of NASA's missions, 2) determine mission architecture feasibility and appropriateness to NASA's strategic plans, and 3) define a project in enough detail to establish an initial baseline capable of meeting mission objectives ACO's role supports the decision­-making process associated with the maturation of concepts for traveling through, living in, and understanding space. ACO performs concept studies and technology assessments to determine the degree of alignment between mission objectives and new technologies. The first step in technology assessment is to identify the current technology maturity in terms of a technology readiness level (TRL). The second step is to determine the difficulty associated with advancing a technology from one state to the next state. NASA has used TRLs since 1970 and ACO formalized them in 1995. The DoD, ESA, Oil & Gas, and DoE have adopted TRLs as a means to assess technology maturity. However, "with the emergence of more complex systems and system of systems, it has been increasingly recognized that TRL assessments have limitations, especially when considering [the] integration of complex systems." When performing the second step in a technology assessment, NASA requires that an Advancement Degree of Difficulty (AD2) method be utilized. NASA has used and developed or used a variety of methods to perform this step: Expert Opinion or Delphi Approach, Value Engineering or Value Stream, Analytical Hierarchy Process (AHP), Technique for the Order of Prioritization by Similarity to Ideal Solution (TOPSIS), and other multi­-criteria decision-making methods. These methods can be labor-intensive, often contain cognitive or parochial bias, and do not consider the competing prioritization between mission architectures. Strategic Decision-Making (SDM) processes cannot be properly understood unless the context of the technology is understood. This makes assessing technological change particularly challenging due to the relationships "between incumbent technology and the incumbent (innovation) system in relation to the emerging technology and the emerging innovation system." The central idea in technology dynamics is to consider all activities that contribute to the development, diffusion, and use of innovations as system functions. Bergek defines system functions within a TIS to address what is actually happening and has a direct influence on the ultimate performance of the system and technology development. ACO uses similar metrics and is expanding these metrics to account for the structure and context of the technology. At NASA technology and strategy is strongly interrelated. NASA's Strategic Space Technology Investment Plan (SSTIP) prioritizes those technologies essential to the pursuit of NASA's missions and national interests. The SSTIP is strongly coupled with NASA's Technology Roadmaps to provide investment guidance during the next four years, within a twenty-year horizon. This paper discusses the methods ACO is currently developing to better perform technology assessments while taking into consideration Strategic Alignment, Technology Forecasting, and Long Term Planning.

  13. The Innovation Deficit in Urban Water: The Need for an Integrated Perspective on Institutions, Organizations, and Technology

    PubMed Central

    Kiparsky, Michael; Sedlak, David L.; Thompson, Barton H.; Truffer, Bernhard

    2013-01-01

    Abstract Interaction between institutional change and technological change poses important constraints on transitions of urban water systems to a state that can meet future needs. Research on urban water and other technology-dependent systems provides insights that are valuable to technology researchers interested in assuring that their efforts will have an impact. In the context of research on institutional change, innovation is the development, application, diffusion, and utilization of new knowledge and technology. This definition is intentionally inclusive: technological innovation will play a key role in reinvention of urban water systems, but is only part of what is necessary. Innovation usually depends on context, such that major changes to infrastructure include not only the technological inventions that drive greater efficiencies and physical transformations of water treatment and delivery systems, but also the political, cultural, social, and economic factors that hinder and enable such changes. On the basis of past and present changes in urban water systems, institutional innovation will be of similar importance to technological innovation in urban water reinvention. To solve current urban water infrastructure challenges, technology-focused researchers need to recognize the intertwined nature of technologies and institutions and the social systems that control change. PMID:23983450

  14. Advances in gallium arsenide monolithic microwave integrated-circuit technology for space communications systems

    NASA Technical Reports Server (NTRS)

    Bhasin, K. B.; Connolly, D. J.

    1986-01-01

    Future communications satellites are likely to use gallium arsenide (GaAs) monolithic microwave integrated-circuit (MMIC) technology in most, if not all, communications payload subsystems. Multiple-scanning-beam antenna systems are expected to use GaAs MMIC's to increase functional capability, to reduce volume, weight, and cost, and to greatly improve system reliability. RF and IF matrix switch technology based on GaAs MMIC's is also being developed for these reasons. MMIC technology, including gigabit-rate GaAs digital integrated circuits, offers substantial advantages in power consumption and weight over silicon technologies for high-throughput, on-board baseband processor systems. In this paper, current developments in GaAs MMIC technology are described, and the status and prospects of the technology are assessed.

  15. NASA technology transfer network communications and information system: TUNS user survey

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Applied Expertise surveyed the users of the deployed Technology Utilization Network System (TUNS) and surveyed prospective new users in order to gather background information for developing the Concept Document of the system that will upgrade and replace TUNS. Survey participants broadly agree that automated mechanisms for acquiring, managing, and disseminating new technology and spinoff benefits information can and should play an important role in meeting NASA technology utilization goals. However, TUNS does not meet this need for most users. The survey describes a number of systematic improvements that will make it easier to use the technology transfer mechanism, and thus expedite the collection and dissemination of technology information. The survey identified 26 suggestions for enhancing the technology transfer system and related processes.

  16. NASA Office of Aeronautics and Space Technology Summer Workshop. Executive summary. [in-space research using the Space Transportation System

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Research and technology investigations are identified in eleven discipline technologies which require or which could significantly benefit from an in-space experiment, systems demonstrations, or component test using the Space Transportation System. Synopses of the eleven technology panels reports are presented.

  17. Seeing the System: Dynamics and Complexity of Technology Integration in Secondary Schools

    ERIC Educational Resources Information Center

    Howard, Sarah K.; Thompson, Kate

    2016-01-01

    This paper introduces system dynamics modeling to understand, visualize and explore technology integration in schools, through the development of a theoretical model of technology-related change in teachers' practice. Technology integration is a dynamic social practice, within the social system of education. It is difficult, if not nearly…

  18. Early Childhood Technology Integrated Instructional System (EC-TIIS) Phase 1: A Final Report

    ERIC Educational Resources Information Center

    Hutinger, Patricia; Robinson, Linda; Schneider, Carol

    2004-01-01

    The Early Childhood Technology Integrated Instructional System (EC-TIIS), a Steppingstones of Technology Innovation Phase 1--Development project, was developed by the Center for Best Practices in Early Childhood (the Center) at Western Illinois University as an online instructional system. EC-TIIS' ultimate goal was to improve technology services…

  19. A feasibility study: California Department of Forestry and Fire Protection utilization of infrared technologies for wildland fire suppression and management

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Britten, R. A.; Parks, G. S.; Voss, J. M.

    1990-01-01

    NASA's JPL has completed a feasibility study using infrared technologies for wildland fire suppression and management. The study surveyed user needs, examined available technologies, matched the user needs with technologies, and defined an integrated infrared wildland fire mapping concept system configuration. System component trade-offs were presented for evaluation in the concept system configuration. The economic benefits of using infrared technologies in fire suppression and management were examined. Follow-on concept system configuration development and implementation were proposed.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MARIAH ENERGY CORPORATION HEAT PLUS POWER SYSTEM

    EPA Science Inventory

    The Greenhouse Gas Technology Center (GHG Center) has recently evaluated the performance of the Heat PlusPower(TM) System (Mariah CDP System), which integrates microturbine technology with a heat recovery system. Electric power is generated with a Capstone MicroTurbine(TM) Model ...

Top