The Large Synoptic Survey Telescope Science Requirements
NASA Astrophysics Data System (ADS)
Tyson, J. A.; LSST Collaboration
2004-12-01
The Large Synoptic Survey Telescope (LSST) is a wide-field telescope facility that will add a qualitatively new capability in astronomy and will address some of the most pressing open questions in astronomy and fundamental physics. The 8.4-meter telescope and 3 billion pixel camera covering ten square degrees will reach sky in less than 10 seconds in each of 5-6 optical bands. This is enabled by advances in microelectronics, software, and large optics fabrication. The unprecedented optical throughput drives LSST's ability to go faint-wide-fast. The LSST will produce time-lapse digital imaging of faint astronomical objects across the entire visible sky with good resolution. For example, the LSST will provide unprecedented 3-dimensional maps of the mass distribution in the Universe, in addition to the traditional images of luminous stars and galaxies. These weak lensing data can be used to better understand the nature of Dark Energy. The LSST will also provide a comprehensive census of our solar system. By surveying deeply the entire accessible sky every few nights, the LSST will provide large samples of events which we now only rarely observe, and will create substantial potential for new discoveries. The LSST will produce the largest non-proprietary data set in the world. Several key science drivers are representative of the LSST system capabilities: Precision Characterization of Dark Energy, Solar System Map, Optical Transients, and a map of our Galaxy and its environs. In addition to enabling all four of these major scientific initiatives, LSST will make it possible to pursue many other research programs. The community has suggested a number of exciting programs using these data, and the long-lived data archives of the LSST will have the astrometric and photometric precision needed to support entirely new research directions which will inevitably develop during the next several decades.
The Effects of Commercial Airline Traffic on LSST Observing Efficiency
NASA Astrophysics Data System (ADS)
Gibson, Rose; Claver, Charles; Stubbs, Christopher
2016-01-01
The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).
Cosmology with the Large Synoptic Survey Telescope: an overview
NASA Astrophysics Data System (ADS)
Zhan, Hu; Tyson, J. Anthony
2018-06-01
The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.
NASA Astrophysics Data System (ADS)
Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.
2005-12-01
We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.
Technology for large space systems: A special bibliography with indexes
NASA Technical Reports Server (NTRS)
1979-01-01
This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.
Technology for large space systems: A special bibliography with indexes (supplement 01)
NASA Technical Reports Server (NTRS)
1979-01-01
This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.
The Large Synoptic Survey Telescope project management control system
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey P.
2012-09-01
The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.
Cosmology with the Large Synoptic Survey Telescope: an overview.
Zhan, Hu; Anthony Tyson, J
2018-06-01
The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an [Formula: see text] ([Formula: see text] effective) aperture, a novel three-mirror design achieving a seeing-limited [Formula: see text] field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an [Formula: see text] survey in six passbands (ugrizy) to a coadded depth of [Formula: see text] over 10 years using [Formula: see text] of its observational time. The remaining [Formula: see text] of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with [Formula: see text] exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen
2014-08-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
LSST: Cadence Design and Simulation
NASA Astrophysics Data System (ADS)
Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration
2009-01-01
The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.
Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)
NASA Technical Reports Server (NTRS)
1980-01-01
This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
Technology for large space systems: A special bibliography with indexes (supplement 05)
NASA Technical Reports Server (NTRS)
1981-01-01
This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
Technology for large space systems: A special bibliography with indexes (supplement 06)
NASA Technical Reports Server (NTRS)
1982-01-01
This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
NOAO and LSST: Illuminating the Path to LSST for All Users
NASA Astrophysics Data System (ADS)
Olsen, Knut A.; Matheson, T.; Ridgway, S. T.; Saha, A.; Lauer, T. R.; NOAO LSST Science Working Group
2013-01-01
As LSST moves toward construction and survey definition, the burden on the user community to begin planning and preparing for the massive data stream grows. In light of the significant challenge and opportunity that LSST now brings, a critical role for a National Observatory will be to advocate for, respond to, and advise the U.S. community on its use of LSST. NOAO intends to establish an LSST Community Science Center to meet these common needs. Such a Center builds on NOAO's leadership in offering survey-style instruments, proposal opportunities, and data management software over the last decade. This leadership has enabled high-impact scientific results, as evidenced by the award of the 2011 Nobel Prize in Physics for the discovery of Dark Energy, which stemmed directly from survey-style observations taken at NOAO. As steps towards creating an LSST Community Science Center, NOAO is 1) supporting the LSST Science Collaborations through membership calls and collaboration meetings; 2) developing the LSST operations simulator, the tool by which the community's scientific goals of are tested against the reality of what LSST's cadence can deliver; 3) embarking on a project to establish metrics for science data quality assessment, which will be critical for establishing faith in LSST results; 4) developing a roadmap and proposal to host and support the capability to help the community manage the expected flood of automated alerts from LSST; and 5) starting a serious discussion of the system capabilities needed for photometric and spectroscopic followup of LSST observations. The fundamental goal is to enable productive, world-class research with LSST by the entire US community-at-large in tight collaboration with the LSST Project, LSST Science Collaborations, and the funding agencies.
University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
abate, alex; cheu, elliott
This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.
LSST: Education and Public Outreach
NASA Astrophysics Data System (ADS)
Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team
2018-01-01
The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.
Yoon, Dong Hyun; Kang, Dongheon; Kim, Hee-Jae; Kim, Jin-Soo; Song, Han Sol; Song, Wook
2017-05-01
The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated. In particular, unconventional high-speed resistance training can improve muscle power development. In the present study, the effectiveness of 12 weeks of elastic band-based high-speed power training (HSPT) was examined. Participants were randomly assigned into a HSPT group (n = 14, age 75.0 ± 0.9 years), a low-speed strength training (LSST) group (n = 9, age 76.0 ± 1.3 years) and a control group (CON; n = 7, age 78.0 ± 1.0 years). A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups, and balance and tone exercises were carried out by the CON group. Significant increases in levels of cognitive function, physical function, and muscle strength were observed in both the HSPT and LSST groups. In cognitive function, significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group. In physical functions, Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group. In the 12 weeks of elastic band-based training, the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group, although both regimens were effective in improving cognitive function, physical function and muscle strength. We conclude that elastic band-based HSPT, as compared with LSST, is more efficient in helping older women with mild cognitive impairment to improve cognitive function, physical performance and muscle strength. Geriatr Gerontol Int 2017; 17: 765-772. © 2016 Japan Geriatrics Society.
Advancing the LSST Operations Simulator
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group
2013-01-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.
Scientific Synergy between LSST and Euclid
NASA Astrophysics Data System (ADS)
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja
2017-12-01
Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.
NASA Astrophysics Data System (ADS)
Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team
2018-01-01
The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.
The LSST Scheduler from design to construction
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Reuter, Michael A.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.
Scientific Synergy between LSST and Euclid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric
We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less
Scientific Synergy between LSST and Euclid
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...
2017-12-07
We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less
The European perspective for LSST
NASA Astrophysics Data System (ADS)
Gangler, Emmanuel
2017-06-01
LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Schumacher, German
2014-08-01
The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.
The Large Synoptic Survey Telescope (LSST) Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
NASA Astrophysics Data System (ADS)
Kantor, J.
During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.
The Large Synoptic Survey Telescope (LSST) Camera
None
2018-06-13
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energyâs SLAC National Accelerator Laboratory is leading the construction of the LSST camera â the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
Examining the Potential of LSST to Contribute to Exoplanet Discovery
NASA Astrophysics Data System (ADS)
Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.
2018-01-01
The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.
Science Education with the LSST
NASA Astrophysics Data System (ADS)
Jacoby, S. H.; Khandro, L. M.; Larson, A. M.; McCarthy, D. W.; Pompea, S. M.; Shara, M. M.
2004-12-01
LSST will create the first true celestial cinematography - a revolution in public access to the changing universe. The challenge will be to take advantage of the unique capabilities of the LSST while presenting the data in ways that are manageable, engaging, and supportive of national science education goals. To prepare for this opportunity for exploration, tools and displays will be developed using current deep-sky multi-color imaging data. Education professionals from LSST partners invite input from interested members of the community. Initial LSST science education priorities include: - Fostering authentic student-teacher research projects at all levels, - Exploring methods of visualizing the large and changing datasets in science centers, - Defining Web-based interfaces and tools for access and interaction with the data, - Delivering online instructional materials, and - Developing meaningful interactions between LSST scientists and the public.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2
NASA Technical Reports Server (NTRS)
Sullivan, M. R.
1982-01-01
Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2
NASA Astrophysics Data System (ADS)
Sullivan, M. R.
1982-06-01
Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.
LSST active optics system software architecture
NASA Astrophysics Data System (ADS)
Thomas, Sandrine J.; Chandrasekharan, Srinivasan; Lotz, Paul; Xin, Bo; Claver, Charles; Angeli, George; Sebag, Jacques; Dubois-Felsmann, Gregory P.
2016-08-01
The Large Synoptic Survey Telescope (LSST) is an 8-meter class wide-field telescope now under construction on Cerro Pachon, near La Serena, Chile. This ground-based telescope is designed to conduct a decade-long time domain survey of the optical sky. In order to achieve the LSST scientific goals, the telescope requires delivering seeing limited image quality over the 3.5 degree field-of-view. Like many telescopes, LSST will use an Active Optics System (AOS) to correct in near real-time the system aberrations primarily introduced by gravity and temperature gradients. The LSST AOS uses a combination of 4 curvature wavefront sensors (CWS) located on the outside of the LSST field-of-view. The information coming from the 4 CWS is combined to calculate the appropriate corrections to be sent to the 3 different mirrors composing LSST. The AOS software incorporates a wavefront sensor estimation pipeline (WEP) and an active optics control system (AOCS). The WEP estimates the wavefront residual error from the CWS images. The AOCS determines the correction to be sent to the different degrees of freedom every 30 seconds. In this paper, we describe the design and implementation of the AOS. More particularly, we will focus on the software architecture as well as the AOS interactions with the various subsystems within LSST.
LSST Survey Data: Models for EPO Interaction
NASA Astrophysics Data System (ADS)
Olsen, J. K.; Borne, K. D.
2007-12-01
The potential for education and public outreach with the Large Synoptic Survey Telescope is as far reaching as the telescope itself. LSST data will be available to the public, giving anyone with a web browser a movie-like window on the Universe. The LSST project is unique in designing its data management and data access systems with the public and community users in mind. The enormous volume of data to be generated by LSST is staggering: 30 Terabytes per night, 10 Petabytes per year. The final database of extracted science parameters from the images will also be enormous -- 50-100 Petabytes -- a rich gold mine for data mining and scientific discovery potential. LSST will also generate 100,000 astronomical alerts per night, for 10 years. The LSST EPO team is examining models for EPO interaction with the survey data, particularly in how the community (amateurs, teachers, students, and general public) can participate in the discovery process. We will outline some of our models of community interaction for inquiry-based science using the LSST survey data, and we invite discussion on these topics.
Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)
NASA Astrophysics Data System (ADS)
Rawls, M.
2017-06-01
(Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.
Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Jurić, Mario; Ivezić, Željko
2016-01-01
The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.
Investigating the Bright End of LSST Photometry
NASA Astrophysics Data System (ADS)
Ojala, Elle; Pepper, Joshua; LSST Collaboration
2018-01-01
The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.
2018-03-01
Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration
2011-01-01
The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.
2016-06-01
Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.
A Prototype External Event Broker for LSST
NASA Astrophysics Data System (ADS)
Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald
2015-01-01
LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.
Pekyavas, Nihan Ozunlu; Ergun, Nevin
2017-05-01
The aim of this study was to compare the short term effects of home exercise program and virtual reality exergaming in patients with subacromial impingement syndrome (SAIS). A total of 30 patients with SAIS were randomized into two groups which are Home Exercise Program (EX Group) (mean age: 40.6 ± 11.7 years) and Virtual Reality Exergaming Program (WII Group) (mean age: 40.33 ± 13.2 years). Subjects were assessed at the first session, at the end of the treatment (6 weeks) and at 1 month follow-up. The groups were assessed and compared with Visual Analogue Scale (based on rest, activity and night pain), Neer and Hawkins Tests, Scapular Retraction Test (SRT), Scapular Assistance Test (SAT), Lateral Scapular Slide Test (LSST) and shoulder disability (Shoulder Pain and Disability Index (SPADI)). Intensity of pain was significantly decreased in both groups with the treatment (p < 0.05). The WII Group had significantly better results for all Neer test, SRT and SAT than the EX Group (p < 0.05). Virtual reality exergaming programs with these programs were found more effective than home exercise programs at short term in subjects with SAIS. Level I, Therapeutic study. Copyright © 2017 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.
Astrometry with LSST: Objectives and Challenges
NASA Astrophysics Data System (ADS)
Casetti-Dinescu, D. I.; Girard, T. M.; Méndez, R. A.; Petronchak, R. M.
2018-01-01
The forthcoming Large Synoptic Survey Telescope (LSST) is an optical telescope with an effective aperture of 6.4 m, and a field of view of 9.6 square degrees. Thus, LSST will have an étendue larger than any other optical telescope, performing wide-field, deep imaging of the sky. There are four broad categories of science objectives: 1) dark-energy and dark matter, 2) transients, 3) the Milky Way and its neighbours and, 4) the Solar System. In particular, for the Milky-Way science case, astrometry will make a critical contribution; therefore, special attention must be devoted to extract the maximum amount of astrometric information from the LSST data. Here, we outline the astrometric challenges posed by such a massive survey. We also present some current examples of ground-based, wide-field, deep imagers used for astrometry, as precursors of the LSST.
Data Mining Research with the LSST
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Strauss, M. A.; Tyson, J. A.
2007-12-01
The LSST catalog database will exceed 10 petabytes, comprising several hundred attributes for 5 billion galaxies, 10 billion stars, and over 1 billion variable sources (optical variables, transients, or moving objects), extracted from over 20,000 square degrees of deep imaging in 5 passbands with thorough time domain coverage: 1000 visits over the 10-year LSST survey lifetime. The opportunities are enormous for novel scientific discoveries within this rich time-domain ultra-deep multi-band survey database. Data Mining, Machine Learning, and Knowledge Discovery research opportunities with the LSST are now under study, with a potential for new collaborations to develop to contribute to these investigations. We will describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. We also give some illustrative examples of current scientific data mining research in astronomy, and point out where new research is needed. In particular, the data mining research community will need to address several issues in the coming years as we prepare for the LSST data deluge. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; visual data mining algorithms for visual exploration of the data; indexing of multi-attribute multi-dimensional astronomical databases (beyond RA-Dec spatial indexing) for rapid querying of petabyte databases; and more. Finally, we will identify opportunities for synergistic collaboration between the data mining research group and the LSST Data Management and Science Collaboration teams.
The Stellar Populations of the Milky Way and Nearby Galaxies with LSST
NASA Astrophysics Data System (ADS)
Olsen, Knut A.; Covey, K.; Saha, A.; Beers, T. C.; Bochanski, J.; Boeshaar, P.; Cargile, P.; Catelan, M.; Burgasser, A.; Cook, K.; Dhital, S.; Figer, D.; Ivezic, Z.; Kalirai, J.; McGehee, P.; Minniti, D.; Pepper, J.; Prsa, A.; Sarajedini, A.; Silva, D.; Smith, J. A.; Stassun, K.; Thorman, P.; Williams, B.; LSST Stellar Populations Collaboration
2011-01-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma) when observations at the individual epochs of the standard cadence are stacked. Analyzing the ten years of independent measurements in each field will allow variability, proper motion and parallax measurements to be derived for objects brighter than r=24.5. These photometric, astrometric, and variability data will enable the construction of a detailed and robust map of the stellar populations of the Milky Way, its satellites and its nearest extra-galactic neighbors--allowing exploration of their star formation, chemical enrichment, and accretion histories on a grand scale. For example, with geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, LSST will allow a complete census of all stars above the hydrogen-burning limit that are closer than 500 pc, including thousands of predicted L and T dwarfs. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics; LSST's projected impact on the study of several variable star classes, including eclipsing binaries, are discussed here. We also describe the ongoing efforts of the collaboration to optimize the LSST system for stellar populations science. We are currently investigating the trade-offs associated with the exact wavelength boundaries of the LSST filters, identifying the most scientifically valuable locations for fields that will receive enhanced temporal coverage compared to the standard cadence, and analyzing synthetic LSST outputs to verify that the system's performance will be sufficient to achieve our highest priority science goals.
Designing a Multi-Petabyte Database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei
2007-01-10
The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team
2010-01-01
The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
Designing a multi-petabyte database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, J; Hanushevsky, A
2005-12-21
The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less
TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacklin, Savannah; Lund, Michael B.; Stassun, Keivan G.
2015-07-15
The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5more » to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.« less
LSST and the Epoch of Reionization Experiments
NASA Astrophysics Data System (ADS)
Ivezić, Željko
2018-05-01
The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.
Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies
NASA Astrophysics Data System (ADS)
Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter
2018-01-01
In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.
The Search for Transients and Variables in the LSST Pathfinder Survey
NASA Astrophysics Data System (ADS)
Gorsuch, Mary Katherine; Kotulla, Ralf
2018-01-01
This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.
The LSSTC Data Science Fellowship Program
NASA Astrophysics Data System (ADS)
Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council
2017-01-01
The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.
The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowden, Gordon B.; Langton, Brian J.; /SLAC
2014-05-28
The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less
LSST Resources for the Community
NASA Astrophysics Data System (ADS)
Jones, R. Lynne
2011-01-01
LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.
Measuring the Growth Rate of Structure with Type IA Supernovae from LSST
NASA Astrophysics Data System (ADS)
Howlett, Cullan; Robotham, Aaron S. G.; Lagos, Claudia D. P.; Kim, Alex G.
2017-10-01
We investigate the peculiar motions of galaxies up to z = 0.5 using Type Ia supernovae (SNe Ia) from the Large Synoptic Survey Telescope (LSST) and predict the subsequent constraints on the growth rate of structure. We consider two cases. Our first is based on measurements of the volumetric SNe Ia rate and assumes we can obtain spectroscopic redshifts and light curves for varying fractions of objects that are detected pre-peak luminosity by LSST (some of which may be obtained by LSST itself, and others that would require additional follow-up observations). We find that these measurements could produce growth rate constraints at z< 0.5 that significantly outperform those found using Redshift Space Distortions (RSD) with DESI or 4MOST, even though there are ˜ 4× fewer objects. For our second case, we use semi-analytic simulations and a prescription for the SNe Ia rate as a function of stellar mass and star-formation rate to predict the number of LSST SNe IA whose host redshifts may already have been obtained with the Taipan+WALLABY surveys or with a future multi-object spectroscopic survey. We find ˜18,000 and ˜160,000 SNe Ia with host redshifts for these cases, respectively. While this is only a fraction of the total LSST-detected SNe Ia, they could be used to significantly augment and improve the growth rate constraints compared to only RSD. Ultimately, we find that combining LSST SNe Ia with large numbers of galaxy redshifts will provide the most powerful probe of large-scale gravity in the z< 0.5 regime over the coming decades.
From Science To Design: Systems Engineering For The Lsst
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration
2009-01-01
The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.
Strong Gravitational Lensing with LSST
NASA Astrophysics Data System (ADS)
Marshall, Philip J.; Bradac, M.; Chartas, G.; Dobler, G.; Eliasdottir, A.; Falco, E.; Fassnacht, C. D.; Jee, M. J.; Keeton, C. R.; Oguri, M.; Tyson, J. A.; LSST Strong Lensing Science Collaboration
2010-01-01
LSST will find more strong gravitational lensing events than any other survey preceding it, and will monitor them all at a cadence of a few days to a few weeks. We can expect the biggest advances in strong lensing science made with LSST to be in those areas that benefit most from the large volume, and the high accuracy multi-filter time series: studies of, and using, several thousand lensed quasars and several hundred supernovae. However, the high quality imaging will allow us to detect and measure large numbers of background galaxies multiply-imaged by galaxies, groups and clusters. In this poster we give an overview of the strong lensing science enabled by LSST, and highlight the particular associated technical challenges that will have to be faced when working with the survey.
Expanding the user base beyond HEP for the Ganga distributed analysis user interface
NASA Astrophysics Data System (ADS)
Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.
2017-10-01
This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.
Big Data Science Cafés: High School Students Experiencing Real Research with Scientists
NASA Astrophysics Data System (ADS)
Walker, C. E.; Pompea, S. M.
2017-12-01
The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
Evaluation of Potential LSST Spatial Indexing Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolaev, S; Abdulla, G; Matzke, R
2006-10-13
The LSST requirement for producing alerts in near real-time, and the fact that generating an alert depends on knowing the history of light variations for a given sky position, both imply that the clustering information for all detections is available at any time during the survey. Therefore, any data structure describing clustering of detections in LSST needs to be continuously updated, even as new detections are arriving from the pipeline. We call this use case ''incremental clustering'', to reflect this continuous updating of clustering information. This document describes the evaluation results for several potential LSST incremental clustering strategies, using: (1)more » Neighbors table and zone optimization to store spatial clusters (a.k.a. Jim Grey's, or SDSS algorithm); (2) MySQL built-in R-tree implementation; (3) an external spatial index library which supports a query interface.« less
Photometric classification and redshift estimation of LSST Supernovae
NASA Astrophysics Data System (ADS)
Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve
2018-07-01
Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 per cent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ (z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ (z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.
NASA Astrophysics Data System (ADS)
Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques
2014-08-01
The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.
The variable sky of deep synoptic surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.
2014-11-20
The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night atmore » the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.« less
Commentary: Learning About the Sky Through Simulations. Chapter 34
NASA Technical Reports Server (NTRS)
Way, Michael J.
2012-01-01
The Large Synoptic Survey Telescope (LSST) simulator being built by Andy Connolly and collaborators is an impressive undertaking and should make working with LSST in the beginning stages far more easy than it was initially with the Sloan Digital Sky Survey (SDSS). However, I would like to focus on an equally important problem that has not yet been discussed here, but in the coming years the community will need to address-can we deal with the flood of data from LSST and will we need to rethink the way we work?
LSST Painting Risk Evaluation Memo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, Justin E.
The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.
Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations
NASA Astrophysics Data System (ADS)
Given, Gabriel; Grin, Daniel
2018-01-01
Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.
Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice
2010-01-18
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.
NASA Astrophysics Data System (ADS)
O'Mullane, William; LSST Data Management Team
2018-01-01
The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map. We are also discuss the move to Python3 and timeline for dropping Python2.
NASA Astrophysics Data System (ADS)
Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo
2016-07-01
The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.
Photometric classification and redshift estimation of LSST Supernovae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Mi; Kuhlmann, Steve; Wang, Yun
Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 percent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z)more » of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ(z phot -z spec 1+z spec )=0.0294 σ(zphot-zspec1+zspec)=0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ(z phot -z spec 1+z spec )=0.0116 σ(zphot-zspec1+zspec)=0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.« less
The LSST Data Mining Research Agenda
NASA Astrophysics Data System (ADS)
Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.
2008-12-01
We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.
On the Detectability of Planet X with LSST
NASA Astrophysics Data System (ADS)
Trilling, David E.; Bellm, Eric C.; Malhotra, Renu
2018-06-01
Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.
Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration
2018-01-01
The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.
Giga-z: A 100,000 Object Superconducting Spectrophotometer for LSST Follow-up
NASA Astrophysics Data System (ADS)
Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran; Hirata, Chris
2013-09-01
We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R 423 nm = E/ΔE = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations. In 3 yr on a dedicated 4 m class telescope, Giga-z could observe ≈2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z ≈ 6 with magnitudes mi <~ 25, with accuracy σΔz/(1 + z) ≈ 0.03 for the whole sample, and σΔz/(1 + z) ≈ 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of α = 1.27 (wp ), 1.53 (wa ), or 1.98 (Δγ). This is equivalent to multiplying both the LSST coverage area and the training sets by α and reducing all systematics by a factor of 1/\\sqrt{\\alpha }, advantages that are robust to even more extreme models of intrinsic alignment.
The Large Synoptic Survey Telescope: Projected Near-Earth Object Discovery Performance
NASA Technical Reports Server (NTRS)
Chesley, Steven R.; Veres, Peter
2016-01-01
The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field survey that has the potential to detect millions of asteroids. LSST is under construction with survey operations slated to begin in 2022. We describe an independent study to assess the performance of LSST for detecting and cataloging near-Earth objects (NEOs). A significant component of the study will be to assess the survey's ability to link observations of a single object from among the large numbers of false detections and detections of other objects. We also will explore the survey's basic performance in terms of fraction of NEOs discovered and cataloged, both for the planned baseline survey, but also for enhanced surveys that are more carefully tuned for NEO search, generally at the expense of other science drivers. Preliminary results indicate that with successful linkage under the current baseline survey LSST would discover approximately 65% of NEOs with absolute magnitude H is less than 22, which corresponds approximately to 140m diameter.
An optical to IR sky brightness model for the LSST
NASA Astrophysics Data System (ADS)
Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo
2016-07-01
To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okura, Yuki; Petri, Andrea; May, Morgan
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
Okura, Yuki; Petri, Andrea; May, Morgan; ...
2016-06-27
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, V J; Olivier, S; Bauman, B
2012-05-24
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less
Probing LSST's Ability to Detect Planets Around White Dwarfs
NASA Astrophysics Data System (ADS)
Cortes, Jorge; Kipping, David
2018-01-01
Over the last four years more than 2,000 planets outside our solar system have been discovered, motivating us to search for and characterize potentially habitable worlds. Most planets orbit Sun-like stars, but more exotic stars can also host planets. Debris disks and disintegrating planetary bodies have been detected around white dwarf stars, the inert, Earth-sized cores of once-thriving stars like our Sun. These detections are clues that planets may exist around white dwarfs. Due to the faintness of white dwarfs and the potential rarity of planets around them, a vast survey is required to have a chance at detecting these planetary systems. The Large Synoptic Survey Telescope (LSST), scheduled to commence operations in 2023, will image the entire southern sky every few nights for 10 years, providing our first real opportunity to detect planets around white dwarfs. We characterized LSST’s ability to detect planets around white dwarfs through simulations that incorporate realistic models for LSST’s observing strategy and the white dwarf distribution within the Milky Way galaxy. This was done through the use of LSST's Operations Simulator (OpSim) and Catalog Simulator (CatSim). Our preliminary results indicate that, if all white dwarfs were to possess a planet, LSST would yield a detection for every 100 observed white dwarfs. In the future, a larger set of ongoing simulations will help us quantify the number of planets LSST could potentially find.
OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST
NASA Astrophysics Data System (ADS)
Roming, Peter; van der Horst, Alexander; OCTOCAM Team
2018-01-01
The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-06-01
How can we hunt down all the near-Earth asteroids that are capable of posing a threat to us? A new study looks at whether the upcoming Large Synoptic Survey Telescope (LSST) is up to the job.Charting Nearby ThreatsLSST is an 8.4-m wide-survey telescope currently being built in Chile. When it goes online in 2022, it will spend the next ten years surveying our sky, mapping tens of billions of stars and galaxies, searching for signatures of dark energy and dark matter, and hunting for transient optical events like novae and supernovae. But in its scanning, LSST will also be looking for asteroids that approach near Earth.Cumulative number of near-Earth asteroids discovered over time, as of June 16, 2016. [NASA/JPL/Chamberlin]Near-Earth objects (NEOs) have the potential to be hazardous if they cross Earths path and are large enough to do significant damage when they impact Earth. Earths history is riddled with dangerous asteroid encounters, including the recent Chelyabinsk airburst in 2013, the encounter that caused the kilometer-sized Meteor Crater in Arizona, and the impact thought to contribute to the extinction of the dinosaurs.Recognizing the potential danger that NEOs can pose to Earth, Congress has tasked NASA with tracking down 90% of NEOs larger than 140 meters in diameter. With our current survey capabilities, we believe weve discovered roughly 25% of these NEOs thus far. Now a new study led by Tommy Grav (Planetary Science Institute) examines whether LSST will be able to complete this task.Absolute magnitude, H, of asynthetic NEO population. Though these NEOs are all larger than 140 m, they have a large spread in albedos. [Grav et al. 2016]Can LSST Help?Based on previous observations of NEOs and resulting predictions for NEO properties and orbits, Grav and collaborators simulate a synthetic population of NEOs all above 140 m in size. With these improved population models, they demonstrate that the common tactic of using an asteroids absolute magnitude as a proxy for its size is a poor approximation, due to asteroids large spread in albedos. Roughly 23% of NEOs larger than 140 m have absolute magnitudes fainter than H = 22 mag, the authors show which is the value usually assumed as the default absolute magnitude of a 140 m NEO.Fraction of NEOs weve detected as a function of time based on the authors simulations of the current surveys (red), LSST plus the current surveys (black), NEOCam plus the current surveys (blue), and the combined result for all surveys (green). [Grav et al. 2016]Taking this into account, Grav and collaborators then use information about the planned LSST survey strategies and detection limits to test what fraction of this synthetic NEO population LSST will be able to detect in its proposed 10-year mission.The authors find that, within 10 years, LSST will likely be able to detect only 63% of NEOs larger than 140 m. Luckily, LSST may not have to work alone; in addition to the current surveys in operation, a proposed infrared space-based survey mission called NEOCam is planned for launch in 2021. If NEOCam is funded, it will complement LSSTs discovery capabilities, potentially allowing the two surveys to jointly achieve the 90% detection goal within a decade.CitationT. Grav et al 2016 AJ 151 172. doi:10.3847/0004-6256/151/6/172
Final Technical Report for DE-SC0012297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dell'Antonio, Ian
This is the final report on the work performed in award DE-SC0012297, Cosmic Frontier work in support of the LSST Dark Energy Science Collaboration's work to develop algorithms, simulations, and statistical tests to ensure optimal extraction of the dark energy properties from galaxy clusters observed with LSST. This work focused on effects that could produce a systematic error on the measurement of cluster masses (that will be used to probe the effects of dark energy on the growth of structure). These effects stem from the deviations from pure ellipticity of the gravitational lensing signal and from the blending of lightmore » of neighboring galaxies. Both these effects are expected to be more significant for LSST than for the stage III experiments such as the Dark Energy Survey. We calculate the magnitude of the mass error (or bias) for the first time and demonstrate that it can be treated as a multiplicative correction and calibrated out, allowing mass measurements of clusters from gravitational lensing to meet the requirements of LSST's dark energy investigation.« less
Using SysML for MBSE analysis of the LSST system
NASA Astrophysics Data System (ADS)
Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques
2010-07-01
The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.
Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.
2013-01-01
The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.
Stellar Populations with the LSST
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Olsen, K.; LSST Stellar Populations Collaboration
2006-12-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to g 27.5(5σ). Strategically cadenced time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than g 25. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence stars at all distances within the Galaxy, permitting a comprehensive study of star formation histories (SFH) and chemical evolution for field stars. With a geometric parallax accuracy of 1mas, LSST will produce a robust complete sample of the solar neighborhood stars. While delivering parallax accuracy comparable to HIPPARCOS, LSST will extend the catalog to more than a 10 magnitudes fainter limit, and will be complete to MV 15. In the Magellanic Clouds too, the photometry will reach MV +8, allowing the SFH and chemical signatures in the expansive outer extremities to be gleaned from their main sequence stars. This in turn will trace the detailed interaction of the Clouds with the Galaxy halo. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1hr to several years, a feast for variable star astrophysics. Cepheids and LPVs in all galaxies in the Sculptor, M83 and Cen-A groups are obvious data products: comparative studies will reveal systematic differences with galaxy properties, and help to fine tune the rungs of the distance ladder. Dwarf galaxies within 10Mpc that are too faint to find from surface brightness enhancements will be revealed via over-densities of their red giants: this systematic census will extend the luminosity function of galaxies to the faint limit. Novae discovered by LSST time sampling will trace intergalactic stars out to the Virgo and Fornax clusters.
The LSST Metrics Analysis Framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.
2015-01-01
Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.
LSST Probes of Dark Energy: New Energy vs New Gravity
NASA Astrophysics Data System (ADS)
Bradshaw, Andrew; Tyson, A.; Jee, M. J.; Zhan, H.; Bard, D.; Bean, R.; Bosch, J.; Chang, C.; Clowe, D.; Dell'Antonio, I.; Gawiser, E.; Jain, B.; Jarvis, M.; Kahn, S.; Knox, L.; Newman, J.; Wittman, D.; Weak Lensing, LSST; LSS Science Collaborations
2012-01-01
Is the late time acceleration of the universe due to new physics in the form of stress-energy or a departure from General Relativity? LSST will measure the shape, magnitude, and color of 4x109 galaxies to high S/N over 18,000 square degrees. These data will be used to separately measure the gravitational growth of mass structure and distance vs redshift to unprecedented precision by combining multiple probes in a joint analysis. Of the five LSST probes of dark energy, weak gravitational lensing (WL) and baryon acoustic oscillation (BAO) probes are particularly effective in combination. By measuring the 2-D BAO scale in ugrizy-band photometric redshift-selected samples, LSST will determine the angular diameter distance to a dozen redshifts with sub percent-level errors. Reconstruction of the WL shear power spectrum on linear and weakly non-linear scales, and of the cross-correlation of shear measured in different photometric redshift bins provides a constraint on the evolution of dark energy that is complementary to the purely geometric measures provided by supernovae and BAO. Cross-correlation of the WL shear and BAO signal within redshift shells minimizes the sensitivity to systematics. LSST will also detect shear peaks, providing independent constraints. Tomographic study of the shear of background galaxies as a function of redshift allows a geometric test of dark energy. To extract the dark energy signal and distinguish between the two forms of new physics, LSST will rely on accurate stellar point-spread functions (PSF) and unbiased reconstruction of galaxy image shapes from hundreds of exposures. Although a weighted co-added deep image has high S/N, it is a form of lossy compression. Bayesian forward modeling algorithms can in principle use all the information. We explore systematic effects on shape measurements and present tests of an algorithm called Multi-Fit, which appears to avoid PSF-induced shear systematics in a computationally efficient way.
Giga-z: A 100,000 OBJECT SUPERCONDUCTING SPECTROPHOTOMETER FOR LSST FOLLOW-UP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran
2013-09-15
We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R{sub 423{sub nm}} = E/{Delta}E = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations.more » In 3 yr on a dedicated 4 m class telescope, Giga-z could observe Almost-Equal-To 2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z Almost-Equal-To 6 with magnitudes m{sub i} {approx}< 25, with accuracy {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.03 for the whole sample, and {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of {alpha} = 1.27 (w{sub p} ), 1.53 (w{sub a} ), or 1.98 ({Delta}{gamma}). This is equivalent to multiplying both the LSST coverage area and the training sets by {alpha} and reducing all systematics by a factor of 1/{radical}({alpha}), advantages that are robust to even more extreme models of intrinsic alignment.« less
Mechanical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordby, Martin; Bowden, Gordon; Foss, Mike
2008-06-13
The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less
Designing for Peta-Scale in the LSST Database
NASA Astrophysics Data System (ADS)
Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.
2007-10-01
The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.
LSST telescope and site status
NASA Astrophysics Data System (ADS)
Gressler, William J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.
limited to two groups of 40 people. One group meets at the gatehouse at 9 AM and the other at 1PM. Because Reports SOAR End-of-night Reports Gemini South LSST Optical Engineering AURA Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and
LSST and the Physics of the Dark Universe
Tyson, Anthony [UC Davis, California, United States
2017-12-09
The physics that underlies the accelerating cosmic expansion is unknown. This, 'dark energy' and the equally mysterious 'dark matter' comprise most of the mass-energy of the universe and are outside the standard model. Recent advances in optics, detectors, and information technology, has led to the design of a facility that will repeatedly image an unprecedented volume of the universe: LSST. For the first time, the sky will be surveyed wide, deep and fast. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. I will review the technology of LSST, and focus on several independent probes of the nature of dark energy and dark matter. These new investigations will rely on the statistical precision obtainable with billions of galaxies.
Solar System science with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David
2015-11-01
The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
NASA Astrophysics Data System (ADS)
Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration
2018-01-01
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scolnic, D.; Kessler, R.; Brout, D.
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
Scolnic, D.; Kessler, R.; Brout, D.; ...
2017-12-22
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less
Surveying the Inner Solar System with an Infrared Space Telescope
NASA Astrophysics Data System (ADS)
Buie, Marc W.; Reitsema, Harold J.; Linfield, Roger P.
2016-11-01
We present an analysis of surveying the inner solar system for objects that may pose some threat to Earth. Most of the analysis is based on understanding the capability provided by Sentinel, a concept for an infrared space-based telescope placed in a heliocentric orbit near the distance of Venus. From this analysis, we show that (1) the size range being targeted can affect the survey design, (2) the orbit distribution of the target sample can affect the survey design, (3) minimum observational arc length during the survey is an important metric of survey performance, and (4) surveys must consider objects as small as D=15{--}30 m to meet the goal of identifying objects that have the potential to cause damage on Earth in the next 100 yr. Sentinel will be able to find 50% of all impactors larger than 40 m in a 6.5 yr survey. The Sentinel mission concept is shown to be as effective as any survey in finding objects bigger than D = 140 m but is more effective when applied to finding smaller objects on Earth-impacting orbits. Sentinel is also more effective at finding objects of interest for human exploration that benefit from lower propulsion requirements. To explore the interaction between space and ground search programs, we also study a case where Sentinel is combined with the Large Synoptic Survey Telescope (LSST) and show the benefit of placing a space-based observatory in an orbit that reduces the overlap in search regions with a ground-based telescope. In this case, Sentinel+LSST can find more than 70% of the impactors larger than 40 m assuming a 6.5 yr lifetime for Sentinel and 10 yr for LSST.
Wavelength-Dependent PSFs and their Impact on Weak Lensing Measurements
NASA Astrophysics Data System (ADS)
Carlsten, S. G.; Strauss, Michael A.; Lupton, Robert H.; Meyers, Joshua E.; Miyazaki, Satoshi
2018-06-01
We measure and model the wavelength dependence of the point spread function (PSF) in the Hyper Suprime-Cam Subaru Strategic Program survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g, r, and i-bands at the 1-2 per cent level and in the z and y-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF size based on weighted second moments, R, and wavelength of the form R(λ)∝λ-b. We find values of b between 0.2 and 0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ˜10 - 100 m, indicating that the atmosphere is dominating the chromaticity. In the best seeing data, we find that the optical system and detector also contribute some wavelength dependence. Meyers & Burchat (2015b) showed that b must be measured to an accuracy of ˜0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that b can be inferred with this accuracy in the r and i-bands for all positions in the LSST focal plane, assuming a stellar density of 1 star arcmin-2 and that the optical component of the PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.
Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST
NASA Astrophysics Data System (ADS)
Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.
2017-01-01
The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.
NASA Astrophysics Data System (ADS)
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; Collett, Thomas E.
2018-03-01
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ∼2 over previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Accounting for microlensing, the 1–2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; ...
2018-03-01
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less
Mapping the Solar System with LSST
NASA Astrophysics Data System (ADS)
Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Harris, A.; Bowell, T.; Bernstein, G.; Stubbs, C.; LSST Collaboration
2004-12-01
The currently considered LSST cadence, based on two 10 sec exposures, may result in orbital parameters, light curves and accurate colors for over a million main-belt asteroids (MBA), and about 20,000 trans-Neptunian objects (TNO). Compared to the current state-of-the-art, this sample would represent a factor of 5 increase in the number of MBAs with known orbits, a factor of 20 increase in the number of MBAs with known orbits and accurate color measurements, and a factor of 100 increase in the number of MBAs with measured variability properties. The corresponding sample increase for TNOs is 10, 100, and 1000, respectively. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. For example, they will constrain the MBA size distribution for objects larger than 100 m, and TNO size distribution for objects larger than 100 km, their physical state through variability measurements (solid body vs. a rubble pile), as well as their surface chemistry through color measurements. A proposed deep TNO survey, based on 1 hour exposures, may result in a sample of about 100,000 TNOs, while spending only 10% of the LSST observing time. Such a deep TNO survey would be capable of discovering Sedna-like objects at distances beyond 150 AU, thereby increasing the observable Solar System volume by about a factor of 7. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying asteroid populations.
Management evolution in the LSST project
NASA Astrophysics Data System (ADS)
Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine
2010-07-01
The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.
Delta Doping High Purity CCDs and CMOS for LSST
NASA Technical Reports Server (NTRS)
Blacksberg, Jordana; Nikzad, Shouleh; Hoenk, Michael; Elliott, S. Tom; Bebek, Chris; Holland, Steve; Kolbe, Bill
2006-01-01
A viewgraph presentation describing delta doping high purity CCD's and CMOS for LSST is shown. The topics include: 1) Overview of JPL s versatile back-surface process for CCDs and CMOS; 2) Application to SNAP and ORION missions; 3) Delta doping as a back-surface electrode for fully depleted LBNL CCDs; 4) Delta doping high purity CCDs for SNAP and ORION; 5) JPL CMP thinning process development; and 6) Antireflection coating process development.
NASA Astrophysics Data System (ADS)
Saha, A.; Monet, D.
2005-12-01
Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.
The LSST metrics analysis framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.
2014-07-01
We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.
Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA
NASA Astrophysics Data System (ADS)
Korol, V.; Rossi, E. M.; Groot, P. J.
2017-03-01
According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.
NASA Astrophysics Data System (ADS)
Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael
2006-06-01
The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.
A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera
NASA Astrophysics Data System (ADS)
Kroedel, Matthias; Langton, J. Brian; Wahl, Bill
2017-09-01
This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.
The Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Axelrod, T. S.
2006-07-01
The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.
Atmospheric Dispersion Effects in Weak Lensing Measurements
Plazas, Andrés Alejandro; Bernstein, Gary
2012-10-01
The wavelength dependence of atmospheric refraction causes elongation of finite-bandwidth images along the elevation vector, which produces spurious signals in weak gravitational lensing shear measurements unless this atmospheric dispersion is calibrated and removed to high precision. Because astrometric solutions and PSF characteristics are typically calibrated from stellar images, differences between the reference stars' spectra and the galaxies' spectra will leave residual errors in both the astrometric positions (dr) and in the second moment (width) of the wavelength-averaged PSF (dv) for galaxies.We estimate the level of dv that will induce spurious weak lensing signals in PSF-corrected galaxy shapes that exceed themore » statistical errors of the DES and the LSST cosmic-shear experiments. We also estimate the dr signals that will produce unacceptable spurious distortions after stacking of exposures taken at different airmasses and hour angles. We also calculate the errors in the griz bands, and find that dispersion systematics, uncorrected, are up to 6 and 2 times larger in g and r bands,respectively, than the requirements for the DES error budget, but can be safely ignored in i and z bands. For the LSST requirements, the factors are about 30, 10, and 3 in g, r, and i bands,respectively. We find that a simple correction linear in galaxy color is accurate enough to reduce dispersion shear systematics to insignificant levels in the r band for DES and i band for LSST,but still as much as 5 times than the requirements for LSST r-band observations. More complex corrections will likely be able to reduce the systematic cosmic-shear errors below statistical errors for LSST r band. But g-band effects remain large enough that it seems likely that induced systematics will dominate the statistical errors of both surveys, and cosmic-shear measurements should rely on the redder bands.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, John Russell
This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, B.; Spergel, D.; Connolly, A.
2015-02-02
The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The mainmore » argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.« less
The Emerging Infrastructure of Autonomous Astronomy
NASA Astrophysics Data System (ADS)
Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.
2007-10-01
Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.
LIMB-DARKENING COEFFICIENTS FOR ECLIPSING WHITE DWARFS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gianninas, A.; Strickland, B. D.; Kilic, Mukremin
2013-03-20
We present extensive calculations of linear and nonlinear limb-darkening coefficients as well as complete intensity profiles appropriate for modeling the light-curves of eclipsing white dwarfs. We compute limb-darkening coefficients in the Johnson-Kron-Cousins UBVRI photometric system as well as the Large Synoptic Survey Telescope (LSST) ugrizy system using the most up to date model atmospheres available. In all, we provide the coefficients for seven different limb-darkening laws. We describe the variations of these coefficients as a function of the atmospheric parameters, including the effects of convection at low effective temperatures. Finally, we discuss the importance of having readily available limb-darkening coefficientsmore » in the context of present and future photometric surveys like the LSST, Palomar Transient Factory, and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The LSST, for example, may find {approx}10{sup 5} eclipsing white dwarfs. The limb-darkening calculations presented here will be an essential part of the detailed analysis of all of these systems.« less
Is flat fielding safe for precision CCD astronomy?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Is flat fielding safe for precision CCD astronomy?
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
2017-07-06
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Mental Fatigue Impairs Soccer-Specific Physical and Technical Performance.
Smith, Mitchell R; Coutts, Aaron J; Merlini, Michele; Deprez, Dieter; Lenoir, Matthieu; Marcora, Samuele M
2016-02-01
To investigate the effects of mental fatigue on soccer-specific physical and technical performance. This investigation consisted of two separate studies. Study 1 assessed the soccer-specific physical performance of 12 moderately trained soccer players using the Yo-Yo Intermittent Recovery Test, Level 1 (Yo-Yo IR1). Study 2 assessed the soccer-specific technical performance of 14 experienced soccer players using the Loughborough Soccer Passing and Shooting Tests (LSPT, LSST). Each test was performed on two occasions and preceded, in a randomized, counterbalanced order, by 30 min of the Stroop task (mentally fatiguing treatment) or 30 min of reading magazines (control treatment). Subjective ratings of mental fatigue were measured before and after treatment, and mental effort and motivation were measured after treatment. Distance run, heart rate, and ratings of perceived exertion were recorded during the Yo-Yo IR1. LSPT performance time was calculated as original time plus penalty time. LSST performance was assessed using shot speed, shot accuracy, and shot sequence time. Subjective ratings of mental fatigue and effort were higher after the Stroop task in both studies (P < 0.001), whereas motivation was similar between conditions. This mental fatigue significantly reduced running distance in the Yo-Yo IR1 (P < 0.001). No difference in heart rate existed between conditions, whereas ratings of perceived exertion were significantly higher at iso-time in the mental fatigue condition (P < 0.01). LSPT original time and performance time were not different between conditions; however, penalty time significantly increased in the mental fatigue condition (P = 0.015). Mental fatigue also impaired shot speed (P = 0.024) and accuracy (P < 0.01), whereas shot sequence time was similar between conditions. Mental fatigue impairs soccer-specific running, passing, and shooting performance.
NASA Astrophysics Data System (ADS)
Coughlin, Michael; Stubbs, Christopher; Claver, Chuck
2016-06-01
We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.
Probing the Solar System with LSST
NASA Astrophysics Data System (ADS)
Harris, A.; Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Bowell, E.; Bernstein, G.; Cook, K.; Stubbs, C.
2005-12-01
LSST will catalog small Potentially Hazardous Asteroids (PHAs), survey the main belt asteroid (MBA) population to extraordinarily small size, discover comets far from the sun where their nuclear properties can be discerned without coma, and survey the Centaur and Trans-Neptunian Object (TNO) populations. The present planned observing strategy is to ``visit'' each field (9.6 deg2) with two back-to-back exposures of ˜ 15 sec, reaching to at least V magnitude 24.5. An intra-night revisit time of the order half an hour will distinguish stationary transients from even very distant ( ˜ 70 AU) solar system bodies. In order to link observations and determine orbits, each sky area will be visited several times during a month, spaced by about a week. This cadence will result in orbital parameters for several million MBAs and about 20,000 TNOs, with light curves and colorimetry for the brighter 10% or so of each population. Compared to the current data available, this would represent factor of 10 to 100 increase in the numbers of orbits, colors, and variability of the two classes of objects. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying moving objects.
Scheduling Algorithm for the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Ichharam, Jaimal; Stubbs, Christopher
2015-01-01
The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.
NASA Astrophysics Data System (ADS)
Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.
2017-06-01
The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Photometric Redshift Calibration Strategy for WFIRST Cosmology
NASA Astrophysics Data System (ADS)
Hemmati, Shoubaneh; WFIRST, WFIRST-HLS-COSMOLOGY
2018-01-01
In order for WFIRST and other Stage IV Dark energy experiments (e.g. LSST, Euclid) to infer cosmological parameters not limited by systematic errors, accurate redshift measurements are needed. This accuracy can only be met using spectroscopic subsamples to calibrate the full sample. In this poster, we employ the machine leaning, SOM based spectroscopic sampling technique developed in Masters et al. 2015, using the empirical color-redshift relation among galaxies to find the minimum spectra required for the WFIRST weak lensing calibration. We use galaxies from the CANDELS survey to build the LSST+WFIRST lensing analog sample of ~36k objects and train the LSST+WFIRST SOM. We show that 26% of the WFIRST lensing sample consists of sources fainter than the Euclid depth in the optical, 91% of which live in color cells already occupied by brighter galaxies. We demonstrate the similarity between faint and bright galaxies as well as the feasibility of redshift measurements at different brightness levels. 4% of SOM cells are however only occupied by faint galaxies for which we recommend extra spectroscopy of ~200 new sources. Acquiring the spectra of these sources will enable the comprehensive calibration of the WFIRST color-redshift relation.
NASA Astrophysics Data System (ADS)
Mróz, Przemek; Poleski, Radosław
2018-04-01
We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
An automated system to measure the quantum efficiency of CCDs for astronomy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, R.; Chiang, J.; Cinabro, D.
We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less
An automated system to measure the quantum efficiency of CCDs for astronomy
Coles, R.; Chiang, J.; Cinabro, D.; ...
2017-04-18
We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less
Fast force actuators for LSST primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Hileman, Edward; Warner, Michael; Wiecha, Oliver
2010-07-01
The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.
Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration
2018-05-01
The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.
Gamma Ray Bursts as Cosmological Probes with EXIST
NASA Astrophysics Data System (ADS)
Hartmann, Dieter; EXIST Team
2006-12-01
The EXIST mission, studied as a Black Hole Finder Probe within NASA's Beyond Einstein Program, would, in its current design, trigger on 1000 Gamma Ray Bursts (GRBs) per year (Grindlay et al, this meeting). The redshift distribution of these GRBs, using results from Swift as a guide, would probe the z > 7 epoch at an event rate of > 50 per year. These bursts trace early cosmic star formation history, point to a first generation of stellar objects that reionize the universe, and provide bright beacons for absorption line studies with groundand space-based observatories. We discuss how EXIST, in conjunction with other space missions and future large survey programs such as LSST, can be utilized to advance our understanding of cosmic chemical evolution, the structure and evolution of the baryonic cosmic web, and the formation of stars in low metallicity environments.
IAC level "O" program development
NASA Technical Reports Server (NTRS)
Vos, R. G.
1982-01-01
The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
NASA Astrophysics Data System (ADS)
Huijse, Pablo; Estévez, Pablo A.; Förster, Francisco; Daniel, Scott F.; Connolly, Andrew J.; Protopapas, Pavlos; Carrasco, Rodrigo; Príncipe, José C.
2018-05-01
The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely sampled time-series are needed. In this paper we present a new method for light curve period estimation based on quadratic mutual information (QMI). The proposed method does not assume a particular model for the light curve nor its underlying probability density and it is robust to non-Gaussian noise and outliers. By combining the QMI from several bands the true period can be estimated even when no single-band QMI yields the period. Period recovery performance as a function of average magnitude and sample size is measured using 30,000 synthetic multiband light curves of RR Lyrae and Cepheid variables generated by the LSST Operations and Catalog simulators. The results show that aggregating information from several bands is highly beneficial in LSST sparsely sampled time-series, obtaining an absolute increase in period recovery rate up to 50%. We also show that the QMI is more robust to noise and light curve length (sample size) than the multiband generalizations of the Lomb–Scargle and AoV periodograms, recovering the true period in 10%–30% more cases than its competitors. A python package containing efficient Cython implementations of the QMI and other methods is provided.
A Euclid, LSST and WFIRST Joint Processing Study
NASA Astrophysics Data System (ADS)
Chary, Ranga-Ram; Joint Processing Working Group
2018-01-01
Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.
Solar System Science with LSST
NASA Astrophysics Data System (ADS)
Jones, R. L.; Chesley, S. R.; Connolly, A. J.; Harris, A. W.; Ivezic, Z.; Knezevic, Z.; Kubica, J.; Milani, A.; Trilling, D. E.
2008-09-01
The Large Synoptic Survey Telescope (LSST) will provide a unique tool to study moving objects throughout the solar system, creating massive catalogs of Near Earth Objects (NEOs), asteroids, Trojans, TransNeptunian Objects (TNOs), comets and planetary satellites with well-measured orbits and high quality, multi-color photometry accurate to 0.005 magnitudes for the brightest objects. In the baseline LSST observing plan, back-to-back 15-second images will reach a limiting magnitude as faint as r=24.7 in each 9.6 square degree image, twice per night; a total of approximately 15,000 square degrees of the sky will be imaged in multiple filters every 3 nights. This time sampling will continue throughout each lunation, creating a huge database of observations. Fig. 1 Sky coverage of LSST over 10 years; separate panels for each of the 6 LSST filters. Color bars indicate number of observations in filter. The catalogs will include more than 80% of the potentially hazardous asteroids larger than 140m in diameter within the first 10 years of LSST operation, millions of main-belt asteroids and perhaps 20,000 Trans-Neptunian Objects. Objects with diameters as small as 100m in the Main Belt and <100km in the Kuiper Belt can be detected in individual images. Specialized `deep drilling' observing sequences will detect KBOs down to 10s of kilometers in diameter. Long period comets will be detected at larger distances than previously possible, constrainting models of the Oort cloud. With the large number of objects expected in the catalogs, it may be possible to observe a pristine comet start outgassing on its first journey into the inner solar system. By observing fields over a wide range of ecliptic longitudes and latitudes, including large separations from the ecliptic plane, not only will these catalogs greatly increase the numbers of known objects, the characterization of the inclination distributions of these populations will be much improved. Derivation of proper elements for main belt and Trojan asteroids will allow ever more resolution of asteroid families and their size-frequency distribution, as well as the study of the long-term dynamics of the individual asteroids and the asteroid belt as a whole. Fig. 2 Orbital parameters of Main Belt Asteroids, color-coded according to ugriz colors measured by SDSS. The figure to the left shows osculating elements, the figure to the right shows proper elements - note the asteroid families visible as clumps in parameter space [1]. By obtaining multi-color ugrizy data for a substantial fraction of objects, relationships between color and dynamical history can be established. This will also enable taxonomic classification of asteroids, provide further links between diverse populations such as irregular satellites and TNOs or planetary Trojans, and enable estimates of asteroid diameter with rms uncertainty of 30%. With the addition of light-curve information, rotation periods and phase curves can be measured for large fractions of each population, leading to new insight on physical characteristics. Photometric variability information, together with sparse lightcurve inversion, will allow spin state and shape estimation for up to two orders of magnitude more objects than presently known. This will leverage physical studies of asteroids by constraining the size-strength relationship, which has important implications for the internal structure (solid, fractured, rubble pile) and in turn the collisional evolution of the asteroid belt. Similar information can be gained for other solar system bodies. [1] Parker, A., Ivezic
Preparing for LSST with the LCOGT NEO Follow-up Network
NASA Astrophysics Data System (ADS)
Greenstreet, Sarah; Lister, Tim; Gomez, Edward
2016-10-01
The Las Cumbres Observatory Global Telescope Network (LCOGT) provides an ideal platform for follow-up and characterization of Solar System objects (e.g. asteroids, Kuiper Belt Objects, comets, Near-Earth Objects (NEOs)) and ultimately for the discovery of new objects. The LCOGT NEO Follow-up Network is using the LCOGT telescope network in addition to a web-based system developed to perform prioritized target selection, scheduling, and data reduction to confirm NEO candidates and characterize radar-targeted known NEOs.In order to determine how to maximize our NEO follow-up efforts, we must first define our goals for the LCOGT NEO Follow-up Network. This means answering the following questions. Should we follow-up all objects brighter than some magnitude limit? Should we only focus on the brightest objects or push to the limits of our capabilities by observing the faintest objects we think we can see and risk not finding the objects in our data? Do we (and how do we) prioritize objects somewhere in the middle of our observable magnitude range? If we want to push to faint objects, how do we minimize the amount of data in which the signal-to-noise ratio is too low to see the object? And how do we find a balance between performing follow-up and characterization observations?To help answer these questions, we have developed a LCOGT NEO Follow-up Network simulator that allows us to test our prioritization algorithms for target selection, confirm signal-to-noise predictions, and determine ideal block lengths and exposure times for observing NEO candidates. We will present our results from the simulator and progress on our NEO follow-up efforts.In the era of LSST, developing/utilizing infrastructure, such as the LCOGT NEO Follow-up Network and our web-based platform for selecting, scheduling, and reducing NEO observations, capable of handling the large number of detections expected to be produced on a daily basis by LSST will be critical to follow-up efforts. We hope our work can act as an example and tool for the community as together we prepare for the age of LSST.
Toroid Joining Gun. [thermoplastic welding system using induction heating
NASA Technical Reports Server (NTRS)
Buckley, J. D.; Fox, R. L.; Swaim, R J.
1985-01-01
The Toroid Joining Gun is a low cost, self-contained, portable low powered (100-400 watts) thermoplastic welding system developed at Langley Research Center for joining plastic and composite parts using an induction heating technique. The device developed for use in the fabrication of large space sructures (LSST Program) can be used in any atmosphere or in a vacuum. Components can be joined in situ, whether on earth or on a space platform. The expanded application of this welding gun is in the joining of thermoplastic composites, thermosetting composites, metals, and combinations of these materials. Its low-power requirements, light weight, rapid response, low cost, portability, and effective joining make it a candidate for solving many varied and unique bonding tasks.
In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms
NASA Astrophysics Data System (ADS)
Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.
2007-12-01
We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.
FY-79 - development of fiber optics connector technology for large space systems
NASA Technical Reports Server (NTRS)
Campbell, T. G.
1980-01-01
The development of physical concepts for integrating fiber optic connectors and cables with structural concepts proposed for the LSST is discussed. Emphasis is placed on remote connections using integrated cables.
Projected Near-Earth Object Discovery Performance of the Large Synoptic Survey Telescope
NASA Technical Reports Server (NTRS)
Chesley, Steven R.; Veres, Peter
2017-01-01
This report describes the methodology and results of an assessment study of the performance of the Large Synoptic Survey Telescope (LSST) in its planned efforts to detect and catalog near-Earth objects (NEOs).
Searching for modified growth patterns with tomographic surveys
NASA Astrophysics Data System (ADS)
Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel
2009-04-01
In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.
Baseline design and requirements for the LSST rotating enclosure (dome)
NASA Astrophysics Data System (ADS)
Neill, D. R.; DeVries, J.; Hileman, E.; Sebag, J.; Gressler, W.; Wiecha, O.; Andrew, J.; Schoening, W.
2014-07-01
The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile. As a result of the wide field of view, its optical system is unusually susceptible to stray light; consequently besides protecting the telescope from the environment the rotating enclosure (Dome) also provides indispensible light baffling. All dome vents are covered with light baffles which simultaneously provide both essential dome flushing and stray light attenuation. The wind screen also (and primarily) functions as a light screen providing only a minimum clear aperture. Since the dome must operate continuously, and the drives produce significant heat, they are located on the fixed lower enclosure to facilitate glycol water cooling. To accommodate day time thermal control, a duct system channels cooling air provided by the facility when the dome is in its parked position.
LSST communications middleware implementation
NASA Astrophysics Data System (ADS)
Mills, Dave; Schumacher, German; Lotz, Paul
2016-07-01
The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.
LSST camera readout chip ASPIC: test tools
NASA Astrophysics Data System (ADS)
Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.
2012-02-01
The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.
LSST Telescope and Optics Status
NASA Astrophysics Data System (ADS)
Krabbendam, Victor; Gressler, W. J.; Andrew, J. R.; Barr, J. D.; DeVries, J.; Hileman, E.; Liang, M.; Neill, D. R.; Sebag, J.; Wiecha, O.; LSST Collaboration
2011-01-01
The LSST Project continues to advance the design and development of an observatory system capable of capturing 20,000 deg2 of the sky in six wavebands over ten years. Optical fabrication of the unique M1/M3 monolithic mirror has entered final front surface optical processing. After substantial grinding to remove 5 tons of excess glass above the M3 surface, a residual of a single spin casting, both distinct optical surfaces are now clearly evident. Loose abrasive grinding has begun and polishing is to occur during 2011 and final optical testing is planned in early 2012. The M1/M3 telescope cell and internal component designs have matured to support on telescope operational requirements and off telescope coating needs. The mirror position system (hardpoint actuators) and mirror support system (figure actuator) designs have developed through internal laboratory analysis and testing. Review of thermal requirements has assisted with definition of a thermal conditioning and control system. Pre-cooling the M1/M3 substrate will enable productive observing during the large temperature swing often seen at twilight. The M2 ULE™ substrate is complete and lies in storage waiting for additional funding to enable final optical polishing. This 3.5m diameter, 100mm thick meniscus substrate has been ground to within 40 microns of final figure. Detailed design of the telescope mount, including subflooring, has been developed. Finally, substantial progress has been achieved on the facility design. In early 2010, LSST contracted with ARCADIS Geotecnica Consultores, a Santiago based engineering firm to lead the formal architectural design effort for the summit facility.
Near-Earth Object Orbit Linking with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Vereš, Peter; Chesley, Steven R.
2017-07-01
We have conducted a detailed simulation of the ability of the Large Synoptic Survey Telescope (LSST) to link near-Earth and main belt asteroid detections into orbits. The key elements of the study were a high-fidelity detection model and the presence of false detections in the form of both statistical noise and difference image artifacts. We employed the Moving Object Processing System (MOPS) to generate tracklets, tracks, and orbits with a realistic detection density for one month of the LSST survey. The main goals of the study were to understand whether (a) the linking of near-Earth objects (NEOs) into orbits can succeed in a realistic survey, (b) the number of false tracks and orbits will be manageable, and (c) the accuracy of linked orbits would be sufficient for automated processing of discoveries and attributions. We found that the overall density of asteroids was more than 5000 per LSST field near opposition on the ecliptic, plus up to 3000 false detections per field in good seeing. We achieved 93.6% NEO linking efficiency for H< 22 on tracks composed of tracklets from at least three distinct nights within a 12 day interval. The derived NEO catalog was comprised of 96% correct linkages. Less than 0.1% of orbits included false detections, and the remainder of false linkages stemmed from main belt confusion, which was an artifact of the short time span of the simulation. The MOPS linking efficiency can be improved by refined attribution of detections to known objects and by improved tuning of the internal kd-tree linking algorithms.
Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era
NASA Astrophysics Data System (ADS)
Cenko, Stephen
2017-08-01
When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.
NASA Technical Reports Server (NTRS)
1980-01-01
The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.
Satellite Power Systems (SPS). LSST systems and integration task for SPS flight test article
NASA Technical Reports Server (NTRS)
Greenberg, H. S.
1981-01-01
This research activity emphasizes the systems definition and resulting structural requirements for the primary structure of two potential SPS large space structure test articles. These test articles represent potential steps in the SPS research and technology development.
A warm Spitzer survey of the LSST/DES 'Deep drilling' fields
NASA Astrophysics Data System (ADS)
Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee
2014-12-01
We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.
Stellar Populations and Nearby Galaxies with the LSST
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Olsen, K.; Monet, D. G.; LSST Stellar Populations Collaboration
2009-01-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma). Time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than r=24.7. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence (MS) stars at all distances within the Galaxy as well as in the Magellanic Clouds, and dwarf satellites of the Milky Way. This will support comprehensive studies of star formation histories and chemical evolution for field stars. The structures of the Clouds and dwarf spheroidals will be traced with the MS stars, to equivalent surface densities fainter than 35 mag/square arc-second. With geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, a robust complete sample of solar neighborhood stars will be obtained. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics. The combination of wide coverage, multi-band photometry, time sampling and parallax taken together will address several key problems: e.g. fine tuning the extragalactic distance scale by examining properties of RR Lyraes and Cepheids as a function of parent populations, extending the faint end of the galaxy luminosity function by discovering them using star count density enhancements on degree scales tracing, and indentifying inter-galactic stars through novae and Long Period Variables.
Properties of tree rings in LSST sensors
Park, H. Y.; Nomerotski, A.; Tsybychev, D.
2017-05-30
Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less
Fringing in MonoCam Y4 filter images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, J.; Fisher-Levine, M.; Nomerotski, A.
Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less
scarlet: Source separation in multi-band images by Constrained Matrix Factorization
NASA Astrophysics Data System (ADS)
Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert
2018-03-01
SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.
Properties of tree rings in LSST sensors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H. Y.; Nomerotski, A.; Tsybychev, D.
Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less
A study of astrometric distortions due to “tree rings” in CCD sensors using LSST Photon Simulator
Beamer, Benjamin; Nomerotski, Andrei; Tsybychev, Dmitri
2015-05-22
Imperfections in the production process of thick CCDs lead to circularly symmetric dopant concentration variations, which in turn produce electric fields transverse to the surface of the fully depleted CCD that displace the photogenerated charges. We use PhoSim, a Monte Carlo photon simulator, to explore and examine the likely impacts these dopant concentration variations will have on astrometric measurements in LSST. The scale and behavior of both the astrometric shifts imparted to point sources and the intensity variations in flat field images that result from these doping imperfections are similar to those previously observed in Dark Energy Camera CCDs, givingmore » initial confirmation of PhoSim's model for these effects. In addition, the organized shape distortions were observed as a result of the symmetric nature of these dopant variations, causing nominally round sources to be imparted with a measurable ellipticity either aligned with or transverse to the radial direction of this dopant variation pattern.« less
Fringing in MonoCam Y4 filter images
Brooks, J.; Fisher-Levine, M.; Nomerotski, A.
2017-05-05
Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less
The Large Synoptic Survey Telescope OCS and TCS models
NASA Astrophysics Data System (ADS)
Schumacher, German; Delgado, Francisco
2010-07-01
The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.
“Big Data” Teen Astronomy Cafes at NOAO
NASA Astrophysics Data System (ADS)
Pompea, Stephen; Walker, Constance E.
2018-01-01
The National Optical Astronomy Observatory has designed and implemented a prototype educational program designed to test and understand best practices with high school students to promote an understanding of modern astronomy research with its emphasis on large data sets, data tools, and visualization tools. This program, designed to cultivate the interest of talented youth in astronomy, is based on a teen science café model developed at Los Alamos as the Café Scientifique New Mexico. In our program, we provide a free, fun way for teens to explore current research topics in astronomy on Saturday mornings at the NOAO headquarters. The program encourages stimulating conversations with astronomers in an informal and relaxed setting, with free food of course. The café is organized through a leadership team of local high school students and recruits students from all parts of the greater Tucson area. The high school students who attend have the opportunity to interact with expert astronomers working with large astronomical data sets on topics such as killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, gravitational lensing, dark energy, and dark matter. The students also have the opportunity to explore astronomical data sets and data tools using computers provided by the program. The program may serve as a model for educational outreach for the 40+ institutions involved in the LSST.
chroma: Chromatic effects for LSST weak lensing
NASA Astrophysics Data System (ADS)
Meyers, Joshua E.; Burchat, Patricia R.
2018-04-01
Chroma investigates biases originating from two chromatic effects in the atmosphere: differential chromatic refraction (DCR), and wavelength dependence of seeing. These biases arise when using the point spread function (PSF) measured with stars to estimate the shapes of galaxies with different spectral energy distributions (SEDs) than the stars.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 1
NASA Technical Reports Server (NTRS)
Sullivan, M. R.
1982-01-01
The first of a two-phase program was performed to develop the technology necessary to evaluate, design, manufacture, package, transport and deploy the hoop/column deployable antenna reflector by means of a ground based program. The hoop/column concept consists of a cable stiffened large diameter hoop and central column structure that supports and contours a radio frequency reflective mesh surface. Mission scenarios for communications, radiometer and radio astronomy, were studied. The data to establish technology drivers that resulted in a specification of a point design was provided. The point design is a multiple beam quadaperture offset antenna system wich provides four separate offset areas of illumination on a 100 meter diameter symmetrical parent reflector. The periphery of the reflector is a hoop having 48 segments that articulate into a small stowed volume around a center extendable column. The hoop and column are structurally connected by graphite and quartz cables. The prominence of cables in the design resulted in the development of advanced cable technology. Design verification models were built of the hoop, column, and surface stowage subassemblies. Model designs were generated for a half scale sector of the surface and a 1/6 scale of the complete deployable reflector.
Near-Field Cosmology with Resolved Stellar Populations Around Local Volume LMC Stellar-Mass Galaxies
NASA Astrophysics Data System (ADS)
Carlin, Jeffrey L.; Sand, David J.; Willman, Beth; Brodie, Jean P.; Crnojevic, Denija; Forbes, Duncan; Hargis, Jonathan R.; Peter, Annika; Pucha, Ragadeepika; Romanowsky, Aaron J.; Spekkens, Kristine; Strader, Jay
2018-06-01
We discuss our ongoing observational program to comprehensively map the entire virial volumes of roughly LMC stellar mass galaxies at distances of ~2-4 Mpc. The MADCASH (Magellanic Analog Dwarf Companions And Stellar Halos) survey will deliver the first census of the dwarf satellite populations and stellar halo properties within LMC-like environments in the Local Volume. Our results will inform our understanding of the recent DES discoveries of dwarf satellites tentatively affiliated with the LMC/SMC system. This program has already yielded the discovery of the faintest known dwarf galaxy satellite of an LMC stellar-mass host beyond the Local Group, based on deep Subaru+HyperSuprimeCam imaging reaching ~2 magnitudes below its TRGB, and at least two additional candidate satellites. We will summarize the survey results and status to date, highlighting some challenges encountered and lessons learned as we process the data for this program through a prototype LSST pipeline. Our program will examine whether LMC stellar mass dwarfs have extended stellar halos, allowing us to assess the relative contributions of in-situ stars vs. merger debris to their stellar populations and halo density profiles. We outline the constraints on galaxy formation models that will be provided by our observations of low-mass galaxy halos and their satellites.
Wood-Vasey DOE #SC0011834 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood-Vasey, William Michael
During the past reporting period (Year 3), this grant has provided partial support for graduate students Daniel Perrefort and Kara Ponder. They have been working exploring different aspects of the technical work needed to take full advantage of the potential for cosmological inference using Type Ia supernovae (SNeIa) with LSST.
Dwarf Hosts of Low-z Supernovae
NASA Astrophysics Data System (ADS)
Pyotr Kolobow, Craig; Perlman, Eric S.; Strolger, Louis
2018-01-01
Hostless supernovae (SNe), or SNe in dwarf galaxies, may serve as excellent beacons for probing the spatial density of dwarf galaxies (M < 10^8M⊙), which themselves are scarcely detected beyond only a few Mpc. Depending on the assumed model for the stellar-mass to halo mass relation for these galaxies, LSST might see 1000s of SNe (of all types) from dwarf galaxies alone. Conversely, one can take the measured rates of these SNe and test the model predictions for the density of dwarf galaxies in the local universe. Current “all-sky” surveys, like PanSTARRS and ASAS-SN, are now finding hostless SNe at a number sufficient to measure their rate. What missing is the appropriate weighting of their host luminosities. Here we seek to continue a successful program to recover the luminosities of these hostless SNe, to z = 0.15, to use their rate to constrain the faint-end slope of the low-z galaxy luminosity function.
Cosmic Visions Dark Energy: Small Projects Portfolio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Kyle; Frieman, Josh; Heitmann, Katrin
Understanding cosmic acceleration is one of the key science drivers for astrophysics and high-energy physics in the coming decade (2014 P5 Report). With the Large Synoptic Survey Telescope (LSST) and the Dark Energy Spectroscopic Instrument (DESI) and other new facilities beginning operations soon, we are entering an exciting phase during which we expect an order of magnitude improvement in constraints on dark energy and the physics of the accelerating Universe. This is a key moment for a matching Small Projects portfolio that can (1) greatly enhance the science reach of these flagship projects, (2) have immediate scientific impact, and (3)more » lay the groundwork for the next stages of the Cosmic Frontier Dark Energy program. In this White Paper, we outline a balanced portfolio that can accomplish these goals through a combination of observational, experimental, and theory and simulation efforts.« less
Rochester scientist discovers new comet with Dark Energy Camera (DECam) at
Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and Meetings Upcoming Colloquia Sky Conditions CTIO Site Conditions TASCA colleagues believe. David Cameron, a visiting scientist in Eric Mamajek's research group in the Department of
LSST camera grid structure made out of ceramic composite material, HB-Cesic
NASA Astrophysics Data System (ADS)
Kroedel, Matthias R.; Langton, J. Bryan
2016-08-01
In this paper we are presenting the ceramic design and the fabrication of the camera structure which is using the unique manufacturing features of the HB-Cesic technology and associated with a dedicated metrology device in order to ensure the challenging flatness requirement of 4 micron over the full array.
Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation
NASA Astrophysics Data System (ADS)
Speagle, Joshua S.; Eisenstein, Daniel J.
2017-07-01
With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim
2016-08-01
We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.
Lower Boundary Forcing related to the Occurrence of Rain in the Tropical Western Pacific
NASA Astrophysics Data System (ADS)
Li, Y.; Carbone, R. E.
2013-12-01
Global weather and climate models have a long and somewhat tortured history with respect to simulation and prediction of tropical rainfall in the relative absence of balanced flow in the geostrophic sense. An important correlate with tropical rainfall is sea surface temperature (SST). The introduction of SST information to convective rainfall parameterization in global models has improved model climatologies of tropical oceanic rainfall. Nevertheless, large systematic errors have persisted, several of which are common to most atmospheric models. Models have evolved to the point where increased spatial resolution demands representation of the SST field at compatible temporal and spatial scales, leading to common usage of monthly SST fields at scales of 10-100 km. While large systematic errors persist, significant skill has been realized from various atmospheric and coupled ocean models, including assimilation of weekly or even daily SST fields, as tested by the European Center for Medium Range Weather Forecasting. A few investigators have explored the role of SST gradients in relation to the occurrence of precipitation. Some of this research has focused on large scale gradients, mainly associated with surface ocean-atmosphere climatology. These studies conclude that lower boundary atmospheric convergence, under some conditions, could be substantially enhanced over SST gradients, destabilizing the atmosphere, and thereby enabling moist convection. While the concept has a firm theoretical foundation, it has not gained a sizeable following far beyond the realm of western boundary currents. Li and Carbone 2012 examined the role of transient mesoscale (~ 100 km) SST gradients in the western Pacific warm pool by means of GHRSST and CMORPH rainfall data. They found that excitation of deep moist convection was strongly associated with the Laplacian of SST (LSST). Specifically, -LSST is associated with rainfall onset in 75% of 10,000 events over 4 years, whereas the background ocean is symmetric about zero Laplacian. This finding is fully consistent with theory for gradients of order ~1degC in low mean wind conditions, capable of inducing atmospheric convergence of N x 10-5s-1. We will present new findings resulting from the application of a Madden-Julian oscillation (MJO) passband filter to GHRSST/CMORPH data. It shows that the -LSST field organizes at scales of 1000-2000 km and can persist for periods of two weeks to 3 months. Such -LSST anomalies are in quadrature with MJO rainfall, tracking and leading the wet phase of the MJO by 10-14 days, from the Indian Ocean to the dateline. More generally, an evaluation of SST structure in rainfall production will be presented, which represents a decidedly alternative view to conventional wisdom. Li, Yanping, and R.E. Carbone, 2012: Excitation of Rainfall over the Tropical Western Pacific, J. Atmos. Sci., 69, 2983-2994.
Astronomy development in Serbia in view of the IAU Strategic Plan
NASA Astrophysics Data System (ADS)
Atanacković, Olga
2015-03-01
An overview of astronomy development in Serbia in view of the goals envisaged by the IAU Strategic Plan is given. Due attention is paid to the recent reform of education at all levels. In the primary schools several extra topics in astronomy are introduced in the physics course. Attempts are made to reintroduce astronomy as a separate subject in the secondary schools. Special emphasis is put to the role and activities of the Petnica Science Center the biggest center for informal education in SE Europe, and to a successful participation of the Serbian team in International astronomy olympiads. Astronomy topics are taught at all five state universities in Serbia. At the University of Belgrade and Novi Sad students can enroll in astronomy from the first study year. The students have the training at the Ondrejov Observatory (Czech Republic) and at the astronomical station on the mountain Vidojevica in southern Serbia. Astronomy research in Serbia is performed at the Astronomical Observatory, Belgrade and the Department of Astronomy, Faculty of Mathematics, University of Belgrade. There are about 70 researchers in astronomy in Serbia (and about as many abroad) who participate in eight projects financed by the Ministry of Education and Science and in several international cooperations and projects: SREAC, VAMDC, Belissima (recruitment of experienced expatriate researchers), Astromundus (a 2-year joint master program with other four European universities), LSST. One of the goals in near future is twinning between universities in the SEE region and worldwide. The ever-increasing activities of 20 amateur astronomical societies are also given.
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
Final acceptance testing of the LSST monolithic primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Burge, James H.; Cuerden, Brian; Gressler, William; Martin, Hubert M.; West, Steven C.; Zhao, Chunyu
2014-07-01
The Large Synoptic Survey Telescope (LSST) is a three-mirror wide-field survey telescope with the primary and tertiary mirrors on one monolithic substrate1. This substrate is made of Ohara E6 borosilicate glass in a honeycomb sandwich, spin cast at the Steward Observatory Mirror Lab at The University of Arizona2. Each surface is aspheric, with the specification in terms of conic constant error, maximum active bending forces and finally a structure function specification on the residual errors3. There are high-order deformation terms, but with no tolerance, any error is considered as a surface error and is included in the structure function. The radii of curvature are very different, requiring two independent test stations, each with instantaneous phase-shifting interferometers with null correctors. The primary null corrector is a standard two-element Offner null lens. The tertiary null corrector is a phase-etched computer-generated hologram (CGH). This paper details the two optical systems and their tolerances, showing that the uncertainty in measuring the figure is a small fraction of the structure function specification. Additional metrology includes the radii of curvature, optical axis locations, and relative surface tilts. The methods for measuring these will also be described along with their tolerances.
On the Detectability of Interstellar Objects Like 1I/'Oumuamua
NASA Astrophysics Data System (ADS)
Ragozzine, Darin
2018-04-01
Almost since Oort's 1950 hypothesis of a tenuously bound cloud of comets, planetary formation theorists have realized that the process of planet formation must have ejected very large numbers of planetesimals into interstellar space. Unforunately, these objects are distributed over galactic volumes, while they are only likely to be detectable if they pass within a few AU of Earth, resulting in an incredibly sparse detectable population. Furthermore, hypotheses for the formation and distribution of these bodies allows for uncertainties of orders of magnitude in the expected detection rate: our analysis suggested LSST would discover 0.01-100 objects during its lifetime (Cook et al. 2016). The discovery of 1I/'Oumuamua by a survey less powerful that LSST indicates either a low probability event and/or that properties of this population are on the more favorable end of the spectrum. We revisit the detailed detection analysis of Cook et al. 2016 in light of the detection of 1I/'Oumuamua. We use these results to better understand 1I/'Oumuamua and to update our assessment of future detections of interstellar objects. We highlight some key questions that can be answered only by additional discoveries.
Earth's Minimoons: Opportunities for Science and Technology.
NASA Astrophysics Data System (ADS)
Jedicke, Robert; Bolin, Bryce T.; Bottke, William F.; Chyba, Monique; Fedorets, Grigori; Granvik, Mikael; Jones, Lynne; Urrutxua, Hodei
2018-05-01
Twelve years ago the Catalina Sky Survey discovered Earth's first known natural geocentric object other than the Moon, a few-meter diameter asteroid designated \\RH. Despite significant improvements in ground-based asteroid surveying technology in the past decade they have not discovered another temporarily-captured orbiter (TCO; colloquially known as minimoons) but the all-sky fireball system operated in the Czech Republic as part of the European Fireball Network detected a bright natural meteor that was almost certainly in a geocentric orbit before it struck Earth's atmosphere. Within a few years the Large Synoptic Survey Telescope (LSST) will either begin to regularly detect TCOs or force a re-analysis of the creation and dynamical evolution of small asteroids in the inner solar system. The first studies of the provenance, properties, and dynamics of Earth's minimoons suggested that there should be a steady state population with about one 1- to 2-meter diameter captured objects at any time, with the number of captured meteoroids increasing exponentially for smaller sizes. That model was then improved and extended to include the population of temporarily-captured flybys (TCFs), objects that fail to make an entire revolution around Earth while energetically bound to the Earth-Moon system. Several different techniques for discovering TCOs have been considered but their small diameters, proximity, and rapid motion make them challenging targets for existing ground-based optical, meteor, and radar surveys. However, the LSST's tremendous light gathering power and short exposure times could allow it to detect and discover many minimoons. We expect that if the TCO population is confirmed, and new objects are frequently discovered, they can provide new opportunities for 1) studying the dynamics of the Earth-Moon system, 2) testing models of the production and dynamical evolution of small asteroids from the asteroid belt, 3) rapid and frequent low delta-v missions to multiple minimoons, and 4) evaluating in-situ resource utilization techniques on asteroidal material. Here we review the past decade of minimoon studies in preparation for capitalizing on the scientific and commercial opportunities of TCOs in the first decade of LSST operations.
NASA Astrophysics Data System (ADS)
Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.
2009-12-01
The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.
NASA Astrophysics Data System (ADS)
2008-10-01
As floods and hurricanes disrupt the lives of people round the world, a new generation of scientific tools are supporting both storm preparedness and recovery. As International Year of Astronomy 2009 approaches, the UK website is developing more features that make it easier to see what's planned for this science extravaganza.
The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Foxley-Marrable, Max; Collett, Thomas E.; Vernardos, Georgios; Goldstein, Daniel A.; Bacon, David
2018-05-01
We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is ≲ 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (≲ 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ˜ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.
Optical Variability and Classification of High Redshift (3.5 < z < 5.5) Quasars on SDSS Stripe 82
NASA Astrophysics Data System (ADS)
AlSayyad, Yusra; McGreer, Ian D.; Fan, Xiaohui; Connolly, Andrew J.; Ivezic, Zeljko; Becker, Andrew C.
2015-01-01
Recent studies have shown promise in combining optical colors with variability to efficiently select and estimate the redshifts of low- to mid-redshift quasars in upcoming ground-based time-domain surveys. We extend these studies to fainter and less abundant high-redshift quasars using light curves from 235 sq. deg. and 10 years of Stripe 82 imaging reprocessed with the prototype LSST data management stack. Sources are detected on the i-band co-adds (5σ: i ~ 24) but measured on the single-epoch (ugriz) images, generating complete and unbiased lightcurves for sources fainter than the single-epoch detection threshold. Using these forced photometry lightcurves, we explore optical variability characteristics of high redshift quasars and validate classification methods with particular attention to the low signal limit. In this low SNR limit, we quantify the degradation of the uncertainties and biases on variability parameters using simulated light curves. Completeness/efficiency and redshift accuracy are verified with new spectroscopic observations on the MMT and APO 3.5m. These preliminary results are part of a survey to measure the z~4 luminosity function for quasars (i < 23) on Stripe 82 and to validate purely photometric classification techniques for high redshift quasars in LSST.
NASA Astrophysics Data System (ADS)
Fong, M.; Bowyer, R.; Whitehead, A.; Lee, B.; King, L.; Applegate, D.; McCarthy, I.
2018-05-01
For more than two decades, the Navarro, Frenk, and White (NFW) model has stood the test of time; it has been used to describe the distribution of mass in galaxy clusters out to their outskirts. Stacked weak lensing measurements of clusters are now revealing the distribution of mass out to and beyond their virial radii, where the NFW model is no longer applicable. In this study we assess how well the parameterised Diemer & Kravstov (DK) density profile describes the characteristic mass distribution of galaxy clusters extracted from cosmological simulations. This is determined from stacked synthetic lensing measurements of the 50 most massive clusters extracted from the Cosmo-OWLS simulations, using the Dark Matter Only run and also the run that most closely matches observations. The characteristics of the data reflect the Weighing the Giants survey and data from the future Large Synoptic Survey Telescope (LSST). In comparison with the NFW model, the DK model favored by the stacked data, in particular for the future LSST data, where the number density of background galaxies is higher. The DK profile depends on the accretion history of clusters which is specified in the current study. Eventually however subsamples of galaxy clusters with qualities indicative of disparate accretion histories could be studied.
Exploring Two Approaches for an End-to-End Scientific Analysis Workflow
NASA Astrophysics Data System (ADS)
Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba
2015-12-01
The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.
New characterization techniques for LSST sensors
Nomerotski, A.
2015-06-18
Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.
[Galaxy/quasar classification based on nearest neighbor method].
Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun
2011-09-01
With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification.
NASA Technical Reports Server (NTRS)
Forester, R. H.
1978-01-01
Polyimide membranes of a thickness range from under 0.01 micron m to greater than 1 micron m can be produced at an estimated cost of 50 cents per sq m (plus the cost of the polymer). The polymer of interest is dissolved in a solvent which is solube in water. The polymer or casting solution is allowed to flow down an inclined ramp onto a water surface where a pool of floating polymer develops. The solvent dissolves into the water lowering the surface tension of the water on equently, the contact angle of the polymer pool is very low and the edge of the pool is very thin. The solvent dissolves from this thin region too rapidly to be replenished from the bulk of the pool and a solid polymer film forms. Firm formation is rapid and spontaneous and the film spreads out unaided, many feet from the leading edge of the pool. The driving force for this process is the exothermic solution of the organic solvent from the polymer solution into the water.
Wide-Field InfraRed Survey Telescope WFIRST
NASA Technical Reports Server (NTRS)
Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Fan, X.; Rauscher, B.;
2012-01-01
In December 2010, NASA created a Science Definition Team (SDT) for WFIRST, the Wide Field Infra-Red Survey Telescope, recommended by the Astro 2010 Decadal Survey as the highest priority for a large space mission. The SDT was chartered to work with the WFIRST Project Office at GSFC and the Program Office at JPL to produce a Design Reference Mission (DRM) for WFIRST. Part of the original charge was to produce an interim design reference mission by mid-2011. That document was delivered to NASA and widely circulated within the astronomical community. In late 2011 the Astrophysics Division augmented its original charge, asking for two design reference missions. The first of these, DRM1, was to be a finalized version of the interim DRM, reducing overall mission costs where possible. The second of these, DRM2, was to identify and eliminate capabilities that overlapped with those of NASA's James Webb Space Telescope (henceforth JWST), ESA's Euclid mission, and the NSF's ground-based Large Synoptic Survey Telescope (henceforth LSST), and again to reduce overall mission cost, while staying faithful to NWNH. This report presents both DRM1 and DRM2.
Machine-assisted discovery of relationships in astronomy
NASA Astrophysics Data System (ADS)
Graham, Matthew J.; Djorgovski, S. G.; Mahabal, Ashish A.; Donalek, Ciro; Drake, Andrew J.
2013-05-01
High-volume feature-rich data sets are becoming the bread-and-butter of 21st century astronomy but present significant challenges to scientific discovery. In particular, identifying scientifically significant relationships between sets of parameters is non-trivial. Similar problems in biological and geosciences have led to the development of systems which can explore large parameter spaces and identify potentially interesting sets of associations. In this paper, we describe the application of automated discovery systems of relationships to astronomical data sets, focusing on an evolutionary programming technique and an information-theory technique. We demonstrate their use with classical astronomical relationships - the Hertzsprung-Russell diagram and the Fundamental Plane of elliptical galaxies. We also show how they work with the issue of binary classification which is relevant to the next generation of large synoptic sky surveys, such as the Large Synoptic Survey Telescope (LSST). We find that comparable results to more familiar techniques, such as decision trees, are achievable. Finally, we consider the reality of the relationships discovered and how this can be used for feature selection and extraction.
Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis
2016-09-15
future research. 3 II. Astrostatistics Historically, astronomy has been a data-driven science. Larger and more precise data sets have led to the...forthcoming Large Synoptic Survey Telescope (LSST), the human-centric approach to astronomy is becoming strained [13, 24, 25, 63]. More than ever...process. One use of the filtering process is to remove artifacts from the data set. In the context of time domain astronomy , an artifact is an error in
Large Synoptic Survey Telescope: From Science Drivers to Reference Design
2008-01-01
faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter , taking an inventory of the Solar...Energy and Dark Matter (2) Taking an Inventory of the Solar System (3) Exploring the Transient Optical Sky (4) Mapping the Milky Way Each of these four...Constraining Dark Energy and Dark Matter Current models of cosmology require the exis- tence of both dark matter and dark energy to match observational
Responding to the Event Deluge
NASA Technical Reports Server (NTRS)
Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John
2012-01-01
We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others
On the accuracy of modelling the dynamics of large space structures
NASA Technical Reports Server (NTRS)
Diarra, C. M.; Bainum, P. M.
1985-01-01
Proposed space missions will require large scale, light weight, space based structural systems. Large space structure technology (LSST) systems will have to accommodate (among others): ocean data systems; electronic mail systems; large multibeam antenna systems; and, space based solar power systems. The structures are to be delivered into orbit by the space shuttle. Because of their inherent size, modelling techniques and scaling algorithms must be developed so that system performance can be predicted accurately prior to launch and assembly. When the size and weight-to-area ratio of proposed LSST systems dictate that the entire system be considered flexible, there are two basic modeling methods which can be used. The first is a continuum approach, a mathematical formulation for predicting the motion of a general orbiting flexible body, in which elastic deformations are considered small compared with characteristic body dimensions. This approach is based on an a priori knowledge of the frequencies and shape functions of all modes included within the system model. Alternatively, finite element techniques can be used to model the entire structure as a system of lumped masses connected by a series of (restoring) springs and possibly dampers. In addition, a computational algorithm was developed to evaluate the coefficients of the various coupling terms in the equations of motion as applied to the finite element model of the Hoop/Column.
Firefly: embracing future web technologies
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.
2016-07-01
At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.
Variable classification in the LSST era: exploring a model for quasi-periodic light curves
NASA Astrophysics Data System (ADS)
Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.
2017-06-01
The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.
Transient survey rates for orphan afterglows from compact merger jets
NASA Astrophysics Data System (ADS)
Lamb, Gavin P.; Tanaka, Masaomi; Kobayashi, Shiho
2018-06-01
Orphan afterglows from short γ-ray bursts (GRBs) are potential candidates for electromagnetic (EM) counterpart searches to gravitational wave (GW) detected neutron star or neutron star black hole mergers. Various jet dynamical and structure models have been proposed that can be tested by the detection of a large sample of GW-EM counterparts. We make predictions for the expected rate of optical transients from these jet models for future survey telescopes, without a GW or GRB trigger. A sample of merger jets is generated in the redshift limits 0 ≤ z ≤ 3.0, and the expected peak r-band flux and time-scale above the Large Synoptic Survey Telescope (LSST) or Zwicky Transient Factory (ZTF) detection threshold, mr = 24.5 and 20.4, respectively, is calculated. General all-sky rates are shown for mr ≤ 26.0 and mr ≤ 21.0. The detected orphan and GRB afterglow rate depends on jet model, typically 16≲ R≲ 76 yr-1 for the LSST, and 2≲ R ≲ 8 yr-1 for ZTF. An excess in the rate of orphan afterglows for a survey to a depth of mr ≤ 26 would indicate that merger jets have a dominant low-Lorentz factor population, or the jets exhibit intrinsic jet structure. Careful filtering of transients is required to successfully identify orphan afterglows from either short- or long-GRB progenitors.
Control system design for the large space systems technology reference platform
NASA Technical Reports Server (NTRS)
Edmunds, R. S.
1982-01-01
Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.
Exploring Two Approaches for an End-to-End Scientific Analysis Workflow
Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...
2015-12-23
The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masters, Daniel C.; Stern, Daniel K.; Rhodes, Jason D.
A key goal of the Stage IV dark energy experiments Euclid , LSST, and WFIRST is to measure the growth of structure with cosmic time from weak lensing analysis over large regions of the sky. Weak lensing cosmology will be challenging: in addition to highly accurate galaxy shape measurements, statistically robust and accurate photometric redshift (photo- z ) estimates for billions of faint galaxies will be needed in order to reconstruct the three-dimensional matter distribution. Here we present an overview of and initial results from the Complete Calibration of the Color–Redshift Relation (C3R2) survey, which is designed specifically to calibratemore » the empirical galaxy color–redshift relation to the Euclid depth. These redshifts will also be important for the calibrations of LSST and WFIRST . The C3R2 survey is obtaining multiplexed observations with Keck (DEIMOS, LRIS, and MOSFIRE), the Gran Telescopio Canarias (GTC; OSIRIS), and the Very Large Telescope (VLT; FORS2 and KMOS) of a targeted sample of galaxies that are most important for the redshift calibration. We focus spectroscopic efforts on undersampled regions of galaxy color space identified in previous work in order to minimize the number of spectroscopic redshifts needed to map the color–redshift relation to the required accuracy. We present the C3R2 survey strategy and initial results, including the 1283 high-confidence redshifts obtained in the 2016A semester and released as Data Release 1.« less
NASA Astrophysics Data System (ADS)
DeVries, J.; Neill, D. R.; Barr, J.; De Lorenzi, Simone; Marchiori, Gianpietro
2016-07-01
The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile 1. As a result of the Telescope wide field of view, the optical system is unusually susceptible to stray light 2. In addition, balancing the effect of wind induced telescope vibrations with Dome seeing is crucial. The rotating enclosure system (Dome) includes a moving wind screen and light baffle system. All of the Dome vents include hinged light baffles, which provide exceptional Dome flushing, stray light attenuation, and allows for vent maintenance access from inside the Dome. The wind screen also functions as a light screen, and helps define a clear optical aperture for the Telescope. The Dome must operate continuously without rotational travel limits to accommodate the Telescope cadence and travel. Consequently, the Azimuth drives are located on the fixed lower enclosure to accommodate glycol water cooling without the need for a utility cable wrap. An air duct system aligns when the Dome is in its parked position, and this provides air cooling for temperature conditioning of the Dome during the daytime. A bridge crane and a series of ladders, stairs and platforms provide for the inspection, maintenance and repair of all of the Dome mechanical systems. The contract to build the Dome was awarded to European Industrial Engineering in Mestre, Italy in May 2015. In this paper, we present the final design of this telescope and site sub-system.
Time-Resolved Surveys of Stellar Clusters
NASA Astrophysics Data System (ADS)
Eyer, Laurent; Eggenberger, Patrick; Greco, Claudia; Saesen, Sophie; Anderson, Richard I.; Mowlavi, Nami
We describe the information that can be gained when a survey is done multi-epoch, and its particular impact in open cluster research. We first explain the irreplaceable information that multi-epoch observations are giving within astrometry, photometry and spectroscopy. Then we give three examples of results on open clusters from multi-epoch surveys, namely, the distance to the Pleiades, the angular momentum evolution of low mass stars and asteroseismology. Finally we mention several very large surveys, which are ongoing or planned for the future, Gaia, JASMINE, LSST, and VVV.
NASA Astrophysics Data System (ADS)
Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.
2017-06-01
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.
SPHEREx: Probing the Physics of Inflation with an All-Sky Spectroscopic Galaxy Survey
NASA Astrophysics Data System (ADS)
Dore, Olivier; SPHEREx Science Team
2018-01-01
SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A in August 2017, is an all-sky survey satellite designed to address all three science goals in NASA’s astrophysics division: probe the origin and destiny of our Universe; explore whether planets around other stars could harbor life; and explore the origin and evolution of galaxies. These themes are addressed by a single survey, with a single instrument.In this poster, we describe how SPHEREx can probe the physics of inflationary non-Gaussianity by measuring large-scale structure with galaxy redshifts over a large cosmological volume at low redshifts, complementing high-redshift surveys optimized to constrain dark energy.SPHEREx will be the first all-sky near-infrared spectral survey, creating a legacy archive of spectra. In particular, it will measure the redshifts of over 500 million galaxies of all types, an unprecedented dataset. Using this catalog, SPHEREx will reduce the uncertainty in fNL -- a parameter describing the inflationary initial conditions -- by a factor of more than 10 compared with CMB measurements. At the same time, this catalog will enable strong scientific synergies with Euclid, WFIRST and LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holoien, Thomas W. -S.; Marshall, Philip J.; Wechsler, Risa H.
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of amore » subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.« less
Measuring the scale dependence of intrinsic alignments using multiple shear estimates
NASA Astrophysics Data System (ADS)
Leonard, C. Danielle; Mandelbaum, Rachel
2018-06-01
We present a new method for measuring the scale dependence of the intrinsic alignment (IA) contamination to the galaxy-galaxy lensing signal, which takes advantage of multiple shear estimation methods applied to the same source galaxy sample. By exploiting the resulting correlation of both shape noise and cosmic variance, our method can provide an increase in the signal-to-noise of the measured IA signal as compared to methods which rely on the difference of the lensing signal from multiple photometric redshift bins. For a galaxy-galaxy lensing measurement which uses LSST sources and DESI lenses, the signal-to-noise on the IA signal from our method is predicted to improve by a factor of ˜2 relative to the method of Blazek et al. (2012), for pairs of shear estimates which yield substantially different measured IA amplitudes and highly correlated shape noise terms. We show that statistical error necessarily dominates the measurement of intrinsic alignments using our method. We also consider a physically motivated extension of the Blazek et al. (2012) method which assumes that all nearby galaxy pairs, rather than only excess pairs, are subject to IA. In this case, the signal-to-noise of the method of Blazek et al. (2012) is improved.
CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding
NASA Astrophysics Data System (ADS)
Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás
2018-01-01
Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.
NASA Astrophysics Data System (ADS)
Friedrich, Oliver; Eifler, Tim
2018-01-01
Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the covariance matrix is the sum of two contributions, C = A+B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard covariance estimator would require >105 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.
Strong Lens Time Delay Challenge. I. Experimental Design
NASA Astrophysics Data System (ADS)
Dobler, Gregory; Fassnacht, Christopher D.; Treu, Tommaso; Marshall, Phil; Liao, Kai; Hojjati, Alireza; Linder, Eric; Rumbaugh, Nicholas
2015-02-01
The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~103 strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders," each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.
Fabrication of the LSST monolithic primary-tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Ketelsen, Dean A.; Law, Kevin; Gressler, William J.; Zhao, Chunyu
2012-09-01
As previously reported (at the SPIE Astronomical Instrumentation conference of 2010 in San Diego1), the Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona’s Steward Observatory Mirror Lab. We will provide an update to the status of the mirrors and metrology systems, which have advanced from concepts to hardware in the past two years. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab, reducing the degrees of freedom needed to be controlled in the telescope. The surface specification is described as a structure function, related to seeing in excellent conditions. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper details the manufacturing process and metrology systems for each surface, including the alignment of the two surfaces. M1 is a hyperboloid and can utilize a standard Offner null corrector, whereas M3 is an oblate ellipsoid, so it has positive spherical aberration. The null corrector is a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature. Laser trackers are relied upon to measure the alignment and spacing as well as rough-surface metrology during looseabrasive grinding.
PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
VanderPlas, Jacob T.; Ivezic, Željko
This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common tomore » all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.« less
The Santiago-Harvard-Edinburgh-Durham void comparison - I. SHEDding light on chameleon gravity tests
NASA Astrophysics Data System (ADS)
Cautun, Marius; Paillas, Enrique; Cai, Yan-Chuan; Bose, Sownak; Armijo, Joaquin; Li, Baojiu; Padilla, Nelson
2018-05-01
We present a systematic comparison of several existing and new void-finding algorithms, focusing on their potential power to test a particular class of modified gravity models - chameleon f(R) gravity. These models deviate from standard general relativity (GR) more strongly in low-density regions and thus voids are a promising venue to test them. We use halo occupation distribution (HOD) prescriptions to populate haloes with galaxies, and tune the HOD parameters such that the galaxy two-point correlation functions are the same in both f(R) and GR models. We identify both three-dimensional (3D) voids and two-dimensional (2D) underdensities in the plane of the sky to find the same void abundance and void galaxy number density profiles across all models, which suggests that they do not contain much information beyond galaxy clustering. However, the underlying void dark matter density profiles are significantly different, with f(R) voids being more underdense than GR ones, which leads to f(R) voids having a larger tangential shear signal than their GR analogues. We investigate the potential of each void finder to test f(R) models with near-future lensing surveys such as EUCLID and LSST. The 2D voids have the largest power to probe f(R) gravity, with an LSST analysis of tunnel (which is a new type of 2D underdensity introduced here) lensing distinguishing at 80 and 11σ (statistical error) f(R) models with parameters, |fR0| = 10-5 and 10-6, from GR.
Lighting the Fire for 25 years: The Nature and Legacy of Astronomy Camp
NASA Astrophysics Data System (ADS)
McCarthy, Donald W.; Hooper, E.; Benecchi, S. D.; Henry, T. J.; Kirkpatrick, J. D.; Kulesa, C.; Oey, M. S.; Regester, J.; Schlingman, W. M.; Camp Staff, Astronomy
2013-01-01
In 1988, Astronomy Camp began in an era when science was entirely the realm of professionals, astronomical observatories were off-limits to the public at night, and scientists were not encouraged to spend time in science education. Since then we have grown a dynamic science education program that immerses individuals (ages 11-80), educators, schools, and Girl Scout Leaders in authentic science at Arizona’s research observatories in the Catalina mountains and at Kitt Peak. Often labeled “life changing,” these residential programs have engaged thousands of people from 49 U.S. states and 20 foreign countries. Female enrollment has increased steadily, and women now generally outnumber men in our teenage programs. Graduate students have played a major creative role and many have gone on to become educators and research leaders around the world. By involving a wide range of ages, the Camps have helped strengthen the STEM-pipeline. Many of our alumni remain in touch via social and professional networks and have developed not only into professional astronomers but also into leaders throughout society, parents, and educators. Our emphasis on age-appropriate research helped inspire today’s concepts of research-based science education and Citizen Science. An accompanying paper (E. Hooper et al.) discusses our approach to project-oriented astronomical research. Scientific discoveries include Near-Earth Objects, supernova classification, and lightcurves of Kuiper Belt Objects. The Camps have also contributed to educational research involving informal science education, youth perceptions, and student identities. Ironically, the Camps have leveraged new initiatives in both research and education at NOAO, LSST, and JWST. Here we review the philosophy, conduct, and content of Astronomy Camp and summarize the unexpected nature of its ongoing legacy. We remain grateful to The University of Arizona Alumni Association for its long-term encouragement and support.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Hugh H.; Balasubramanian, V.; Bernstein, G.
The University of Pennsylvania elementary particle physics/particle cosmology group, funded by the Department of Energy Office of Science, participates in research in high energy physics and particle cosmology that addresses some of the most important unanswered questions in science. The research is divided into five areas. Energy Frontier - We participate in the study of proton-proton collisions at the Large Hadron Collider in Geneva, Switzerland using the ATLAS detector. The University of Pennsylvania group was responsible for the design, installation, and commissioning of the front-end electronics for the Transition Radiation Tracker (TRT) and plays the primary role in its maintenancemore » and operation. We play an important role in the triggering of ATLAS, and we have made large contributions to the TRT performance and to the study and identification of electrons, photons, and taus. We have been actively involved in searches for the Higgs boson and for SUSY and other exotic particles. We have made significant contributions to measurement of Standard Model processes such as inclusive photon production and WW pair production. We also have participated significantly in R&D for upgrades to the ATLAS detector. Cosmic Frontier - The Dark Energy Survey (DES) telescope will be used to elucidate the nature of dark energy and the distribution of dark matter. Penn has played a leading role both in the use of weak gravitational lensing of distant galaxies and the discovery of large numbers of distant supernovae. The techniques and forecasts developed at Penn are also guiding the development of the proposed Large Synoptic Survey Telescope (LSST).We are also developing a new detector, MiniClean, to search for direct detection of dark matter particles. Intensity Frontier - We are participating in the design and R&D of detectors for the Long Baseline Neutrino Experiment (now DUNE), a new experiment to study the properties of neutrinos. Advanced Techology R&D - We have an extensive involvement in electronics required for sophisticated new detectors at the LHC and are developing electronics for the LSST camera. Theoretical Physics - We are carrying out a broad program studying the fundamental forces of nature and early universe cosmology and mathematical physics. Our activities span the range from model building, formal field theory, and string theory to new paradigms for cosmology and the interface of string theory with mathematics. Our effort combines extensive development of the formal aspects of string theory with a focus on real phenomena in particle physics, cosmology and gravity.« less
The Amateurs' Love Affair with Large Datasets
NASA Astrophysics Data System (ADS)
Price, Aaron; Jacoby, S. H.; Henden, A.
2006-12-01
Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.
Data Management challenges in Astronomy and Astroparticle Physics
NASA Astrophysics Data System (ADS)
Lamanna, Giovanni
2015-12-01
Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.
The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept
NASA Astrophysics Data System (ADS)
Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking
2018-01-01
CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.
On determination of charge transfer efficiency of thick, fully depleted CCDs with 55 Fe x-rays
Yates, D.; Kotov, I.; Nomerotski, A.
2017-07-01
Charge transfer efficiency (CTE) is one of the most important CCD characteristics. Our paper examines ways to optimize the algorithms used to analyze 55Fe x-ray data on the CCDs, as well as explores new types of observables for CTE determination that can be used for testing LSST CCDs. Furthermore, the observables are modeled employing simple Monte Carlo simulations to determine how the charge diffusion in thick, fully depleted silicon affects the measurement. The data is compared to the simulations for one of the observables, integral flux of the x-ray hit.
Near-Earth Object Survey Simulation Software
NASA Astrophysics Data System (ADS)
Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide
2017-10-01
There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.
Strong Gravitational Lensing as a Probe of Gravity, Dark-Matter and Super-Massive Black Holes
NASA Astrophysics Data System (ADS)
Koopmans, L.V.E.; Barnabe, M.; Bolton, A.; Bradac, M.; Ciotti, L.; Congdon, A.; Czoske, O.; Dye, S.; Dutton, A.; Elliasdottir, A.; Evans, E.; Fassnacht, C.D.; Jackson, N.; Keeton, C.; Lasio, J.; Moustakas, L.; Meneghetti, M.; Myers, S.; Nipoti, C.; Suyu, S.; van de Ven, G.; Vegetti, S.; Wucknitz, O.; Zhao, H.-S.
Whereas considerable effort has been afforded in understanding the properties of galaxies, a full physical picture, connecting their baryonic and dark-matter content, super-massive black holes, and (metric) theories of gravity, is still ill-defined. Strong gravitational lensing furnishes a powerful method to probe gravity in the central regions of galaxies. It can (1) provide a unique detection-channel of dark-matter substructure beyond the local galaxy group, (2) constrain dark-matter physics, complementary to direct-detection experiments, as well as metric theories of gravity, (3) probe central super-massive black holes, and (4) provide crucial insight into galaxy formation processes from the dark matter point of view, independently of the nature and state of dark matter. To seriously address the above questions, a considerable increase in the number of strong gravitational-lens systems is required. In the timeframe 2010-2020, a staged approach with radio (e.g. EVLA, e-MERLIN, LOFAR, SKA phase-I) and optical (e.g. LSST and JDEM) instruments can provide 10^(2-4) new lenses, and up to 10^(4-6) new lens systems from SKA/LSST/JDEM all-sky surveys around ~2020. Follow-up imaging of (radio) lenses is necessary with moderate ground/space-based optical-IR telescopes and with 30-50m telescopes for spectroscopy (e.g. TMT, GMT, ELT). To answer these fundamental questions through strong gravitational lensing, a strong investment in large radio and optical-IR facilities is therefore critical in the coming decade. In particular, only large-scale radio lens surveys (e.g. with SKA) provide the large numbers of high-resolution and high-fidelity images of lenses needed for SMBH and flux-ratio anomaly studies.
STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobler, Gregory; Fassnacht, Christopher D.; Rumbaugh, Nicholas
2015-02-01
The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a groupmore » of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.« less
STRONG LENS TIME DELAY CHALLENGE. II. RESULTS OF TDC1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Kai; Treu, Tommaso; Marshall, Phil
2015-02-10
We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g., COSMOGRAIL) and expected from future synoptic surveys (e.g., LSST), and include simulated systematic errors. Seven teams participated in TDC1, submitting results from 78 different method variants. After describing each method, we compute and analyze basic statistics measuring accuracy (ormore » bias) A, goodness of fit χ{sup 2}, precision P, and success rate f. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give |A| < 0.03, P < 0.03, and χ{sup 2} < 1.5, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range f = 20%-40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher f than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with P < 0.03 and |A| < 0.01, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that A and f depend mostly on season length, while P depends mostly on cadence and campaign duration.« less
PROFIT: Bayesian profile fitting of galaxy images
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.
2017-04-01
We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (
Unveiling the population of orphan γ-ray bursts
NASA Astrophysics Data System (ADS)
Ghirlanda, G.; Salvaterra, R.; Campana, S.; Vergani, S. D.; Japelj, J.; Bernardini, M. G.; Burlon, D.; D'Avanzo, P.; Melandri, A.; Gomboc, A.; Nappo, F.; Paladini, R.; Pescalli, A.; Salafia, O. S.; Tagliaferri, G.
2015-06-01
Gamma-ray bursts (GRBs) are detectable in the γ-ray band if their jets are oriented toward the observer. However, for each GRB with a typical θjet, there should be ~2/θ2jet bursts whose emission cone is oriented elsewhere in space. These off-axis bursts can eventually be detected when, due to the deceleration of their relativistic jets, the beaming angle becomes comparable to the viewing angle. Orphan afterglows (OAs) should outnumber the current population of bursts detected in the γ-ray band even if they have not been conclusively observed so far at any frequency. We compute the expected flux of the population of orphan afterglows in the mm, optical, and X-ray bands through a population synthesis code of GRBs and the standard afterglow emission model. We estimate the detection rate of OAs with ongoing and forthcoming surveys. The average duration of OAs as transients above a given limiting flux is derived and described with analytical expressions: in general OAs should appear as daily transients in optical surveys and as monthly/yearly transients in the mm/radio band. We find that ~2 OA yr-1 could already be detected by Gaia and up to 20 OA yr-1 could be observed by the ZTF survey. A larger number of 50 OA yr-1 should be detected by LSST in the optical band. For the X-ray band, ~26 OA yr-1 could be detected by the eROSITA. For the large population of OA detectable by LSST, the X-ray and optical follow up of the light curve (for the brightest cases) and/or the extensive follow up of their emission in the mm and radio band could be the key to disentangling their GRB nature from other extragalactic transients of comparable flux density.
Optical testing of the LSST combined primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Gressler, William J.; Zhao, Chunyu
2010-07-01
The Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona's Steward Observatory Mirror Lab. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper describes the basic metrology systems for each surface, with particular attention to the alignment of the two surfaces. These surfaces are aspheric enough to require null correctors for each wavefront. Both M1 and M3 are concave surfaces with both non-zero conic constants and higher-order terms (6th order for M1 and both 6th and 8th orders for M3). M1 is hyperboloidal and can utilize a standard Offner null corrector. M3 is an oblate ellipsoid, so has positive spherical aberration. We have chosen to place a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature (CoC), whereas the M1 null lens is beyond the CoC. One relatively new metrology tool is the laser tracker, which is relied upon to measure the alignment and spacings. A separate laser tracker system will be used to measure both surfaces during loose abrasive grinding and initial polishing.
Computer analysis of digital sky surveys using citizen science and manual classification
NASA Astrophysics Data System (ADS)
Kuminski, Evan; Shamir, Lior
2015-01-01
As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Axelrod, Tim; Calamida, Annalisa; Saha, Abhijit; Matheson, Thomas; Olszewski, Edward; Holberg, Jay; Holberg, Jay; Bohlin, Ralph; Stubbs, Christopher W.; Rest, Armin; Deustua, Susana; Sabbi, Elena; MacKenty, John W.; Points, Sean D.; Hubeny, Ivan
2018-01-01
We have established a network of faint (16.5 < V < 19) hot DA white dwarfs as spectrophotometric standards for present and future wide-field observatories. Our standards are accessible from both hemispheres and suitable for ground and space-based covering the UV to the near IR. The network is tied directly to the most precise astrophysical reference presently available - the CALSPEC standards - through a multi-cycle program imaging using the Wide-Field Camera 3 (WFC3) on the Hubble Space Telescope (HST). We have developed two independent analyses to forward model all the observed photometry and ground-based spectroscopy and infer a spectral energy distribution for each source using a non-local-thermodynamic-equilibrium (NLTE) DA white dwarf atmosphere extincted by interstellar dust. The models are in excellent agreement with each other, and agree with the observations to better than 0.01 mag in all passbands, and better than 0.005 mag in the optical. The high-precision of these faint sources, tied directly to the most accurate flux standards presently available, make our network of standards ideally suited for any experiments that have very stringent requirements on absolute flux calibration, such as studies of dark energy using the Large Synoptic Survey Telescope (LSST) and the Wide-Field Infrared Survey Telescope (WFIRST).
voevent-parse: Parse, manipulate, and generate VOEvent XML packets
NASA Astrophysics Data System (ADS)
Staley, Tim D.
2014-11-01
voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.
The Follow-up Crisis: Optimizing Science in an Opportunity Rich Environment
NASA Astrophysics Data System (ADS)
Vestrand, T.
Rapid follow-up tasking for robotic telescopes has been dominated by a one-dimensional uncoordinated response strategy developed for gamma-ray burst studies. However, this second-grade soccer approach is increasing showing its limitations even when there are only a few events per night. And it will certainly fail when faced with the denial-of-service attack generated by the nightly flood of new transients generated by massive variability surveys like LSST. We discuss approaches for optimizing the scientific return from autonomous robotic telescopes in the high event range limit and explore the potential of a coordinated telescope ecosystem employing heterogeneous telescopes.
High Energy Physics and Nuclear Physics Network Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dart, Eli; Bauerdick, Lothar; Bell, Greg
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements needed by instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In August 2013, ESnet and the DOE SC Offices of High Energy Physics (HEP) and Nuclear Physicsmore » (NP) organized a review to characterize the networking requirements of the programs funded by the HEP and NP program offices. Several key findings resulted from the review. Among them: 1. The Large Hadron Collider?s ATLAS (A Toroidal LHC Apparatus) and CMS (Compact Muon Solenoid) experiments are adopting remote input/output (I/O) as a core component of their data analysis infrastructure. This will significantly increase their demands on the network from both a reliability perspective and a performance perspective. 2. The Large Hadron Collider (LHC) experiments (particularly ATLAS and CMS) are working to integrate network awareness into the workflow systems that manage the large number of daily analysis jobs (1 million analysis jobs per day for ATLAS), which are an integral part of the experiments. Collaboration with networking organizations such as ESnet, and the consumption of performance data (e.g., from perfSONAR [PERformance Service Oriented Network monitoring Architecture]) are critical to the success of these efforts. 3. The international aspects of HEP and NP collaborations continue to expand. This includes the LHC experiments, the Relativistic Heavy Ion Collider (RHIC) experiments, the Belle II Collaboration, the Large Synoptic Survey Telescope (LSST), and others. The international nature of these collaborations makes them heavily reliant on transoceanic connectivity, which is subject to longer term service disruptions than terrestrial connectivity. The network engineering aspects of undersea connectivity will continue to be a significant part of the planning, deployment, and operation of the data analysis infrastructure for HEP and NP experiments for the foreseeable future. Given their critical dependency on networking services, the experiments have expressed the need for tight integration (both technically and operationally) of the domestic and the transoceanic parts of the network infrastructure that supports the experiments. 4. The datasets associated with simulations continue to increase in size, and the need to move these datasets between analysis centers is placing ever-increasing demands on networks and on data management systems at the supercomputing centers. In addition, there is a need to harmonize cybersecurity practice with the data transfer performance requirements of the science. This report expands on these points, and addresses others as well. The report contains a findings section in addition to the text of the case studies discussed during the review.« less
Protecting Dark Skies in Chile
NASA Astrophysics Data System (ADS)
Smith, R. Chris; Sanhueza, Pedro; Phillips, Mark
2018-01-01
Current projections indicate that Chile will host approximately 70% of the astronomical collecting area on Earth by 2030, augmenting the enormous area of ALMA with that of three next-generation optical telescopes: LSST, GMTO, and E-ELT. These cutting-edge facilities represent billions of dollars of investment in the astronomical facilities hosted in Chile. The Chilean government, Chilean astronomical community, and the international observatories in Chile have recognized that these investments are threatened by light pollution, and have formed a strong collaboration to work at managing the threats. We will provide an update on the work being done in Chile, ranging from training municipalities about new lighting regulations to exploring international recognition of the dark sky sites of Northern Chile.
Machine Learning for Zwicky Transient Facility
NASA Astrophysics Data System (ADS)
Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey
2018-01-01
The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.
Agile software development in an earned value world: a survival guide
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin
2016-08-01
Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.
Characterising CCDs with cosmic rays
Fisher-Levine, M.; Nomerotski, A.
2015-08-06
The properties of cosmic ray muons make them a useful probe for measuring the properties of thick, fully depleted CCD sensors. The known energy deposition per unit length allows measurement of the gain of the sensor's amplifiers, whilst the straightness of the tracks allows for a crude assessment of the static lateral electric fields at the sensor's edges. The small volume in which the muons deposit their energy allows measurement of the contribution to the PSF from the diffusion of charge as it drifts across the sensor. In this work we present a validation of the cosmic ray gain measurementmore » technique by comparing with radioisotope gain measurments, and calculate the charge diffusion coefficient for prototype LSST sensors.« less
Multi-Wavelength Photometric Identification of Quenching Galaxies in ZFOURGE
NASA Astrophysics Data System (ADS)
Forrest, Ben; Tran, Kim-Vy; ZFOURGE Collaboration
2018-01-01
In the new millennium, multi-wavelength photometric surveys of thousands of galaxies, such as SDSS, CANDELS, NMBS, and ZFOURGE have become the standard for analyzing large populations.With ongoing surveys such as DES, and upcoming programs with LSST and JWST, finding ways to leverage large amounts of data will continue to be an area of important research.Many diagnostics have been used to classify these galaxies, most notably the rest-frame UVJ color-color diagram, which splits galaxies into star-forming and quiescent populations.With the plethora of data probing wavelengths outside of the optical however, we can do better.In this talk I present a scheme for classifying galaxies with using composite SEDs that clearly reveals rare populations such as extreme emission line galaxies and post-starburst galaxies.We use a sample of ~8000 galaxies from ZFOURGE which have SNR_Ks>20, observations from 0.3-8 microns, and are at 1
Extracting meaning from astronomical telegrams
NASA Astrophysics Data System (ADS)
Graham, Matthew; Conwill, L.; Djorgovski, S. G.; Mahabal, A.; Donalek, C.; Drake, A.
2011-01-01
The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using aspects of natural language processing. We demonstrate that it is possible to infer the subject of an ATEL from the vocabulary used and to identify previously unassociated reports.
Constraining neutrino masses with the integrated-Sachs-Wolfe-galaxy correlation function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesgourgues, Julien; Valkenburg, Wessel; Gaztanaga, Enrique
2008-03-15
Temperature anisotropies in the cosmic microwave background (CMB) are affected by the late integrated Sachs-Wolfe (lISW) effect caused by any time variation of the gravitational potential on linear scales. Dark energy is not the only source of lISW, since massive neutrinos induce a small decay of the potential on small scales during both matter and dark energy domination. In this work, we study the prospect of using the cross correlation between CMB and galaxy-density maps as a tool for constraining the neutrino mass. On the one hand massive neutrinos reduce the cross-correlation spectrum because free-streaming slows down structure formation; onmore » the other hand, they enhance it through their change in the effective linear growth. We show that in the observable range of scales and redshifts, the first effect dominates, but the second one is not negligible. We carry out an error forecast analysis by fitting some mock data inspired by the Planck satellite, Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST). The inclusion of the cross correlation data from Planck and LSST increases the sensitivity to the neutrino mass m{sub {nu}} by 38% (and to the dark energy equation of state w by 83%) with respect to Planck alone. The correlation between Planck and DES brings a far less significant improvement. This method is not potentially as good for detecting m{sub {nu}} as the measurement of galaxy, cluster, or cosmic shear power spectra, but since it is independent and affected by different systematics, it remains potentially interesting if the total neutrino mass is of the order of 0.2 eV; if instead it is close to the lower bound from atmospheric oscillations, m{sub {nu}}{approx}0.05 eV, we do not expect the ISW-galaxy correlation to be ever sensitive to m{sub {nu}}.« less
The Future of Astrometric Education
NASA Astrophysics Data System (ADS)
van Altena, W.; Stavinschi, M.
2005-10-01
Astrometry is poised to enter an era of unparalleled growth and relevance due to the wealth of highly accurate data expected from the SIM and GAIA space missions. Innovative ground-based telescopes, such as the LSST, are planned which will provide less precise data, but for many more stars. The potential for studies of the structure, kinematics and dynamics of our Galaxy as well as for the physical nature of stars and the cosmological distance scale is without equal in the history of astronomy. It is therefore ironic that in two years not one course in astrometry will be taught in the US, leaving all astrometric education to Europe, China and Latin America. Who will ensure the astrometric quality control for the JWT, SIM, GAIA, LSST, to say nothing about the current large ground-based facilities, such as the VLT, Gemini, Keck, NOAO, Magellan, LBT, etc.? Hipparcos and the HST were astrometric successes due only to the dedicated work of specialists in astrometry who fought to maintain the astrometric characteristics of those satellites and their data pipelines. We propose a renewal of astrometric education in the universities to prepare qualified scientists so that the scientific returns from the investment of billions of dollars in these unique facilities will be maximized. The funding agencies are providing outstanding facilities. The universities, national and international observatories and agencies should acknowledge their responsibility to hire qualified full-time astrometric scientists to teach students, and to supervise existing and planned astronomical facilities so that quality data will be obtained and analyzed. A temporary solution to this problem is proposed in the form of a series of international summer schools in Astrometry. The Michelson Science Center of the SIM project has offered to hold an astrometry summer school in 2005 to begin this process. A one-semester syllabus is suggested as a means of meeting the needs of Astronomy by educating students in astrometric techniques that might be most valuable for careers associated with modern astrophysics.
Managing Astronomy Research Data: Case Studies of Big and Small Research Projects
NASA Astrophysics Data System (ADS)
Sands, Ashley E.
2015-01-01
Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy data workforce encompasses a greater breadth of educational backgrounds. Results show that teams of individuals with distinct expertise are key to ensuring the long-term preservation and usability of astronomy datasets.
Color Me Intrigued: The Discovery of iPTF 16fnm, an SN 2002cx-like Object
NASA Astrophysics Data System (ADS)
Miller, A. A.; Kasliwal, M. M.; Cao, Y.; Adams, S. M.; Goobar, A.; Knežević, S.; Laher, R. R.; Lunnan, R.; Masci, F. J.; Nugent, P. E.; Perley, D. A.; Petrushevska, T.; Quimby, R. M.; Rebbapragada, U. D.; Sollerman, J.; Taddia, F.; Kulkarni, S. R.
2017-10-01
Modern wide-field, optical time-domain surveys must solve a basic optimization problem: maximize the number of transient discoveries or minimize the follow-up needed for the new discoveries. Here, we describe the Color Me Intrigued experiment, the first from the intermediate Palomar Transient Factory (iPTF) to search for transients simultaneously in the g PTF and R PTF bands. During the course of this experiment, we discovered iPTF 16fnm, a new member of the 02cx-like subclass of Type Ia supernovae (SNe). iPTF 16fnm peaked at {M}{g{PTF}}=-15.09+/- 0.17 {mag}, making it the second-least-luminous known SN Ia. iPTF 16fnm exhibits all the hallmarks of the 02cx-like class: (I) low luminosity at peak, (II) low ejecta velocities, and (III) a non-nebular spectrum several months after peak. Spectroscopically, iPTF 16fnm exhibits a striking resemblance to two other low-luminosity 02cx-like SNe: SN 2007qd and SN 2010ae. iPTF 16fnm and SN 2005hk decline at nearly the same rate, despite a 3 mag difference in brightness at peak. When considering the full subclass of 02cx-like SNe, we do not find evidence for a tight correlation between peak luminosity and decline rate in either the g‧ or r‧ band. We measure the relative rate of 02cx-like SNe to normal SNe Ia and find {r}{N02{cx}/{N}{Ia}}={33}-25+158 % . We further examine the g‧ - r‧ evolution of 02cx-like SNe and find that their unique color evolution can be used to separate them from 91bg-like and normal SNe Ia. This selection function will be especially important in the spectroscopically incomplete Zwicky Transient Facility/Large Synoptic Survey Telescope (LSST) era. Finally, we close by recommending that LSST periodically evaluate, and possibly update, its observing cadence to maximize transient science.
Liverpool telescope 2: a new robotic facility for rapid transient follow-up
NASA Astrophysics Data System (ADS)
Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Bersier, D.; Bode, M. F.; Carter, D.; Clay, N. R.; Collins, C. A.; Darnley, M. J.; Davis, C. J.; Gutierrez, C. M.; Harman, D. J.; James, P. A.; Knapen, J. H.; Kobayashi, S.; Marchant, J. M.; Mazzali, P. A.; Mottram, C. J.; Mundell, C. G.; Newsam, A.; Oscoz, A.; Palle, E.; Piascik, A.; Rebolo, R.; Smith, R. J.
2015-03-01
The Liverpool Telescope is one of the world's premier facilities for time domain astronomy. The time domain landscape is set to radically change in the coming decade, with synoptic all-sky surveys such as LSST providing huge numbers of transient detections on a nightly basis; transient detections across the electromagnetic spectrum from other major facilities such as SVOM, SKA and CTA; and the era of `multi-messenger astronomy', wherein astrophysical events are detected via non-electromagnetic means, such as neutrino or gravitational wave emission. We describe here our plans for the Liverpool Telescope 2: a new robotic telescope designed to capitalise on this new era of time domain astronomy. LT2 will be a 4-metre class facility co-located with the Liverpool Telescope at the Observatorio del Roque de Los Muchachos on the Canary island of La Palma. The telescope will be designed for extremely rapid response: the aim is that the telescope will take data within 30 seconds of the receipt of a trigger from another facility. The motivation for this is twofold: firstly it will make it a world-leading facility for the study of fast fading transients and explosive phenomena discovered at early times. Secondly, it will enable large-scale programmes of low-to-intermediate resolution spectral classification of transients to be performed with great efficiency. In the target-rich environment of the LSST era, minimising acquisition overheads will be key to maximising the science gains from any follow-up programme. The telescope will have a diverse instrument suite which is simultaneously mounted for automatic changes, but it is envisaged that the primary instrument will be an intermediate resolution, optical/infrared spectrograph for scientific exploitation of transients discovered with the next generation of synoptic survey facilities. In this paper we outline the core science drivers for the telescope, and the requirements for the optical and mechanical design.
NASA Astrophysics Data System (ADS)
TAMURA, NAOYUKI
2015-08-01
PFS (Prime Focus Spectrograph), a next generation facility instrument on Subaru, is a very wide-field, massively-multiplexed, and optical & near-infrared spectrograph. Exploiting the Subaru prime focus, 2400 reconfigurable fibers will be distributed in the 1.3 degree field. The spectrograph will have 3 arms of blue, red, and near-infrared cameras to simultaneously observe spectra from 380nm to 1260nm at one exposure. The development of this instrument has been undertaken by the international collaboration at the initiative of Kavli IPMU. The project is now going into the construction phase aiming at system integration and on-sky commissioning in 2017-2018, and science operation in 2019. In parallel, the survey design has also been developed envisioning a Subaru Strategic Program (SSP) that spans roughly speaking 300 nights over 5 years. The major science areas are three-folds: Cosmology, galaxy/AGN evolution, and Galactic archaeology (GA). The cosmology program will be to constrain the nature of dark energy via a survey of emission line galaxies over a comoving volume of ~10 Gpc^3 in the redshift range of 0.8 < z < 2.4. In the GA program, radial velocities and chemical abundances of stars in the Milky Way, dwarf spheroidal galaxies, and M31 will be used to understand the past assembly histories of those galaxies and the structures of their dark matter halos. Spectra will be taken for ~1 million stars as faint as V = 22 therefore out to large distances from the Sun. For the extragalactic program, our simulations suggest the wide wavelength coverage of PFS will be particularly powerful in probing the galaxy populations and its clustering properties over a wide redshift range. We will conduct a survey of color-selected 1 < z < 2 galaxies and AGN over 20 square degrees down to J = 23.4, yielding a fair sample of galaxies with stellar masses above ˜10^10 solar masses. Further, PFS will also provide unique spectroscopic opportunities even in the era of Euclid, LSST, WFIRST and TMT. In this presentation, an overview of the instrument, current project status and path forward will be given.
Ground/bonding for Large Space System Technology (LSST). [of metallic and nonmetallic structures
NASA Technical Reports Server (NTRS)
Dunbar, W. G.
1980-01-01
The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.
Cables and connectors for Large Space System Technology (LSST)
NASA Technical Reports Server (NTRS)
Dunbar, W. G.
1980-01-01
The effect of the environment and extravehicular activity/remote assembly operations on the cables and connectors for spacecraft with metallic and/or nonmetallic structures was examined. Cable and connector philosophy was outlined for the electrical systems and electronic compartments which contain high-voltage, high-power electrical and electronic equipment. The influence of plasma and particulates on the system is analyzed and the effect of static buildup on the spacecraft electrical system discussed. Conceptual cable and connector designs are assessed for capability to withstand high current and high voltage without danger of arcs and electromagnetic interference. The extravehicular activites required of the space station and/or supply spacecraft crew members to join and inspect the electrical system, using manual or remote assembly construction are also considered.
Automated software configuration in the MONSOON system
NASA Astrophysics Data System (ADS)
Daly, Philip N.; Buchholz, Nick C.; Moore, Peter C.
2004-09-01
MONSOON is the next generation OUV-IR controller project being developed at NOAO. The design is flexible, emphasizing code re-use, maintainability and scalability as key factors. The software needs to support widely divergent detector systems ranging from multi-chip mosaics (for LSST, QUOTA, ODI and NEWFIRM) down to large single or multi-detector laboratory development systems. In order for this flexibility to be effective and safe, the software must be able to configure itself to the requirements of the attached detector system at startup. The basic building block of all MONSOON systems is the PAN-DHE pair which make up a single data acquisition node. In this paper we discuss the software solutions used in the automatic PAN configuration system.
The Dynamics of the Local Group in the Era of Precision Astrometry
NASA Astrophysics Data System (ADS)
Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta
2018-06-01
Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.
Ground/bonding for Large Space System Technology (LSST)
NASA Astrophysics Data System (ADS)
Dunbar, W. G.
1980-04-01
The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.
NASA Astrophysics Data System (ADS)
Liebenberg, Janet; Mentz, Elsa; Breed, Betty
2012-09-01
This paper reports on a qualitative study that examined how pair programming shapes the experience of secondary school girls taking IT as a subject, with respect to their enjoyment of programming and the subject itself. The study involved six Grade 11 girls who were doing solo programming in Grade 10 and pair programming in their following Grade. The results showed that the girls enjoyed the subject more when programming in pairs due to improved comprehension of the task. They especially enjoyed the socialization and communication brought about by pair programming. The assistance, support, motivation, focus and encouragement they received from partners when stuck or while fixing errors made the programming experience more enjoyable for them. The increased enjoyment brought about by pair programming resulted in the perception of greater learning in the subject IT and also to greater interest in it. It also led to greater persistence in dealing with problems. Pair programming should be implemented right from the start of Grade 10 since it may lead to greater enjoyment of programming and the subject IT in general. The approach may also lead to more girls being attracted to the subject.
NASA Astrophysics Data System (ADS)
Hara, Toshitsugu
Elementary education program for engineering by the dual system combined with workshop program and teaching program with practical subject was discussed. The dual system which consists of several workshop programs and fundamental subjects (such as mathematics, English and physics) with practical material has been performed for the freshmen. The elementary workshop program (primary course) has four workshops and the related lectures. Fundamental subjects are taught with the practical or engineering texts. English subjects are taught by specified teachers who have ever worked in engineering field with English. The dual system was supported by such systems as the center for success initiative and the English education center.
Supernovae and cosmology with future European facilities.
Hook, I M
2013-06-13
Prospects for future supernova surveys are discussed, focusing on the European Space Agency's Euclid mission and the European Extremely Large Telescope (E-ELT), both expected to be in operation around the turn of the decade. Euclid is a 1.2 m space survey telescope that will operate at visible and near-infrared wavelengths, and has the potential to find and obtain multi-band lightcurves for thousands of distant supernovae. The E-ELT is a planned, general-purpose ground-based, 40-m-class optical-infrared telescope with adaptive optics built in, which will be capable of obtaining spectra of type Ia supernovae to redshifts of at least four. The contribution to supernova cosmology with these facilities will be discussed in the context of other future supernova programmes such as those proposed for DES, JWST, LSST and WFIRST.
Using Deep Learning to Analyze the Voices of Stars.
NASA Astrophysics Data System (ADS)
Boudreaux, Thomas Macaulay
2018-01-01
With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.
The Maunakea Spectroscopic ExplorerStatus and System overview
NASA Astrophysics Data System (ADS)
Mignot, S.; Murowinski, R.; Szeto, K.; Blin, A.; Caillier, P.
2017-12-01
The Maunakea Spectroscopic Explorer (MSE) project explores the possibility of upgrading the existing CFHT telescope and collaboration to turn it into the most powerful spectroscopic facility available in the years 2020s. Its 10 meter aperture and its 1.5°² hexagonal field of view will allow both large and deep surveys, as complements to current (Gaia, eRosita, LOFAR) and future imaging (Euclid, WFIRST, SKA, LSST) surveys, but also to provide tentative targets to the TMT or the E-ELT. In perfect agreement with INSU's 2015-2020 prospective, besides being well represented in MSE's science team (23/105 members), France is also a major contributor to the Conceptual Design studies with CRAL developing a concept for the low and moderate spectrographs, DT INSU for the prime focus environment and GEPI for systems engineering.
Winkler, Sabune J; Cagliero, Enrico; Witte, Elizabeth; Bierer, Barbara E
2014-08-01
The Harvard Clinical and Translational Science Center ("Harvard Catalyst") Research Subject Advocacy (RSA) Program has reengineered subject advocacy, distributing the delivery of advocacy functions through a multi-institutional, central platform rather than vesting these roles and responsibilities in a single individual functioning as a subject advocate. The program is process-oriented and output-driven, drawing on the strengths of participating institutions to engage local stakeholders both in the protection of research subjects and in advocacy for subjects' rights. The program engages stakeholder communities in the collaborative development and distributed delivery of accessible and applicable educational programming and resources. The Harvard Catalyst RSA Program identifies, develops, and supports the sharing and distribution of expertise, education, and resources for the benefit of all institutions, with a particular focus on the frontline: research subjects, researchers, research coordinators, and research nurses. © 2014 Wiley Periodicals, Inc.
Augmenting the Funding Sources for Space Science and the ASTRO-1 Space Telescope
NASA Astrophysics Data System (ADS)
Morse, Jon
2015-08-01
The BoldlyGo Institute was formed in 2013 to augment the planned space science portfolio through philanthropically funded robotic space missions, similar to how some U.S. medical institutes and ground-based telescopes are funded. I introduce BoldlyGo's two current projects: the SCIM mission to Mars and the ASTRO-1 space telescope. In particular, ASTRO-1 is a 1.8-meter off-axis (unobscured) ultraviolet-visible space observatory to be located in a Lagrange point or heliocentric orbit with a wide-field panchromatic camera, medium- and high-resolution spectrograph, and high-contrast imaging coronagraph and/or an accompanying starshade/occulter. It is intended for the post-Hubble Space Telescope era in the 2020s, enabling unique measurements of a broad range of celestial targets, while providing vital complementary capabilities to other ground- and space-based facilities such as the JWST, ALMA, WFIRST-AFTA, LSST, TESS, Euclid, and PLATO. The ASTRO-1 architecture simultaneously wields great scientific power while being technically viable and affordable. A wide variety of scientific programs can be accomplished, addressing topics across space astronomy, astrophysics, fundamental physics, and solar system science, as well as being technologically informative to future large-aperture programs. ASTRO-1 is intended to be a new-generation research facility serving a broad national and international community, as well as a vessel for impactful public engagement. Traditional institutional partnerships and consortia, such as are common with private ground-based observatories, may play a role in the support and governance of ASTRO-1; we are currently engaging interested international organizations. In addition to our planned open guest observer program and accessible data archive, we intend to provide a mechanism whereby individual scientists can buy in to a fraction of the gauranteed observing time. Our next step in ASTRO-1 development is to form the ASTRO-1 Requirements Team (ART), to which international scientists are invited to apply. The ART will be tasked with anchoring the science case, optimizing the observatory design, and constructing a design reference mission during late-2015 and 2016.
ERIC Educational Resources Information Center
Liebenberg, Janet; Mentz, Elsa; Breed, Betty
2012-01-01
This paper reports on a qualitative study that examined how pair programming shapes the experience of secondary school girls taking IT as a subject, with respect to their enjoyment of programming and the subject itself. The study involved six Grade 11 girls who were doing solo programming in Grade 10 and pair programming in their following Grade.…
Clinical evaluation of higher stimulation rates in the nucleus research platform 8 system.
Plant, Kerrie; Holden, Laura; Skinner, Margo; Arcaroli, Jennifer; Whitford, Lesley; Law, Mary-Ann; Nel, Esti
2007-06-01
The effect on speech perception of using higher stimulation rates than the 14.4 kHz available in the Nucleus 24 cochlear implant system was investigated. The study used the Nucleus Research Platform 8 (RP8) system, comprising the CI24RE receiver-stimulator with the Contour electrode array, the L34SP body-worn research speech processor, and the Nucleus Programming Environment (NPE) fitting and Neural Response Telemetry (NRT) software. This system enabled clinical investigation of higher stimulation rates before an implementation in the Freedom cochlear implant system commercially released by Cochlear Limited. Use of higher stimulation rates in the ACE coding strategy was assessed in 15 adult subjects. An ABAB experimental design was used to control for order effects. Program A used a total stimulation rate of between 12 kHz and 14.4 kHz. This program was used for at least the first 3 mo after initial device activation. After evaluation with this program, each subject was provided with two different higher stimulation rate programs: one with a total stimulation rate of 24 kHz and the other with a total stimulation rate of 32 kHz. After a 6-week period of familiarization, each subject identified his/her preferred higher rate program (program B), and this was used for the evaluation. Subjects then repeated their use of program A for 3 wk, then program B for 3 wk, before the second evaluation with each. Speech perception was evaluated by using CNC open-set monosyllabic words presented in quiet and CUNY open-set sentences presented in noise. Preference for stimulation rate program was assessed via a subjective questionnaire. Threshold (T)- and Comfortable (C)-levels, as well as subjective reports of tinnitus, were monitored for each subject throughout the study to determine whether there were any changes that might be associated with the use of higher stimulation rates. No significant mean differences in speech perception results were found for the group between the two programs for tests in either quiet or noise. Analysis of individual subject data showed that five subjects had significant benefit from use of program B for tests administered in quiet and for tests administered in noise. However, only two of these subjects showed benefit in both test conditions. One subject showed significant benefit from use of program A when tested in quiet, whereas another showed benefit with this program in noise. Each subject's preferred program varied. Five subjects reported a preference for program A, eight subjects reported a preference for program B and two reported no overall preference. Preference between the different stimulation rates provided within program B also varied, with 10 subjects preferring 24 kHz and five preferring 32 kHz total stimulation rates. A significant increase in T-levels from baseline measures was observed after three weeks of initial experience with program B, however there was no difference between the baseline levels and those obtained after five weeks of use. No significant change in C-levels was found over the monitoring period. No long-term changes in tinnitus that could be associated with the use of the higher stimulation rates were reported by any of the subjects. The use of higher stimulation rates may provide benefit to some but not all cochlear implant recipients. It is important to optimize the stimulation rate for an individual to ensure maximal benefit. The absence of any changes in T- and C-levels or in tinnitus suggests that higher stimulation rates are safe for clinical use.
Introductory Programming Subject in European Higher Education
ERIC Educational Resources Information Center
Aleksic, Veljko; Ivanovic, Mirjana
2016-01-01
Programming is one of the basic subjects in most informatics, computer science mathematics and technical faculties' curricula. Integrated overview of the models for teaching programming, problems in teaching and suggested solutions were presented in this paper. Research covered current state of 1019 programming subjects in 715 study programmes at…
A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs
Gilbertson, W.; Nomerotski, A.; Takacs, P.
2017-09-07
In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less
A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbertson, W.; Nomerotski, A.; Takacs, P.
In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less
Architectural Implications for Spatial Object Association Algorithms*
Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste
2013-01-01
Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244
Centroid Position as a Function of Total Counts in a Windowed CMOS Image of a Point Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtz, R E; Olivier, S; Riot, V
2010-05-27
We obtained 960,200 22-by-22-pixel windowed images of a pinhole spot using the Teledyne H2RG CMOS detector with un-cooled SIDECAR readout. We performed an analysis to determine the precision we might expect in the position error signals to a telescope's guider system. We find that, under non-optimized operating conditions, the error in the computed centroid is strongly dependent on the total counts in the point image only below a certain threshold, approximately 50,000 photo-electrons. The LSST guider camera specification currently requires a 0.04 arcsecond error at 10 Hertz. Given the performance measured here, this specification can be delivered with a singlemore » star at 14th to 18th magnitude, depending on the passband.« less
Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database
NASA Astrophysics Data System (ADS)
Borne, Kirk D.
2014-01-01
We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.
Connecting the time domain community with the Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Plante, Raymond L.; Kantor, Jeffrey; Good, John C.
2012-09-01
The time domain has been identied as one of the most important areas of astronomical research for the next decade. The Virtual Observatory is in the vanguard with dedicated tools and services that enable and facilitate the discovery, dissemination and analysis of time domain data. These range in scope from rapid notications of time-critical astronomical transients to annotating long-term variables with the latest modelling results. In this paper, we will review the prior art in these areas and focus on the capabilities that the VAO is bringing to bear in support of time domain science. In particular, we will focus on the issues involved with the heterogeneous collections of (ancilllary) data associated with astronomical transients, and the time series characterization and classication tools required by the next generation of sky surveys, such as LSST and SKA.
Structural overview and learner control in hypermedia instructional programs
NASA Astrophysics Data System (ADS)
Burke, Patricia Anne
1998-09-01
This study examined the effects of a structural overview and learner control in a computer-based program on the achievement, attitudes, time in program and Linearity of path of fifth-grade students. Four versions of a computer-based instructional program about the Sun and planets were created in a 2 x 2 factorial design. The program consisted of ten sections, one for each planet and one for the Sun. Two structural overview conditions (structural overview, no structural overview) were crossed with two control conditions (learner control, program control). Subjects in the structural overview condition chose the order in which they would learn about the planets from among three options: ordered by distance from the Sun, ordered by size, or ordered by temperature. Subjects in the learner control condition were able to move freely among screens within a section and to choose their next section after finishing the previous one. In contrast, those in the program control condition advanced through the program in a prescribed linear manner. A 2 x 2 ANOVA yielded no significant differences in posttest scores for either independent variable or for their interaction. The structural overview was most likely not effective because subjects spent only a small percentage of their total time on the structural overview screens and they were not required to act upon the information in those screens. Learner control over content sequencing may not have been effective because most learner-control subjects chose the same overall sequence of instruction (i.e., distance from the Sun) prescribed for program-control subjects. Learner-control subjects chose to view an average of 40 more screens than the fixed number of 160 screens in the program-control version. However, program-control subjects spent significantly more time per screen than learner-control subjects, and the total time in program did not differ significantly between the two groups. Learner-control subjects receiving the structural overview deviated from the linear path significantly more often than subjects who did not have the structural overview, but deviation from the linear path was not associated with higher posttest scores.
Coldwell, S E; Getz, T; Milgrom, P; Prall, C W; Spadafora, A; Ramsay, D S
1998-04-01
This paper describes CARL (Computer Assisted Relaxation Learning), a computerized, exposure-based therapy program for the treatment of dental injection fear. The CARL program operates primarily in two different modes; in vitro, which presents a video-taped exposure hierarchy, and in vivo, which presents scripts for a dentist or hygienist to use while working with a subject. Two additional modes are used to train subjects to use the program and to administer behavioral assessment tests. The program contains five different modules, which function to register a subject, train subjects to use physical and cognitive relaxation techniques, deliver an exposure hierarchy, question subjects about the helpfulness of each of the therapy components, and test for memory effects of anxiolytic medication. Nine subjects have completed the CARL therapy program and 1-yr follow-up as participants in a placebo-controlled clinical trial examining the effects of alprazolam on exposure therapy for dental injection phobia. All nine subjects were able to receive two dental injections, and all reduced their general fear of dental injections. Initial results therefore indicate that the CARL program successfully reduces dental injection fear.
Searching for Exoplanets using Artificial Intelligence
NASA Astrophysics Data System (ADS)
Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann
2017-10-01
In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.
A Photometric (griz) Metallicity Calibration for Cool Stars
NASA Astrophysics Data System (ADS)
West, Andrew A.; Davenport, James R. A.; Dhital, Saurav; Mann, Andrew; Massey, Angela P
2014-06-01
We present results from a study that uses wide pairs as tools for estimating and constraining the metal content of cool stars from their spectra and broad band colors. Specifically, we will present results that optimize the Mann et al. M dwarf metallicity calibrations (derived using wide binaries) for the optical regime covered by SDSS spectra. We will demonstrate the robustness of the new calibrations using a sample of wide, low-mass binaries for which both components have an SDSS spectrum. Using these new spectroscopic metallicity calibrations, we will present relations between the metallicities (from optical spectra) and the Sloan colors derived using more than 20,000 M dwarfs in the SDSS DR7 spectroscopic catalog. These relations have important ramifications for studies of Galactic chemical evolution, the search for exoplanets and subdwarfs, and are essential for surveys such as Pan-STARRS and LSST, which use griz photometry but have no spectroscopic component.
Transient Go: A Mobile App for Transient Astronomy Outreach
NASA Astrophysics Data System (ADS)
Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.
2016-12-01
Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.
Data Service: Distributed Data Capture and Replication
NASA Astrophysics Data System (ADS)
Warner, P. B.; Pietrowicz, S. R.
2007-10-01
Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.
Architectural Implications for Spatial Object Association Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, V S; Kurc, T; Saltz, J
2009-01-29
Spatial object association, also referred to as cross-match of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server R, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation providesmore » insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST).« less
Probing Neutrino Hierarchy and Chirality via Wakes.
Zhu, Hong-Ming; Pen, Ue-Li; Chen, Xuelei; Inman, Derek
2016-04-08
The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and neutrino wakes are expected to develop downstream of the dark matter halos. We propose a method of measuring the neutrino mass based on this mechanism. This neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys with a low redshift galaxy survey or a 21 cm intensity mapping survey, which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make a positive detection if the three neutrino masses are quasidegenerate with each neutrino mass of ∼0.1 eV, and a future high precision 21 cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right-handed Dirac neutrinos may be detectable.
Summary of LSST systems analysis and integration task for SPS flight test articles
NASA Astrophysics Data System (ADS)
Greenberg, H. S.
1981-02-01
The structural and equipment requirements for two solar power satellite (SPS) test articles are defined. The first SPS concept uses a hexagonal frame structure to stabilize the array of primary tension cables configured to support a Mills Cross antenna containing 17,925 subarrays composed of dipole radiating elements and solid state power amplifier modules. The second test article consists of a microwave antenna and its power source, a 20 by 200 m array of solar cell blankets, both of which are supported by the solar blanket array support structure. The test article structure, a ladder, is comprised of two longitudinal beams (215 m long) spaced 10 m apart and interconnected by six lateral beams. The system control module structure and bridge fitting provide bending and torsional stiffness, and supplement the in plane Vierendeel structure behavior. Mission descriptions, construction, and structure interfaces are addressed.
Implications from XMM and Chandra Source Catalogs for Future Studies with Lynx
NASA Astrophysics Data System (ADS)
Ptak, Andrew
2018-01-01
Lynx will perform extremely sensitive X-ray surveys by combining very high-resolution imaging over a large field of view with a high effective area. These will include deep planned surveys and serendipitous source surveys. Here we discuss implications that can be gleaned from current Chandra and XMM-Newton serendipitous source surveys. These current surveys have discovered novel sources such as tidal disruption events, binary AGN, and ULX pulsars. In addition these surveys have detected large samples of normal galaxies, low-luminosity AGN and quasars due to the wide-area coverage of the Chandra and XMM-Newton source catalogs, allowing the evolution of these phenonema to be explored. The wide area Lynx surveys will probe down further in flux and will be coupled with very sensitive wide-area surveys such as LSST and SKA, allowing for detailed modeling of their SEDs and the discovery of rare, exotic sources and transient events.
Cosmic Evolution Through UV Spectroscopy (CETUS): A NASA Probe-Class Mission Concept
NASA Astrophysics Data System (ADS)
Heap, Sara R.; CETUS Team
2017-01-01
CETUS is a probe-class mission concept proposed for study to NASA in November 2016. Its overarching objective is to provide access to the ultraviolet (~100-400 nm) after Hubble has died. CETUS will be a major player in the emerging global network of powerful, new telescopes such as E-ROSITA, DESI, Subaru/PFS, GMT, LSST, WFIRST, JWST, and SKA. The CETUS mission concept provisionally features a 1.5-m telescope with a suite of instruments including a near-UV multi-object spectrograph (200-400 nm) complementing Subaru/PFS observations, wide-field far-UV and near-UV cameras, and far-UV and near-UV spectrographs that can be operated in either high-resolution or low-resolution mode. We have derived the scope and specific science requirements for CETUS for understanding the evolutionary history of galaxies, stars, and dust, but other applications are possible.
The science enabled by the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Martin, N. F.; Babusiaux, C.
2017-12-01
With its unique wide-field, multi-object, and dedicated spectroscopic capabilities, the Maunakea Spectroscopic Explorer (MSE) is a powerful facility to shed light on the faint Universe. Built around an upgrade of the Canada-France Hawaii Telescope (CFHT) to a 11.25-meter telescope with a dedicated ˜1.5 deg^2, 4,000-fiber wide-field spectrograph that covers the optical and near-infrared wavelengths at resolutions between 2,500 and 40,000, the MSE is the essential follow-up complement to the current and next generations of multi-wavelength imaging surveys, such as the LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for the extremely large telescopes that are currently being built (E-ELT, GMT, and TMT). The science enabled by the MSE is vast and would have an impact on almost all aspects of astronomy research.
Estimating explosion properties of normal hydrogen-rich core-collapse supernovae
NASA Astrophysics Data System (ADS)
Pejcha, Ondrej
2017-08-01
Recent parameterized 1D explosion models of hundreds of core-collapse supernova progenitors suggest that success and failure are intertwined in a complex pattern that is not a simple function of the progenitor initial mass. This rugged landscape is present also in other explosion properties, allowing for quantitative tests of the neutrino mechanism from observations of hundreds of supernovae discovered every year. We present a new self-consistent and versatile method that derives photospheric radius and temperature variations of normal hydrogen-rich core-collapse supernovae based on their photometric measurements and expansion velocities. We construct SED and bolometric light curves, determine explosion energies, ejecta and nickel masses while taking into account all uncertainties and covariances of the model. We describe the efforts to compare the inferences to the predictions of the neutrino mechanim. The model can be adapted to include more physical assumptions to utilize primarily photometric data coming from surveys such as LSST.
The applications of deep neural networks to sdBV classification
NASA Astrophysics Data System (ADS)
Boudreaux, Thomas M.
2017-12-01
With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and we show that two separate paradigms of deep learning - the Artificial Neural Network And the Convolutional Neural Network - can both be used to classify this synthetic data effectively. And that additionally this classification can be performed at relatively high levels of accuracy with minimal time spent adjusting network hyperparameters.
LensFlow: A Convolutional Neural Network in Search of Strong Gravitational Lenses
NASA Astrophysics Data System (ADS)
Pourrahmani, Milad; Nayyeri, Hooshang; Cooray, Asantha
2018-03-01
In this work, we present our machine learning classification algorithm for identifying strong gravitational lenses from wide-area surveys using convolutional neural networks; LENSFLOW. We train and test the algorithm using a wide variety of strong gravitational lens configurations from simulations of lensing events. Images are processed through multiple convolutional layers that extract feature maps necessary to assign a lens probability to each image. LENSFLOW provides a ranking scheme for all sources that could be used to identify potential gravitational lens candidates by significantly reducing the number of images that have to be visually inspected. We apply our algorithm to the HST/ACS i-band observations of the COSMOS field and present our sample of identified lensing candidates. The developed machine learning algorithm is more computationally efficient and complimentary to classical lens identification algorithms and is ideal for discovering such events across wide areas from current and future surveys such as LSST and WFIRST.
Boonyasopun, Umaporn; Aree, Patcharaporn; Avant, Kay C
2008-06-01
This quasi-experimental study examined the effects of an empowerment-based nutrition promotion program on food consumption and serum lipid levels among hyperlipidemic Thai elderly. Fifty-six experimental subjects received the program; 48 control subjects maintained their habitual lifestyle. The statistical methods used were the t-test, Z-test, and chi2/Fisher's exact test. After the program, the consumption of high saturated fat, cholesterol, and simple sugar diets was significantly lower for the experimental group than for the control group. The percentage change of the serum total cholesterol of the experimental subjects was significantly higher than that of the control subjects. The number of experimental subjects that changed from hyperlipidemia to normolipidemia significantly increased compared to that for the control subjects. The implementation of this program was related to an improvement in food consumption and serum lipid levels among hyperlipidemic Thai elderly and, therefore, has implications for practice.
Marshall, E; Buckner, E; Powell, K
1991-01-01
The purpose of this study was to evaluate a teen parent program designed to increase parents' self-esteem, improve parenting skills, and increase parental knowledge about child development. Subjects (n = 30) in the program were referred from public health services. Control subjects (n = 30) were served by a local health department. Subjects were tested before and on completion of the program (or 6-9 months later for controls) using the Coopersmith Self-Esteem Inventory (SEI), the Inventory of Parents' Experiences (IPE), and the Denver Developmental Screening Test (DDST). Findings included (a) intervention subjects scored lower than control subjects on the pretesting in self-esteem (p less than 0.05), parental role satisfaction (p less than 0.05), and community support (p less than 0.0001); (b) control subjects scored lower on satisfaction with intimate relationships (p less than 0.0001); (c) at post-test, there were no statistically significant differences, and intervention subjects recorded self-esteem scores had increased to control levels; and (d) no developmental delays were detected in newborns at either pre- or post-testing. Implications of this study include (a) data support effectiveness of the program in enhancing self-esteem, maintaining satisfaction in parental role, and increasing community support for teen parents; and (b) evaluation of teen parent programs' effects should be done every 3-6 months to reduce subject attrition.
Code of Federal Regulations, 2010 CFR
2010-04-01
... developed, acquired, or assisted under the IHBG program subject to limitations on cost or design standards... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Is affordable housing developed, acquired, or assisted under the IHBG program subject to limitations on cost or design standards? 1000.156...
Code of Federal Regulations, 2011 CFR
2011-04-01
... developed, acquired, or assisted under the IHBG program subject to limitations on cost or design standards... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Is affordable housing developed, acquired, or assisted under the IHBG program subject to limitations on cost or design standards? 1000.156...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Are non-dwelling structures developed, acquired or assisted under the IHBG program subject to limitations on cost or design standards...-dwelling structures developed, acquired or assisted under the IHBG program subject to limitations on cost...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Are non-dwelling structures developed, acquired or assisted under the IHBG program subject to limitations on cost or design standards...-dwelling structures developed, acquired or assisted under the IHBG program subject to limitations on cost...
7 CFR 782.11 - Extent to which commodities are subject to end-use certificate regulations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 7 2014-01-01 2014-01-01 false Extent to which commodities are subject to end-use... (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS END-USE CERTIFICATE PROGRAM Implementation of the End-Use Certificate Program § 782.11 Extent to which commodities are subject to end-use...
7 CFR 782.11 - Extent to which commodities are subject to end-use certificate regulations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 7 2010-01-01 2010-01-01 false Extent to which commodities are subject to end-use... (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS END-USE CERTIFICATE PROGRAM Implementation of the End-Use Certificate Program § 782.11 Extent to which commodities are subject to end-use...
7 CFR 782.11 - Extent to which commodities are subject to end-use certificate regulations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 7 2011-01-01 2011-01-01 false Extent to which commodities are subject to end-use... (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS END-USE CERTIFICATE PROGRAM Implementation of the End-Use Certificate Program § 782.11 Extent to which commodities are subject to end-use...
7 CFR 782.11 - Extent to which commodities are subject to end-use certificate regulations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 7 2012-01-01 2012-01-01 false Extent to which commodities are subject to end-use... (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS END-USE CERTIFICATE PROGRAM Implementation of the End-Use Certificate Program § 782.11 Extent to which commodities are subject to end-use...
7 CFR 782.11 - Extent to which commodities are subject to end-use certificate regulations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 7 2013-01-01 2013-01-01 false Extent to which commodities are subject to end-use... (Continued) FARM SERVICE AGENCY, DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS END-USE CERTIFICATE PROGRAM Implementation of the End-Use Certificate Program § 782.11 Extent to which commodities are subject to end-use...
Correlated Curriculum Program: An Experimental Program, Mathematics Level 1. Project No. 10006.
ERIC Educational Resources Information Center
Magram, Elyse; And Others
The Correlated Curriculum Program is a 4-year career-oriented program designed to provide a more effective educational program for the general course student, with an interdisciplinary approach to teaching. Teachers are organized into teams to plan for correlated lessons. Correlating career subjects with academic subjects serves to reinforce…
ERIC Educational Resources Information Center
Buditjahjanto, I. G. P. Asto; Nurlaela, Luthfiyah; Ekohariadi; Riduwan, Mochamad
2017-01-01
Programming technique is one of the subjects at Vocational High School in Indonesia. This subject contains theory and application of programming utilizing Visual Programming. Students experience some difficulties to learn textual learning. Therefore, it is necessary to develop media as a tool to transfer learning materials. The objectives of this…
An Attempt of Making Program-Generated Animation in a Beginners’ Programming Class
NASA Astrophysics Data System (ADS)
Matsuyama, Chieko; Nakashima, Toyoshiro; Ishii, Naohiro
In general, mathematical subjects are used for programming education in universities. In this case, many students lose the interest in the programming because the students have the preconception that is difficult to program by using the mathematical expressions. Especially beginners of the programming are a tendency to lose the interest. Therefore it is pointed out to use the subjects which do not need mathematical knowledge as much as possible. In this paper the authors have tried to make animation that are generated by programs instead of the mathematical subjects in a beginners’ programming class using C language used in a wide-ranging field. The authors discuss about improvements of the interest of students for programming by the try that is to make animation by programs in a programming class and refer to its effects.
Use of telemedicine in the remote programming of cochlear implants.
Ramos, Angel; Rodriguez, Carina; Martinez-Beneyto, Paz; Perez, Daniel; Gault, Alexandre; Falcon, Juan Carlos; Boyle, Patrick
2009-05-01
Remote cochlear implant (CI) programming is a viable, safe, user-friendly and cost-effective procedure, equivalent to standard programming in terms of efficacy and user's perception, which can complement the standard procedures. The potential benefits of this technique are outlined. We assessed the technical viability, risks and difficulties of remote CI programming; and evaluated the benefits for the user comparing the standard on-site CI programming versus the remote CI programming. The Remote Programming System (RPS) basically consists of completing the habitual programming protocol in a regular CI centre, assisted by local staff, although guided by a remote expert, who programs the CI device using a remote programming station that takes control of the local station through the Internet. A randomized prospective study has been designed with the appropriate controls comparing RPS to the standard on-site CI programming. Study subjects were implanted adults with a HiRes 90K(R) CI with post-lingual onset of profound deafness and 4-12 weeks of device use. Subjects underwent two daily CI programming sessions either remote or standard, on 4 programming days separated by 3 month intervals. A total of 12 remote and 12 standard sessions were completed. To compare both CI programming modes we analysed: program parameters, subjects' auditory progress, subjects' perceptions of the CI programming sessions, and technical aspects, risks and difficulties of remote CI programming. Control of the local station from the remote station was carried out successfully and remote programming sessions were achieved completely and without incidents. Remote and standard program parameters were compared and no significant differences were found between the groups. The performance evaluated in subjects who had been using either standard or remote programs for 3 months showed no significant difference. Subjects were satisfied with both the remote and standard sessions. Safety was proven by checking emergency stops in different conditions. A very small delay was noticed that did not affect the ease of the fitting. The oral and video communication between the local and the remote equipment was established without difficulties and was of high quality.
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 2 2012-07-01 2012-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 2 2013-07-01 2013-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 2 2014-07-01 2014-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources BUREAU OF SAFETY AND... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.911 If my platform is subject...
31 CFR 205.3 - What Federal assistance programs are subject to this subpart A?
Code of Federal Regulations, 2011 CFR
2011-07-01
... programs which: (1) Are listed in the Catalog of Federal Domestic Assistance; (2) Meet the funding... Federal assistance programs subject to subpart A if a State or Federal Program Agency fails to comply with... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false What Federal assistance programs are...
31 CFR 205.3 - What Federal assistance programs are subject to this subpart A?
Code of Federal Regulations, 2012 CFR
2012-07-01
... programs which: (1) Are listed in the Catalog of Federal Domestic Assistance; (2) Meet the funding... Federal assistance programs subject to subpart A if a State or Federal Program Agency fails to comply with... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false What Federal assistance programs are...
31 CFR 205.3 - What Federal assistance programs are subject to this subpart A?
Code of Federal Regulations, 2013 CFR
2013-07-01
... programs which: (1) Are listed in the Catalog of Federal Domestic Assistance; (2) Meet the funding... Federal assistance programs subject to subpart A if a State or Federal Program Agency fails to comply with... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false What Federal assistance programs are...
31 CFR 205.3 - What Federal assistance programs are subject to this subpart A?
Code of Federal Regulations, 2014 CFR
2014-07-01
... programs which: (1) Are listed in the Catalog of Federal Domestic Assistance; (2) Meet the funding... Federal assistance programs subject to subpart A if a State or Federal Program Agency fails to comply with... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false What Federal assistance programs are...
31 CFR 205.3 - What Federal assistance programs are subject to this subpart A?
Code of Federal Regulations, 2010 CFR
2010-07-01
... programs which: (1) Are listed in the Catalog of Federal Domestic Assistance; (2) Meet the funding... Federal assistance programs subject to subpart A if a State or Federal Program Agency fails to comply with... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What Federal assistance programs are...
[Effects of training on static and dynamic balance in elderly subjects who have had a fall or not].
Toulotte, C; Thévenon, A; Fabre, C
2004-11-01
To evaluate the effects of a physical training program on static and dynamic balance during single and dual task conditions in elderly subjects who have had a fall or not. Two groups, comprising a total of 33 elderly subjects, were trained: 16 who had a fall were 69.2 +/- 5.0 years old and 17 who had not had a fall were 67.3 +/- 3.8 years. All subjects underwent an unipedal test with eyes open and eyes closed, followed by gait assessment during single and dual motor task conditions, before and after a physical training program. All subjects showed a significant decrease, by six times for subjects who had fallen and four times by those who had not, in the number of touch-downs in the unipedal test with eyes open (P < 0.05), and by 2.5 and 2 times, respectively, with eyes closed (P < 0.05) after the training program. All subjects showed a significant increase in speed (P < 0.05), cadence (P < 0.05) and stride length (P < 0.05) and a significant decrease in the single support time (P < 0.05) and stride time (P < 0.05) in gait assessment during single and dual task conditions after the training program. During the training program, no subjects fell. The physical training program improved static balance and quality of gait in elderly subjects who had had a fall and those who had not, which could contribute to minimizing and/or retarding the effects of aging and maintaining physical independence.
Evaluation of a Group-Based Trauma Recovery Program in Gaza: Students' Subjective Experiences
ERIC Educational Resources Information Center
Barron, Ian; Abdullah, Ghassan
2012-01-01
Internationally, evaluation of group-based trauma recovery programs has relied upon normative outcome measures, with no studies systematically analyzing children's subjective experience for program development. In contrast, the current study explored children's experience of a Gazan recovery program "in their own words." Twenty-four…
Index to Computer Based Learning.
ERIC Educational Resources Information Center
Hoye, Robert E., Ed.; Wang, Anastasia C., Ed.
The computer-based programs and projects described in this index are listed under 98 different subject matter fields. Descrptions of programs include information on: subject field, program name and number, author, source, the program's curriculum content, prerequisites, level of instruction, type of student for which it is intended, total hours of…
Enhancing programming logic thinking using analogy mapping
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Megasari, R.
2018-05-01
Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.
NASA Astrophysics Data System (ADS)
Tamura, Naoyuki
This short article is about Prime Focus Spectrograph (PFS), a very wide-field, massively-multiplexed, and optical & near-infrared (NIR) spectrograph as a next generation facility instrument on Subaru Telescope. More details and updates are available on the PFS official website (http://pfs.ipmu.jp), blog (http://pfs.ipmu.jp/blog/), and references therein. The project, instrument, & timeline PFS will position 2400 fibers to science targets or blank sky in the 1.3 degree field on the Subaru prime focus. These fibers will be quickly (~60sec) reconfigurable and feed the photons during exposures to the Spectrograph System (SpS). SpS consists of 4 modules each of which accommodate ~600 fibers and deliver spectral images ranging from 380nm to 1260nm simultaneously at one exposure via the 3 arms of blue, red, and NIR cameras. The instrument development has been undertaken by the international collaboration at the initiative of Kavli IPMU. The project is now going into the construction phase aiming at system integration and on-sky engineering observations in 2017-2018, and science operation in 2019. The survey design has also been under development envisioning a survey spanning ~300 nights over ~5 years in the framework of Subaru Strategic Program (SSP). The key science areas are: Cosmology, galaxy/AGN evolution, and Galactic Archaeology (GA) (Takada et al. 2014). The cosmology program will be to constrain the nature of dark energy via a survey of emission line galaxies over a comoving volume of 10 Gpc3 at z=0.8-2.4. In the galaxy/AGN program, the wide wavelength coverage of PFS as well as the large field of view will be exploited to characterize the galaxy populations and its clustering properties over a wide redshift range. A survey of color-selected galaxies/AGN at z = 1-2 will be conducted over 20 square degrees yielding a fair sample of galaxies with stellar masses down to ~1010 M ⊙. In the GA program, radial velocities and chemical abundances of stars in the Milky Way, dwarf spheroids, and M31 will be used to understand the past assembly histories of those galaxies and the structures of their dark matter halos. Spectra will be taken for 1 million stars as faint as V = 22 mag therefore out to large distances from the Sun. PFS will provide powerful spectroscopic capabilities even in the era of Euclid, LSST, WFIRST and TMT, and the effective synergies are expected for further unique science outputs.
High profile students’ growth of mathematical understanding in solving linier programing problems
NASA Astrophysics Data System (ADS)
Utomo; Kusmayadi, TA; Pramudya, I.
2018-04-01
Linear program has an important role in human’s life. This linear program is learned in senior high school and college levels. This material is applied in economy, transportation, military and others. Therefore, mastering linear program is useful for provision of life. This research describes a growth of mathematical understanding in solving linear programming problems based on the growth of understanding by the Piere-Kieren model. Thus, this research used qualitative approach. The subjects were students of grade XI in Salatiga city. The subjects of this study were two students who had high profiles. The researcher generally chose the subjects based on the growth of understanding from a test result in the classroom; the mark from the prerequisite material was ≥ 75. Both of the subjects were interviewed by the researcher to know the students’ growth of mathematical understanding in solving linear programming problems. The finding of this research showed that the subjects often folding back to the primitive knowing level to go forward to the next level. It happened because the subjects’ primitive understanding was not comprehensive.
Kondo, Ayumi; Ide, Mihoko; Takahashi, Ikue; Taniai, Tomoko; Miura, Kasumi; Yamaguchi, Akiko; Yotsuji, Naomi; Matsumoto, Toshihiko
2014-04-01
We developed the TAMA mental health and welfare center Relapse Prevention Program (TAMARPP) and evaluated the efficacy of the program. We provided the program for 59 substance abusers at Tokyo Tama Comprehensive Mental Health and Welfare Center, and conducted brief interviews and questionnaire surveys to them four times during eight months follow-up period. The main results were as follows. 1) Most of the subjects were before "hitting bottom". 2) More than half of the subjects continued participating in the program for more than 2 months and their attendance rate was fairly high. 3) Some of the subjects began joining a self-help group as N.A. and A.A. during the follow up period. 4) The mood states of the subjects were gradually improved during the period. 5) About one-third of the subjects abused substance again after two-month' program, but all of them continued to attend the program or a private counseling. Most of their families also continued having support from the center. These findings suggested it was meaningful to have such a friendly and less confrontational program as TAMARPP at our center to provide support for many substance abusers before "hitting bottom" and their families.
Pilot study of a multidisciplinary gout patient education and monitoring program.
Fields, Theodore R; Rifaat, Adam; Yee, Arthur M F; Ashany, Dalit; Kim, Katherine; Tobin, Matthew; Oliva, Nicole; Fields, Kara; Richey, Monica; Kasturi, Shanthini; Batterman, Adena
2017-04-01
Gout patient self-management knowledge and adherence to treatment regimens are poor. Our objective was to assess the feasibility and acceptability of a multidisciplinary team-based pilot program for the education and monitoring of gout patients. Subjects completed a gout self-management knowledge exam, along with gout flare history and compliance questionnaires, at enrollment and at 6 and 12 months. Each exam was followed by a nursing educational intervention via a structured gout curriculum. Structured monthly follow-up calls from pharmacists emphasized adherence to management programs. Primary outcomes were subject and provider program evaluation questionnaires at 6 and 12 months, program retention rate and success in reaching patients via monthly calls. Overall, 40/45 subjects remained in the study at 12 months. At 12 months, on a scale of 1 (most) to 5 (least), ratings of 3 or better were given by 84.6% of subjects evaluating the usefulness of the overall program in understanding and managing their gout, 81.0% of subjects evaluating the helpfulness of the nursing education program, and 50.0% of subjects evaluating the helpfulness of the calls from the pharmacists. Knowledge exam questions that were most frequently answered incorrectly on repeat testing concerned bridge therapy, the possibility of being flare-free, and the genetic component of gout. Our multidisciplinary program of gout patient education and monitoring demonstrates feasibility and acceptability. We identified variability in patient preference for components of the program and persistent patient knowledge gaps. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
10 CFR 1040.63 - Discrimination prohibited.
Code of Federal Regulations, 2010 CFR
2010-01-01
... subjected to discrimination under any program or activity that receives Federal financial assistance from... or substantially impairing accomplishment of the objectives of the recipient's program or activity... subjecting them to discrimination under any program or activity that receives Federal financial assistance...
13 CFR 121.701 - What SBIR programs are subject to size determinations?
Code of Federal Regulations, 2013 CFR
2013-01-01
... Requirements for the Small Business Innovation Research (sbir) Program § 121.701 What SBIR programs are subject... funding agreement pursuant to the Small Business Innovation Development Act of 1982 (Pub. L. 97-219, 15 U... Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) Programs 2. At...
40 CFR 97.534 - Recordkeeping and reporting.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Acid Rain Program or a TR NOX Annual emissions limitation or if the owner or operator of such unit... not subject to the Acid Rain Program or a TR NOX Annual emissions limitation, then the designated... Ozone Season units that are also subject to the Acid Rain Program, TR NOX Annual Trading Program, TR SO2...
40 CFR 97.534 - Recordkeeping and reporting.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Acid Rain Program or a TR NOX Annual emissions limitation or if the owner or operator of such unit... not subject to the Acid Rain Program or a TR NOX Annual emissions limitation, then the designated... Ozone Season units that are also subject to the Acid Rain Program, TR NOX Annual Trading Program, TR SO2...
40 CFR 97.534 - Recordkeeping and reporting.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Acid Rain Program or a TR NOX Annual emissions limitation or if the owner or operator of such unit... not subject to the Acid Rain Program or a TR NOX Annual emissions limitation, then the designated... Ozone Season units that are also subject to the Acid Rain Program, TR NOX Annual Trading Program, TR SO2...
Does My Program Really Make a Difference? Program Evaluation Utilizing Aggregate Single-Subject Data
ERIC Educational Resources Information Center
Burns, Catherine E.
2015-01-01
In the current climate of increasing fiscal and clinical accountability, information is required about overall program effectiveness using clinical data. These requests present a challenge for programs utilizing single-subject data due to the use of highly individualized behavior plans and behavioral monitoring. Subsequently, the diversity of the…
Code of Federal Regulations, 2013 CFR
2013-04-01
... 25 Indians 2 2013-04-01 2013-04-01 false Are there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such terms as the parties may negotiate? 1000.123 Section 1000... there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 25 Indians 2 2014-04-01 2014-04-01 false Are there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such terms as the parties may negotiate? 1000.123 Section 1000... there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 25 Indians 2 2012-04-01 2012-04-01 false Are there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such terms as the parties may negotiate? 1000.123 Section 1000... there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 2 2010-04-01 2010-04-01 false Are there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such terms as the parties may negotiate? 1000.123 Section 1000... there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 25 Indians 2 2011-04-01 2011-04-01 false Are there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such terms as the parties may negotiate? 1000.123 Section 1000... there non-BIA programs for which the Secretary must negotiate for inclusion in an AFA subject to such...
Valenti-Hein, D C; Yarnold, P R; Mueser, K T
1994-01-01
The effectiveness of a social skills training program for improving heterosocial interactions in persons with mental retardation was examined. Moderate to borderline mentally retarded subjects were selected based on problems with social anxiety and social skill deficits. Subjects were then randomly assigned to either a 12-session Dating Skills Program (DSP) or a wait list control (WLC) group. Assessments of social skills in a role-play test, knowledge about social/sexual situations, and social anxiety were obtained for all subjects at baseline, posttreatment, and at an 8-week follow-up. In addition, naturalistic observations were made of interactions of subjects in the DSP group. Subjects who participated in the DSP showed improvements in social skill and social/sexual knowledge at posttest and at follow-up compared to subjects in the WLC group. Social anxiety did not change over time for either group of subjects. Subjects who received the DSP increased interactions with persons of the opposite gender over time, while same-gender interactions decreased. The results replicate and extend previous research on the Dating Skills Program, and suggest that social skills training interventions may improve the heterosocial interactions of adults with mental retardation.
DiStefano, Lindsay J; Padua, Darin A; DiStefano, Michael J; Marshall, Stephen W
2009-03-01
Anterior cruciate ligament (ACL) injury prevention programs show promising results with changing movement; however, little information exists regarding whether a program designed for an individual's movements may be effective or how baseline movements may affect outcomes. A program designed to change specific movements would be more effective than a "one-size-fits-all" program. Greatest improvement would be observed among individuals with the most baseline error. Subjects of different ages and sexes respond similarly. Randomized controlled trial; Level of evidence, 1. One hundred seventy-three youth soccer players from 27 teams were randomly assigned to a generalized or stratified program. Subjects were videotaped during jump-landing trials before and after the program and were assessed using the Landing Error Scoring System (LESS), which is a valid clinical movement analysis tool. A high LESS score indicates more errors. Generalized players performed the same exercises, while the stratified players performed exercises to correct their initial movement errors. Change scores were compared between groups of varying baseline errors, ages, sexes, and programs. Subjects with the highest baseline LESS score improved the most (95% CI, -3.4 to -2.0). High school subjects (95% CI, -1.7 to -0.98) improved their technique more than pre-high school subjects (95% CI, -1.0 to -0.4). There was no difference between the programs or sexes. Players with the greatest amount of movement errors experienced the most improvement. A program's effectiveness may be enhanced if this population is targeted.
Code of Federal Regulations, 2010 CFR
2010-01-01
... LIQUIDITY GUARANTEE PROGRAM § 370.10 Oversight. (a) Participating entities are subject to the FDIC's oversight regarding compliance with the terms of the temporary liquidity guarantee program. (b) A..., for the duration of the temporary liquidity guarantee program, to be subject to the FDIC's authority...
49 CFR 17.3 - What programs and activities of the Department are subject to these regulations?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 1 2010-10-01 2010-10-01 false What programs and activities of the Department are subject to these regulations? 17.3 Section 17.3 Transportation Office of the Secretary of Transportation INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF TRANSPORTATION PROGRAMS AND ACTIVITIES § 17.3 What programs and activities...
Yang, Jong-Eun; Lee, Tac-Young; Kim, Jin-Kyung
2017-12-01
[Purpose] The purpose of this study is to explore the effect of a VR exercise program on falls and depression in the elderly with mild depression who reside in the local community. [Subjects and Methods] This study was performed by targeting 15 elderly subjects with mild depression who resided in the local community. The targeted subjects voluntarily selected 3 VR exercise programs (each lasting 10 minutes) among 4 activities, and a resting time of 5 minutes was given for an interval after each activity. The VR exercise program was performed for total 12 weeks (36 times), 3 times a week, 45 minutes per session. [Results] After exercise, scores of static balance test (anteroposterior), Falls Efficacy Scale, and the Activities-specific Balance Confidence Scale in the test subjects were improved and depression and internal stress scores were significantly decreased after the intervention. [Conclusion] It can be concluded that the VR exercise program exerts a positive effect not only on the physical factor but also on the mental factor of the elderly subjects with mild depression who reside in the local community. It is expected that based on the VR exercise program, diversified home programs for the elderly should be developed in the future.
NASA Astrophysics Data System (ADS)
Abdul Hadi, Normi; Mohd Noor, Norlenda; Abd Halim, Suhaila; Alwadood, Zuraida; Khairol Azmi, Nurul Nisa'
2013-04-01
Mathematics is a basic subject in primary and secondary schools. Early exposure to mathematics is very important since it will affect the student perception towards this subject for their entire life. Therefore, a program called 'Mini Hari Matematik' was conducted to expose the basic mathematics concept through some games which fit the knowledge of Standard four and five students. A questionnaire regarding student perception towards this subject was distributed before and after the program. From the analysis, the program has positively changed the student's perception towards mathematics.
Turan, Tanya N; Al Kasab, Sami; Nizam, Azhar; Lynn, Michael J; Harrell, Jamie; Derdeyn, Colin P; Fiorella, David; Janis, L Scott; Lane, Bethany F; Montgomery, Jean; Chimowitz, Marc I
2018-03-01
Lifestyle modification programs have improved the achievement of risk factor targets in a variety of clinical settings, including patients who have previously suffered a stroke or transient ischemic attack and those with multiple risk factors. Stenting Aggressive Medical Management for Prevention of Recurrent Stroke in Intracranial Stenosis (SAMMPRIS) was the first vascular disease prevention trial to provide a commercially available lifestyle modification program to enhance risk factor control. We sought to determine the relationship between compliance with this program and risk factor control in SAMMPRIS. SAMMPRIS aggressive medical management included a telephonic lifestyle modification program provided free of charge to all subjects (n = 451) during their participation in the study. Subjects with fewer than 3 expected lifestyle-coaching calls were excluded from these analyses. Compliant subjects (n = 201) had greater than or equal to 78.5% of calls (median % of completed/expected calls). Noncompliant subjects (n = 200) had less than 78.5% of calls or refused to participate. Mean risk factor values or % in-target for each risk factor was compared between compliant versus noncompliant subjects, using t tests and chi-square tests. Risk factor changes from baseline to follow-up were compared between the groups to account for baseline differences. Compliant subjects had better risk factor control throughout follow-up for low-density lipoprotein, systolic blood pressure (SBP), hemoglobin A1c (HgA1c), non-high-density lipoprotein, nonsmoking, and exercise than noncompliant subjects, but there was no difference for body mass index. After adjusting for baseline differences between the groups, compliant subjects had a greater change from baseline than noncompliant subjects for SBP did at 24 months and HgA1c at 6 months. SAMMPRIS subjects who were compliant with the lifestyle modification program had better risk factor control during the study for almost all risk factors. Published by Elsevier Inc.
Evaluation of a Stress Management Program in a Child Protection Agency.
ERIC Educational Resources Information Center
Cahill, Janet; Feldman, Lenard H.
High stress levels experienced by child protection workers have been well documented. This study examined the effectiveness of a stress management program in a child protection agency. Subjects were case workers, immediate supervisors, and clerical staff; 320 subjects participated in pretesting and 279 subjects participated in posttesting.…
32 CFR 321.5 - Access by subject individuals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Access by subject individuals. 321.5 Section 321.5 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.5 Access by subject individuals. (a...
32 CFR 321.5 - Access by subject individuals.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Access by subject individuals. 321.5 Section 321.5 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.5 Access by subject individuals. (a...
32 CFR 321.5 - Access by subject individuals.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Access by subject individuals. 321.5 Section 321.5 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.5 Access by subject individuals. (a...
32 CFR 321.5 - Access by subject individuals.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Access by subject individuals. 321.5 Section 321.5 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.5 Access by subject individuals. (a...
32 CFR 321.5 - Access by subject individuals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Access by subject individuals. 321.5 Section 321.5 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.5 Access by subject individuals. (a...
Park, Byoung-Sun; Noh, Ji-Woong; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Park, Jaehong; Kim, Junghwan
2016-06-01
[Purpose] The purpose of this study was to compare the effects of aquatic and land-based trunk exercise program on gait in stroke patients. [Subjects and Methods] The subjects were 28 hemiplegic stroke patients (20 males, 8 females). The subjects performed a trunk exercise program for a total of four weeks. [Results] Walking speed and cycle, stance phase and stride length of the affected side, and the symmetry index of the stance phase significantly improved after the aquatic and land-based trunk exercise program. [Conclusion] These results suggest that the aquatic and land-based trunk exercise program may help improve gait performance ability after stroke.
New York City Russian Bilingual Program, 1981-1982. O.E.E. Final Evaluation Report.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Educational Evaluation.
The New York City Russian Bilingual Program, evaluated here, serves students in grades 9-12 in three public and eight private schools. Three groups of subjects are included in the program: English as a second language, native language arts, and content-area subjects. All students take some mainstream classes from the beginning of the program. In…
... Funding Current Funding Opportunities Research Programs & Contacts Human Subjects Research Funding Process Research Training & Career Development Funded ... Funding Current Funding Opportunities Research Programs & Contacts Human Subjects Research Funding Process Research Training & Career Development Funded ...
13 CFR 121.401 - What procurement programs are subject to size determinations?
Code of Federal Regulations, 2010 CFR
2010-01-01
... program, the Women-Owned Small Business (WOSB) Federal Contract Assistance Procedures, SBA's Service... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What procurement programs are subject to size determinations? 121.401 Section 121.401 Business Credit and Assistance SMALL BUSINESS...
Ha, Sung-min; Kwon, Oh-yun; Yi, Chung-hwi; Cynn, Heon-seock; Weon, Jong-hyuck; Kim, Tae-ho
2016-02-01
The purpose of this study was to investigate the effects of a 6-week scapular upward rotation exercise (SURE) on scapular and clavicular alignment and scapular upward rotators strength in subjects with scapular downward rotation syndrome (SDRS). Seventeen volunteer subjects with SDRS were recruited from university populations. The alignment of the scapula and clavicle was measured using radiographic analysis and compared in subjects before and after a 6-week self-SURE program. A hand-held dynamometer was used to measure the strength of the scapular upward rotators. The subjects were instructed how to perform the self-SURE program at home. The 6-week self-SURE program was divided into two sections (the first section with non-resistive SURE during weeks 1-3, and the second section with resistive SURE using thera-band during weeks 4-6). The significance of the difference between pre- and post-program was assessed using a paired t-test, with the level of statistical significance set at p<0.05. Significant differences between pre- and post-program were found for scapular and clavicular alignment (p<0.05). Additionally, the comparison between pre- and post-program measurements of the strength of the scapular upward rotators showed significant differences (p<0.05). The results of this study showed that a 6-week self-SURE program is effective for improving scapular and clavicular alignment and increasing the strength of scapular upward rotator muscles in subjects with SDRS. Copyright © 2015 Elsevier Ltd. All rights reserved.
Jung, Ji-Yoon; Park, So-Yeon; Kim, Jin-Kyung
2018-01-01
[Purpose] This study aimed to examine the effects of a client-centered leisure activity program on satisfaction, upper limb function, self-esteem, and depression in elderly residents of a long-term care facility. [Subjects and Methods] This study included 12 elderly subjects, aged 65 or older, residing in a nursing home. The subjects were divided into an experimental and a control group. Subjects in the control group received leisure activities already provided by the facility. The experimental group participated in a client-centered leisure activity program. The subjects conducted individual activities three times per week, 30 minutes per session. The group activity was conducted three times per week for eight weeks. Each subject’s performance of and satisfaction with the leisure activity programs, upper limb function, self-esteem, and depression were measured before and after the intervention. [Results] After participating in a program, significant improvements were seen in both the Canadian Occupational Performance Measure and upper limb function in the experimental group. Also after the intervention, the subjects’ self-esteem significantly increased and their depression significantly decreased. [Conclusion] A client-centered leisure activity program motivates elderly people residing in a long-term care facility and induces their voluntary participation. Such customized programs are therefore effective for enhancing physical and psychological functioning in this population. PMID:29410570
Park, Byoung-Sun; Noh, Ji-Woong; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Park, Jaehong; Kim, Junghwan
2016-01-01
[Purpose] The purpose of this study was to compare the effects of aquatic and land-based trunk exercise program on gait in stroke patients. [Subjects and Methods] The subjects were 28 hemiplegic stroke patients (20 males, 8 females). The subjects performed a trunk exercise program for a total of four weeks. [Results] Walking speed and cycle, stance phase and stride length of the affected side, and the symmetry index of the stance phase significantly improved after the aquatic and land-based trunk exercise program. [Conclusion] These results suggest that the aquatic and land-based trunk exercise program may help improve gait performance ability after stroke. PMID:27390444
Interactive (statistical) visualisation and exploration of a billion objects with vaex
NASA Astrophysics Data System (ADS)
Breddels, M. A.
2017-06-01
With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
LSST system analysis and integration task for an advanced science and application space platform
NASA Technical Reports Server (NTRS)
1980-01-01
To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.
NASA Astrophysics Data System (ADS)
Yoon, Mijin; Jee, Myungkook James; Tyson, Tony
2018-01-01
The Deep Lens Survey (DLS), a precursor to the Large Synoptic Survey Telescope (LSST), is a 20 sq. deg survey carried out with NOAO’s Blanco and Mayall telescopes. The strength of the survey lies in its depth reaching down to ~27th mag in BVRz bands. This enables a broad redshift baseline study and allows us to investigate cosmological evolution of the large-scale structure. In this poster, we present the first cosmological analysis from the DLS using galaxy-shear correlations and galaxy clustering signals. Our DLS shear calibration accuracy has been validated through the most recent public weak-lensing data challenge. Photometric redshift systematic errors are tested by performing lens-source flip tests. Instead of real-space correlations, we reconstruct band-limited power spectra for cosmological parameter constraints. Our analysis puts a tight constraint on the matter density and the power spectrum normalization parameters. Our results are highly consistent with our previous cosmic shear analysis and also with the Planck CMB results.
Variability Analysis: Detection and Classification
NASA Astrophysics Data System (ADS)
Eyer, L.
2005-01-01
The Gaia mission will offer an exceptional opportunity to perform variability studies. The data homogeneity, its optimised photometric systems, composed of 11 medium and 4-5 broad bands, the high photometric precision in G band of one milli-mag for V = 13-15, the radial velocity measurements and the exquisite astrometric precision for one billion stars will permit a detailed description of variable objects like stars, quasars and asteroids. However the time sampling and the total number of measurements change from one object to another because of the satellite scanning law. The data analysis is a challenge because of the huge amount of data, the complexity of the observed objects and the peculiarities of the satellite, and needs thorough preparation. Experience can be gained by the study of past and present survey analyses and results, and Gaia should be put in perspective with the future large scale surveys, like PanSTARRS or LSST. We present the activities of the Variable Star Working Group and a general plan to digest this unprecedented data set, focusing here on the photometry.
Primary results from the Pan-STARRS-1 Outer Solar System Key Project
NASA Astrophysics Data System (ADS)
Holman, Matthew J.; Chen, Ying-Tung; Lackner, Michael; Payne, Matthew John; Lin, Hsing-Wen; Cristopher Fraser, Wesley; Lacerda, Pedro; Pan-STARRS 1 Science Consortium
2016-10-01
We have completed a search for slow moving bodies in the data obtained by the Pan-STARRS-1 (PS1) Science Consortium from 2010 to 2014. The data set covers the full sky north of -30 degrees declination, in the PS1 g, r, i, z, y, and w (g+r+i) filters. Our novel distance-based search is effective at detecting and linking very slow moving objects with sparsely sampled observations, even if observations are widely separated in RA, Dec and time, which is relevant to the future LSST solar system searches. In particular, our search is sensitive to objects at heliocentric distances of 25-2000 AU with magnitudes brighter than approximately r=22.5, without limits on the inclination of the object. We recover hundreds of known TNOs and Centaurs and discover hundreds of new objects, measuring phase and color information for many of them. Other highlights include the discovery of a second retrograde TNO, a number of Neptune Trojans, and large numbers of distant resonant TNOs.
zBEAMS: a unified solution for supernova cosmology with redshift uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Ethan; Lochner, Michelle; Bassett, Bruce A.
Supernova cosmology without spectra will be an important component of future surveys such as LSST. This lack of supernova spectra results in uncertainty in the redshifts which, if ignored, leads to significantly biased estimates of cosmological parameters. Here we present a hierarchical Bayesian formalism— zBEAMS—that addresses this problem by marginalising over the unknown or uncertain supernova redshifts to produce unbiased cosmological estimates that are competitive with supernova data with spectroscopically confirmed redshifts. zBEAMS provides a unified treatment of both photometric redshifts and host galaxy misidentification (occurring due to chance galaxy alignments or faint hosts), effectively correcting the inevitable contamination inmore » the Hubble diagram. Like its predecessor BEAMS, our formalism also takes care of non-Ia supernova contamination by marginalising over the unknown supernova type. We illustrate this technique with simulations of supernovae with photometric redshifts and host galaxy misidentification. A novel feature of the photometric redshift case is the important role played by the redshift distribution of the supernovae.« less
Unveiling the Low Surface Brightness Stellar Peripheries of Galaxies
NASA Astrophysics Data System (ADS)
Ferguson, Annette M. N.
2018-01-01
The low surface brightness peripheral regions of galaxies contain a gold mine of information about how minor mergers and accretions have influenced their evolution over cosmic time. Enormous stellar envelopes and copious amounts of faint tidal debris are natural outcomes of the hierarchical assembly process and the search for and study of these features, albeit highly challenging, offers the potential for unrivalled insight into the mechanisms of galaxy growth. Over the last two decades, there has been burgeoning interest in probing galaxy outskirts using resolved stellar populations. Wide-field surveys have uncovered vast tidal debris features and new populations of very remote globular clusters, while deep Hubble Space Telescope photometry has provided exquisite star formation histories back to the earliest epochs. I will highlight some recent results from studies within and beyond the Local Group and conclude by briefly discussing the great potential of future facilities, such as JWST, Euclid, LSST and WFIRST, for major breakthroughs in low surface brightness galaxy periphery science.
Systematic Serendipity: A Method to Discover the Anomalous
NASA Astrophysics Data System (ADS)
Giles, Daniel; Walkowicz, Lucianne
2018-01-01
One of the challenges in the era of big data astronomical surveys is identifying anomalous data, data that exhibits as-of-yet unobserved behavior. These data may result from systematic errors, extreme (or rare) forms of known phenomena, or, most interestingly, truly novel phenomena that has historically required a trained eye and often fortuitous circumstance to identify. We describe a method that uses machine clustering techniques to discover anomalous data in Kepler lightcurves, as a step towards systematizing the detection of novel phenomena in the era of LSST. As a proof of concept, we apply our anomaly detection method to Kepler data including Boyajian's Star (KIC 8462852). We examine quarters 4, 8, 11, and 16 of the Kepler data which contain Boyajian’s Star acting normally (quarters 4 and 11) and anomalously (quarters 8 and 16). We demonstrate that our method is capable of identifying Boyajian’s Star’s anomalous behavior in quarters of interest, and we further identify other anomalous light curves that exhibit a range of interesting variability.
A model to forecast data centre infrastructure costs.
NASA Astrophysics Data System (ADS)
Vernet, R.
2015-12-01
The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.
Recovering Galaxy Properties Using Gaussian Process SED Fitting
NASA Astrophysics Data System (ADS)
Iyer, Kartheik; Awan, Humna
2018-01-01
Information about physical quantities like the stellar mass, star formation rates, and ages for distant galaxies is contained in their spectral energy distributions (SEDs), obtained through photometric surveys like SDSS, CANDELS, LSST etc. However, noise in the photometric observations often is a problem, and using naive machine learning methods to estimate physical quantities can result in overfitting the noise, or converging on solutions that lie outside the physical regime of parameter space.We use Gaussian Process regression trained on a sample of SEDs corresponding to galaxies from a Semi-Analytic model (Somerville+15a) to estimate their stellar masses, and compare its performance to a variety of different methods, including simple linear regression, Random Forests, and k-Nearest Neighbours. We find that the Gaussian Process method is robust to noise and predicts not only stellar masses but also their uncertainties. The method is also robust in the cases where the distribution of the training data is not identical to the target data, which can be extremely useful when generalized to more subtle galaxy properties.
Quality of Subjective Experience in a Summer Science Program for Academically Talented Adolescents.
ERIC Educational Resources Information Center
Tuss, Paul
This study utilized the flow theory of intrinsic motivation to evaluate the subjective experience of 78 academically talented high school sophomores participating in an 8-day summer research apprenticeship program in materials and nuclear science. The program involved morning lectures on such topics as physics of electromagnetic radiation, energy…
Marijuana Use by Heroin Abusers as a Factor in Program Retention
ERIC Educational Resources Information Center
Ellner, Melvyn
1977-01-01
Primary heroin abusers who remained in a voluntary drug-free treatment program for an average of nine months were carefully matched with not-retained control subjects. Marijuana was used by the retained subjects as a heroin substitute and those who used marijuana were more apt to remain in the treatment program. (Author)
ERIC Educational Resources Information Center
Abramson, Theodore; Kagen, Edward
This study investigated attribute by treatment interactions between prior familiarity and response mode to programmed materials for college level subjects by manipulating subjects' familiarity. The programs were a revised version of Diagnosis of Myocardial Infraction in standard format and in a reading version. Materials to familiarize subjects…
45 CFR 660.3 - What programs and activities of the Foundation are subject to these regulations?
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 3 2012-10-01 2012-10-01 false What programs and activities of the Foundation are subject to these regulations? 660.3 Section 660.3 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS...
45 CFR 660.3 - What programs and activities of the Foundation are subject to these regulations?
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 3 2014-10-01 2014-10-01 false What programs and activities of the Foundation are subject to these regulations? 660.3 Section 660.3 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS...
45 CFR 660.3 - What programs and activities of the Foundation are subject to these regulations?
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 3 2013-10-01 2013-10-01 false What programs and activities of the Foundation are subject to these regulations? 660.3 Section 660.3 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS...
45 CFR 660.3 - What programs and activities of the Foundation are subject to these regulations?
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 3 2011-10-01 2011-10-01 false What programs and activities of the Foundation are subject to these regulations? 660.3 Section 660.3 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS...
45 CFR 660.3 - What programs and activities of the Foundation are subject to these regulations?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 3 2010-10-01 2010-10-01 false What programs and activities of the Foundation are subject to these regulations? 660.3 Section 660.3 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS...
Comparison of traditional six-year and new four-year dental curricula in South Korea.
Komabayashi, Takashi; Ahn, Chul; Kim, Kang-Ju; Oh, Hyo-Won
2012-01-01
This study aimed to compare the dental curriculum of the traditional six-year system with that of the new four-year (graduate-entry) system in South Korea. There are 11 dental schools in South Korea: six are public and five are private. Eight offer the new four-year program and the other three offer the traditional six-year program. Descriptive analyses were conducted using bibliographic data and local information along with statistical analyses such as chi-square tests. In the six-year programs, clinical dentistry subjects were taught almost equally in practical and didactic courses, while the basic science courses were taught more often as practical courses (P < 0.0001). In the four-year programs, both the basic science and clinical dentistry subjects were taught didactically more often; while more dentistry subjects were taught than basic sciences (P = 0.004). The four-year program model in South Korea is more focused on dentistry than on basic science, while both basic and clinical dentistry subjects were equally taught in the six-year program.
JPKWIC - General key word in context and subject index report generator
NASA Technical Reports Server (NTRS)
Jirka, R.; Kabashima, N.; Kelly, D.; Plesset, M.
1968-01-01
JPKWIC computer program is a general key word in context and subject index report generator specifically developed to help nonprogrammers and nontechnical personnel to use the computer to access files, libraries and mass documentation. This program is designed to produce a KWIC index, a subject index, an edit report, a summary report, and an exclusion list.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Register a list of the Corps of Engineers Civil Works programs and activities that are subject to these... the Corps of Engineers are subject to these regulations? 384.3 Section 384.3 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE INTERGOVERNMENTAL REVIEW OF...
Subjective evaluation of a peer support program by women with breast cancer: A qualitative study.
Ono, Miho; Tsuyumu, Yuko; Ota, Hiroko; Okamoto, Reiko
2017-01-01
The aim of this study was to determine the subjective evaluation of a breast cancer peer support program based on a survey of the participants who completed the program. Semistructured interviews were held with 10 women with breast cancer. The responses were subject to a qualitative inductive analysis. Women with breast cancer who participated in the breast cancer peer support program evaluated the features of the program and cited benefits, such as "Receiving individual peer support tailored to your needs," "Easily consulted trained peer supporters," and "Excellent coordination." Also indicated were benefits of the peer support that was received, such as "Receiving peer-specific emotional support," "Obtaining specific experimental information," "Re-examining yourself," and "Making preparations to move forward." The women also spoke of disadvantages, such as "Strict management of personal information" and "Matching limitations." In this study, the subjective evaluation of a peer support program by women with breast cancer was clarified . The women with breast cancer felt that the program had many benefits and some disadvantages. These results suggest that there is potential for peer support-based patient-support programs in medical services that are complementary to the current support that is provided by professionals. © 2016 Japan Academy of Nursing Science.
Uzunovic, Slavoljub; Kostic, Radmila; Zivkovic, Dobrica
2010-09-01
This study aimed to determine the effects of two different programs of modern sports dancing on coordination, strength, and speed in 60 beginner-level female dancers, aged 13 and 14 yrs. The subjects were divided into two experimental groups (E1 and E2), each numbering 30 subjects, drawn from local dance clubs. In order to determine motor coordination, strength, and speed, we used 15 measurements. The groups were tested before and after the experimental programs. Both experimental programs lasted for 18 wks, with training sessions twice a week for 60 minutes. The subjects from the E1 group trained according to a new experimental program of disco dance (DD) modern sports dance, and the E2 group trained according to the classic DD program of the same kind for beginner selections. The obtained results were assessed by statistical analysis: a paired-samples t-test and MANCOVA/ANCOVA. The results indicated that following the experimental programs, both groups showed a statistically significant improvement in the evaluated skills, but the changes among the E1 group subjects were more pronounced. The basic assumption of this research was confirmed, that the new experimental DD program has a significant influence on coordination, strength, and speed. In relation to these changes, the application of the new DD program was recommended for beginner dancers.
13 CFR 101.401 - What programs and activities of SBA are subject to this subpart?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What programs and activities of SBA are subject to this subpart? 101.401 Section 101.401 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION ADMINISTRATION Intergovernmental Partnership § 101.401 What programs and activities of SBA are...
13 CFR 101.401 - What programs and activities of SBA are subject to this subpart?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false What programs and activities of SBA are subject to this subpart? 101.401 Section 101.401 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION ADMINISTRATION Intergovernmental Partnership § 101.401 What programs and activities of SBA are...
ERIC Educational Resources Information Center
Grumm, Mandy; Hein, Sascha; Fingerle, Michael
2013-01-01
School-based aggression prevention programs have been implemented in many educational institutions, and fostering the development of social competencies is one of the central aspects of many approaches. The aim of the present study was to assess the level of subjectively perceived usefulness of the prevention program "Faustlos" in…
Kordi, Ramin; Nourian, Ruhollah; Ghayour, Mahboubeh; Kordi, Mahboubeh; Younesian, Ali
2012-01-01
Objective The objectives of this study were a) to develop a physical activity program for nursery schools, and b) to evaluate the effects of this program on fundamental movement skills of preschool age children in Iran. Methods In this quasi-experimental study 147 children from five nursery schools in five different cities in Iran were enrolled. A physical activity program was developed for nursery children. Trained nursery physical activity instructors conducted the program for 10 weeks for all subjects. The levels of gross motor development of all subjects were measured before intervention and after 10 weeks physical activity program employing the Test of Gross Motor Development-edition 2 (TGMD-2). Findings The participants in this study had a mean (SD) age of 4.95 (0.83) years. At the end of the study, scores of subjects at all components of TGMD-2 (including locomotor, object control, sum of standard scores and gross motor quotient) were significantly improved compared to the baseline scores (P<0.001). Based on descriptive rating of the "Gross Motor Quotient" in the base line, 11.5% of subjects were superior/very superior (GMQ >120) and after 10 weeks intervention this rate was increased to 49.7% of all subjects. Conclusion It seems that the developed physical activity program conducted by trained nursery physical activity instructors could be an effective and practical way of increasing levels of fundamental movement skills of preschool children in Iran. PMID:23400235
40 CFR 60.4174 - Recordkeeping and reporting.
Code of Federal Regulations, 2011 CFR
2011-07-01
... unit is subject to an Acid Rain emission limitation or the CAIR NOX Annual Trading Program, CAIR SO2... are also subject to an Acid Rain emissions limitation or the CAIR NOX Annual Trading Program, CAIR SO2...
40 CFR 60.4174 - Recordkeeping and reporting.
Code of Federal Regulations, 2010 CFR
2010-07-01
... unit is subject to an Acid Rain emission limitation or the CAIR NOX Annual Trading Program, CAIR SO2... are also subject to an Acid Rain emissions limitation or the CAIR NOX Annual Trading Program, CAIR SO2...
Solar system science with ESA Euclid
NASA Astrophysics Data System (ADS)
Carry, B.
2018-01-01
Context. The ESA Euclid mission has been designed to map the geometry of the dark Universe. Scheduled for launch in 2020, it will conduct a six-year visible and near-infrared imaging and spectroscopic survey over 15 000 deg2 down to VAB 24.5. Although the survey will avoid ecliptic latitudes below 15°, the survey pattern in repeated sequences of four broadband filters seems well-adapted to detect and characterize solar system objects (SSOs). Aims: We aim at evaluating the capability of Euclid of discovering SSOs and of measuring their position, apparent magnitude, and spectral energy distribution. We also investigate how the SSO orbits, morphology (activity and multiplicity), physical properties (rotation period, spin orientation, and 3D shape), and surface composition can be determined based on these measurements. Methods: We used the current census of SSOs to extrapolate the total amount of SSOs that will be detectable by Euclid, that is, objects within the survey area and brighter than the limiting magnitude. For each different population of SSO, from neighboring near-Earth asteroids to distant Kuiper-belt objects (KBOs) and including comets, we compared the expected Euclid astrometry, photometry, and spectroscopy with the SSO properties to estimate how Euclid will constrain the SSOs dynamical, physical, and compositional properties. Results: With the current survey design, about 150 000 SSOs, mainly from the asteroid main-belt, should be observable by Euclid. These objects will all have high inclination, which is a difference to many SSO surveys that focus on the ecliptic plane. Euclid may be able to discover several 104 SSOs, in particular, distant KBOs at high declination. The Euclid observations will consist of a suite of four sequences of four measurements and will refine the spectral classification of SSOs by extending the spectral coverage provided by Gaia and the LSST, for instance, to 2 microns. Combined with sparse photometry such as measured by Gaia and the LSST, the time-resolved photometry will contribute to determining the SSO rotation period, spin orientation, and 3D shape model. The sharp and stable point-spread function of Euclid will also allow us to resolve binary systems in the Kuiper belt and detect activity around Centaurs. Conclusions: The depth of the Euclid survey (VAB 24.5), its spectral coverage (0.5 to 2.0 μm), and its observation cadence has great potential for solar system research. A dedicated processing for SSOs is being set up within the Euclid consortium to produce astrometry catalogs, multicolor and time-resolved photometry, and spectral classification of some 105 SSOs, which will be delivered as Legacy Science.
Lawani, M M; Hounkpatin, S; Akplogan, B
2006-01-01
Asthma is a world wide public health problem. It is the most commom chronic disease of school age children. Its severity is in constant increase. The frequency of the hospitalizations for asthma increased in practically all countries. Physical exercises and sport are used more and more as therapeutic means, in northern deveopped countries of where it was very early understood that it is necessary to integrate the asthmatic subjects into a program of specific physical training. This study undertaken in south saharian african country considers also assiduity in a physical training program as the factor of increase in the expiratory peak flow, of reinforcement of some principal muscles necessary to the improvement, and of the respiratory function of the asthmatic subject. Physical exercise is used as a non pharmacological therapy of asthma. This transversal study was carried out on fourteen asthmatic subjects of colleges in Porto-Novo's town, aged 15 years old to 25 years, of the two sexes. The results showed that: the Expiratory Peak Flow of Point (EPF) of the subjects at the beginning of the program is lower than the minimal average value of the group whatever the sex; the subjects average EPF increased from approximately 35% compared to the average at the beginning of the program; the subjects from family with asthmatic line, are much more inclined with respiratory embarrassments post-exercises than those who did not come from it; the respiratory embarrassments post-exercises noticed in the first weeks, grew blurred before the end of the program. This study suggests physical exercise adapted to the asthmatic subjects for the improvement of their health.
Exercise program improved subjective dry eye symptoms for office workers.
Sano, Kokoro; Kawashima, Motoko; Takechi, Sayuri; Mimura, Masaru; Tsubota, Kazuo
2018-01-01
We investigated the benefits of a cognitive behavior therapy-based exercise program to reduce the dry eye symptoms of office workers. We recruited 11 office workers with dry eye symptoms, aged 31-64 years, who voluntarily participated in group health guidance at a manufacturing company. Participants learned about the role of physical activity and exercise in enhancing wellness and performed an exercise program at home 3 days per week for 10 weeks. We estimated the indexes of body composition, dry eye symptoms, and psychological distress using the Dry Eye-Related Quality of Life Score and the World Health Organization's Subjective Well-Being Inventory questionnaires pre- and postintervention. The 10-week exercise program and the questionnaires were completed by 48.1% (39 of 81) of the participants. Body composition did not change pre- and postintervention. However, the average of the Dry Eye-Related Quality of Life Score scores in participants with subjective dry eye significantly improved after the intervention. Moreover, the World Health Organization's Subjective Well-Being Inventory positive well-being score tended to increase after the intervention. In this study, we showed that a 10-week exercise program improved subjective dry eye symptoms of healthy office workers. Our study suggests that a cognitive behavior therapy-based exercise program can play an important role in the treatment of patients with dry eye disease.
Effects of virtual reality programs on balance in functional ankle instability
Kim, Ki-Jong; Heo, Myoung
2015-01-01
[Purpose] The aim of present study was to identify the impact that recent virtual reality training programs used in a variety of fields have had on the ankle’s static and dynamic senses of balance among subjects with functional ankle instability. [Subjects and Methods] This study randomly divided research subjects into two groups, a strengthening exercise group (Group I) and a balance exercise group (Group II), with each group consisting of 10 people. A virtual reality program was performed three times a week for four weeks. Exercises from the Nintendo Wii Fit Plus program were applied to each group for twenty minutes along with ten minutes of warming up and wrap-up exercises. [Results] Group II showed a significant decrease of post-intervention static and dynamic balance overall in the anterior-posterior, and mediolateral directions, compared with the pre-intervention test results. In comparison of post-intervention static and dynamic balance between Group I and Group II, a significant decrease was observed overall. [Conclusion] Virtual reality programs improved the static balance and dynamic balance of subjects with functional ankle instability. Virtual reality programs can be used more safely and efficiently if they are implemented under appropriate monitoring by a physiotherapist. PMID:26644652
Effects of virtual reality programs on balance in functional ankle instability.
Kim, Ki-Jong; Heo, Myoung
2015-10-01
[Purpose] The aim of present study was to identify the impact that recent virtual reality training programs used in a variety of fields have had on the ankle's static and dynamic senses of balance among subjects with functional ankle instability. [Subjects and Methods] This study randomly divided research subjects into two groups, a strengthening exercise group (Group I) and a balance exercise group (Group II), with each group consisting of 10 people. A virtual reality program was performed three times a week for four weeks. Exercises from the Nintendo Wii Fit Plus program were applied to each group for twenty minutes along with ten minutes of warming up and wrap-up exercises. [Results] Group II showed a significant decrease of post-intervention static and dynamic balance overall in the anterior-posterior, and mediolateral directions, compared with the pre-intervention test results. In comparison of post-intervention static and dynamic balance between Group I and Group II, a significant decrease was observed overall. [Conclusion] Virtual reality programs improved the static balance and dynamic balance of subjects with functional ankle instability. Virtual reality programs can be used more safely and efficiently if they are implemented under appropriate monitoring by a physiotherapist.
Effect of Lower Extremity Stretching Exercises on Balance in Geriatric Population.
Reddy, Ravi Shankar; Alahmari, Khalid A
2016-07-01
The purpose of this study was to find "Effect of lower extremity stretching exercises on balance in the geriatric population. 60 subjects (30 male and 30 female) participated in the study. The subjects underwent 10 weeks of lower limb stretching exercise program. Pre and post 10 weeks stretching exercise program, the subjects were assessed for balance, using single limb stance time in seconds and berg balance score. These outcome measures were analyzed. Pre and post lower extremity stretching on balance was analyzed using paired t test. Of 60 subjects 50 subjects completed the stretching exercise program. Paired sample t test analysis showed a significant improvement in single limb stance time (eyes open and eyes closed) (p<0.001) and berg balance score (p<0.001). Lower extremity stretching exercises enhances balance in the geriatric population and thereby reduction in the number of falls.
Feasibility and Effectiveness of a Pilot Health Promotion Program for Adults With Type 2 Diabetes
Kluding, Patricia M.; Singh, Rupali; Goetz, Jeanine; Rucker, Jason; Bracciano, Sarah; Curry, Natasha
2013-01-01
Purpose The purpose of this pilot study was to assess the feasibility and effectiveness of an intense health promotion program in older adults with diabetes. The program combined individually prescribed and supervised exercise with nutrition and education programs on glycemic control and aerobic fitness. Methods Various recruitment and retention strategies were analyzed for effectiveness. Out of 28 potential subjects assessed for eligibility, 6 subjects with type 2 diabetes (2 male and 4 female; all white; age, 60.2 ± 4.7 years) participated in the 10-week intervention. Aerobic and resistance exercise was performed on alternate days (3-4 days per week), with individualized nutrition counseling and diabetes health education sessions once weekly. The primary outcome measures were aerobic fitness and glycemic control (A1C), and secondary outcome measures included body mass index (BMI), self-efficacy, and symptoms of neuropathy. Changes in outcomes were assessed using descriptive statistics and paired t test analysis (α = .05). Results Following the intervention, subjects had improvements that approached significance in A1C and pain, with significant improvements in self-efficacy. Conclusions A systematic approach to analysis of feasibility revealed issues with recruitment and retention that would need to be addressed for future studies or clinical implementation of this program. However, for the subset of subjects who did complete the intervention, adherence was excellent, and satisfaction with the program was confirmed by exit interview comments. Following participation in this pilot health promotion program, subjects had meaningful improvements in glycemic control, pain, and self-efficacy. PMID:20530663
ECIA Chapter 1 Program: 1982-83 Evaluation Report. Report No. 11:06:82/83:108 wp:5899.
ERIC Educational Resources Information Center
Phoenix Union High School District, AZ. Research Services.
This is a report of the testing results of the 1982-83 Chapter 1 program in the Phoenix Union High School District, Arizona. The average Normal Curve Equivalent (NCE) gains in the District's Chapter 1 Programs showed the students gained more knowledge in each subject than similar students across the nation. However, in some subjects the students…
ERIC Educational Resources Information Center
Jelks-Emmanuel, Merry
A study examined the effectiveness of a Reading Recovery program. Subjects, 14 first-grade students who received the Reading Recovery program and 20 first-grade students who did not receive the program, were administered the Iowa Tests of Basic Skills in the spring of 1994. The subject population was comprised of 100% minority students attending…
ERIC Educational Resources Information Center
Katzin, Ori
2015-01-01
This article presents findings from a longitudinal qualitative study that examined teaching approaches of neophyte teachers in Israel during their 4-year exclusive teachers' training program for teaching Jewish subjects and first two years of teaching. The program wanted to promote change in secular pupils' attitudes toward Jewish subjects. We…
ERIC Educational Resources Information Center
Bianchini, Paolo; Morandini, Maria Cristina
2017-01-01
Civic education has always been an ancillary subject in the Italian school system. Introduced at the end of the 1950s as a sort of appendage to the history programs, it has recently been subject to multiple reforms although little or nothing has changed in reality. The analysis of a sample of civic education textbooks in use in schools explains…
Selles, Ruud W; Li, Xiaoyan; Lin, Fang; Chung, Sun G; Roth, Elliot J; Zhang, Li-Qun
2005-12-01
To investigate the effect of repeated feedback-controlled and programmed "intelligent" stretching of the ankle plantar- and dorsiflexors to treat subjects with ankle spasticity and/or contracture in stroke. Noncontrolled trial. Institutional research center. Subjects with spasticity and/or contracture after stroke. Stretching of the plantar- and dorsiflexors of the ankle 3 times a week for 45 minutes during a 4-week period by using a feedback-controlled and programmed stretching device. Passive and active range of motion (ROM), muscle strength, joint stiffness, joint viscous damping, reflex excitability, comfortable walking speed, and subjective experiences of the subjects. Significant improvements were found in the passive ROM, maximum voluntary contraction, ankle stiffness, and comfortable walking speed. The visual analog scales indicated very positive subjective evaluation in terms of the comfort of stretching and the effect on their involved ankle. Repeated feedback-controlled or intelligent stretching had a positive influence on the joint properties of the ankle with spasticity and/or contracture after stroke. The stretching device may be an effective and safe alternative to manual passive motion treatment by a therapist and has potential to be used to repeatedly and regularly stretch the ankle of subjects with spasticity and/or contracture without daily involvement of clinicians or physical therapists.
Porsdal, Vibeke; Beal, Catherine; Kleivenes, Ole Kristian; Martinsen, Egil W; Lindström, Eva; Nilsson, Harriet; Svanborg, Pär
2010-06-10
Solutions for Wellness (SfW) is an educational 3-month program concerning nutrition and exercise for persons with psychiatric disorders on psychotropic medication, who have weight problems. This observational study assessed the impact of SfW on subjective well-being, weight and waist circumference (WC). Data was collected at 49 psychiatric clinics. Where the SfW program was offered patients could enter the intervention group; where not, the control group. Subjective well-being was measured by the Subjective Well-being under Neuroleptics scale (SWN), at baseline, at the end of SfW participation, and at a follow-up 6 months after baseline. Demographic, disease and treatment data was also collected. 314 patients enrolled in the SfW group, 59 in the control group. 54% of the patients had schizophrenia, 67% received atypical antipsychotics, 56% were female. They averaged 41 +/- 12.06 years and had a BMI of 31.4 +/- 6.35. There were significant differences at baseline between groups for weight, SWN total score and other factors. Stepwise logistic models controlling for baseline covariates yielded an adjusted non-significant association between SfW program participation and response in subjective well-being (SWN increase). However, statistically significant associations were found between program participation and weight-response (weight loss or gain < 1 kg) OR = 2; 95% CI [1.1; 3.7] and between program participation and WC-response (WC decrease or increase < 2 cm) OR = 5; 95% CI [2.4; 10.3]), at 3 months after baseline. SfW program participation was associated with maintaining or decreasing weight and WC but not with improved subjective well-being as measured with the SWN scale.
Passenger ride quality determined from commercial airline flights
NASA Technical Reports Server (NTRS)
Richards, L. G.; Kuhlthau, A. R.; Jacobson, I. D.
1975-01-01
The University of Virginia ride-quality research program is reviewed. Data from two flight programs, involving seven types of aircraft, are considered in detail. An apparatus for measuring physical variations in the flight environment and recording the subjective reactions of test subjects is described. Models are presented for predicting the comfort response of test subjects from the physical data, and predicting the overall comfort reaction of test subjects from their moment by moment responses. The correspondence of mean passenger comfort judgments and test subject response is shown. Finally, the models of comfort response based on data from the 5-point and 7-point comfort scales are shown to correspond.
Phenelzine as a stimulant drug antagonist: a preliminary report.
Maletzky, B M
1977-08-01
Phenelzine administration, monitored via a pharmacy-controlled program, was employed in 38 subjects over a 6-month period to prevent amphetamine-type drug abuse, in much the same manner as disulfiram programs are employed against alcohol abuse. Advantages of the program were apparent, with a majority of subjects abstaining during the enforced phenelzine trial. Subjects generally made use of this abstinent period to benefit from a variety of psychotherapeutic modes, and demonstrated enhanced job and school performance and improved marital relationships. Results based on subject and observer reports, reports from dispensing pharmacies, and random urinalyses for drugs were encouraging. However, the study was uncontrolled and observational, and thus results are merely suggestive at present. Potential dangers as well as benefits of administering phenelzine to such a population are discussed.
NASA TLX: software for assessing subjective mental workload.
Cao, Alex; Chintamani, Keshav K; Pandya, Abhilash K; Ellis, R Darin
2009-02-01
The NASA Task Load Index (TLX) is a popular technique for measuring subjective mental workload. It relies on a multidimensional construct to derive an overall workload score based on a weighted average of ratings on six subscales: mental demand, physical demand, temporal demand, performance, effort, and frustration level. A program for implementing a computerized version of the NASA TLX is described. The software version assists in simplifying collection, postprocessing, and storage of raw data. The program collects raw data from the subject and calculates the weighted (or unweighted) workload score, which is output to a text file. The program can also be tailored to a specific experiment using a simple input text file, if desired. The program was designed in Visual Studio 2005 and is capable of running on a Pocket PC with Windows CE or on a PC with Windows 2000 or higher. The NASA TLX program is available for free download.
Jo, Kae Hwa; Kim, Yeong Kyeong
2008-06-01
The purpose of this study was to develop a multidimensional suicide prevention program for Korean elders by utilizing a community network and to evaluate its effect. A non-equivalent control group pretest-posttest design was used. The subjects were recruited from two different elderly institutions located in D city and K province, Korea. Nineteen subjects in the control group received no intervention and 20 subjects in the experimental group received a multidimensional suicide prevention program. There were more significant decreases in depression, suicide ideation, and increases in life satisfaction in the experimental group compared to the control group. According to the above results, the multidimensional suicide prevention program for Korean elders decreased stressful events like depression, and suicide ideation and increased life satisfaction through the community network. These findings suggest that this program can be used as an efficient intervention for elders in a critical situation.
Eakin, Brenda; Kirk, Rosalind; Piechowski, Patricia; Thomas, Barbara
2014-01-01
Abstract Funders, institutions, and research organizations are increasingly recognizing the need for human subjects protections training programs for those engaged in academic research. Current programs tend to be online and directed toward an audience of academic researchers. Research teams now include many nonacademic members, such as community partners, who are less likely to respond to either the method or the content of current online trainings. A team at the CTSA‐supported Michigan Institute for Clinical and Health Research at the University of Michigan developed a pilot human subjects protection training program for community partners that is both locally implemented and adaptable to local contexts, yet nationally consistent and deliverable from a central administrative source. Here, the developers and the analysts of this program discuss its development, its content, and the results of its evaluation. PMID:24720288
Promoting aging well: evaluation of Vital-Aging-Multimedia Program in Madrid, Spain.
Caprara, Mariagiovanna; Fernández-Ballesteros, Rocío; Alessandri, Guido
2016-09-01
This article attests to the effectiveness of Vital Aging-Multimedia (VA-M, 'Vivir con Vitalidad-M'), a psycho-educational multimedia program designed to promote successful aging. The program was implemented over 3 months through 35 h of video lessons grouped into 15 thematic units addressing four domains of experience commonly associated with aging well: health and healthy habits, cognitive functioning, aging self-efficacy and well-being and social participation. In accordance with a quasi-experimental design, a total of 115 senior citizens (aged 54-82) participated: 73 subjects attended the VA-M, while 42 subjects with similar characteristics served as controls. All subjects were assessed before and after the program on target variables related to the above domains of functioning. Significant changes in most of the examined variables documented the positive effects of the program. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Solomon, Stephanie; Eakin, Brenda; Kirk, Rosalind; Piechowski, Patricia; Thomas, Barbara
2014-04-01
Funders, institutions, and research organizations are increasingly recognizing the need for human subjects protections training programs for those engaged in academic research. Current programs tend to be online and directed toward an audience of academic researchers. Research teams now include many nonacademic members, such as community partners, who are less likely to respond to either the method or the content of current online trainings. A team at the CTSA-supported Michigan Institute for Clinical and Health Research at the University of Michigan developed a pilot human subjects protection training program for community partners that is both locally implemented and adaptable to local contexts, yet nationally consistent and deliverable from a central administrative source. Here, the developers and the analysts of this program discuss its development, its content, and the results of its evaluation. © 2014 Wiley Periodicals, Inc.
Effect of a virtual reality-enhanced exercise protocol after coronary artery bypass grafting.
Chuang, Tien-Yow; Sung, Wen-Hsu; Chang, Hwa-Ann; Wang, Ray-Yau
2006-10-01
Virtual reality (VR) technology has gained importance in many areas of medicine. Knowledge concerning the application and the influence of VR-enhanced exercise programs is limited for patients receiving coronary artery bypass grafting. The purpose of this study was to evaluate the effect of a virtual "country walk" on the number of sessions necessary to reach cardiac rehabilitation goals in patients undergoing coronary artery bypass grafting. Twenty subjects who were seen for cardiac rehabilitation between January and June 2004 comprised the study sample. The protocol for this study included an initial maximum graded exercise tolerance test, given to determine the subsequent training goals for the subject, followed by biweekly submaximal endurance training sessions. All subjects were assigned by lot to 1 of 2 submaximal endurance training programs, one (group 2) with and the other (group 1) without the added VR environment. In all other respects, the 2 programs were identical. Each training session lasted for 30 minutes and was carried out twice per week for about 3 months. The primary outcome measures were maximum load during the work sessions, target oxygen consumption, target heart rate (beats per minute), and number of training sessions required to reach rehabilitation goals. By the end of 20 training sessions, only 4 of the 10 control subjects had reached the heart rate target goal of 85% their maximum heart rate. In contrast, 9 of the 10 subjects in the VR program had attained this goal by 9 or fewer training sessions. When target metabolic cost (75% peak oxygen consumption) was used as the training goal, all 10 subjects in the VR program had reached this target after 2 training sessions (or, in some cases, 1 training session), but not until training session 15 did a cumulative number of 9 control subjects reach this goal. These study outcomes clearly support the notion that incorporating a VR environment into cardiac rehabilitation programs will accelerate maximum recovery of patients' cardiovascular function.
Program adherence and effectiveness of a commercial nutrition program: the metabolic balance study.
Meffert, Cornelia; Gerdes, Nikolaus
2010-01-01
Objective. To assess the effectiveness of a commercial nutrition program in improving weight, blood lipids, and health-related quality of life (HRQOL). Methods. Prospective observational study with followup after 1, 3, 6, and 12 months with data from questionnaires and blood samples. Subjects. After 12 months, we had data from 524 subjects (= 60.6% of the initial samples). 84.1% of the subjects were women. The average BMI at baseline was 30.3 (SD = 5.7). Results. After 12 months, the average weight loss was 6.8 kg (SD = 7.1 kg). Program adherence declined over time but was still high after 12 months and showed a positive linear correlation with weight loss. Relevant blood parameters as well as HRQOL improved significantly. Conclusion. After 12 months, nearly two thirds of the samples had achieved >5% reduction of their initial weights. The high degree of program adherence is probably due to personal counseling and individually designed nutrition plans provided by the program.
Experimental evaluation of sensorimotor patterning used with mentally retarded children.
Neman, R; Roos, P; McCann, R M; Menolascino, F J; Heal, L W
1975-01-01
In the present study, a sensorimotor "patterning" program used with 66 institutionalized, mentally retarded children and adolescents was evaluated. The subjects were randomly assigned to one of three groups: (a) Experimental 1 group, which received a program of mobility exercises including patterning, creeping, and crawling; visual-motor training; and sensory stimulation exercises; (b) Experimental 2 group, which received a program of physical activity, personal attention, and the same sensory stimulation program given to the first group; or (c) Passive Control group, which provided baseline measures but which received no additional programming as part of the study. Experimental 1 group subjects improved more than subjects in the other groups in visual perception, program-related measures of mobility, and language ability. Intellectual functioning did not appear to be enhanced by the procedures, at least during the active phase of the project. The results were discussed with reference to other researchers who have failed to support the patterning approach, and some reasons were suggested for the differences between the present and past investigations.
14 CFR 119.49 - Contents of operations specifications.
Code of Federal Regulations, 2013 CFR
2013-01-01
... markings, and serial number of each aircraft that is subject to an airworthiness maintenance program..., and emergency equipment of aircraft that are subject to an airworthiness maintenance program required... Transportation, if required. (4) Type of aircraft, registration markings, and serial numbers of each aircraft...
14 CFR 119.49 - Contents of operations specifications.
Code of Federal Regulations, 2011 CFR
2011-01-01
... markings, and serial number of each aircraft that is subject to an airworthiness maintenance program..., and emergency equipment of aircraft that are subject to an airworthiness maintenance program required... Transportation, if required. (4) Type of aircraft, registration markings, and serial numbers of each aircraft...
14 CFR 119.49 - Contents of operations specifications.
Code of Federal Regulations, 2014 CFR
2014-01-01
... markings, and serial number of each aircraft that is subject to an airworthiness maintenance program..., and emergency equipment of aircraft that are subject to an airworthiness maintenance program required... Transportation, if required. (4) Type of aircraft, registration markings, and serial numbers of each aircraft...
14 CFR 119.49 - Contents of operations specifications.
Code of Federal Regulations, 2012 CFR
2012-01-01
... markings, and serial number of each aircraft that is subject to an airworthiness maintenance program..., and emergency equipment of aircraft that are subject to an airworthiness maintenance program required... Transportation, if required. (4) Type of aircraft, registration markings, and serial numbers of each aircraft...
Jeon, Mi Yang; Jeong, HyeonCheol; Petrofsky, Jerrold; Lee, Haneul; Yim, JongEun
2014-11-14
Falling can lead to severe health issues in the elderly and importantly contributes to morbidity, death, immobility, hospitalization, and early entry to long-term care facilities. The aim of this study was to devise a recurrent fall prevention program for elderly women in rural areas. This study adopted an assessor-blinded, randomized, controlled trial methodology. Subjects were enrolled in a 12-week recurrent fall prevention program, which comprised strength training, balance training, and patient education. Muscle strength and endurance of the ankles and the lower extremities, static balance, dynamic balance, depression, compliance with preventive behavior related to falls, fear of falling, and fall self-efficacy at baseline and immediately after the program were assessed. Sixty-two subjects (mean age 69.2±4.3 years old) completed the program--31 subjects in the experimental group and 31 subjects in the control group. When the results of the program in the 2 groups were compared, significant differences were found in ankle heel rise test, lower extremity heel rise test, dynamic balance, depression, compliance with fall preventative behavior, fear of falling, and fall self-efficacy (p<0.05), but no significant difference was found in static balance. This study shows that the fall prevention program described effectively improves muscle strength and endurance, balance, and psychological aspects in elderly women with a fall history.
ERIC Educational Resources Information Center
Appenzellar, Anne B.; Kelley, H. Paul
Two validity studies of the College Board College-Level Examination Program (CLEP) Subject Examination in Elementary Computer Programming: Fortran IV determined that CLEP scores are appropriate for granting examination credit at the University of Texas at Austin. The standard-setting administration was in the spring of 1979, with a re-evaluation…
Effectiveness of two Arthritis Foundation programs: Walk With Ease, and YOU Can Break the Pain Cycle
Bruno, Michelle; Cummins, Susan; Gaudiano, Lisha; Stoos, Johanna; Blanpied, Peter
2006-01-01
Objective: To evaluate the effectiveness of two Arthritis Foundation programs: Walk With Ease (WWE) and YOU Can Break The Pain Cycle (PC). Design: Quasi-experimental, repeated measures design. Retested at six weeks and four months. Setting: Community based intervention. Participants: Volunteer sample of 163 adults with arthritis recruited through mailings, newspapers, and flyers. Interventions: Subjects participated in a 90 minute seminar (PC, Group A), a six-week walking program (WWE, Group B), or both programs (Group C). Main outcome measures: Survey assessment of arthritis knowledge, general health, self-management activities, confidence, physical abilities, depression, health distress, and how arthritis affects their life. A Squat Test, a Six Minute Walk test, and a Timed Functional Walk Test were also administered. Results: Subjects in Group B were more confident, less depressed, had less health distress, and less pain than subjects in Group A. Scores of Group C were between Group A and B scores. Differences in groups over time indicated that the WWE resulted in increased confidence, physical abilities, time spent in self-management activities and decreased pain and fatigue. All groups increased in walking endurance at six weeks, and increased in health distress at four months. Conclusion: Subjects in different programs differed on impact of arthritis. These programs provide effective arthritis management opportunities. PMID:18046884
The effect of time-management training on employee attitudes and behavior: a field experiment.
Orpen, C
1994-07-01
This field experiment tested for the effect of time-management training on 56 employees at an Australian manufacturing company, half of whom attended a 3-day training program and half of whom did not. The training group subjects rated their management of time significantly higher after the program than did the group who did not attend the training program. The diary entries of the trained subjects over a 2-week period after the training program were also rated by three superiors as exhibiting significantly better time management than the diary entries of the untrained group. Given that subjects had been randomly assigned to the two conditions, these results suggest that appropriate training can cause employees to improve how they manage their time at work.
Dooley, K O; Farmer, A
1988-08-01
Neurolinguistic programming's hypothesized eye movements were measured independently using videotapes of 10 nonfluent aphasic and 10 control subjects matched for age and sex. Chi-squared analysis indicated that eye-position responses were significantly different for the groups. Although earlier research has not supported the hypothesized eye positions for normal subjects, the present findings support the contention that eye-position responses may differ between neurologically normal and aphasic individuals.
Solomon, Stephanie; Bullock, Sherita; Calhoun, Karen; Crosby, Lori; Eakin, Brenda; Franco, Zeno; Hardwick, Emily; Holland, Samuel; Leinberger-Jabari, Andrea; Newton, Gail; Odell, Jere; Paberzs, Adam; Spellecy, Ryan
2014-04-01
Funders, institutions, and research organizations are increasingly recognizing the need for human subjects protections training programs for those engaged in academic research. Current programs tend to be online and directed toward an audience of academic researchers. Research teams now include many nonacademic members, such as community partners, who are less likely to respond to either the method or the content of current online trainings. A team at the CTSA-supported Michigan Institute for Clinical and Health Research at the University of Michigan developed a pilot human subjects protection training program for community partners that is both locally implemented and adaptable to local contexts, yet nationally consistent and deliverable from a central administrative source. Here, the developers of the program and the collaborators who participated in the pilot across the United States describe 10 important lessons learned that align with four major themes: The distribution of the program, the implementation of the program, the involvement of community engagement in the program, and finally lessons regarding the content of the program. These lessons are relevant to anyone who anticipates developing or improving a training program that is developed in a central location and intended for local implementation. © 2014 Wiley Periodicals, Inc.
Bullock, Sherita; Calhoun, Karen; Crosby, Lori; Eakin, Brenda; Franco, Zeno; Hardwick, Emily; Leinberger‐Jabari, Andrea; Newton, Gail; Odell, Jere; Paberzs, Adam; Spellecy, Ryan
2014-01-01
Abstract Funders, institutions, and research organizations are increasingly recognizing the need for human subjects protections training programs for those engaged in academic research. Current programs tend to be online and directed toward an audience of academic researchers. Research teams now include many nonacademic members, such as community partners, who are less likely to respond to either the method or the content of current online trainings. A team at the CTSA‐supported Michigan Institute for Clinical and Health Research at the University of Michigan developed a pilot human subjects protection training program for community partners that is both locally implemented and adaptable to local contexts, yet nationally consistent and deliverable from a central administrative source. Here, the developers of the program and the collaborators who participated in the pilot across the United States describe 10 important lessons learned that align with four major themes: The distribution of the program, the implementation of the program, the involvement of community engagement in the program, and finally lessons regarding the content of the program. These lessons are relevant to anyone who anticipates developing or improving a training program that is developed in a central location and intended for local implementation. PMID:24720349
The Role of Mental Models in Learning to Program.
ERIC Educational Resources Information Center
Pirolli, Peter L.; Anderson, John R.
This study reports two experiments which indicate that the processes of providing subjects with insightful representations of example programs and guiding subjects through an "ideal" problem solving strategy facilitate learning. A production system model (GRAPES) has been developed that simulates problem-solving and learning in the…
Deyle, Gail D; Allison, Stephen C; Matekel, Robert L; Ryder, Michael G; Stang, John M; Gohdes, David D; Hutton, Jeremy P; Henderson, Nancy E; Garber, Matthew B
2005-12-01
Manual therapy and exercise have not previously been compared with a home exercise program for patients with osteoarthritis (OA) of the knee. The purpose of this study was to compare outcomes between a home-based physical therapy program and a clinically based physical therapy program. One hundred thirty-four subjects with OA of the knee were randomly assigned to a clinic treatment group (n=66; 61% female, 39% male; mean age [+/-SD]=64+/-10 years) or a home exercise group (n=68, 71% female, 29% male; mean age [+/-SD]=62+/-9 years). Subjects in the clinic treatment group received supervised exercise, individualized manual therapy, and a home exercise program over a 4-week period. Subjects in the home exercise group received the same home exercise program initially, reinforced at a clinic visit 2 weeks later. Measured outcomes were the distance walked in 6 minutes and the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC). Both groups showed clinically and statistically significant improvements in 6-minute walk distances and WOMAC scores at 4 weeks; improvements were still evident in both groups at 8 weeks. By 4 weeks, WOMAC scores had improved by 52% in the clinic treatment group and by 26% in the home exercise group. Average 6-minute walk distances had improved about 10% in both groups. At 1 year, both groups were substantially and about equally improved over baseline measurements. Subjects in the clinic treatment group were less likely to be taking medications for their arthritis and were more satisfied with the overall outcome of their rehabilitative treatment compared with subjects in the home exercise group. Although both groups improved by 1 month, subjects in the clinic treatment group achieved about twice as much improvement in WOMAC scores than subjects who performed similar unsupervised exercises at home. Equivalent maintenance of improvements at 1 year was presumably due to both groups continuing the identical home exercise program. The results indicate that a home exercise program for patients with OA of the knee provides important benefit. Adding a small number of additional clinical visits for the application of manual therapy and supervised exercise adds greater symptomatic relief.
Castrillon, Tabitha; Hanney, William J; Rothschild, Carey E; Kolber, Morey J; Liu, Xinliang; Masaracchio, Michael
2017-01-01
An alternative approach to facilitate movement and control through the trunk and pelvis is belly dancing. Investigations of belly dancing mechanics indicate similar muscular activation patterns of those known to influence chronic low back pain (cLBP). However, no documented studies have examined its effectiveness as a treatment for cLBP. The purpose of this study was to investigate the influence of a standardized belly dance program in women with cLBP. A single subject design was used to evaluate weekly outcomes during a three-week baseline period, six-week belly dance program, and again at a two-month follow-up. Outcome measures for pain, disability, function, and fear-avoidance beliefs were utilized. Two subjects completed the program. No significant differences were noted during the baseline assessment period. At two months, subject one demonstrated change scores of -1.12, -1%, and 2.2 for pain, disability, and function respectively while subject two demonstrated change scores of 5.4, 5%, and 1.1 for pain, disability, and function, respectively. Subject one showed a clinically significant change score for both fear avoidance of work and physical activity, with score changes of 4 and 3.3, respectively. The results of this study suggest a standardized belly dance program may positively influence pain and function in women with cLBP.
Effectiveness of an Individualized Training Based on Force-Velocity Profiling during Jumping
Jiménez-Reyes, Pedro; Samozino, Pierre; Brughelli, Matt; Morin, Jean-Benoît
2017-01-01
Ballistic performances are determined by both the maximal lower limb power output (Pmax) and their individual force-velocity (F-v) mechanical profile, especially the F-v imbalance (FVimb): difference between the athlete's actual and optimal profile. An optimized training should aim to increase Pmax and/or reduce FVimb. The aim of this study was to test whether an individualized training program based on the individual F-v profile would decrease subjects' individual FVimb and in turn improve vertical jump performance. FVimb was used as the reference to assign participants to different training intervention groups. Eighty four subjects were assigned to three groups: an “optimized” group divided into velocity-deficit, force-deficit, and well-balanced sub-groups based on subjects' FVimb, a “non-optimized” group for which the training program was not specifically based on FVimb and a control group. All subjects underwent a 9-week specific resistance training program. The programs were designed to reduce FVimb for the optimized groups (with specific programs for sub-groups based on individual FVimb values), while the non-optimized group followed a classical program exactly similar for all subjects. All subjects in the three optimized training sub-groups (velocity-deficit, force-deficit, and well-balanced) increased their jumping performance (12.7 ± 5.7% ES = 0.93 ± 0.09, 14.2 ± 7.3% ES = 1.00 ± 0.17, and 7.2 ± 4.5% ES = 0.70 ± 0.36, respectively) with jump height improvement for all subjects, whereas the results were much more variable and unclear in the non-optimized group. This greater change in jump height was associated with a markedly reduced FVimb for both force-deficit (57.9 ± 34.7% decrease in FVimb) and velocity-deficit (20.1 ± 4.3%) subjects, and unclear or small changes in Pmax (−0.40 ± 8.4% and +10.5 ± 5.2%, respectively). An individualized training program specifically based on FVimb (gap between the actual and optimal F-v profiles of each individual) was more efficient at improving jumping performance (i.e., unloaded squat jump height) than a traditional resistance training common to all subjects regardless of their FVimb. Although improving both FVimb and Pmax has to be considered to improve ballistic performance, the present results showed that reducing FVimb without even increasing Pmax lead to clearly beneficial jump performance changes. Thus, FVimb could be considered as a potentially useful variable for prescribing optimal resistance training to improve ballistic performance. PMID:28119624
7 CFR 614.3 - Decisions subject to informal appeal procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Program; and (x) Conservation Innovation Grants. (2) Non-Title XII conservation programs or provisions, including: (i) Agriculture Management Assistance Program; (ii) Emergency Watershed Protection Program; (iii...
Figuratively Speaking: Analogies in the Accounting Classroom
ERIC Educational Resources Information Center
Tucker, Basil P.
2017-01-01
One of the foundational subjects comprising most Master of Business Administration (MBA) programs is an introductory accounting course, in which students are exposed to the study of financial and management accounting at a basic level. For many students accounting is arguably the most feared subject in the MBA program. Although some students…
32 CFR 321.10 - Disclosure to other than subject.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Disclosure to other than subject. 321.10 Section 321.10 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.10 Disclosure to other than...
32 CFR 321.10 - Disclosure to other than subject.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Disclosure to other than subject. 321.10 Section 321.10 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.10 Disclosure to other than...
32 CFR 321.10 - Disclosure to other than subject.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Disclosure to other than subject. 321.10 Section 321.10 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.10 Disclosure to other than...
32 CFR 321.10 - Disclosure to other than subject.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Disclosure to other than subject. 321.10 Section 321.10 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) PRIVACY PROGRAM DEFENSE SECURITY SERVICE PRIVACY PROGRAM § 321.10 Disclosure to other than...
10 CFR 26.33 - Behavioral observation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Behavioral observation. 26.33 Section 26.33 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Program Elements § 26.33 Behavioral observation. Licensees and other entities shall ensure that the individuals who are subject to this subpart are subject to...
10 CFR 26.33 - Behavioral observation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Behavioral observation. 26.33 Section 26.33 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Program Elements § 26.33 Behavioral observation. Licensees and other entities shall ensure that the individuals who are subject to this subpart are subject to...
10 CFR 26.33 - Behavioral observation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Behavioral observation. 26.33 Section 26.33 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Program Elements § 26.33 Behavioral observation. Licensees and other entities shall ensure that the individuals who are subject to this subpart are subject to...
10 CFR 26.33 - Behavioral observation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Behavioral observation. 26.33 Section 26.33 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Program Elements § 26.33 Behavioral observation. Licensees and other entities shall ensure that the individuals who are subject to this subpart are subject to...
10 CFR 26.33 - Behavioral observation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Behavioral observation. 26.33 Section 26.33 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Program Elements § 26.33 Behavioral observation. Licensees and other entities shall ensure that the individuals who are subject to this subpart are subject to...
25 CFR 1000.240 - What construction programs included in an AFA are subject to this subpart?
Code of Federal Regulations, 2012 CFR
2012-04-01
..., DEPARTMENT OF THE INTERIOR ANNUAL FUNDING AGREEMENTS UNDER THE TRIBAL SELF-GOVERNMENT ACT AMENDMENTS TO THE INDIAN SELF-DETERMINATION AND EDUCATION ACT Construction § 1000.240 What construction programs included... AFA are subject to this subpart. This includes design, construction, repair, improvement, expansion...
25 CFR 1000.240 - What construction programs included in an AFA are subject to this subpart?
Code of Federal Regulations, 2013 CFR
2013-04-01
..., DEPARTMENT OF THE INTERIOR ANNUAL FUNDING AGREEMENTS UNDER THE TRIBAL SELF-GOVERNMENT ACT AMENDMENTS TO THE INDIAN SELF-DETERMINATION AND EDUCATION ACT Construction § 1000.240 What construction programs included... AFA are subject to this subpart. This includes design, construction, repair, improvement, expansion...
25 CFR 1000.240 - What construction programs included in an AFA are subject to this subpart?
Code of Federal Regulations, 2014 CFR
2014-04-01
..., DEPARTMENT OF THE INTERIOR ANNUAL FUNDING AGREEMENTS UNDER THE TRIBAL SELF-GOVERNMENT ACT AMENDMENTS TO THE INDIAN SELF-DETERMINATION AND EDUCATION ACT Construction § 1000.240 What construction programs included... AFA are subject to this subpart. This includes design, construction, repair, improvement, expansion...
25 CFR 1000.240 - What construction programs included in an AFA are subject to this subpart?
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF THE INTERIOR ANNUAL FUNDING AGREEMENTS UNDER THE TRIBAL SELF-GOVERNMENT ACT AMENDMENTS TO THE INDIAN SELF-DETERMINATION AND EDUCATION ACT Construction § 1000.240 What construction programs included... AFA are subject to this subpart. This includes design, construction, repair, improvement, expansion...
The Application of Autogenic Feedback Training in a Smoking Termination Program.
ERIC Educational Resources Information Center
Boullion, Jean K.; Chen, W. William
1980-01-01
Autogenic feedback training was an effective adjunct to a smoking termination program. An 81 percent reduction in smoking activity was found for the subjects who received the training. Achieving relaxation and reducing anxiety through autogenic feedback training helped subjects restore their self-confidence and deal with stress. (Author)
Reliability Programs for Nonelectronic Designs. Volume 1
1983-04-01
to the system or component levels or to both and then Its degree of effectiveness. DOCUMENT ID TITLE/SUBJECT APPLICATION EFFECTIVENESS Sys ./Comp/Both...ID TITLE/SUBJECT APPLICATION EFFECTIVENESS ’ Sys ./Comp/Both Exc./Good/Poor MIL-STD-781B Reliability Tests Exponen- 1 1 1 1 Ii II tial Distribution MIL...DOCUMENT ID TITLE/SUBJECT APPLICATION EFFECTIVENESS Sys ./Comp/Both Exc./Good/Poor MIL-STD-1535A Supplier Quality Assurance I 1 I1 I 1 1 I Program
Kadri, Mohamed Abdelhafid; Noé, Frederic; Nouar, Merbouha Boulahbel; Paillard, Thierry
2017-09-01
To compare the effects of unilateral strength training by stimulated and voluntary contractions on muscle strength and monopedal postural control of the contralateral limb. 36 non-active healthy male subjects were recruited and split randomly into three groups. Two groups of 12 subjects took part in a strength-training program (3 sessions a week over 8 weeks) comprising 43 contractions of the quadriceps femoris of the ipsilateral limb (at 20% of the MVC). One group carried out voluntary contractions exclusively (VOL group), while the other group benefited exclusively from electro-induced contractions (NMES group). The other 12 subjects formed the control (CON) group. Assessments of MVC and monopedal postural control in static and dynamic postural tasks were performed with the ipsilateral (ISPI) and contralateral (CONTRA) limbs before (PRE) and after (POST) completion of the training program. After the training program, the MVC of the IPSI and CONTRA limbs increased similarly for both experimental groups (VOL and NMES). There were no significant improvements of monopedal postural control for the IPSI or CONTRA limbs in either the VOL or NMES experimental group. No change was observed for the CON group over the protocol period. The purposed training program with NMES vs VOL contractions induced strength gains but did not permit any improvement of contralateral monopedal postural control in healthy young subjects. This has potential for therapeutic application and allows clinicians to focus their training programs on dynamic and poly-articular exercises to improve the postural control in young subjects.
Kirenskaya, Anna V; Novototsky-Vlasov, Vladimir Y; Chistyakov, Andrey N; Zvonikov, Vyacheslav M
2011-04-01
Subjective scoring and autonomic variables (heart rate, skin conduction span) were used to verify the reality of inner experience during recollection of emotionally neutral, positive, and negative past events in 19 high (HH) and 12 low (LH) hypnotizable subjects in hypnotic and nonhypnotic experimental sessions. Also, the influence of hypnotizability on the effectiveness of an imagery-based neurolinguistic programming (NLP) technique was evaluated. Results demonstrated that subjective scores of image vividness and emotional intensity were significantly higher in the HH subjects compared to LH in both sessions. The past-events recollection was followed by increased autonomic activity only in the HH subjects. The NLP procedure was followed by decreased negative emotional intensity in both groups, but autonomic activity decline was observed in the HH subjects and not in the LH.
Astronomical Image Processing with Hadoop
NASA Astrophysics Data System (ADS)
Wiley, K.; Connolly, A.; Krughoff, S.; Gardner, J.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.
2011-07-01
In the coming decade astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. With a requirement that these images be analyzed in real time to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. In the commercial world, new techniques that utilize cloud computing have been developed to handle massive data streams. In this paper we describe how cloud computing, and in particular the map-reduce paradigm, can be used in astronomical data processing. We will focus on our experience implementing a scalable image-processing pipeline for the SDSS database using Hadoop (http://hadoop.apache.org). This multi-terabyte imaging dataset approximates future surveys such as those which will be conducted with the LSST. Our pipeline performs image coaddition in which multiple partially overlapping images are registered, integrated and stitched into a single overarching image. We will first present our initial implementation, then describe several critical optimizations that have enabled us to achieve high performance, and finally describe how we are incorporating a large in-house existing image processing library into our Hadoop system. The optimizations involve prefiltering of the input to remove irrelevant images from consideration, grouping individual FITS files into larger, more efficient indexed files, and a hybrid system in which a relational database is used to determine the input images relevant to the task. The incorporation of an existing image processing library, written in C++, presented difficult challenges since Hadoop is programmed primarily in Java. We will describe how we achieved this integration and the sophisticated image processing routines that were made feasible as a result. We will end by briefly describing the longer term goals of our work, namely detection and classification of transient objects and automated object classification.
Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets
NASA Astrophysics Data System (ADS)
Juric, Mario
2011-01-01
The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.
NASA Astrophysics Data System (ADS)
Myers, Steven T.
2013-01-01
The Jansky Very Large Array is a recently completed upgrade to the VLA that has significantly expanded its capabilities through replacement of the receivers, electronics, signal paths, and correlator with cutting-edge technology. This enhancement provides significantly increased continuum sensitivity and spectral survey speeds (by factors of 100 or more in select cases) from 1-50 GHz and in key bands below 1 GHz. Concurrently, we are greatly enhancing the sensitivity of the Very Long Baseline Array. A suite of ever more ambitious radio sky survey programs undertaken with these new instruments address science goals central to answering the questions posed by Astro2010, and will undoubtedly incite new inquiries. The science themes of the Jansky VLA and the VLBA are: illuminating the obscured, probing the magnetic, sounding the transient, and charting the evolving Universe. New observations will allow us to image young stars in massive black holes in dust enshrouded environments, measure the strength and topology of the cosmic magnetic field, follow the rapid evolution of energetic phenomena, and to study the formation and evolution of stars, galaxies, AGN, and the Universe itself. We can follow the evolution of gas and galaxies and particles and fields through cosmic time to bridge the eras from cosmic dawn to the dawn of new worlds. I will describe the salient features of the Jansky VLA and the VLBA for cosmological survey work, and summarize the multi-wavelength aspects in regard to those with ALMA and next generation optical, infrared, X-ray and Gamma-ray telescopes. Example data taken from Janksy VLA and upgraded VLBA commissioning tests and early science will illustrate these features. I also describe evolution of the VLA and VLBA and their capabilities for future surveys that will lead towards the next decade, into the era of the LSST and the SKA.
Using "Big Data" in a Classroom Setting for Student-Developed Projects
NASA Astrophysics Data System (ADS)
Hayes-Gehrke, Melissa; Vogel, Stuart N.
2018-01-01
The advances in exploration of the optical transient sky anticipated with major facilities such as the Zwicky Transient Facility (ZTF) and Large Synoptic Survey Telescope (LSST) provide an opportunity to integrate large public research datasets into the undergraduate classroom. As a step in this direction, the NSF PIRE-funded GROWTH (Global Relay of Observatories Watching Transients Happen) collaboration provided funding for curriculum development using data from the precursor to ZTF, the Intermediate Palomar Transient Factory (iPTF). One of the iPTF portals, the PTF Variable Marshal, was used by 56 Astronomy majors in the fall 2016 and 2017 semesters of the required Observational Astronomy course at the University of Maryland. Student teams learned about the iPTF survey and how to use the PTF Variable Marshal and then developed their own hypotheses about variable stars to test using data they gathered from the Variable Marshal. Through this project, students gained experience in how to develop scientific questions that can be explored using large datasets and became aware of the limitations and difficulties of such projects. This work was supported in part by NSF award OISE-1545949.
Astrometric surveys in the Gaia era
NASA Astrophysics Data System (ADS)
Zacharias, Norbert
2018-04-01
The Gaia first data release (DR1) already provides an almost error free optical reference frame on the milli-arcsecond (mas) level allowing significantly better calibration of ground-based astrometric data than ever before. Gaia DR1 provides positions, proper motions and trigonometric parallaxes for just over 2 million stars in the Tycho-2 catalog. For over 1.1 billion additional stars DR1 gives positions. Proper motions for these, mainly fainter stars (G >= 11.5) are currently provided by several new projects which combine earlier epoch ground-based observations with Gaia DR1 positions. These data are very helpful in the interim period but will become obsolete with the second Gaia data release (DR2) expected in April 2018. The era of traditional, ground-based, wide-field astrometry with the goal to provide accurate reference stars has come to an end. Future ground-based astrometry will fill in some gaps (very bright stars, observations needed at many or specific epochs) and mainly will go fainter than the Gaia limit, like the PanSTARRS and the upcoming LSST surveys.
The Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Venn, Kim; Starkenburg, Else; Martin, Nicolas; Kielty, Collin; Youakim, Kris; Arnetsen, Anke
2018-06-01
The Maunakea Spectroscopic Explorer (MSE) is an ambitious project to transform the Canada-France-Hawaii 3.6-metre telescope into an 11.25-metre facility dedicated to wide field multi-object spectroscopy. Following a successful conceptual design review of ten subsystems and the systems-level review in January 2018, MSE is preparing to move into the Preliminary Design Phase. MSE will simultaneously deploy over 3000 fibers that feed low/medium resolution spectrometers and 1000 fibers that feed high-resolution (R~40,000) spectrometers. This design is expected to revolutionize astrophysical studies requiring large spectroscopic datasets: i.e., reconstructing the Milky Way's formation history through the chemical tagging of stars, searches for the effects of dark matter on stellar streams, determination of environmental influences on galaxy formation since cosmic noon, measuring black hole masses through repeat spectroscopy of quasars, follow-up of large samples identified in other surveys (Gaia, LSST, SKA, etc.), and more. MSE will reuse a large fraction of CFHT’s existing facilities while tripling the diameter of the telescope’s primary mirror and increasing the height of the enclosure by only 10%. I will discuss the progress to date and opportunities for partnerships.
Liverpool Telescope 2: beginning the design phase
NASA Astrophysics Data System (ADS)
Copperwheat, Christopher M.; Steele, Iain A.; Barnsley, Robert M.; Bates, Stuart D.; Bode, Mike F.; Clay, Neil R.; Collins, Chris A.; Jermak, Helen E.; Knapen, Johan H.; Marchant, Jon M.; Mottram, Chris J.; Piascik, Andrzej S.; Smith, Robert J.
2016-07-01
The Liverpool Telescope is a fully robotic 2-metre telescope located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope began routine science operations in 2004, and currently seven simultaneously mounted instruments support a broad science programme, with a focus on transient followup and other time domain topics well suited to the characteristics of robotic observing. Work has begun on a successor facility with the working title `Liverpool Telescope 2'. We are entering a new era of time domain astronomy with new discovery facilities across the electromagnetic spectrum, and the next generation of optical survey facilities such as LSST are set to revolutionise the field of transient science in particular. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time, and will be designed to meet the challenges of this new era. Following a conceptual design phase, we are about to begin the detailed design which will lead towards the start of construction in 2018, for first light ˜2022. In this paper we provide an overview of the facility and an update on progress.
Supernova Cosmology in the Big Data Era
NASA Astrophysics Data System (ADS)
Kessler, Richard
Here we describe large "Big Data" Supernova (SN) Ia surveys, past and present, used to make precision measurements of cosmological parameters that describe the expansion history of the universe. In particular, we focus on surveys designed to measure the dark energy equation of state parameter w and its dependence on cosmic time. These large surveys have at least four photometric bands, and they use a rolling search strategy in which the same instrument is used for both discovery and photometric follow-up observations. These surveys include the Supernova Legacy Survey (SNLS), Sloan Digital Sky Survey II (SDSS-II), Pan-STARRS 1 (PS1), Dark Energy Survey (DES), and Large Synoptic Survey Telescope (LSST). We discuss the development of how systematic uncertainties are evaluated, and how methods to reduce them play a major role is designing new surveys. The key systematic effects that we discuss are (1) calibration, measuring the telescope efficiency in each filter band, (2) biases from a magnitude-limited survey and from the analysis, and (3) photometric SN classification for current surveys that don't have enough resources to spectroscopically confirm each SN candidate.
Prompt emission from the counter jet of a short gamma-ray burst
NASA Astrophysics Data System (ADS)
Yamazaki, Ryo; Ioka, Kunihito; Nakamura, Takashi
2018-03-01
The counter jet of a short gamma-ray burst (sGRB) has not yet been observed, while recent discoveries of gravitational waves (GWs) from a binary neutron star merger GW170817 and the associated sGRB 170817A have demonstrated that off-axis sGRB jets are detectable. We calculate the prompt emission from the counter jet of an sGRB and show that it is typically 23-26 mag in the optical-infrared band 10-10^3 s after the GWs for an sGRB 170817A-like event, which is brighter than the early macronova (or kilonova) emission and detectable by LSST in the near future. We also propose a new method to constrain the unknown jet properties, such as the Lorentz factor, opening angle, emission radii, and jet launch time, by observing both the forward and counter jets. To scrutinize the counter jets, space GW detectors like DECIGO are powerful in forecasting the merger time (≲ 1 s) and position (≲ 1 arcmin) (˜ a week) before the merger.
The Zwicky Transient Facility: Overview and Commissioning Activities
NASA Astrophysics Data System (ADS)
Graham, Matthew; Zwicky Transient Facility (ZTF) Project Team
2018-01-01
The Zwicky Transient Facility (ZTF) is the first of a new generation of LSST-scope sky surveys to be realized. It will employ a 47 square degree field-of-view camera mounted on the Samuel Oschin 48-inch Schmidt telescope at Palomar Observatory to scan more than 3750 square degrees an hour to a depth of 20.5 – 21 mag. This will lead to unprecedented discovery rates for transients – a young supernova less than 24 hours after its explosion each night as well as rarer and more exotic sources. Repeated imaging of the Northern sky (including the Galactic Plane) will produce a photometric variability catalog with nearly 300 observations each year, ideal for studies of variable stars, binaries, AGN, and asteroids. ZTF represents a significant increase in scale relative to previous surveys in terms of both data volume and data complexity. It will be the first survey to produce one million alerts a night and the first to have a trillion row data archive. We will present an overview of the survey and its challenges and describe recent commissioning activities.
Priming the search for cosmic superstrings using GADGET2 simulations
NASA Astrophysics Data System (ADS)
Cousins, Bryce; Jia, Hewei; Braverman, William; Chernoff, David
2018-01-01
String theory is an extensive mathematical theory which, despite its broad explanatory power, is still lacking empirical support. However, this may change when considering the scope of cosmology, where “cosmic superstrings” may serve as observational evidence. According to string theory, these superstrings were stretched to cosmic scales in the early Universe and may now be detectable, via microlensing or gravitational radiation. Negative results from prior surveys have put some limits on superstring properties, so to investigate the parameter space more effectively, we ask: “where should we expect to find cosmic superstrings, and how many should we predict?” This research investigates these questions by simulating cosmic string behavior during structure formation in the universe using GADGET2. The sizes and locations of superstring clusters are assessed using kernel density estimation and radial correlation functions. Currently, only preliminary small-scale simulations have been performed, producing superstring clustering with low sensitivity. However, future simulations of greater magnitude will offer far higher resolution, allowing us to more precisely track superstring behavior within structures. Such results will guide future searches, most imminently those made possible by LSST and WFIRST.
SKA weak lensing - I. Cosmological forecasts and the power of radio-optical cross-correlations
NASA Astrophysics Data System (ADS)
Harrison, Ian; Camera, Stefano; Zuntz, Joe; Brown, Michael L.
2016-12-01
We construct forecasts for cosmological parameter constraints from weak gravitational lensing surveys involving the Square Kilometre Array (SKA). Considering matter content, dark energy and modified gravity parameters, we show that the first phase of the SKA (SKA1) can be competitive with other Stage III experiments such as the Dark Energy Survey and that the full SKA (SKA2) can potentially form tighter constraints than Stage IV optical weak lensing experiments, such as those that will be conducted with LSST, WFIRST-AFTA or Euclid-like facilities. Using weak lensing alone, going from SKA1 to SKA2 represents improvements by factors of ˜10 in matter, ˜10 in dark energy and ˜5 in modified gravity parameters. We also show, for the first time, the powerful result that comparably tight constraints (within ˜5 per cent) for both Stage III and Stage IV experiments, can be gained from cross-correlating shear maps between the optical and radio wavebands, a process which can also eliminate a number of potential sources of systematic errors which can otherwise limit the utility of weak lensing cosmology.
Liverpool Telescope and Liverpool Telescope 2
NASA Astrophysics Data System (ADS)
Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Clay, N. R.; Jermak, H.; Marchant, J. M.; Mottram, C. J.; Piascik, A.; Smith, R. J.
2016-12-01
The Liverpool Telescope is a fully robotic optical/near-infrared telescope with a 2-metre clear aperture, located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope is owned and operated by Liverpool John Moores University, with financial support from the UK's Science and Technology Facilities Council. The telescope began routine science operations in 2004 and is a common-user facility with time available through a variety of committees via an open, peer reviewed process. Seven simultaneously mounted instruments support a broad science programme, with a focus on transient follow-up and other time domain topics well suited to the characteristics of robotic observing. Development has also begun on a successor facility, with the working title `Liverpool Telescope 2', to capitalise on the new era of time domain astronomy which will be brought about by the next generation of survey facilities such as LSST. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time. In this paper we provide an overview of the current status of both facilities.
The dynamics and control of large flexible space structures-V
NASA Technical Reports Server (NTRS)
Bainum, P. M.; Reddy, A. S. S. R.; Diarra, C. M.; Kumar, V. K.
1982-01-01
A general survey of the progress made in the areas of mathematical modelling of the system dynamics, structural analysis, development of control algorithms, and simulation of environmental disturbances is presented. The use of graph theory techniques is employed to examine the effects of inherent damping associated with LSST systems on the number and locations of the required control actuators. A mathematical model of the forces and moments induced on a flexible orbiting beam due to solar radiation pressure is developed and typical steady state open loop responses obtained for the case when rotations and vibrations are limited to occur within the orbit plane. A preliminary controls analysis based on a truncated (13 mode) finite element model of the 122m. Hoop/Column antenna indicates that a minimum of six appropriately placed actuators is required for controllability. An algorithm to evaluate the coefficients which describe coupling between the rigid rotational and flexible modes and also intramodal coupling was developed and numerical evaluation based on the finite element model of Hoop/Column system is currently in progress.
Science capabilities of the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Devost, Daniel; McConnachie, Alan; Flagey, Nicolas; Cote, Patrick; Balogh, Michael; Driver, Simon P.; Venn, Kim
2017-01-01
The Maunakea Spectroscopic Explorer (MSE) project will transform the CFHT 3.6m optical telescope into a 10m class dedicated multiobject spectroscopic facility, with an ability to simultaneously measure thousands of objects with a spectral resolution range spanning 2,000 to 20,000. The project is currently in design phase, with full science operations nominally starting in 2025. MSE will enable transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. MSE is an essential follow-up facility to current and next generations of multi-wavelength imaging surveys, including LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for E-ELT, TMT and GMT. I will give an update on the status of the project and review some of the most exciting scientific capabilities of the observatory.
Status of mirror segment production for the Giant Magellan Telescope
NASA Astrophysics Data System (ADS)
Martin, H. M.; Burge, J. H.; Davis, J. M.; Kim, D. W.; Kingsley, J. S.; Law, K.; Loeff, A.; Lutz, R. D.; Merrill, C.; Strittmatter, P. A.; Tuell, M. T.; Weinberger, S. N.; West, S. C.
2016-07-01
The Richard F. Caris Mirror Lab at the University of Arizona is responsible for production of the eight 8.4 m segments for the primary mirror of the Giant Magellan Telescope, including one spare off-axis segment. We report on the successful casting of Segment 4, the center segment. Prior to generating the optical surface of Segment 2, we carried out a major upgrade of our 8.4 m Large Optical Generator. The upgrade includes new hardware and software to improve accuracy, safety, reliability and ease of use. We are currently carrying out an upgrade of our 8.4 m polishing machine that includes improved orbital polishing capabilities. We added and modified several components of the optical tests during the manufacture of Segment 1, and we have continued to improve the systems in preparation for Segments 2-8. We completed two projects that were prior commitments before GMT Segment 2: casting and polishing the combined primary and tertiary mirrors for the LSST, and casting and generating a 6.5 m mirror for the Tokyo Atacama Observatory.
How the cosmic web induces intrinsic alignments of galaxies
NASA Astrophysics Data System (ADS)
Codis, S.; Dubois, Y.; Pichon, C.; Devriendt, J.; Slyz, A.
2016-10-01
Intrinsic alignments are believed to be a major source of systematics for future generation of weak gravitational lensing surveys like Euclid or LSST. Direct measurements of the alignment of the projected light distribution of galaxies in wide field imaging data seem to agree on a contamination at a level of a few per cent of the shear correlation functions, although the amplitude of the effect depends on the population of galaxies considered. Given this dependency, it is difficult to use dark matter-only simulations as the sole resource to predict and control intrinsic alignments. We report here estimates on the level of intrinsic alignment in the cosmological hydrodynamical simulation Horizon-AGN that could be a major source of systematic errors in weak gravitational lensing measurements. In particular, assuming that the spin of galaxies is a good proxy for their ellipticity, we show how those spins are spatially correlated and how they couple to the tidal field in which they are embedded. We will also present theoretical calculations that illustrate and qualitatively explain the observed signals.
Detection technique for artificially illuminated objects in the outer solar system and beyond.
Loeb, Abraham; Turner, Edwin L
2012-04-01
Existing and planned optical telescopes and surveys can detect artificially illuminated objects, comparable in total brightness to a major terrestrial city, at the outskirts of the Solar System. Orbital parameters of Kuiper belt objects (KBOs) are routinely measured to exquisite precisions of<10(-3). Here, we propose to measure the variation of the observed flux F from such objects as a function of their changing orbital distances D. Sunlight-illuminated objects will show a logarithmic slope α ≡ (d log F/d log D)=-4, whereas artificially illuminated objects should exhibit α=-2. The proposed Large Synoptic Survey Telescope (LSST) and other planned surveys will provide superb data and allow measurement of α for thousands of KBOs. If objects with α=-2 are found, follow-up observations could measure their spectra to determine whether they are illuminated by artificial lighting. The search can be extended beyond the Solar System with future generations of telescopes on the ground and in space that would have the capacity to detect phase modulation due to very strong artificial illumination on the nightside of planets as they orbit their parent stars.
Making The Most Of Flaring M Dwarfs
NASA Astrophysics Data System (ADS)
Hunt-Walker, Nicholas; Hilton, E.; Kowalski, A.; Hawley, S.; Matthews, J.; Holtzman, J.
2011-01-01
We present observations of flare activity using the Microvariability and Oscillations of Stars (MOST) satellite in conjunction with simultaneous spectroscopic and photometric observations from the ARC 3.5-meter, NMSU 1.0-meter, and ARCSAT 0.5-meter telescopes at the Apache Point Observatory. The MOST observations enable unprecedented completeness with regard to observing frequent, low-energy flares on the well-known dMe flare star AD Leo with broadband photometry. The observations span approximately one week with a 60-second cadence and are sensitive to flares as small as 0.01-magnitudes. The time-resolved, ground-based spectroscopy gives measurements of Hα and other important chromospheric emission lines, whereas the Johnson U-, SDSS u-, and SDSS g-band photometry provide color information during the flare events and allow us to relate the MOST observations to decades of previous broadband observations. Understanding the rates and energetics of flare events on M dwarfs will help characterize this source of variability in large time-domain surveys such as LSST and Pan-STARRS. Flare rates are also of interest to astrobiology, since flares affect the habitability of exoplanets orbiting M dwarfs.
Meira, Erik P.; En Gilpin, Hui; Brunette, Meredith
2011-01-01
Background and Purpose: Golf is a popular sport played by hundreds of thousands of individuals of all ages and of varying skill levels. An orthopedic or sports-related injury and/or surgery may limit an individual's sport participation, require him/her to complete a course of rehabilitation, and initiate (or resume) a sport-specific training program. Unlike the availability of evidence to guide postsurgical rehabilitation and sport-specific training of athletes from sports other than golf, there have only been two reports describing outcomes after surgery and for golfers. The purpose of this case report is to present a post-rehabilitation return to sport-training program for a recreational golfer 11-months after a rotator cuff repair. Case Description: The subject, a 67-year old female, injured her right shoulder requiring a rotator cuff repair 11-months prior to her participation in a golf fitness training program. The subject participated in six training sessions over seven week period consisting of general strengthening exercises (including exercises for the rotator cuff), exercises for the core, plyometrics, and power exercises. Outcomes: The subject made improvements in power and muscular endurance of the core. She was able to resume golf at the completion of the training program. Discussion: The subject was able to make functional improvements and return to golf after participation in a comprehensive strength program. Additional studies are necessary to improve program design for golfers who wish to return to sport after shoulder surgery. PMID:22163096
Jeon, Mi Yang; Jeong, HyeonCheol; Petrofsky, Jerrold; Lee, Haneul; Yim, JongEun
2014-01-01
Background Falling can lead to severe health issues in the elderly and importantly contributes to morbidity, death, immobility, hospitalization, and early entry to long-term care facilities. The aim of this study was to devise a recurrent fall prevention program for elderly women in rural areas. Material/Methods This study adopted an assessor-blinded, randomized, controlled trial methodology. Subjects were enrolled in a 12-week recurrent fall prevention program, which comprised strength training, balance training, and patient education. Muscle strength and endurance of the ankles and the lower extremities, static balance, dynamic balance, depression, compliance with preventive behavior related to falls, fear of falling, and fall self-efficacy at baseline and immediately after the program were assessed. Sixty-two subjects (mean age 69.2±4.3 years old) completed the program – 31 subjects in the experimental group and 31 subjects in the control group. Results When the results of the program in the 2 groups were compared, significant differences were found in ankle heel rise test, lower extremity heel rise test, dynamic balance, depression, compliance with fall preventative behavior, fear of falling, and fall self-efficacy (p<0.05), but no significant difference was found in static balance. Conclusions This study shows that the fall prevention program described effectively improves muscle strength and endurance, balance, and psychological aspects in elderly women with a fall history. PMID:25394805
Online diabetes self-management program: a randomized study.
Lorig, Kate; Ritter, Philip L; Laurent, Diana D; Plant, Kathryn; Green, Maurice; Jernigan, Valarie Blue Bird; Case, Siobhan
2010-06-01
We hypothesized that people with type 2 diabetes in an online diabetes self-management program, compared with usual-care control subjects, would 1) demonstrate reduced A1C at 6 and 18 months, 2) have fewer symptoms, 3) demonstrate increased exercise, and 4) have improved self-efficacy and patient activation. In addition, participants randomized to listserve reinforcement would have better 18-month outcomes than participants receiving no reinforcement. A total of 761 participants were randomized to 1) the program, 2) the program with e-mail reinforcement, or 3) were usual-care control subjects (no treatment). This sample included 110 American Indians/Alaska Natives (AI/ANs). Analyses of covariance models were used at the 6- and 18-month follow-up to compare groups. At 6 months, A1C, patient activation, and self-efficacy were improved for program participants compared with usual care control subjects (P < 0.05). There were no changes in other health or behavioral indicators. The AI/AN program participants demonstrated improvements in health distress and activity limitation compared with usual-care control subjects. The subgroup with initial A1C >7% demonstrated stronger improvement in A1C (P = 0.01). At 18 months, self-efficacy and patient activation were improved for program participants. A1C was not measured. Reinforcement showed no improvement. An online diabetes self-management program is acceptable for people with type 2 diabetes. Although the results were mixed they suggest 1) that the program may have beneficial effects in reducing A1C, 2) AI/AN populations can be engaged in and benefit from online interventions, and 3) our follow-up reinforcement appeared to have no value.
Eye-movements and ongoing task processing.
Burke, David T; Meleger, Alec; Schneider, Jeffrey C; Snyder, Jim; Dorvlo, Atsu S S; Al-Adawi, Samir
2003-06-01
This study tests the relation between eye-movements and thought processing. Subjects were given specific modality tasks (visual, gustatory, kinesthetic) and assessed on whether they responded with distinct eye-movements. Some subjects' eye-movements reflected ongoing thought processing. Instead of a universal pattern, as suggested by the neurolinguistic programming hypothesis, this study yielded subject-specific idiosyncratic eye-movements across all modalities. Included is a discussion of the neurolinguistic programming hypothesis regarding eye-movements and its implications for the eye-movement desensitization and reprocessing theory.
32 CFR 199.15 - Quality and utilization review peer review organization program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... CHAMPUS program. In recognition of the similarity of purpose and design between the Medicare and CHAMPUS... by OCHAMPUS as subject to a pattern of abuse shall be the subject of intensified quality assurance.... (iv) Notify OCHAMPUS of all such actions. (2) Findings related to a pattern of inappropriate practices...
32 CFR 199.15 - Quality and utilization review peer review organization program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CHAMPUS program. In recognition of the similarity of purpose and design between the Medicare and CHAMPUS... by OCHAMPUS as subject to a pattern of abuse shall be the subject of intensified quality assurance.... (iv) Notify OCHAMPUS of all such actions. (2) Findings related to a pattern of inappropriate practices...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... that are not subject to Independent Public Accountant (IPA) audit requirements. DATES: Comments Due... numerous PHAs that are not subject to Independent Public Accountant (IPA) audit requirements. Number of... Proposed Information Collection to OMB Public Housing Capital Fund Program AGENCY: Office of the Chief...
ERIC Educational Resources Information Center
Ragonis, Noa; Shilo, Gila
2014-01-01
The paper presents a theoretical investigational study of the potential advantages that secondary school learners may gain from learning two different subjects, namely, logic programming within computer science studies and argumentation texts within linguistics studies. The study suggests drawing an analogy between the two subjects since they both…
44 CFR 4.3 - What programs and activities of FEMA are subject to these regulations?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false What programs and activities of FEMA are subject to these regulations? 4.3 Section 4.3 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL INTERGOVERNMENTAL REVIEW OF FEDERAL...
ERIC Educational Resources Information Center
And Others; Stalonas, Peter M., Jr.
1978-01-01
Investigated behavioral programs for obesity. Exercise and self-managed contingency components were compared using obese subjects who were evaluated after treatment and follow-up. Significant weight loss was observed at termination. The influence of exercise at follow-up was noticeable. Subjects engaged in behaviors, yet behaviors were not related…
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2013 CFR
2013-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...
Victory of Prepaid Tuition: Court Says Michigan Plan Is Not Subject to Taxes.
ERIC Educational Resources Information Center
Healy, Patricia
1994-01-01
An appellate court found that Michigan's popular prepaid college tuition program is not subject to federal income taxes, reversing an earlier decision. The program, considered a model for other states, was suspended in 1991 due to fund depletion from tax payments. The Internal Revenue Service is considering an appeal. (MSE)
Lechner, L; De Vries, H
1995-11-01
This article presents a study of the determinants of starting participation in an employee fitness program. Information from 488 employees, recruited from two worksites, was obtained. From these employees the determinants of participation were studied. A questionnaire based on two theoretical models was used. The Stages of Change model was used to measure the health behavior, consisting of precontemplation (no intention to participate), contemplation (considering participation), preparation (intending to participate within a short period), and action (participating in fitness). The possible determinants were measured according to the ASE model, including the attitude toward an employee fitness program, social influence, and self-efficacy expectations. Subjects in action stage were most convinced of the benefits of participation in the employee fitness program and of their own skills to participate in a fitness program. Subjects in precontemplation stage were least convinced of the advantages of participation and had the lowest self-efficacy scores. Subjects in action stage experienced the most social support to participate in the employee fitness program. Health education for employees within industrial fitness programs can be tailored toward their motivational stage. Promotional activities for industrial fitness programs should concentrate on persons in the precontemplation and contemplation stages, since people in these stages are insufficiently convinced of the advantages of a fitness program and expect many problems with regard to their ability to participate in the program.
Consumer use of health-related endorsements on food labels in the United Kingdom and Australia.
Rayner, M; Boaz, A; Higginson, C
2001-01-01
The objective of this research was to examine how consumers use health-related food endorsements on food labels. Three endorsement programs were examined: those of the two major retailers in the United Kingdom, Tesco and Sainsbury's, and the "Pick the Tick" program of the National Heart Foundation of Australia. The main methodology used was protocol analysis. This involves the subject "thinking aloud" while performing a task--in this case, (a) shopping normally and (b) shopping "healthily" for foods on a predetermined list--to generate a protocol. Each subject was also interviewed to investigate reported use of endorsements. Subjects were a quota sample (N = 44) of shoppers representative of the U.K. and Australian populations. Information about the subjects, the protocols, and interview data were analyzed quantitatively; the protocols were also analyzed qualitatively. Sainsbury's and Australian shoppers never used the endorsements when shopping but Tesco shoppers did, albeit rarely. Tesco shoppers used the endorsement in complex ways and not just as a trigger to food selection. They sometimes used the endorsement to reject endorsed foods. Subjects claimed to use the endorsements even though the protocol analysis revealed no actual use. There are features of the Tesco endorsement program that make it more helpful to consumers than the other programs.
NASA Astrophysics Data System (ADS)
Ward, Peggy
Although hailed as a powerful form of instruction, in most teaching and learning contexts, inquiry-based instruction is fraught with ambiguous and conflicting definitions and descriptions. Yet little has been written about the experiences preservice science teacher have regarding their learning to teach science through inquiry. This project sought to understand how select preservice secondary science teachers enrolled in three UTeach programs in Arkansas conceptualize inquiry instruction and how they rationalize its value in a teaching and learning context. The three teacher education programs investigated in this study are adoption sites aligned with the UTeach Program in Austin, TX that distinguishes itself in part by its inquiry emphasis. Using a mixed method investigation design, this study utilized two sources of data to explore the preservice science teachers' thinking. In the first phase, a modified version of the Pedagogy of Science teaching Tests (POSTT) was used to identify select program participants who indicated preferences for inquiry instruction over other instructional strategies. Secondly, the study used an open-ended questionnaire to explore the selected subjects' beliefs and conceptions of teaching and learning science in an inquiry context. The study also focused on identifying particular junctures in the prospective science teachers' education preparation that might impact their understanding about inquiry. Using a constant comparative approach, this study explored 19 preservice science teachers' conceptions about inquiry. The results indicate that across all levels of instruction, the prospective teachers tended to have strong student-centered teaching orientations. Except subjects in for the earliest courses, subjects' definitions and descriptions of inquiry tended toward a few of the science practices. More advanced subjects, however, expressed more in-depth descriptions. Excluding the subjects who have completed the program, multiple subjects tended to associate inquiry learning exclusively in terms of exploring before lecture, getting a single correct answer. Additionally, various subjects at multiple levels, described inquiry in terms of the 5E Model of Instruction, which is emphasized in the Arkansas UTeach lesson design. Implications of these findings and suggestions for program improvement at the course levels are suggested.
34 CFR 668.10 - Direct assessment programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as creativity, analysis or synthesis associated with the subject matter of the program. Examples... measurement apply to direct assessment programs. Because a direct assessment program does not utilize credit... program includes regularly scheduled learning sessions, faculty-guided independent study, consultations...
MSFC Skylab Orbital Workshop, volume 5
NASA Technical Reports Server (NTRS)
1974-01-01
The various programs involved in the development of the Skylab Orbital Workshop are discussed. The subjects considered include the following: (1) reliability program, (2) system safety program, (3) testing program, (4) engineering program management, (5) mission operations support, and (6) aerospace applications.
Impact of a Post-Discharge Integrated Disease Management Program on COPD Hospital Readmissions.
Russo, Ashlee N; Sathiyamoorthy, Gayathri; Lau, Chris; Saygin, Didem; Han, Xiaozhen; Wang, Xiao-Feng; Rice, Richard; Aboussouan, Loutfi S; Stoller, James K; Hatipoğlu, Umur
2017-11-01
Readmission following a hospitalization for COPD is associated with significant health-care expenditure. A multicomponent COPD post-discharge integrated disease management program was implemented at the Cleveland Clinic to improve the care of patients with COPD and reduce readmissions. This retrospective study reports our experience with the program. Groups of subjects who were exposed to different components of the program were compared regarding their readmission rates. Multivariate logistic regression analysis was performed to build predictive models for 30- and 90-d readmission. One hundred sixty subjects completed a 90-d follow-up, of which, 67 attended the exacerbation clinic, 16 subjects received care coordination, 51 subjects completed both, and 26 subjects did not participate in any component despite referral. Thirty- and 90-d readmission rates for the entire group were 18.1 and 46.2%, respectively. Thirty- and 90-d readmission rates for the individual groups were: exacerbation clinic, 11.9 and 35.8%; care coordination, 25.0 and 50.0%; both, 19.6 and 41.2%; and neither, 26.9 and 80.8%, respectively. The model with the best predictive ability for 30-d readmission risk included the number of hospitalizations within the previous year and use of noninvasive ventilation (C statistic of 0.84). The model for 90-d readmission risk included receiving any component of the post-discharge integrated disease management program, the number of hospitalizations, and primary care physician visits within the previous year (C statistic of 0.87). Receiving any component of a post-discharge integrated disease management program was associated with reduced 90-d readmission rate. Previous health-care utilization and lung function impairment were strong predictors of readmission. Copyright © 2017 by Daedalus Enterprises.
Confirmational study: a positive-based thumb and finger sucking elimination program.
Green, Shari E
2010-11-01
This article emphasizes the critical need for information specifically regarding the topic of retained sucking behaviors. The study aimed to confirm results provided by Van Norman of 723 subjects in 1997. Parent surveys were collected on 441 subjects who received an orofacial myofunctional treatment program provided by one certified orofacial myologist. Results of this study do confirm that retained digit sucking behavior may be addressed successfully and expediently by a program based on positive behavior modification techniques.
45 CFR 1214.149 - Program accessibility: Discrimination prohibited.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 4 2012-10-01 2012-10-01 false Program accessibility: Discrimination prohibited... PROGRAMS OR ACTIVITIES CONDUCTED BY ACTION § 1214.149 Program accessibility: Discrimination prohibited... of, be excluded from participation in, or otherwise be subjected to discrimination under any program...
Comprehension: The Challenge for Children's Television.
ERIC Educational Resources Information Center
Storm, Susan R.
The purpose of this research was to determine young children's comprehension of selected TV program content. The subjects were 210 children in grades K-2. All subjects in groups of five, were shown segments from four TV programs: a scalloped potatoes commercial, a "Batman" and Robin episode, a news story on the MIG-25 and a segment of the…
76 FR 41491 - Applications for New Awards; Arts in Education National Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... the arts as a core academic subject in the school curriculum. Interrupted time series design means a... interrupted time series design that relies on the comparison of treatment effects on a single subject or group.... That is, the series should show an ``interruption'' of the prior situation at the time when the program...
Item Response Theory at Subject- and Group-Level. Research Report 90-1.
ERIC Educational Resources Information Center
Tobi, Hilde
This paper reviews the literature about item response models for the subject level and aggregated level (group level). Group-level item response models (IRMs) are used in the United States in large-scale assessment programs such as the National Assessment of Educational Progress and the California Assessment Program. In the Netherlands, these…
24 CFR 906.9 - Title restrictions and encumbrances on properties sold under a homeownership program.
Code of Federal Regulations, 2011 CFR
2011-04-01
... program. (a) If the property is subject to indebtedness under the Annual Contributions Contract (ACC), HUD will continue to make any debt service contributions for which it is obligated under the ACC, and the... title restrictions prescribed by the ACC. Because the property will no longer be subject to the ACC...
24 CFR 906.9 - Title restrictions and encumbrances on properties sold under a homeownership program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... program. (a) If the property is subject to indebtedness under the Annual Contributions Contract (ACC), HUD will continue to make any debt service contributions for which it is obligated under the ACC, and the... title restrictions prescribed by the ACC. Because the property will no longer be subject to the ACC...
Emetic and Electric Shock Alcohol Aversion Therapy: Six- and Twelve-Month Follow-Up.
ERIC Educational Resources Information Center
Cannon, Dale S.; Baker, Timothy B.
1981-01-01
Follow-up data are presented for 6- and 12-months on male alcoholics (N=20) who received either a multifaceted inpatient alcoholism treatment program alone (controls) or emetic or shock aversion therapy in addition to that program. Both emetic and control subjects compiled more days of abstinence than shock subjects. (Author)
Code of Federal Regulations, 2011 CFR
2011-10-01
... coverage. (1) If a State, local or private program provides for health insurance for the full-time... program provides health insurance coverage for the full-time participant, the sponsor must also continue... Selection and Treatment of Participants § 2540.220 Under what circumstances and subject to what conditions...
Code of Federal Regulations, 2010 CFR
2010-10-01
... coverage. (1) If a State, local or private program provides for health insurance for the full-time... program provides health insurance coverage for the full-time participant, the sponsor must also continue... Selection and Treatment of Participants § 2540.220 Under what circumstances and subject to what conditions...
Assessment of Selected Aspects of Teaching Programming in SK and CZ
ERIC Educational Resources Information Center
Záhorec, Jan; Hašková, Alena; Munk, Michal
2014-01-01
Authors of this paper carried out a broader international research aimed at assessing the computer science education at upper secondary level of education--ISCED 3A. The assessed school subjects were informatics and programming as the most common school subjects taught at secondary schools within computer sciences. The assessment was based on the…
Kangas, Brian D; Berry, Meredith S; Cassidy, Rachel N; Dallery, Jesse; Vaidya, Manish; Hackenberg, Timothy D
2009-10-01
Adult human subjects engaged in a simulated Rock/Paper/Scissors game against a computer opponent. The computer opponent's responses were determined by programmed probabilities that differed across 10 blocks of 100 trials each. Response allocation in Experiment 1 was well described by a modified version of the generalized matching equation, with undermatching observed in all subjects. To assess the effects of instructions on response allocation, accurate probability-related information on how the computer was programmed to respond was provided to subjects in Experiment 2. Five of 6 subjects played the counter response of the computer's dominant programmed response near-exclusively (e.g., subjects played paper almost exclusively if the probability of rock was high), resulting in minor overmatching, and higher reinforcement rates relative to Experiment 1. On the whole, the study shows that the generalized matching law provides a good description of complex human choice in a gaming context, and illustrates a promising set of laboratory methods and analytic techniques that capture important features of human choice outside the laboratory.
Nuclear thermal propulsion program overview
NASA Technical Reports Server (NTRS)
Bennett, Gary L.
1991-01-01
Nuclear thermal propulsion program is described. The following subject areas are covered: lunar and Mars missions; national space policy; international cooperation in space exploration; propulsion technology; nuclear rocket program; and budgeting.
Influence of Smartphones and Software on Acoustic Voice Measures
GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA
2016-01-01
This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797
45 CFR 63.31 - Protection of human subjects.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Protection of human subjects. 63.31 Section 63.31 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION GRANT PROGRAMS ADMINISTERED... Protection of human subjects. All grants made pursuant to this part are subject to the specific provisions of...
45 CFR 63.31 - Protection of human subjects.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Protection of human subjects. 63.31 Section 63.31 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION GRANT PROGRAMS ADMINISTERED... Protection of human subjects. All grants made pursuant to this part are subject to the specific provisions of...
45 CFR 63.31 - Protection of human subjects.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Protection of human subjects. 63.31 Section 63.31 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION GRANT PROGRAMS ADMINISTERED... Protection of human subjects. All grants made pursuant to this part are subject to the specific provisions of...
45 CFR 63.31 - Protection of human subjects.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Protection of human subjects. 63.31 Section 63.31 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION GRANT PROGRAMS ADMINISTERED... Protection of human subjects. All grants made pursuant to this part are subject to the specific provisions of...
45 CFR 63.31 - Protection of human subjects.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Protection of human subjects. 63.31 Section 63.31 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION GRANT PROGRAMS ADMINISTERED... Protection of human subjects. All grants made pursuant to this part are subject to the specific provisions of...
Hypothesized eye movements of neurolinguistic programming: a statistical artifact.
Farmer, A; Rooney, R; Cunningham, J R
1985-12-01
Neurolinguistic programming's hypothesized eye-movements were measured independently from videotapes of 30 subjects, aged 15 to 76 yr., who were asked to recall visual pictures, recorded audio sounds, and textural objects. chi 2 indicated that subjects' responses were significantly different from those predicted. When chi 2 comparisons were weighted by number of eye positions assigned to each modality (3 visual, 3 auditory, 1 kinesthetic), subjects' responses did not differ significantly from the expected pattern. These data indicate that the eye-movement hypothesis may represent randomly occurring rather than sensory-modality-related positions.
NASA Astrophysics Data System (ADS)
Julianto, Eko Nugroho; Salamah, Ummu
2017-03-01
On the 2012 curriculum, Vocational Education Program Universitas Negeri Semarang allowed the students to choose subjects for their specialization according to their ability. The subject specialization was given at the 6th semester to provide students in performing field work experience. Each course has its own enthusiasts specialization, students have certain considerations in selecting the course. The consideration of each of them is different from one another because they have their own talents, interests, aspirations and perceptions or a different view in assessing a subject specialization offered by Construction Engineering Vocational Education Program. The purpose of this study was to determine the amount of interest caused by intrinsic and extrinsic factors on 2012 and 2013 students' cohort in selecting subjects of specialization. This research is descriptive with quantitative approach, which is carried out to determine the magnitude of the interest students in choosing courses of specialization. Research conducted at the Civil Engineering Department Universitas Negeri Semarang, with research subjects that students PTB forces in 2012 and 2013, with a total sample of 87 students. The results showed that the interest of the student of 2012 and 2013 in selecting subjects of specialization is equal to 68.06% with the criteria are interested in contributions from intrinsic factors indicate the yield at 35.48% and 64.52% extrinsic factors.
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2010 CFR
2010-07-01
... a project management timeline, Gantt Chart, that depicts when interim and final reports required by... 30 Mineral Resources 2 2010-07-01 2010-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources MINERALS MANAGEMENT SERVICE...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Physical Protection of Irradiated Reactor Fuel in Transit... Irradiated Reactor Fuel in Transit, Training Program Subject Schedule Pursuant to the provision of § 73.37 of... reactor fuel is required to assure that individuals used as shipment escorts have completed a training...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Physical Protection of Irradiated Reactor Fuel in Transit... Irradiated Reactor Fuel in Transit, Training Program Subject Schedule Pursuant to the provision of § 73.37 of... reactor fuel is required to assure that individuals used as shipment escorts have completed a training...