Evaluation of Potential LSST Spatial Indexing Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolaev, S; Abdulla, G; Matzke, R
2006-10-13
The LSST requirement for producing alerts in near real-time, and the fact that generating an alert depends on knowing the history of light variations for a given sky position, both imply that the clustering information for all detections is available at any time during the survey. Therefore, any data structure describing clustering of detections in LSST needs to be continuously updated, even as new detections are arriving from the pipeline. We call this use case ''incremental clustering'', to reflect this continuous updating of clustering information. This document describes the evaluation results for several potential LSST incremental clustering strategies, using: (1)more » Neighbors table and zone optimization to store spatial clusters (a.k.a. Jim Grey's, or SDSS algorithm); (2) MySQL built-in R-tree implementation; (3) an external spatial index library which supports a query interface.« less
Data Mining Research with the LSST
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Strauss, M. A.; Tyson, J. A.
2007-12-01
The LSST catalog database will exceed 10 petabytes, comprising several hundred attributes for 5 billion galaxies, 10 billion stars, and over 1 billion variable sources (optical variables, transients, or moving objects), extracted from over 20,000 square degrees of deep imaging in 5 passbands with thorough time domain coverage: 1000 visits over the 10-year LSST survey lifetime. The opportunities are enormous for novel scientific discoveries within this rich time-domain ultra-deep multi-band survey database. Data Mining, Machine Learning, and Knowledge Discovery research opportunities with the LSST are now under study, with a potential for new collaborations to develop to contribute to these investigations. We will describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. We also give some illustrative examples of current scientific data mining research in astronomy, and point out where new research is needed. In particular, the data mining research community will need to address several issues in the coming years as we prepare for the LSST data deluge. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night); multi-resolution methods for exploration of petascale databases; visual data mining algorithms for visual exploration of the data; indexing of multi-attribute multi-dimensional astronomical databases (beyond RA-Dec spatial indexing) for rapid querying of petabyte databases; and more. Finally, we will identify opportunities for synergistic collaboration between the data mining research group and the LSST Data Management and Science Collaboration teams.
Examining the Potential of LSST to Contribute to Exoplanet Discovery
NASA Astrophysics Data System (ADS)
Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.
2018-01-01
The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.
LSST Survey Data: Models for EPO Interaction
NASA Astrophysics Data System (ADS)
Olsen, J. K.; Borne, K. D.
2007-12-01
The potential for education and public outreach with the Large Synoptic Survey Telescope is as far reaching as the telescope itself. LSST data will be available to the public, giving anyone with a web browser a movie-like window on the Universe. The LSST project is unique in designing its data management and data access systems with the public and community users in mind. The enormous volume of data to be generated by LSST is staggering: 30 Terabytes per night, 10 Petabytes per year. The final database of extracted science parameters from the images will also be enormous -- 50-100 Petabytes -- a rich gold mine for data mining and scientific discovery potential. LSST will also generate 100,000 astronomical alerts per night, for 10 years. The LSST EPO team is examining models for EPO interaction with the survey data, particularly in how the community (amateurs, teachers, students, and general public) can participate in the discovery process. We will outline some of our models of community interaction for inquiry-based science using the LSST survey data, and we invite discussion on these topics.
The LSST Data Mining Research Agenda
NASA Astrophysics Data System (ADS)
Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.
2008-12-01
We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.
Probing LSST's Ability to Detect Planets Around White Dwarfs
NASA Astrophysics Data System (ADS)
Cortes, Jorge; Kipping, David
2018-01-01
Over the last four years more than 2,000 planets outside our solar system have been discovered, motivating us to search for and characterize potentially habitable worlds. Most planets orbit Sun-like stars, but more exotic stars can also host planets. Debris disks and disintegrating planetary bodies have been detected around white dwarf stars, the inert, Earth-sized cores of once-thriving stars like our Sun. These detections are clues that planets may exist around white dwarfs. Due to the faintness of white dwarfs and the potential rarity of planets around them, a vast survey is required to have a chance at detecting these planetary systems. The Large Synoptic Survey Telescope (LSST), scheduled to commence operations in 2023, will image the entire southern sky every few nights for 10 years, providing our first real opportunity to detect planets around white dwarfs. We characterized LSST’s ability to detect planets around white dwarfs through simulations that incorporate realistic models for LSST’s observing strategy and the white dwarf distribution within the Milky Way galaxy. This was done through the use of LSST's Operations Simulator (OpSim) and Catalog Simulator (CatSim). Our preliminary results indicate that, if all white dwarfs were to possess a planet, LSST would yield a detection for every 100 observed white dwarfs. In the future, a larger set of ongoing simulations will help us quantify the number of planets LSST could potentially find.
The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.
2018-03-01
Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.
NASA Astrophysics Data System (ADS)
Saha, A.; Monet, D.
2005-12-01
Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.
Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Jurić, Mario; Ivezić, Željko
2016-01-01
The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.
The Large Synoptic Survey Telescope Science Requirements
NASA Astrophysics Data System (ADS)
Tyson, J. A.; LSST Collaboration
2004-12-01
The Large Synoptic Survey Telescope (LSST) is a wide-field telescope facility that will add a qualitatively new capability in astronomy and will address some of the most pressing open questions in astronomy and fundamental physics. The 8.4-meter telescope and 3 billion pixel camera covering ten square degrees will reach sky in less than 10 seconds in each of 5-6 optical bands. This is enabled by advances in microelectronics, software, and large optics fabrication. The unprecedented optical throughput drives LSST's ability to go faint-wide-fast. The LSST will produce time-lapse digital imaging of faint astronomical objects across the entire visible sky with good resolution. For example, the LSST will provide unprecedented 3-dimensional maps of the mass distribution in the Universe, in addition to the traditional images of luminous stars and galaxies. These weak lensing data can be used to better understand the nature of Dark Energy. The LSST will also provide a comprehensive census of our solar system. By surveying deeply the entire accessible sky every few nights, the LSST will provide large samples of events which we now only rarely observe, and will create substantial potential for new discoveries. The LSST will produce the largest non-proprietary data set in the world. Several key science drivers are representative of the LSST system capabilities: Precision Characterization of Dark Energy, Solar System Map, Optical Transients, and a map of our Galaxy and its environs. In addition to enabling all four of these major scientific initiatives, LSST will make it possible to pursue many other research programs. The community has suggested a number of exciting programs using these data, and the long-lived data archives of the LSST will have the astrometric and photometric precision needed to support entirely new research directions which will inevitably develop during the next several decades.
Scientific Synergy between LSST and Euclid
NASA Astrophysics Data System (ADS)
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja
2017-12-01
Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.
2016-06-01
Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.
Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies
NASA Astrophysics Data System (ADS)
Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter
2018-01-01
In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.
Scientific Synergy between LSST and Euclid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric
We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less
Scientific Synergy between LSST and Euclid
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; ...
2017-12-07
We report that Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy clustermore » studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. Also, we provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.« less
NOAO and LSST: Illuminating the Path to LSST for All Users
NASA Astrophysics Data System (ADS)
Olsen, Knut A.; Matheson, T.; Ridgway, S. T.; Saha, A.; Lauer, T. R.; NOAO LSST Science Working Group
2013-01-01
As LSST moves toward construction and survey definition, the burden on the user community to begin planning and preparing for the massive data stream grows. In light of the significant challenge and opportunity that LSST now brings, a critical role for a National Observatory will be to advocate for, respond to, and advise the U.S. community on its use of LSST. NOAO intends to establish an LSST Community Science Center to meet these common needs. Such a Center builds on NOAO's leadership in offering survey-style instruments, proposal opportunities, and data management software over the last decade. This leadership has enabled high-impact scientific results, as evidenced by the award of the 2011 Nobel Prize in Physics for the discovery of Dark Energy, which stemmed directly from survey-style observations taken at NOAO. As steps towards creating an LSST Community Science Center, NOAO is 1) supporting the LSST Science Collaborations through membership calls and collaboration meetings; 2) developing the LSST operations simulator, the tool by which the community's scientific goals of are tested against the reality of what LSST's cadence can deliver; 3) embarking on a project to establish metrics for science data quality assessment, which will be critical for establishing faith in LSST results; 4) developing a roadmap and proposal to host and support the capability to help the community manage the expected flood of automated alerts from LSST; and 5) starting a serious discussion of the system capabilities needed for photometric and spectroscopic followup of LSST observations. The fundamental goal is to enable productive, world-class research with LSST by the entire US community-at-large in tight collaboration with the LSST Project, LSST Science Collaborations, and the funding agencies.
LSST: Education and Public Outreach
NASA Astrophysics Data System (ADS)
Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team
2018-01-01
The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-06-01
How can we hunt down all the near-Earth asteroids that are capable of posing a threat to us? A new study looks at whether the upcoming Large Synoptic Survey Telescope (LSST) is up to the job.Charting Nearby ThreatsLSST is an 8.4-m wide-survey telescope currently being built in Chile. When it goes online in 2022, it will spend the next ten years surveying our sky, mapping tens of billions of stars and galaxies, searching for signatures of dark energy and dark matter, and hunting for transient optical events like novae and supernovae. But in its scanning, LSST will also be looking for asteroids that approach near Earth.Cumulative number of near-Earth asteroids discovered over time, as of June 16, 2016. [NASA/JPL/Chamberlin]Near-Earth objects (NEOs) have the potential to be hazardous if they cross Earths path and are large enough to do significant damage when they impact Earth. Earths history is riddled with dangerous asteroid encounters, including the recent Chelyabinsk airburst in 2013, the encounter that caused the kilometer-sized Meteor Crater in Arizona, and the impact thought to contribute to the extinction of the dinosaurs.Recognizing the potential danger that NEOs can pose to Earth, Congress has tasked NASA with tracking down 90% of NEOs larger than 140 meters in diameter. With our current survey capabilities, we believe weve discovered roughly 25% of these NEOs thus far. Now a new study led by Tommy Grav (Planetary Science Institute) examines whether LSST will be able to complete this task.Absolute magnitude, H, of asynthetic NEO population. Though these NEOs are all larger than 140 m, they have a large spread in albedos. [Grav et al. 2016]Can LSST Help?Based on previous observations of NEOs and resulting predictions for NEO properties and orbits, Grav and collaborators simulate a synthetic population of NEOs all above 140 m in size. With these improved population models, they demonstrate that the common tactic of using an asteroids absolute magnitude as a proxy for its size is a poor approximation, due to asteroids large spread in albedos. Roughly 23% of NEOs larger than 140 m have absolute magnitudes fainter than H = 22 mag, the authors show which is the value usually assumed as the default absolute magnitude of a 140 m NEO.Fraction of NEOs weve detected as a function of time based on the authors simulations of the current surveys (red), LSST plus the current surveys (black), NEOCam plus the current surveys (blue), and the combined result for all surveys (green). [Grav et al. 2016]Taking this into account, Grav and collaborators then use information about the planned LSST survey strategies and detection limits to test what fraction of this synthetic NEO population LSST will be able to detect in its proposed 10-year mission.The authors find that, within 10 years, LSST will likely be able to detect only 63% of NEOs larger than 140 m. Luckily, LSST may not have to work alone; in addition to the current surveys in operation, a proposed infrared space-based survey mission called NEOCam is planned for launch in 2021. If NEOCam is funded, it will complement LSSTs discovery capabilities, potentially allowing the two surveys to jointly achieve the 90% detection goal within a decade.CitationT. Grav et al 2016 AJ 151 172. doi:10.3847/0004-6256/151/6/172
The Large Synoptic Survey Telescope: Projected Near-Earth Object Discovery Performance
NASA Technical Reports Server (NTRS)
Chesley, Steven R.; Veres, Peter
2016-01-01
The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field survey that has the potential to detect millions of asteroids. LSST is under construction with survey operations slated to begin in 2022. We describe an independent study to assess the performance of LSST for detecting and cataloging near-Earth objects (NEOs). A significant component of the study will be to assess the survey's ability to link observations of a single object from among the large numbers of false detections and detections of other objects. We also will explore the survey's basic performance in terms of fraction of NEOs discovered and cataloged, both for the planned baseline survey, but also for enhanced surveys that are more carefully tuned for NEO search, generally at the expense of other science drivers. Preliminary results indicate that with successful linkage under the current baseline survey LSST would discover approximately 65% of NEOs with absolute magnitude H is less than 22, which corresponds approximately to 140m diameter.
LSST: Cadence Design and Simulation
NASA Astrophysics Data System (ADS)
Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration
2009-01-01
The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.
The European perspective for LSST
NASA Astrophysics Data System (ADS)
Gangler, Emmanuel
2017-06-01
LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.
The Large Synoptic Survey Telescope (LSST) Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energy’s SLAC National Accelerator Laboratory is leading the construction of the LSST camera – the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
NASA Astrophysics Data System (ADS)
Kantor, J.
During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.
The Large Synoptic Survey Telescope (LSST) Camera
None
2018-06-13
Ranked as the top ground-based national priority for the field for the current decade, LSST is currently under construction in Chile. The U.S. Department of Energyâs SLAC National Accelerator Laboratory is leading the construction of the LSST camera â the largest digital camera ever built for astronomy. SLAC Professor Steven M. Kahn is the overall Director of the LSST project, and SLAC personnel are also participating in the data management. The National Science Foundation is the lead agency for construction of the LSST. Additional financial support comes from the Department of Energy and private funding raised by the LSST Corporation.
NASA Astrophysics Data System (ADS)
Coughlin, Michael; Stubbs, Christopher; Claver, Chuck
2016-06-01
We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.
Science Education with the LSST
NASA Astrophysics Data System (ADS)
Jacoby, S. H.; Khandro, L. M.; Larson, A. M.; McCarthy, D. W.; Pompea, S. M.; Shara, M. M.
2004-12-01
LSST will create the first true celestial cinematography - a revolution in public access to the changing universe. The challenge will be to take advantage of the unique capabilities of the LSST while presenting the data in ways that are manageable, engaging, and supportive of national science education goals. To prepare for this opportunity for exploration, tools and displays will be developed using current deep-sky multi-color imaging data. Education professionals from LSST partners invite input from interested members of the community. Initial LSST science education priorities include: - Fostering authentic student-teacher research projects at all levels, - Exploring methods of visualizing the large and changing datasets in science centers, - Defining Web-based interfaces and tools for access and interaction with the data, - Delivering online instructional materials, and - Developing meaningful interactions between LSST scientists and the public.
LSST active optics system software architecture
NASA Astrophysics Data System (ADS)
Thomas, Sandrine J.; Chandrasekharan, Srinivasan; Lotz, Paul; Xin, Bo; Claver, Charles; Angeli, George; Sebag, Jacques; Dubois-Felsmann, Gregory P.
2016-08-01
The Large Synoptic Survey Telescope (LSST) is an 8-meter class wide-field telescope now under construction on Cerro Pachon, near La Serena, Chile. This ground-based telescope is designed to conduct a decade-long time domain survey of the optical sky. In order to achieve the LSST scientific goals, the telescope requires delivering seeing limited image quality over the 3.5 degree field-of-view. Like many telescopes, LSST will use an Active Optics System (AOS) to correct in near real-time the system aberrations primarily introduced by gravity and temperature gradients. The LSST AOS uses a combination of 4 curvature wavefront sensors (CWS) located on the outside of the LSST field-of-view. The information coming from the 4 CWS is combined to calculate the appropriate corrections to be sent to the 3 different mirrors composing LSST. The AOS software incorporates a wavefront sensor estimation pipeline (WEP) and an active optics control system (AOCS). The WEP estimates the wavefront residual error from the CWS images. The AOCS determines the correction to be sent to the different degrees of freedom every 30 seconds. In this paper, we describe the design and implementation of the AOS. More particularly, we will focus on the software architecture as well as the AOS interactions with the various subsystems within LSST.
The LSST Metrics Analysis Framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.
2015-01-01
Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.
Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)
NASA Astrophysics Data System (ADS)
Rawls, M.
2017-06-01
(Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.
Investigating the Bright End of LSST Photometry
NASA Astrophysics Data System (ADS)
Ojala, Elle; Pepper, Joshua; LSST Collaboration
2018-01-01
The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST
NASA Astrophysics Data System (ADS)
Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.
2017-01-01
The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.
NASA Astrophysics Data System (ADS)
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; Collett, Thomas E.
2018-03-01
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ∼2 over previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Accounting for microlensing, the 1–2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.; ...
2018-03-01
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, Daniel A.; Nugent, Peter E.; Kasen, Daniel N.
Time delays between the multiple images of strongly gravitationally lensed Type Ia supernovae (glSNe Ia) have the potential to deliver precise cosmological constraints, but the effects of microlensing on time delay extraction have not been studied in detail. Here we quantify the effect of microlensing on the glSN Ia yield of the Large Synoptic Survey Telescope (LSST) and the effect of microlensing on the precision and accuracy of time delays that can be extracted from LSST glSNe Ia. Microlensing has a negligible effect on the LSST glSN Ia yield, but it can be increased by a factor of ~2 overmore » previous predictions to 930 systems using a novel photometric identification technique based on spectral template fitting. Crucially, the microlensing of glSNe Ia is achromatic until three rest-frame weeks after the explosion, making the early-time color curves microlensing-insensitive time delay indicators. By fitting simulated flux and color observations of microlensed glSNe Ia with their underlying, unlensed spectral templates, we forecast the distribution of absolute time delay error due to microlensing for LSST, which is unbiased at the sub-percent level and peaked at 1% for color curve observations in the achromatic phase, while for light-curve observations it is comparable to state-of-the-art mass modeling uncertainties (4%). About 70% of LSST glSN Ia images should be discovered during the achromatic phase, indicating that microlensing time delay uncertainties can be minimized if prompt multicolor follow-up observations are obtained. Lastly, accounting for microlensing, the 1-2 day time delay on the recently discovered glSN Ia iPTF16geu can be measured to 40% precision, limiting its cosmological utility.« less
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Debois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Marshall, S.; Nordby, M.; Schumacher, G.; Sebag, J.; LSST Collaboration
2011-01-01
The Large Synoptic Survey Telescope (LSST) is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire [Southern] sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m 3-mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper level of abstraction and detail. The LSST modeling includes the analysis and documenting the flow of command and control information and data between the suite of systems in the LSST observatory that are needed to carry out the activities of the survey. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
Searching for modified growth patterns with tomographic surveys
NASA Astrophysics Data System (ADS)
Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel
2009-04-01
In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.
Solar System science with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David
2015-11-01
The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.
Satellite Power Systems (SPS). LSST systems and integration task for SPS flight test article
NASA Technical Reports Server (NTRS)
Greenberg, H. S.
1981-01-01
This research activity emphasizes the systems definition and resulting structural requirements for the primary structure of two potential SPS large space structure test articles. These test articles represent potential steps in the SPS research and technology development.
A Prototype External Event Broker for LSST
NASA Astrophysics Data System (ADS)
Elan Alvarez, Gabriella; Stassun, Keivan; Burger, Dan; Siverd, Robert; Cox, Donald
2015-01-01
LSST plans to have an alerts system that will automatically identify various types of "events" appearing in the LSST data stream. These events will include things such as supernovae, moving objects, and many other types, and it is expected that there will be millions of events nightly. It is expected that there may be tens of millions of events each night. To help the LSST community parse and make full advantage of the LSST alerts stream, we are working to design an external "events alert broker" that will generate real-time notification of LSST events to users and/or robotic telescope facilities based on user-specified criteria. For example, users will be able to specify that they wish to be notified immediately via text message of urgent events, such as GRB counterparts, or notified only occasionally in digest form of less time-sensitive events, such as eclipsing binaries. This poster will summarize results from a survey of scientists for the most important features that such an alerts notification service needs to provide, and will present a preliminary design for our external event broker.
Astrometry with LSST: Objectives and Challenges
NASA Astrophysics Data System (ADS)
Casetti-Dinescu, D. I.; Girard, T. M.; Méndez, R. A.; Petronchak, R. M.
2018-01-01
The forthcoming Large Synoptic Survey Telescope (LSST) is an optical telescope with an effective aperture of 6.4 m, and a field of view of 9.6 square degrees. Thus, LSST will have an étendue larger than any other optical telescope, performing wide-field, deep imaging of the sky. There are four broad categories of science objectives: 1) dark-energy and dark matter, 2) transients, 3) the Milky Way and its neighbours and, 4) the Solar System. In particular, for the Milky-Way science case, astrometry will make a critical contribution; therefore, special attention must be devoted to extract the maximum amount of astrometric information from the LSST data. Here, we outline the astrometric challenges posed by such a massive survey. We also present some current examples of ground-based, wide-field, deep imagers used for astrometry, as precursors of the LSST.
The Stellar Populations of the Milky Way and Nearby Galaxies with LSST
NASA Astrophysics Data System (ADS)
Olsen, Knut A.; Covey, K.; Saha, A.; Beers, T. C.; Bochanski, J.; Boeshaar, P.; Cargile, P.; Catelan, M.; Burgasser, A.; Cook, K.; Dhital, S.; Figer, D.; Ivezic, Z.; Kalirai, J.; McGehee, P.; Minniti, D.; Pepper, J.; Prsa, A.; Sarajedini, A.; Silva, D.; Smith, J. A.; Stassun, K.; Thorman, P.; Williams, B.; LSST Stellar Populations Collaboration
2011-01-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma) when observations at the individual epochs of the standard cadence are stacked. Analyzing the ten years of independent measurements in each field will allow variability, proper motion and parallax measurements to be derived for objects brighter than r=24.5. These photometric, astrometric, and variability data will enable the construction of a detailed and robust map of the stellar populations of the Milky Way, its satellites and its nearest extra-galactic neighbors--allowing exploration of their star formation, chemical enrichment, and accretion histories on a grand scale. For example, with geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, LSST will allow a complete census of all stars above the hydrogen-burning limit that are closer than 500 pc, including thousands of predicted L and T dwarfs. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics; LSST's projected impact on the study of several variable star classes, including eclipsing binaries, are discussed here. We also describe the ongoing efforts of the collaboration to optimize the LSST system for stellar populations science. We are currently investigating the trade-offs associated with the exact wavelength boundaries of the LSST filters, identifying the most scientifically valuable locations for fields that will receive enhanced temporal coverage compared to the standard cadence, and analyzing synthetic LSST outputs to verify that the system's performance will be sufficient to achieve our highest priority science goals.
Designing a Multi-Petabyte Database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei
2007-01-10
The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team
2010-01-01
The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
The Effects of Commercial Airline Traffic on LSST Observing Efficiency
NASA Astrophysics Data System (ADS)
Gibson, Rose; Claver, Charles; Stubbs, Christopher
2016-01-01
The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).
Architectural Implications for Spatial Object Association Algorithms*
Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste
2013-01-01
Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244
NASA Astrophysics Data System (ADS)
Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.
2005-12-01
We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.
The Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Axelrod, T. S.
2006-07-01
The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.
Designing a multi-petabyte database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, J; Hanushevsky, A
2005-12-21
The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less
TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacklin, Savannah; Lund, Michael B.; Stassun, Keivan G.
2015-07-15
The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5more » to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.« less
LSST and the Epoch of Reionization Experiments
NASA Astrophysics Data System (ADS)
Ivezić, Željko
2018-05-01
The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.
Probing the Solar System with LSST
NASA Astrophysics Data System (ADS)
Harris, A.; Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Bowell, E.; Bernstein, G.; Cook, K.; Stubbs, C.
2005-12-01
LSST will catalog small Potentially Hazardous Asteroids (PHAs), survey the main belt asteroid (MBA) population to extraordinarily small size, discover comets far from the sun where their nuclear properties can be discerned without coma, and survey the Centaur and Trans-Neptunian Object (TNO) populations. The present planned observing strategy is to ``visit'' each field (9.6 deg2) with two back-to-back exposures of ˜ 15 sec, reaching to at least V magnitude 24.5. An intra-night revisit time of the order half an hour will distinguish stationary transients from even very distant ( ˜ 70 AU) solar system bodies. In order to link observations and determine orbits, each sky area will be visited several times during a month, spaced by about a week. This cadence will result in orbital parameters for several million MBAs and about 20,000 TNOs, with light curves and colorimetry for the brighter 10% or so of each population. Compared to the current data available, this would represent factor of 10 to 100 increase in the numbers of orbits, colors, and variability of the two classes of objects. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying moving objects.
Architectural Implications for Spatial Object Association Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, V S; Kurc, T; Saltz, J
2009-01-29
Spatial object association, also referred to as cross-match of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server R, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation providesmore » insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST).« less
LSST Resources for the Community
NASA Astrophysics Data System (ADS)
Jones, R. Lynne
2011-01-01
LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.
Measuring the Growth Rate of Structure with Type IA Supernovae from LSST
NASA Astrophysics Data System (ADS)
Howlett, Cullan; Robotham, Aaron S. G.; Lagos, Claudia D. P.; Kim, Alex G.
2017-10-01
We investigate the peculiar motions of galaxies up to z = 0.5 using Type Ia supernovae (SNe Ia) from the Large Synoptic Survey Telescope (LSST) and predict the subsequent constraints on the growth rate of structure. We consider two cases. Our first is based on measurements of the volumetric SNe Ia rate and assumes we can obtain spectroscopic redshifts and light curves for varying fractions of objects that are detected pre-peak luminosity by LSST (some of which may be obtained by LSST itself, and others that would require additional follow-up observations). We find that these measurements could produce growth rate constraints at z< 0.5 that significantly outperform those found using Redshift Space Distortions (RSD) with DESI or 4MOST, even though there are ˜ 4× fewer objects. For our second case, we use semi-analytic simulations and a prescription for the SNe Ia rate as a function of stellar mass and star-formation rate to predict the number of LSST SNe IA whose host redshifts may already have been obtained with the Taipan+WALLABY surveys or with a future multi-object spectroscopic survey. We find ˜18,000 and ˜160,000 SNe Ia with host redshifts for these cases, respectively. While this is only a fraction of the total LSST-detected SNe Ia, they could be used to significantly augment and improve the growth rate constraints compared to only RSD. Ultimately, we find that combining LSST SNe Ia with large numbers of galaxy redshifts will provide the most powerful probe of large-scale gravity in the z< 0.5 regime over the coming decades.
The Large Synoptic Survey Telescope project management control system
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey P.
2012-09-01
The Large Synoptic Survey Telescope (LSST) program is jointly funded by the NSF, the DOE, and private institutions and donors. From an NSF funding standpoint, the LSST is a Major Research Equipment and Facilities (MREFC) project. The NSF funding process requires proposals and D&D reviews to include activity-based budgets and schedules; documented basis of estimates; risk-based contingency analysis; cost escalation and categorization. "Out-of-the box," the commercial tool Primavera P6 contains approximately 90% of the planning and estimating capability needed to satisfy R&D phase requirements, and it is customizable/configurable for remainder with relatively little effort. We describe the customization/configuration and use of Primavera for the LSST Project Management Control System (PMCS), assess our experience to date, and describe future directions. Examples in this paper are drawn from the LSST Data Management System (DMS), which is one of three main subsystems of the LSST and is funded by the NSF. By astronomy standards the LSST DMS is a large data management project, processing and archiving over 70 petabyes of image data, producing over 20 petabytes of catalogs annually, and generating 2 million transient alerts per night. Over the 6-year construction and commissioning phase, the DM project is estimated to require 600,000 hours of engineering effort. In total, the DMS cost is approximately 60% hardware/system software and 40% labor.
From Science To Design: Systems Engineering For The Lsst
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Axelrod, T.; Fouts, K.; Kantor, J.; Nordby, M.; Sebag, J.; LSST Collaboration
2009-01-01
The LSST is a universal-purpose survey telescope that will address scores of scientific missions. To assist the technical teams to convergence to a specific engineering design, the LSST Science Requirements Document (SRD) selects four stressing principle scientific missions: 1) Constraining Dark Matter and Dark Energy; 2) taking an Inventory of the Solar System; 3) Exploring the Transient Optical Sky; and 4) mapping the Milky Way. From these 4 missions the SRD specifies the needed requirements for single images and the full 10 year survey that enables a wide range of science beyond the 4 principle missions. Through optical design and analysis, operations simulation, and throughput modeling the systems engineering effort in the LSST has largely focused on taking the SRD specifications and deriving system functional requirements that define the system design. A Model Based Systems Engineering approach with SysML is used to manage the flow down of requirements from science to system function to sub-system. The rigor of requirements flow and management assists the LSST in keeping the overall scope, hence budget and schedule, under control.
Strong Gravitational Lensing with LSST
NASA Astrophysics Data System (ADS)
Marshall, Philip J.; Bradac, M.; Chartas, G.; Dobler, G.; Eliasdottir, A.; Falco, E.; Fassnacht, C. D.; Jee, M. J.; Keeton, C. R.; Oguri, M.; Tyson, J. A.; LSST Strong Lensing Science Collaboration
2010-01-01
LSST will find more strong gravitational lensing events than any other survey preceding it, and will monitor them all at a cadence of a few days to a few weeks. We can expect the biggest advances in strong lensing science made with LSST to be in those areas that benefit most from the large volume, and the high accuracy multi-filter time series: studies of, and using, several thousand lensed quasars and several hundred supernovae. However, the high quality imaging will allow us to detect and measure large numbers of background galaxies multiply-imaged by galaxies, groups and clusters. In this poster we give an overview of the strong lensing science enabled by LSST, and highlight the particular associated technical challenges that will have to be faced when working with the survey.
Expanding the user base beyond HEP for the Ganga distributed analysis user interface
NASA Astrophysics Data System (ADS)
Currie, R.; Egede, U.; Richards, A.; Slater, M.; Williams, M.
2017-10-01
This document presents the result of recent developments within Ganga[1] project to support users from new communities outside of HEP. In particular I will examine the case of users from the Large Scale Survey Telescope (LSST) group looking to use resources provided by the UK based GridPP[2][3] DIRAC[4][5] instance. An example use case is work performed with users from the LSST Virtual Organisation (VO) to distribute the workflow used for galaxy shape identification analyses. This work highlighted some LSST specific challenges which could be well solved by common tools within the HEP community. As a result of this work the LSST community was able to take advantage of GridPP[2][3] resources to perform large computing tasks within the UK.
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
Photometric classification and redshift estimation of LSST Supernovae
NASA Astrophysics Data System (ADS)
Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve
2018-07-01
Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 per cent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ (z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ (z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.
Lower Boundary Forcing related to the Occurrence of Rain in the Tropical Western Pacific
NASA Astrophysics Data System (ADS)
Li, Y.; Carbone, R. E.
2013-12-01
Global weather and climate models have a long and somewhat tortured history with respect to simulation and prediction of tropical rainfall in the relative absence of balanced flow in the geostrophic sense. An important correlate with tropical rainfall is sea surface temperature (SST). The introduction of SST information to convective rainfall parameterization in global models has improved model climatologies of tropical oceanic rainfall. Nevertheless, large systematic errors have persisted, several of which are common to most atmospheric models. Models have evolved to the point where increased spatial resolution demands representation of the SST field at compatible temporal and spatial scales, leading to common usage of monthly SST fields at scales of 10-100 km. While large systematic errors persist, significant skill has been realized from various atmospheric and coupled ocean models, including assimilation of weekly or even daily SST fields, as tested by the European Center for Medium Range Weather Forecasting. A few investigators have explored the role of SST gradients in relation to the occurrence of precipitation. Some of this research has focused on large scale gradients, mainly associated with surface ocean-atmosphere climatology. These studies conclude that lower boundary atmospheric convergence, under some conditions, could be substantially enhanced over SST gradients, destabilizing the atmosphere, and thereby enabling moist convection. While the concept has a firm theoretical foundation, it has not gained a sizeable following far beyond the realm of western boundary currents. Li and Carbone 2012 examined the role of transient mesoscale (~ 100 km) SST gradients in the western Pacific warm pool by means of GHRSST and CMORPH rainfall data. They found that excitation of deep moist convection was strongly associated with the Laplacian of SST (LSST). Specifically, -LSST is associated with rainfall onset in 75% of 10,000 events over 4 years, whereas the background ocean is symmetric about zero Laplacian. This finding is fully consistent with theory for gradients of order ~1degC in low mean wind conditions, capable of inducing atmospheric convergence of N x 10-5s-1. We will present new findings resulting from the application of a Madden-Julian oscillation (MJO) passband filter to GHRSST/CMORPH data. It shows that the -LSST field organizes at scales of 1000-2000 km and can persist for periods of two weeks to 3 months. Such -LSST anomalies are in quadrature with MJO rainfall, tracking and leading the wet phase of the MJO by 10-14 days, from the Indian Ocean to the dateline. More generally, an evaluation of SST structure in rainfall production will be presented, which represents a decidedly alternative view to conventional wisdom. Li, Yanping, and R.E. Carbone, 2012: Excitation of Rainfall over the Tropical Western Pacific, J. Atmos. Sci., 69, 2983-2994.
NASA Astrophysics Data System (ADS)
Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques
2014-08-01
The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.
Commentary: Learning About the Sky Through Simulations. Chapter 34
NASA Technical Reports Server (NTRS)
Way, Michael J.
2012-01-01
The Large Synoptic Survey Telescope (LSST) simulator being built by Andy Connolly and collaborators is an impressive undertaking and should make working with LSST in the beginning stages far more easy than it was initially with the Sloan Digital Sky Survey (SDSS). However, I would like to focus on an equally important problem that has not yet been discussed here, but in the coming years the community will need to address-can we deal with the flood of data from LSST and will we need to rethink the way we work?
Cosmology with the Large Synoptic Survey Telescope: an overview
NASA Astrophysics Data System (ADS)
Zhan, Hu; Tyson, J. Anthony
2018-06-01
The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.
Yoon, Dong Hyun; Kang, Dongheon; Kim, Hee-Jae; Kim, Jin-Soo; Song, Han Sol; Song, Wook
2017-05-01
The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated. In particular, unconventional high-speed resistance training can improve muscle power development. In the present study, the effectiveness of 12 weeks of elastic band-based high-speed power training (HSPT) was examined. Participants were randomly assigned into a HSPT group (n = 14, age 75.0 ± 0.9 years), a low-speed strength training (LSST) group (n = 9, age 76.0 ± 1.3 years) and a control group (CON; n = 7, age 78.0 ± 1.0 years). A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups, and balance and tone exercises were carried out by the CON group. Significant increases in levels of cognitive function, physical function, and muscle strength were observed in both the HSPT and LSST groups. In cognitive function, significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group. In physical functions, Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group. In the 12 weeks of elastic band-based training, the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group, although both regimens were effective in improving cognitive function, physical function and muscle strength. We conclude that elastic band-based HSPT, as compared with LSST, is more efficient in helping older women with mild cognitive impairment to improve cognitive function, physical performance and muscle strength. Geriatr Gerontol Int 2017; 17: 765-772. © 2016 Japan Geriatrics Society.
LSST Painting Risk Evaluation Memo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, Justin E.
The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.
Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations
NASA Astrophysics Data System (ADS)
Given, Gabriel; Grin, Daniel
2018-01-01
Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.
Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice
2010-01-18
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.
NASA Astrophysics Data System (ADS)
O'Mullane, William; LSST Data Management Team
2018-01-01
The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to the data center in North America and within 60 seconds it will be reduced using difference imaging and an alert list be generated for the community. Additionally, annual data releases will be constructed from all the data during the 10-year mission, producing catalogs and deep co-added images with unprecedented time resolution for such a large region of sky. In the paper we present the current status of the LSST stack including the data processing components, Qserv database and data visualization software, describe how to obtain it, and provide a summary of the development road map. We are also discuss the move to Python3 and timeline for dropping Python2.
Large-Scale Overlays and Trends: Visually Mining, Panning and Zooming the Observable Universe.
Luciani, Timothy Basil; Cherinka, Brian; Oliphant, Daniel; Myers, Sean; Wood-Vasey, W Michael; Labrinidis, Alexandros; Marai, G Elisabeta
2014-07-01
We introduce a web-based computing infrastructure to assist the visual integration, mining and interactive navigation of large-scale astronomy observations. Following an analysis of the application domain, we design a client-server architecture to fetch distributed image data and to partition local data into a spatial index structure that allows prefix-matching of spatial objects. In conjunction with hardware-accelerated pixel-based overlays and an online cross-registration pipeline, this approach allows the fetching, displaying, panning and zooming of gigabit panoramas of the sky in real time. To further facilitate the integration and mining of spatial and non-spatial data, we introduce interactive trend images-compact visual representations for identifying outlier objects and for studying trends within large collections of spatial objects of a given class. In a demonstration, images from three sky surveys (SDSS, FIRST and simulated LSST results) are cross-registered and integrated as overlays, allowing cross-spectrum analysis of astronomy observations. Trend images are interactively generated from catalog data and used to visually mine astronomy observations of similar type. The front-end of the infrastructure uses the web technologies WebGL and HTML5 to enable cross-platform, web-based functionality. Our approach attains interactive rendering framerates; its power and flexibility enables it to serve the needs of the astronomy community. Evaluation on three case studies, as well as feedback from domain experts emphasize the benefits of this visual approach to the observational astronomy field; and its potential benefits to large scale geospatial visualization in general.
NASA Astrophysics Data System (ADS)
Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo
2016-07-01
The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.
Photometric classification and redshift estimation of LSST Supernovae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Mi; Kuhlmann, Steve; Wang, Yun
Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of an SN classifier that uses SN colours to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an area-under-the-curve of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99 percent SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z)more » of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias (⟨zphot - zspec⟩) of 0.012 with σ(z phot -z spec 1+z spec )=0.0294 σ(zphot-zspec1+zspec)=0.0294 without using a host-galaxy photo-z prior, and a mean bias (⟨zphot - zspec⟩) of 0.0017 with σ(z phot -z spec 1+z spec )=0.0116 σ(zphot-zspec1+zspec)=0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.« less
On the Detectability of Planet X with LSST
NASA Astrophysics Data System (ADS)
Trilling, David E.; Bellm, Eric C.; Malhotra, Renu
2018-06-01
Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.
Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration
2018-01-01
The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.
Wood-Vasey DOE #SC0011834 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood-Vasey, William Michael
During the past reporting period (Year 3), this grant has provided partial support for graduate students Daniel Perrefort and Kara Ponder. They have been working exploring different aspects of the technical work needed to take full advantage of the potential for cosmological inference using Type Ia supernovae (SNeIa) with LSST.
NASA Astrophysics Data System (ADS)
Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team
2018-01-01
The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.
Giga-z: A 100,000 Object Superconducting Spectrophotometer for LSST Follow-up
NASA Astrophysics Data System (ADS)
Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran; Hirata, Chris
2013-09-01
We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R 423 nm = E/ΔE = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations. In 3 yr on a dedicated 4 m class telescope, Giga-z could observe ≈2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z ≈ 6 with magnitudes mi <~ 25, with accuracy σΔz/(1 + z) ≈ 0.03 for the whole sample, and σΔz/(1 + z) ≈ 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of α = 1.27 (wp ), 1.53 (wa ), or 1.98 (Δγ). This is equivalent to multiplying both the LSST coverage area and the training sets by α and reducing all systematics by a factor of 1/\\sqrt{\\alpha }, advantages that are robust to even more extreme models of intrinsic alignment.
An optical to IR sky brightness model for the LSST
NASA Astrophysics Data System (ADS)
Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo
2016-07-01
To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.
Cosmology with the Large Synoptic Survey Telescope: an overview.
Zhan, Hu; Anthony Tyson, J
2018-06-01
The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an [Formula: see text] ([Formula: see text] effective) aperture, a novel three-mirror design achieving a seeing-limited [Formula: see text] field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an [Formula: see text] survey in six passbands (ugrizy) to a coadded depth of [Formula: see text] over 10 years using [Formula: see text] of its observational time. The remaining [Formula: see text] of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with [Formula: see text] exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, V J; Olivier, S; Bauman, B
2012-05-24
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less
Advancing the LSST Operations Simulator
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group
2013-01-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.
Final Technical Report for DE-SC0012297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dell'Antonio, Ian
This is the final report on the work performed in award DE-SC0012297, Cosmic Frontier work in support of the LSST Dark Energy Science Collaboration's work to develop algorithms, simulations, and statistical tests to ensure optimal extraction of the dark energy properties from galaxy clusters observed with LSST. This work focused on effects that could produce a systematic error on the measurement of cluster masses (that will be used to probe the effects of dark energy on the growth of structure). These effects stem from the deviations from pure ellipticity of the gravitational lensing signal and from the blending of lightmore » of neighboring galaxies. Both these effects are expected to be more significant for LSST than for the stage III experiments such as the Dark Energy Survey. We calculate the magnitude of the mass error (or bias) for the first time and demonstrate that it can be treated as a multiplicative correction and calibrated out, allowing mass measurements of clusters from gravitational lensing to meet the requirements of LSST's dark energy investigation.« less
Using SysML for MBSE analysis of the LSST system
NASA Astrophysics Data System (ADS)
Claver, Charles F.; Dubois-Felsmann, Gregory; Delgado, Francisco; Hascall, Pat; Marshall, Stuart; Nordby, Martin; Schalk, Terry; Schumacher, German; Sebag, Jacques
2010-07-01
The Large Synoptic Survey Telescope is a complex hardware - software system of systems, making up a highly automated observatory in the form of an 8.4m wide-field telescope, a 3.2 billion pixel camera, and a peta-scale data processing and archiving system. As a project, the LSST is using model based systems engineering (MBSE) methodology for developing the overall system architecture coded with the Systems Modeling Language (SysML). With SysML we use a recursive process to establish three-fold relationships between requirements, logical & physical structural component definitions, and overall behavior (activities and sequences) at successively deeper levels of abstraction and detail. Using this process we have analyzed and refined the LSST system design, ensuring the consistency and completeness of the full set of requirements and their match to associated system structure and behavior. As the recursion process proceeds to deeper levels we derive more detailed requirements and specifications, and ensure their traceability. We also expose, define, and specify critical system interfaces, physical and information flows, and clarify the logic and control flows governing system behavior. The resulting integrated model database is used to generate documentation and specifications and will evolve to support activities from construction through final integration, test, and commissioning, serving as a living representation of the LSST as designed and built. We discuss the methodology and present several examples of its application to specific systems engineering challenges in the LSST design.
Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.
2013-01-01
The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen
2014-08-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
Stellar Populations with the LSST
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Olsen, K.; LSST Stellar Populations Collaboration
2006-12-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to g 27.5(5σ). Strategically cadenced time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than g 25. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence stars at all distances within the Galaxy, permitting a comprehensive study of star formation histories (SFH) and chemical evolution for field stars. With a geometric parallax accuracy of 1mas, LSST will produce a robust complete sample of the solar neighborhood stars. While delivering parallax accuracy comparable to HIPPARCOS, LSST will extend the catalog to more than a 10 magnitudes fainter limit, and will be complete to MV 15. In the Magellanic Clouds too, the photometry will reach MV +8, allowing the SFH and chemical signatures in the expansive outer extremities to be gleaned from their main sequence stars. This in turn will trace the detailed interaction of the Clouds with the Galaxy halo. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1hr to several years, a feast for variable star astrophysics. Cepheids and LPVs in all galaxies in the Sculptor, M83 and Cen-A groups are obvious data products: comparative studies will reveal systematic differences with galaxy properties, and help to fine tune the rungs of the distance ladder. Dwarf galaxies within 10Mpc that are too faint to find from surface brightness enhancements will be revealed via over-densities of their red giants: this systematic census will extend the luminosity function of galaxies to the faint limit. Novae discovered by LSST time sampling will trace intergalactic stars out to the Virgo and Fornax clusters.
LSST Probes of Dark Energy: New Energy vs New Gravity
NASA Astrophysics Data System (ADS)
Bradshaw, Andrew; Tyson, A.; Jee, M. J.; Zhan, H.; Bard, D.; Bean, R.; Bosch, J.; Chang, C.; Clowe, D.; Dell'Antonio, I.; Gawiser, E.; Jain, B.; Jarvis, M.; Kahn, S.; Knox, L.; Newman, J.; Wittman, D.; Weak Lensing, LSST; LSS Science Collaborations
2012-01-01
Is the late time acceleration of the universe due to new physics in the form of stress-energy or a departure from General Relativity? LSST will measure the shape, magnitude, and color of 4x109 galaxies to high S/N over 18,000 square degrees. These data will be used to separately measure the gravitational growth of mass structure and distance vs redshift to unprecedented precision by combining multiple probes in a joint analysis. Of the five LSST probes of dark energy, weak gravitational lensing (WL) and baryon acoustic oscillation (BAO) probes are particularly effective in combination. By measuring the 2-D BAO scale in ugrizy-band photometric redshift-selected samples, LSST will determine the angular diameter distance to a dozen redshifts with sub percent-level errors. Reconstruction of the WL shear power spectrum on linear and weakly non-linear scales, and of the cross-correlation of shear measured in different photometric redshift bins provides a constraint on the evolution of dark energy that is complementary to the purely geometric measures provided by supernovae and BAO. Cross-correlation of the WL shear and BAO signal within redshift shells minimizes the sensitivity to systematics. LSST will also detect shear peaks, providing independent constraints. Tomographic study of the shear of background galaxies as a function of redshift allows a geometric test of dark energy. To extract the dark energy signal and distinguish between the two forms of new physics, LSST will rely on accurate stellar point-spread functions (PSF) and unbiased reconstruction of galaxy image shapes from hundreds of exposures. Although a weighted co-added deep image has high S/N, it is a form of lossy compression. Bayesian forward modeling algorithms can in principle use all the information. We explore systematic effects on shape measurements and present tests of an algorithm called Multi-Fit, which appears to avoid PSF-induced shear systematics in a computationally efficient way.
Giga-z: A 100,000 OBJECT SUPERCONDUCTING SPECTROPHOTOMETER FOR LSST FOLLOW-UP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marsden, Danica W.; Mazin, Benjamin A.; O'Brien, Kieran
2013-09-15
We simulate the performance of a new type of instrument, a Superconducting Multi-Object Spectrograph (SuperMOS), that uses microwave kinetic inductance detectors (MKIDs). MKIDs, a new detector technology, feature good quantum efficiency in the UVOIR, can count individual photons with microsecond timing accuracy, and, like X-ray calorimeters, determine their energy to several percent. The performance of Giga-z, a SuperMOS designed for wide field imaging follow-up observations, is evaluated using simulated observations of the COSMOS mock catalog with an array of 100,000 R{sub 423{sub nm}} = E/{Delta}E = 30 MKID pixels. We compare our results against a simultaneous simulation of LSST observations.more » In 3 yr on a dedicated 4 m class telescope, Giga-z could observe Almost-Equal-To 2 billion galaxies, yielding a low-resolution spectral energy distribution spanning 350-1350 nm for each; 1000 times the number measured with any currently proposed LSST spectroscopic follow-up, at a fraction of the cost and time. Giga-z would provide redshifts for galaxies up to z Almost-Equal-To 6 with magnitudes m{sub i} {approx}< 25, with accuracy {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.03 for the whole sample, and {sigma}{sub {Delta}z/(1+z)} Almost-Equal-To 0.007 for a select subset. We also find catastrophic failure rates and biases that are consistently lower than for LSST. The added constraint on dark energy parameters for WL + CMB by Giga-z using the FoMSWG default model is equivalent to multiplying the LSST Fisher matrix by a factor of {alpha} = 1.27 (w{sub p} ), 1.53 (w{sub a} ), or 1.98 ({Delta}{gamma}). This is equivalent to multiplying both the LSST coverage area and the training sets by {alpha} and reducing all systematics by a factor of 1/{radical}({alpha}), advantages that are robust to even more extreme models of intrinsic alignment.« less
Mechanical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordby, Martin; Bowden, Gordon; Foss, Mike
2008-06-13
The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less
Designing for Peta-Scale in the LSST Database
NASA Astrophysics Data System (ADS)
Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.
2007-10-01
The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.
LSST telescope and site status
NASA Astrophysics Data System (ADS)
Gressler, William J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.
limited to two groups of 40 people. One group meets at the gatehouse at 9 AM and the other at 1PM. Because Reports SOAR End-of-night Reports Gemini South LSST Optical Engineering AURA Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and
The LSST Scheduler from design to construction
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Reuter, Michael A.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.
LSST and the Physics of the Dark Universe
Tyson, Anthony [UC Davis, California, United States
2017-12-09
The physics that underlies the accelerating cosmic expansion is unknown. This, 'dark energy' and the equally mysterious 'dark matter' comprise most of the mass-energy of the universe and are outside the standard model. Recent advances in optics, detectors, and information technology, has led to the design of a facility that will repeatedly image an unprecedented volume of the universe: LSST. For the first time, the sky will be surveyed wide, deep and fast. The history of astronomy has taught us repeatedly that there are surprises whenever we view the sky in a new way. I will review the technology of LSST, and focus on several independent probes of the nature of dark energy and dark matter. These new investigations will rely on the statistical precision obtainable with billions of galaxies.
Mapping the Solar System with LSST
NASA Astrophysics Data System (ADS)
Ivezic, Z.; Juric, M.; Lupton, R.; Connolly, A.; Kubica, J.; Moore, A.; Harris, A.; Bowell, T.; Bernstein, G.; Stubbs, C.; LSST Collaboration
2004-12-01
The currently considered LSST cadence, based on two 10 sec exposures, may result in orbital parameters, light curves and accurate colors for over a million main-belt asteroids (MBA), and about 20,000 trans-Neptunian objects (TNO). Compared to the current state-of-the-art, this sample would represent a factor of 5 increase in the number of MBAs with known orbits, a factor of 20 increase in the number of MBAs with known orbits and accurate color measurements, and a factor of 100 increase in the number of MBAs with measured variability properties. The corresponding sample increase for TNOs is 10, 100, and 1000, respectively. The LSST MBA and TNO samples will enable detailed studies of the dynamical and chemical history of the solar system. For example, they will constrain the MBA size distribution for objects larger than 100 m, and TNO size distribution for objects larger than 100 km, their physical state through variability measurements (solid body vs. a rubble pile), as well as their surface chemistry through color measurements. A proposed deep TNO survey, based on 1 hour exposures, may result in a sample of about 100,000 TNOs, while spending only 10% of the LSST observing time. Such a deep TNO survey would be capable of discovering Sedna-like objects at distances beyond 150 AU, thereby increasing the observable Solar System volume by about a factor of 7. The increase in data volume associated with LSST asteroid science will present many computational challenges to how we might extract tracks and orbits of asteroids from the underlying clutter. Tree-based algorithms for multihypothesis testing of asteroid tracks can help solve these challenges by providing the necessary 1000-fold speed-ups over current approaches while recovering 95% of the underlying asteroid populations.
Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era
NASA Astrophysics Data System (ADS)
Cenko, Stephen
2017-08-01
When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.
Management evolution in the LSST project
NASA Astrophysics Data System (ADS)
Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine
2010-07-01
The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
Delta Doping High Purity CCDs and CMOS for LSST
NASA Technical Reports Server (NTRS)
Blacksberg, Jordana; Nikzad, Shouleh; Hoenk, Michael; Elliott, S. Tom; Bebek, Chris; Holland, Steve; Kolbe, Bill
2006-01-01
A viewgraph presentation describing delta doping high purity CCD's and CMOS for LSST is shown. The topics include: 1) Overview of JPL s versatile back-surface process for CCDs and CMOS; 2) Application to SNAP and ORION missions; 3) Delta doping as a back-surface electrode for fully depleted LBNL CCDs; 4) Delta doping high purity CCDs for SNAP and ORION; 5) JPL CMP thinning process development; and 6) Antireflection coating process development.
The LSST metrics analysis framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.
2014-07-01
We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.
Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA
NASA Astrophysics Data System (ADS)
Korol, V.; Rossi, E. M.; Groot, P. J.
2017-03-01
According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.
NASA Astrophysics Data System (ADS)
Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael
2006-06-01
The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Schumacher, German
2014-08-01
The Large Synoptic Survey Telescope (LSST) is a complex system of systems with demanding performance and operational requirements. The nature of its scientific goals requires a special Observatory Control System (OCS) and particularly a very specialized automatic Scheduler. The OCS Scheduler is an autonomous software component that drives the survey, selecting the detailed sequence of visits in real time, taking into account multiple science programs, the current external and internal conditions, and the history of observations. We have developed a SysML model for the OCS Scheduler that fits coherently in the OCS and LSST integrated model. We have also developed a prototype of the Scheduler that implements the scheduling algorithms in the simulation environment provided by the Operations Simulator, where the environment and the observatory are modeled with real weather data and detailed kinematics parameters. This paper expands on the Scheduler architecture and the proposed algorithms to achieve the survey goals.
A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera
NASA Astrophysics Data System (ADS)
Kroedel, Matthias; Langton, J. Brian; Wahl, Bill
2017-09-01
This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.
Solar System Science with LSST
NASA Astrophysics Data System (ADS)
Jones, R. L.; Chesley, S. R.; Connolly, A. J.; Harris, A. W.; Ivezic, Z.; Knezevic, Z.; Kubica, J.; Milani, A.; Trilling, D. E.
2008-09-01
The Large Synoptic Survey Telescope (LSST) will provide a unique tool to study moving objects throughout the solar system, creating massive catalogs of Near Earth Objects (NEOs), asteroids, Trojans, TransNeptunian Objects (TNOs), comets and planetary satellites with well-measured orbits and high quality, multi-color photometry accurate to 0.005 magnitudes for the brightest objects. In the baseline LSST observing plan, back-to-back 15-second images will reach a limiting magnitude as faint as r=24.7 in each 9.6 square degree image, twice per night; a total of approximately 15,000 square degrees of the sky will be imaged in multiple filters every 3 nights. This time sampling will continue throughout each lunation, creating a huge database of observations. Fig. 1 Sky coverage of LSST over 10 years; separate panels for each of the 6 LSST filters. Color bars indicate number of observations in filter. The catalogs will include more than 80% of the potentially hazardous asteroids larger than 140m in diameter within the first 10 years of LSST operation, millions of main-belt asteroids and perhaps 20,000 Trans-Neptunian Objects. Objects with diameters as small as 100m in the Main Belt and <100km in the Kuiper Belt can be detected in individual images. Specialized `deep drilling' observing sequences will detect KBOs down to 10s of kilometers in diameter. Long period comets will be detected at larger distances than previously possible, constrainting models of the Oort cloud. With the large number of objects expected in the catalogs, it may be possible to observe a pristine comet start outgassing on its first journey into the inner solar system. By observing fields over a wide range of ecliptic longitudes and latitudes, including large separations from the ecliptic plane, not only will these catalogs greatly increase the numbers of known objects, the characterization of the inclination distributions of these populations will be much improved. Derivation of proper elements for main belt and Trojan asteroids will allow ever more resolution of asteroid families and their size-frequency distribution, as well as the study of the long-term dynamics of the individual asteroids and the asteroid belt as a whole. Fig. 2 Orbital parameters of Main Belt Asteroids, color-coded according to ugriz colors measured by SDSS. The figure to the left shows osculating elements, the figure to the right shows proper elements - note the asteroid families visible as clumps in parameter space [1]. By obtaining multi-color ugrizy data for a substantial fraction of objects, relationships between color and dynamical history can be established. This will also enable taxonomic classification of asteroids, provide further links between diverse populations such as irregular satellites and TNOs or planetary Trojans, and enable estimates of asteroid diameter with rms uncertainty of 30%. With the addition of light-curve information, rotation periods and phase curves can be measured for large fractions of each population, leading to new insight on physical characteristics. Photometric variability information, together with sparse lightcurve inversion, will allow spin state and shape estimation for up to two orders of magnitude more objects than presently known. This will leverage physical studies of asteroids by constraining the size-strength relationship, which has important implications for the internal structure (solid, fractured, rubble pile) and in turn the collisional evolution of the asteroid belt. Similar information can be gained for other solar system bodies. [1] Parker, A., Ivezic
Surveying the Inner Solar System with an Infrared Space Telescope
NASA Astrophysics Data System (ADS)
Buie, Marc W.; Reitsema, Harold J.; Linfield, Roger P.
2016-11-01
We present an analysis of surveying the inner solar system for objects that may pose some threat to Earth. Most of the analysis is based on understanding the capability provided by Sentinel, a concept for an infrared space-based telescope placed in a heliocentric orbit near the distance of Venus. From this analysis, we show that (1) the size range being targeted can affect the survey design, (2) the orbit distribution of the target sample can affect the survey design, (3) minimum observational arc length during the survey is an important metric of survey performance, and (4) surveys must consider objects as small as D=15{--}30 m to meet the goal of identifying objects that have the potential to cause damage on Earth in the next 100 yr. Sentinel will be able to find 50% of all impactors larger than 40 m in a 6.5 yr survey. The Sentinel mission concept is shown to be as effective as any survey in finding objects bigger than D = 140 m but is more effective when applied to finding smaller objects on Earth-impacting orbits. Sentinel is also more effective at finding objects of interest for human exploration that benefit from lower propulsion requirements. To explore the interaction between space and ground search programs, we also study a case where Sentinel is combined with the Large Synoptic Survey Telescope (LSST) and show the benefit of placing a space-based observatory in an orbit that reduces the overlap in search regions with a ground-based telescope. In this case, Sentinel+LSST can find more than 70% of the impactors larger than 40 m assuming a 6.5 yr lifetime for Sentinel and 10 yr for LSST.
Atmospheric Dispersion Effects in Weak Lensing Measurements
Plazas, Andrés Alejandro; Bernstein, Gary
2012-10-01
The wavelength dependence of atmospheric refraction causes elongation of finite-bandwidth images along the elevation vector, which produces spurious signals in weak gravitational lensing shear measurements unless this atmospheric dispersion is calibrated and removed to high precision. Because astrometric solutions and PSF characteristics are typically calibrated from stellar images, differences between the reference stars' spectra and the galaxies' spectra will leave residual errors in both the astrometric positions (dr) and in the second moment (width) of the wavelength-averaged PSF (dv) for galaxies.We estimate the level of dv that will induce spurious weak lensing signals in PSF-corrected galaxy shapes that exceed themore » statistical errors of the DES and the LSST cosmic-shear experiments. We also estimate the dr signals that will produce unacceptable spurious distortions after stacking of exposures taken at different airmasses and hour angles. We also calculate the errors in the griz bands, and find that dispersion systematics, uncorrected, are up to 6 and 2 times larger in g and r bands,respectively, than the requirements for the DES error budget, but can be safely ignored in i and z bands. For the LSST requirements, the factors are about 30, 10, and 3 in g, r, and i bands,respectively. We find that a simple correction linear in galaxy color is accurate enough to reduce dispersion shear systematics to insignificant levels in the r band for DES and i band for LSST,but still as much as 5 times than the requirements for LSST r-band observations. More complex corrections will likely be able to reduce the systematic cosmic-shear errors below statistical errors for LSST r band. But g-band effects remain large enough that it seems likely that induced systematics will dominate the statistical errors of both surveys, and cosmic-shear measurements should rely on the redder bands.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, John Russell
This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, B.; Spergel, D.; Connolly, A.
2015-02-02
The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The mainmore » argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.« less
The Emerging Infrastructure of Autonomous Astronomy
NASA Astrophysics Data System (ADS)
Seaman, R.; Allan, A.; Axelrod, T.; Cook, K.; White, R.; Williams, R.
2007-10-01
Advances in the understanding of cosmic processes demand that sky transient events be confronted with statistical techniques honed on static phenomena. Time domain data sets require vast surveys such as LSST {http://www.lsst.org/lsst_home.shtml} and Pan-STARRS {http://www.pan-starrs.ifa.hawaii.edu}. A new autonomous infrastructure must close the loop from the scheduling of survey observations, through data archiving and pipeline processing, to the publication of transient event alerts and automated follow-up, and to the easy analysis of resulting data. The IVOA VOEvent {http://voevent.org} working group leads efforts to characterize sky transient alerts published through VOEventNet {http://voeventnet.org}. The Heterogeneous Telescope Networks (HTN {http://www.telescope-networks.org}) consortium are observatories and robotic telescope projects seeking interoperability with a long-term goal of creating an e-market for telescope time. Two projects relying on VOEvent and HTN are eSTAR {http://www.estar.org.uk} and the Thinking Telescope {http://www.thinkingtelescopes.lanl.gov} Project.
LIMB-DARKENING COEFFICIENTS FOR ECLIPSING WHITE DWARFS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gianninas, A.; Strickland, B. D.; Kilic, Mukremin
2013-03-20
We present extensive calculations of linear and nonlinear limb-darkening coefficients as well as complete intensity profiles appropriate for modeling the light-curves of eclipsing white dwarfs. We compute limb-darkening coefficients in the Johnson-Kron-Cousins UBVRI photometric system as well as the Large Synoptic Survey Telescope (LSST) ugrizy system using the most up to date model atmospheres available. In all, we provide the coefficients for seven different limb-darkening laws. We describe the variations of these coefficients as a function of the atmospheric parameters, including the effects of convection at low effective temperatures. Finally, we discuss the importance of having readily available limb-darkening coefficientsmore » in the context of present and future photometric surveys like the LSST, Palomar Transient Factory, and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The LSST, for example, may find {approx}10{sup 5} eclipsing white dwarfs. The limb-darkening calculations presented here will be an essential part of the detailed analysis of all of these systems.« less
Is flat fielding safe for precision CCD astronomy?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Is flat fielding safe for precision CCD astronomy?
Baumer, Michael; Davis, Christopher P.; Roodman, Aaron
2017-07-06
The ambitious goals of precision cosmology with wide-field optical surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST) demand precision CCD astronomy as their foundation. This in turn requires an understanding of previously uncharacterized sources of systematic error in CCD sensors, many of which manifest themselves as static effective variations in pixel area. Such variation renders a critical assumption behind the traditional procedure of flat fielding—that a sensor's pixels comprise a uniform grid—invalid. In this work, we present a method to infer a curl-free model of a sensor's underlying pixel grid from flat-field images,more » incorporating the superposition of all electrostatic sensor effects—both known and unknown—present in flat-field data. We use these pixel grid models to estimate the overall impact of sensor systematics on photometry, astrometry, and PSF shape measurements in a representative sensor from the Dark Energy Camera (DECam) and a prototype LSST sensor. Applying the method to DECam data recovers known significant sensor effects for which corrections are currently being developed within DES. For an LSST prototype CCD with pixel-response non-uniformity (PRNU) of 0.4%, we find the impact of "improper" flat fielding on these observables is negligible in nominal .7'' seeing conditions. Furthermore, these errors scale linearly with the PRNU, so for future LSST production sensors, which may have larger PRNU, our method provides a way to assess whether pixel-level calibration beyond flat fielding will be required.« less
Scheduling Algorithm for the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Ichharam, Jaimal; Stubbs, Christopher
2015-01-01
The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.
NASA Astrophysics Data System (ADS)
Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.
2017-06-01
The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
abate, alex; cheu, elliott
This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.
Photometric Redshift Calibration Strategy for WFIRST Cosmology
NASA Astrophysics Data System (ADS)
Hemmati, Shoubaneh; WFIRST, WFIRST-HLS-COSMOLOGY
2018-01-01
In order for WFIRST and other Stage IV Dark energy experiments (e.g. LSST, Euclid) to infer cosmological parameters not limited by systematic errors, accurate redshift measurements are needed. This accuracy can only be met using spectroscopic subsamples to calibrate the full sample. In this poster, we employ the machine leaning, SOM based spectroscopic sampling technique developed in Masters et al. 2015, using the empirical color-redshift relation among galaxies to find the minimum spectra required for the WFIRST weak lensing calibration. We use galaxies from the CANDELS survey to build the LSST+WFIRST lensing analog sample of ~36k objects and train the LSST+WFIRST SOM. We show that 26% of the WFIRST lensing sample consists of sources fainter than the Euclid depth in the optical, 91% of which live in color cells already occupied by brighter galaxies. We demonstrate the similarity between faint and bright galaxies as well as the feasibility of redshift measurements at different brightness levels. 4% of SOM cells are however only occupied by faint galaxies for which we recommend extra spectroscopy of ~200 new sources. Acquiring the spectra of these sources will enable the comprehensive calibration of the WFIRST color-redshift relation.
NASA Astrophysics Data System (ADS)
Mróz, Przemek; Poleski, Radosław
2018-04-01
We use three-dimensional distributions of classical Cepheids and RR Lyrae stars in the Small Magellanic Cloud (SMC) to model the stellar density distribution of a young and old stellar population in that galaxy. We use these models to estimate the microlensing self-lensing optical depth to the SMC, which is in excellent agreement with the observations. Our models are consistent with the total stellar mass of the SMC of about 1.0× {10}9 {M}ȯ under the assumption that all microlensing events toward this galaxy are caused by self-lensing. We also calculate the expected event rates and estimate that future large-scale surveys, like the Large Synoptic Survey Telescope (LSST), will be able to detect up to a few dozen microlensing events in the SMC annually. If the planet frequency in the SMC is similar to that in the Milky Way, a few extragalactic planets can be detected over the course of the LSST survey, provided significant changes in the SMC observing strategy are devised. A relatively small investment of LSST resources can give us a unique probe of the population of extragalactic exoplanets.
An automated system to measure the quantum efficiency of CCDs for astronomy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, R.; Chiang, J.; Cinabro, D.
We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less
An automated system to measure the quantum efficiency of CCDs for astronomy
Coles, R.; Chiang, J.; Cinabro, D.; ...
2017-04-18
We describe a system to measure the Quantum Efficiency in the wavelength range of 300 nm to 1100 nm of 40 × 40 mm n-channel CCD sensors for the construction of the 3.2 gigapixel LSST focal plane. The technique uses a series of instrument to create a very uniform flux of photons of controllable intensity in the wavelength range of interest across the face the sensor. This allows the absolute Quantum Efficiency to be measured with an accuracy in the 1% range. Finally, this system will be part of a production facility at Brookhaven National Lab for the basic componentmore » of the LSST camera.« less
The Santiago-Harvard-Edinburgh-Durham void comparison - I. SHEDding light on chameleon gravity tests
NASA Astrophysics Data System (ADS)
Cautun, Marius; Paillas, Enrique; Cai, Yan-Chuan; Bose, Sownak; Armijo, Joaquin; Li, Baojiu; Padilla, Nelson
2018-05-01
We present a systematic comparison of several existing and new void-finding algorithms, focusing on their potential power to test a particular class of modified gravity models - chameleon f(R) gravity. These models deviate from standard general relativity (GR) more strongly in low-density regions and thus voids are a promising venue to test them. We use halo occupation distribution (HOD) prescriptions to populate haloes with galaxies, and tune the HOD parameters such that the galaxy two-point correlation functions are the same in both f(R) and GR models. We identify both three-dimensional (3D) voids and two-dimensional (2D) underdensities in the plane of the sky to find the same void abundance and void galaxy number density profiles across all models, which suggests that they do not contain much information beyond galaxy clustering. However, the underlying void dark matter density profiles are significantly different, with f(R) voids being more underdense than GR ones, which leads to f(R) voids having a larger tangential shear signal than their GR analogues. We investigate the potential of each void finder to test f(R) models with near-future lensing surveys such as EUCLID and LSST. The 2D voids have the largest power to probe f(R) gravity, with an LSST analysis of tunnel (which is a new type of 2D underdensity introduced here) lensing distinguishing at 80 and 11σ (statistical error) f(R) models with parameters, |fR0| = 10-5 and 10-6, from GR.
Fast force actuators for LSST primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Hileman, Edward; Warner, Michael; Wiecha, Oliver
2010-07-01
The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.
The Search for Transients and Variables in the LSST Pathfinder Survey
NASA Astrophysics Data System (ADS)
Gorsuch, Mary Katherine; Kotulla, Ralf
2018-01-01
This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.
Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration
2018-05-01
The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.
Constraining neutrino masses with the integrated-Sachs-Wolfe-galaxy correlation function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesgourgues, Julien; Valkenburg, Wessel; Gaztanaga, Enrique
2008-03-15
Temperature anisotropies in the cosmic microwave background (CMB) are affected by the late integrated Sachs-Wolfe (lISW) effect caused by any time variation of the gravitational potential on linear scales. Dark energy is not the only source of lISW, since massive neutrinos induce a small decay of the potential on small scales during both matter and dark energy domination. In this work, we study the prospect of using the cross correlation between CMB and galaxy-density maps as a tool for constraining the neutrino mass. On the one hand massive neutrinos reduce the cross-correlation spectrum because free-streaming slows down structure formation; onmore » the other hand, they enhance it through their change in the effective linear growth. We show that in the observable range of scales and redshifts, the first effect dominates, but the second one is not negligible. We carry out an error forecast analysis by fitting some mock data inspired by the Planck satellite, Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST). The inclusion of the cross correlation data from Planck and LSST increases the sensitivity to the neutrino mass m{sub {nu}} by 38% (and to the dark energy equation of state w by 83%) with respect to Planck alone. The correlation between Planck and DES brings a far less significant improvement. This method is not potentially as good for detecting m{sub {nu}} as the measurement of galaxy, cluster, or cosmic shear power spectra, but since it is independent and affected by different systematics, it remains potentially interesting if the total neutrino mass is of the order of 0.2 eV; if instead it is close to the lower bound from atmospheric oscillations, m{sub {nu}}{approx}0.05 eV, we do not expect the ISW-galaxy correlation to be ever sensitive to m{sub {nu}}.« less
Variable classification in the LSST era: exploring a model for quasi-periodic light curves
NASA Astrophysics Data System (ADS)
Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.
2017-06-01
The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.
Transient survey rates for orphan afterglows from compact merger jets
NASA Astrophysics Data System (ADS)
Lamb, Gavin P.; Tanaka, Masaomi; Kobayashi, Shiho
2018-06-01
Orphan afterglows from short γ-ray bursts (GRBs) are potential candidates for electromagnetic (EM) counterpart searches to gravitational wave (GW) detected neutron star or neutron star black hole mergers. Various jet dynamical and structure models have been proposed that can be tested by the detection of a large sample of GW-EM counterparts. We make predictions for the expected rate of optical transients from these jet models for future survey telescopes, without a GW or GRB trigger. A sample of merger jets is generated in the redshift limits 0 ≤ z ≤ 3.0, and the expected peak r-band flux and time-scale above the Large Synoptic Survey Telescope (LSST) or Zwicky Transient Factory (ZTF) detection threshold, mr = 24.5 and 20.4, respectively, is calculated. General all-sky rates are shown for mr ≤ 26.0 and mr ≤ 21.0. The detected orphan and GRB afterglow rate depends on jet model, typically 16≲ R≲ 76 yr-1 for the LSST, and 2≲ R ≲ 8 yr-1 for ZTF. An excess in the rate of orphan afterglows for a survey to a depth of mr ≤ 26 would indicate that merger jets have a dominant low-Lorentz factor population, or the jets exhibit intrinsic jet structure. Careful filtering of transients is required to successfully identify orphan afterglows from either short- or long-GRB progenitors.
Dwarf Hosts of Low-z Supernovae
NASA Astrophysics Data System (ADS)
Pyotr Kolobow, Craig; Perlman, Eric S.; Strolger, Louis
2018-01-01
Hostless supernovae (SNe), or SNe in dwarf galaxies, may serve as excellent beacons for probing the spatial density of dwarf galaxies (M < 10^8M⊙), which themselves are scarcely detected beyond only a few Mpc. Depending on the assumed model for the stellar-mass to halo mass relation for these galaxies, LSST might see 1000s of SNe (of all types) from dwarf galaxies alone. Conversely, one can take the measured rates of these SNe and test the model predictions for the density of dwarf galaxies in the local universe. Current “all-sky” surveys, like PanSTARRS and ASAS-SN, are now finding hostless SNe at a number sufficient to measure their rate. What missing is the appropriate weighting of their host luminosities. Here we seek to continue a successful program to recover the luminosities of these hostless SNe, to z = 0.15, to use their rate to constrain the faint-end slope of the low-z galaxy luminosity function.
NASA Astrophysics Data System (ADS)
Huijse, Pablo; Estévez, Pablo A.; Förster, Francisco; Daniel, Scott F.; Connolly, Andrew J.; Protopapas, Pavlos; Carrasco, Rodrigo; Príncipe, José C.
2018-05-01
The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely sampled time-series are needed. In this paper we present a new method for light curve period estimation based on quadratic mutual information (QMI). The proposed method does not assume a particular model for the light curve nor its underlying probability density and it is robust to non-Gaussian noise and outliers. By combining the QMI from several bands the true period can be estimated even when no single-band QMI yields the period. Period recovery performance as a function of average magnitude and sample size is measured using 30,000 synthetic multiband light curves of RR Lyrae and Cepheid variables generated by the LSST Operations and Catalog simulators. The results show that aggregating information from several bands is highly beneficial in LSST sparsely sampled time-series, obtaining an absolute increase in period recovery rate up to 50%. We also show that the QMI is more robust to noise and light curve length (sample size) than the multiband generalizations of the Lomb–Scargle and AoV periodograms, recovering the true period in 10%–30% more cases than its competitors. A python package containing efficient Cython implementations of the QMI and other methods is provided.
A Euclid, LSST and WFIRST Joint Processing Study
NASA Astrophysics Data System (ADS)
Chary, Ranga-Ram; Joint Processing Working Group
2018-01-01
Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.
NASA Astrophysics Data System (ADS)
Reyes, Reinabelle; Mandelbaum, R.; Seljak, U.; Gunn, J.; Lombriser, L.
2009-01-01
We perform a test of gravity on large scales (5-50 Mpc/h) using 70,000 luminous red galaxies (LRGs) from the Sloan Digital Sky Survey (SDSS) DR7 with redshifts 0.16
The Follow-up Crisis: Optimizing Science in an Opportunity Rich Environment
NASA Astrophysics Data System (ADS)
Vestrand, T.
Rapid follow-up tasking for robotic telescopes has been dominated by a one-dimensional uncoordinated response strategy developed for gamma-ray burst studies. However, this second-grade soccer approach is increasing showing its limitations even when there are only a few events per night. And it will certainly fail when faced with the denial-of-service attack generated by the nightly flood of new transients generated by massive variability surveys like LSST. We discuss approaches for optimizing the scientific return from autonomous robotic telescopes in the high event range limit and explore the potential of a coordinated telescope ecosystem employing heterogeneous telescopes.
Preparing for LSST with the LCOGT NEO Follow-up Network
NASA Astrophysics Data System (ADS)
Greenstreet, Sarah; Lister, Tim; Gomez, Edward
2016-10-01
The Las Cumbres Observatory Global Telescope Network (LCOGT) provides an ideal platform for follow-up and characterization of Solar System objects (e.g. asteroids, Kuiper Belt Objects, comets, Near-Earth Objects (NEOs)) and ultimately for the discovery of new objects. The LCOGT NEO Follow-up Network is using the LCOGT telescope network in addition to a web-based system developed to perform prioritized target selection, scheduling, and data reduction to confirm NEO candidates and characterize radar-targeted known NEOs.In order to determine how to maximize our NEO follow-up efforts, we must first define our goals for the LCOGT NEO Follow-up Network. This means answering the following questions. Should we follow-up all objects brighter than some magnitude limit? Should we only focus on the brightest objects or push to the limits of our capabilities by observing the faintest objects we think we can see and risk not finding the objects in our data? Do we (and how do we) prioritize objects somewhere in the middle of our observable magnitude range? If we want to push to faint objects, how do we minimize the amount of data in which the signal-to-noise ratio is too low to see the object? And how do we find a balance between performing follow-up and characterization observations?To help answer these questions, we have developed a LCOGT NEO Follow-up Network simulator that allows us to test our prioritization algorithms for target selection, confirm signal-to-noise predictions, and determine ideal block lengths and exposure times for observing NEO candidates. We will present our results from the simulator and progress on our NEO follow-up efforts.In the era of LSST, developing/utilizing infrastructure, such as the LCOGT NEO Follow-up Network and our web-based platform for selecting, scheduling, and reducing NEO observations, capable of handling the large number of detections expected to be produced on a daily basis by LSST will be critical to follow-up efforts. We hope our work can act as an example and tool for the community as together we prepare for the age of LSST.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2
NASA Technical Reports Server (NTRS)
Sullivan, M. R.
1982-01-01
Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 2
NASA Astrophysics Data System (ADS)
Sullivan, M. R.
1982-06-01
Cable technology is discussed. Manufacturing flow and philosophy are considered. Acceptance, gratification and flight tests are discussed. Fifteen-meter and fifty-meter models are considered. An economic assessment is included.
The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowden, Gordon B.; Langton, Brian J.; /SLAC
2014-05-28
The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less
In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms
NASA Astrophysics Data System (ADS)
Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.
2007-12-01
We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.
The variable sky of deep synoptic surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridgway, Stephen T.; Matheson, Thomas; Mighell, Kenneth J.
2014-11-20
The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria—a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky—Galactic stars, quasi-stellar objects (QSOs), active galactic nuclei (AGNs), and asteroids. It is found that the Large Synoptic Survey Telescope (LSST) will be capable of discovering ∼10{sup 5} high latitude (|b| > 20°) variable stars per night atmore » the beginning of the survey. (The corresponding number for |b| < 20° is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100 per night within less than one year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of AGNs and QSOs are each predicted to begin at ∼3000 per night and decrease by 50 times over four years. Supernovae are expected at ∼1100 per night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at >10{sup 5} per night, and if orbital determination has a 50% success rate per epoch, they will drop below 1000 per night within two years.« less
FY-79 - development of fiber optics connector technology for large space systems
NASA Technical Reports Server (NTRS)
Campbell, T. G.
1980-01-01
The development of physical concepts for integrating fiber optic connectors and cables with structural concepts proposed for the LSST is discussed. Emphasis is placed on remote connections using integrated cables.
Projected Near-Earth Object Discovery Performance of the Large Synoptic Survey Telescope
NASA Technical Reports Server (NTRS)
Chesley, Steven R.; Veres, Peter
2017-01-01
This report describes the methodology and results of an assessment study of the performance of the Large Synoptic Survey Telescope (LSST) in its planned efforts to detect and catalog near-Earth objects (NEOs).
Baseline design and requirements for the LSST rotating enclosure (dome)
NASA Astrophysics Data System (ADS)
Neill, D. R.; DeVries, J.; Hileman, E.; Sebag, J.; Gressler, W.; Wiecha, O.; Andrew, J.; Schoening, W.
2014-07-01
The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile. As a result of the wide field of view, its optical system is unusually susceptible to stray light; consequently besides protecting the telescope from the environment the rotating enclosure (Dome) also provides indispensible light baffling. All dome vents are covered with light baffles which simultaneously provide both essential dome flushing and stray light attenuation. The wind screen also (and primarily) functions as a light screen providing only a minimum clear aperture. Since the dome must operate continuously, and the drives produce significant heat, they are located on the fixed lower enclosure to facilitate glycol water cooling. To accommodate day time thermal control, a duct system channels cooling air provided by the facility when the dome is in its parked position.
LSST communications middleware implementation
NASA Astrophysics Data System (ADS)
Mills, Dave; Schumacher, German; Lotz, Paul
2016-07-01
The LSST communications middleware is based on a set of software abstractions; which provide standard interfaces for common communications services. The observatory requires communication between diverse subsystems, implemented by different contractors, and comprehensive archiving of subsystem status data. The Service Abstraction Layer (SAL) is implemented using open source packages that implement open standards of DDS (Data Distribution Service1) for data communication, and SQL (Standard Query Language) for database access. For every subsystem, abstractions for each of the Telemetry datastreams, along with Command/Response and Events, have been agreed with the appropriate component vendor (such as Dome, TMA, Hexapod), and captured in ICD's (Interface Control Documents).The OpenSplice (Prismtech) Community Edition of DDS provides an LGPL licensed distribution which may be freely redistributed. The availability of the full source code provides assurances that the project will be able to maintain it over the full 10 year survey, independent of the fortunes of the original providers.
LSST camera readout chip ASPIC: test tools
NASA Astrophysics Data System (ADS)
Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.
2012-02-01
The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.
LSST Telescope and Optics Status
NASA Astrophysics Data System (ADS)
Krabbendam, Victor; Gressler, W. J.; Andrew, J. R.; Barr, J. D.; DeVries, J.; Hileman, E.; Liang, M.; Neill, D. R.; Sebag, J.; Wiecha, O.; LSST Collaboration
2011-01-01
The LSST Project continues to advance the design and development of an observatory system capable of capturing 20,000 deg2 of the sky in six wavebands over ten years. Optical fabrication of the unique M1/M3 monolithic mirror has entered final front surface optical processing. After substantial grinding to remove 5 tons of excess glass above the M3 surface, a residual of a single spin casting, both distinct optical surfaces are now clearly evident. Loose abrasive grinding has begun and polishing is to occur during 2011 and final optical testing is planned in early 2012. The M1/M3 telescope cell and internal component designs have matured to support on telescope operational requirements and off telescope coating needs. The mirror position system (hardpoint actuators) and mirror support system (figure actuator) designs have developed through internal laboratory analysis and testing. Review of thermal requirements has assisted with definition of a thermal conditioning and control system. Pre-cooling the M1/M3 substrate will enable productive observing during the large temperature swing often seen at twilight. The M2 ULE™ substrate is complete and lies in storage waiting for additional funding to enable final optical polishing. This 3.5m diameter, 100mm thick meniscus substrate has been ground to within 40 microns of final figure. Detailed design of the telescope mount, including subflooring, has been developed. Finally, substantial progress has been achieved on the facility design. In early 2010, LSST contracted with ARCADIS Geotecnica Consultores, a Santiago based engineering firm to lead the formal architectural design effort for the summit facility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okura, Yuki; Petri, Andrea; May, Morgan
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
Okura, Yuki; Petri, Andrea; May, Morgan; ...
2016-06-27
Weak gravitational lensing causes subtle changes in the apparent shapes of galaxies due to the bending of light by the gravity of foreground masses. By measuring the shapes of large numbers of galaxies (millions in recent surveys, up to tens of billions in future surveys) we can infer the parameters that determine cosmology. Imperfections in the detectors used to record images of the sky can introduce changes in the apparent shape of galaxies, which in turn can bias the inferred cosmological parameters. Here in this paper we consider the effect of two widely discussed sensor imperfections: tree-rings, due to impuritymore » gradients which cause transverse electric fields in the Charge-Coupled Devices (CCD), and pixel-size variation, due to periodic CCD fabrication errors. These imperfections can be observed when the detectors are subject to uniform illumination (flat field images). We develop methods to determine the spurious shear and convergence (due to the imperfections) from the flat-field images. We calculate how the spurious shear when added to the lensing shear will bias the determination of cosmological parameters. We apply our methods to candidate sensors of the Large Synoptic Survey Telescope (LSST) as a timely and important example, analyzing flat field images recorded with LSST prototype CCDs in the laboratory. In conclusion, we find that tree-rings and periodic pixel-size variation present in the LSST CCDs will introduce negligible bias to cosmological parameters determined from the lensing power spectrum, specifically w,Ω m and σ 8.« less
Near-Earth Object Orbit Linking with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Vereš, Peter; Chesley, Steven R.
2017-07-01
We have conducted a detailed simulation of the ability of the Large Synoptic Survey Telescope (LSST) to link near-Earth and main belt asteroid detections into orbits. The key elements of the study were a high-fidelity detection model and the presence of false detections in the form of both statistical noise and difference image artifacts. We employed the Moving Object Processing System (MOPS) to generate tracklets, tracks, and orbits with a realistic detection density for one month of the LSST survey. The main goals of the study were to understand whether (a) the linking of near-Earth objects (NEOs) into orbits can succeed in a realistic survey, (b) the number of false tracks and orbits will be manageable, and (c) the accuracy of linked orbits would be sufficient for automated processing of discoveries and attributions. We found that the overall density of asteroids was more than 5000 per LSST field near opposition on the ecliptic, plus up to 3000 false detections per field in good seeing. We achieved 93.6% NEO linking efficiency for H< 22 on tracks composed of tracklets from at least three distinct nights within a 12 day interval. The derived NEO catalog was comprised of 96% correct linkages. Less than 0.1% of orbits included false detections, and the remainder of false linkages stemmed from main belt confusion, which was an artifact of the short time span of the simulation. The MOPS linking efficiency can be improved by refined attribution of detections to known objects and by improved tuning of the internal kd-tree linking algorithms.
NASA Technical Reports Server (NTRS)
1980-01-01
The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
NASA Astrophysics Data System (ADS)
Scolnic, D.; Kessler, R.; Brout, D.; Cowperthwaite, P. S.; Soares-Santos, M.; Annis, J.; Herner, K.; Chen, H.-Y.; Sako, M.; Doctor, Z.; Butler, R. E.; Palmese, A.; Diehl, H. T.; Frieman, J.; Holz, D. E.; Berger, E.; Chornock, R.; Villar, V. A.; Nicholl, M.; Biswas, R.; Hounsell, R.; Foley, R. J.; Metzger, J.; Rest, A.; García-Bellido, J.; Möller, A.; Nugent, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Cunha, C. E.; D’Andrea, C. B.; da Costa, L. N.; Davis, C.; Doel, P.; Drlica-Wagner, A.; Eifler, T. F.; Flaugher, B.; Fosalba, P.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; James, D. J.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Lahav, O.; Li, T. S.; Lima, M.; Maia, M. A. G.; March, M.; Marshall, J. L.; Menanteau, F.; Miquel, R.; Neilsen, E.; Plazas, A. A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Smith, R. C.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, R. C.; Tucker, D. L.; Walker, A. R.; DES Collaboration
2018-01-01
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies of {10}3 {{Gpc}}-3 {{yr}}-1, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is z=0.8 for WFIRST, z=0.25 for LSST, and z=0.04 for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. More broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scolnic, D.; Kessler, R.; Brout, D.
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less
How Many Kilonovae Can Be Found in Past, Present, and Future Survey Data Sets?
Scolnic, D.; Kessler, R.; Brout, D.; ...
2017-12-22
The discovery of a kilonova (KN) associated with the Advanced LIGO (aLIGO)/Virgo event GW170817 opens up new avenues of multi-messenger astrophysics. Here, using realistic simulations, we provide estimates of the number of KNe that could be found in data from past, present, and future surveys without a gravitational-wave trigger. For the simulation, we construct a spectral time-series model based on the DES-GW multi-band light curve from the single known KN event, and we use an average of BNS rates from past studies ofmore » $${10}^{3}\\,{\\mathrm{Gpc}}^{-3}\\,{\\mathrm{yr}}^{-1}$$, consistent with the one event found so far. Examining past and current data sets from transient surveys, the number of KNe we expect to find for ASAS-SN, SDSS, PS1, SNLS, DES, and SMT is between 0 and 0.3. We predict the number of detections per future survey to be 8.3 from ATLAS, 10.6 from ZTF, 5.5/69 from LSST (the Deep Drilling/Wide Fast Deep), and 16.0 from WFIRST. The maximum redshift of KNe discovered for each survey is $z=0.8$ for WFIRST, $z=0.25$ for LSST, and $z=0.04$ for ZTF and ATLAS. This maximum redshift for WFIRST is well beyond the sensitivity of aLIGO and some future GW missions. For the LSST survey, we also provide contamination estimates from Type Ia and core-collapse supernovae: after light curve and template-matching requirements, we estimate a background of just two events. Finally, more broadly, we stress that future transient surveys should consider how to optimize their search strategies to improve their detection efficiency and to consider similar analyses for GW follow-up programs.« less
A warm Spitzer survey of the LSST/DES 'Deep drilling' fields
NASA Astrophysics Data System (ADS)
Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee
2014-12-01
We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.
Stellar Populations and Nearby Galaxies with the LSST
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Olsen, K.; Monet, D. G.; LSST Stellar Populations Collaboration
2009-01-01
The LSST will produce a multi-color map and photometric object catalog of half the sky to r=27.6 (AB mag; 5-sigma). Time-space sampling of each field spanning ten years will allow variability, proper motion and parallax measurements for objects brighter than r=24.7. As part of providing an unprecedented map of the Galaxy, the accurate multi-band photometry will permit photometric parallaxes, chemical abundances and a handle on ages via colors at turn-off for main-sequence (MS) stars at all distances within the Galaxy as well as in the Magellanic Clouds, and dwarf satellites of the Milky Way. This will support comprehensive studies of star formation histories and chemical evolution for field stars. The structures of the Clouds and dwarf spheroidals will be traced with the MS stars, to equivalent surface densities fainter than 35 mag/square arc-second. With geometric parallax accuracy of 1 milli-arc-sec, comparable to HIPPARCOS but reaching more than 10 magnitudes fainter, a robust complete sample of solar neighborhood stars will be obtained. The LSST time sampling will identify and characterize variable stars of all types, from time scales of 1 hr to several years, a feast for variable star astrophysics. The combination of wide coverage, multi-band photometry, time sampling and parallax taken together will address several key problems: e.g. fine tuning the extragalactic distance scale by examining properties of RR Lyraes and Cepheids as a function of parent populations, extending the faint end of the galaxy luminosity function by discovering them using star count density enhancements on degree scales tracing, and indentifying inter-galactic stars through novae and Long Period Variables.
Wavelength-Dependent PSFs and their Impact on Weak Lensing Measurements
NASA Astrophysics Data System (ADS)
Carlsten, S. G.; Strauss, Michael A.; Lupton, Robert H.; Meyers, Joshua E.; Miyazaki, Satoshi
2018-06-01
We measure and model the wavelength dependence of the point spread function (PSF) in the Hyper Suprime-Cam Subaru Strategic Program survey. We find that PSF chromaticity is present in that redder stars appear smaller than bluer stars in the g, r, and i-bands at the 1-2 per cent level and in the z and y-bands at the 0.1-0.2 per cent level. From the color dependence of the PSF, we fit a model between the monochromatic PSF size based on weighted second moments, R, and wavelength of the form R(λ)∝λ-b. We find values of b between 0.2 and 0.5, depending on the epoch and filter. This is consistent with the expectations of a turbulent atmosphere with an outer scale length of ˜10 - 100 m, indicating that the atmosphere is dominating the chromaticity. In the best seeing data, we find that the optical system and detector also contribute some wavelength dependence. Meyers & Burchat (2015b) showed that b must be measured to an accuracy of ˜0.02 not to dominate the systematic error budget of the Large Synoptic Survey Telescope (LSST) weak lensing (WL) survey. Using simple image simulations, we find that b can be inferred with this accuracy in the r and i-bands for all positions in the LSST focal plane, assuming a stellar density of 1 star arcmin-2 and that the optical component of the PSF can be accurately modeled. Therefore, it is possible to correct for most, if not all, of the bias that the wavelength-dependent PSF will introduce into an LSST-like WL survey.
Properties of tree rings in LSST sensors
Park, H. Y.; Nomerotski, A.; Tsybychev, D.
2017-05-30
Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less
Fringing in MonoCam Y4 filter images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, J.; Fisher-Levine, M.; Nomerotski, A.
Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less
scarlet: Source separation in multi-band images by Constrained Matrix Factorization
NASA Astrophysics Data System (ADS)
Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert
2018-03-01
SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.
Properties of tree rings in LSST sensors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H. Y.; Nomerotski, A.; Tsybychev, D.
Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. Furthermore, these patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. Here, we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Treemore » ring pattern has a weak dependence on the wavelength. But the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.« less
A study of astrometric distortions due to “tree rings” in CCD sensors using LSST Photon Simulator
Beamer, Benjamin; Nomerotski, Andrei; Tsybychev, Dmitri
2015-05-22
Imperfections in the production process of thick CCDs lead to circularly symmetric dopant concentration variations, which in turn produce electric fields transverse to the surface of the fully depleted CCD that displace the photogenerated charges. We use PhoSim, a Monte Carlo photon simulator, to explore and examine the likely impacts these dopant concentration variations will have on astrometric measurements in LSST. The scale and behavior of both the astrometric shifts imparted to point sources and the intensity variations in flat field images that result from these doping imperfections are similar to those previously observed in Dark Energy Camera CCDs, givingmore » initial confirmation of PhoSim's model for these effects. In addition, the organized shape distortions were observed as a result of the symmetric nature of these dopant variations, causing nominally round sources to be imparted with a measurable ellipticity either aligned with or transverse to the radial direction of this dopant variation pattern.« less
Fringing in MonoCam Y4 filter images
Brooks, J.; Fisher-Levine, M.; Nomerotski, A.
2017-05-05
Here, we study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net "fringe" pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relativemore » intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. Lastly, we also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.« less
OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST
NASA Astrophysics Data System (ADS)
Roming, Peter; van der Horst, Alexander; OCTOCAM Team
2018-01-01
The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.
The Large Synoptic Survey Telescope OCS and TCS models
NASA Astrophysics Data System (ADS)
Schumacher, German; Delgado, Francisco
2010-07-01
The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.
chroma: Chromatic effects for LSST weak lensing
NASA Astrophysics Data System (ADS)
Meyers, Joshua E.; Burchat, Patricia R.
2018-04-01
Chroma investigates biases originating from two chromatic effects in the atmosphere: differential chromatic refraction (DCR), and wavelength dependence of seeing. These biases arise when using the point spread function (PSF) measured with stars to estimate the shapes of galaxies with different spectral energy distributions (SEDs) than the stars.
Near-Earth Object Survey Simulation Software
NASA Astrophysics Data System (ADS)
Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide
2017-10-01
There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.
How the cosmic web induces intrinsic alignments of galaxies
NASA Astrophysics Data System (ADS)
Codis, S.; Dubois, Y.; Pichon, C.; Devriendt, J.; Slyz, A.
2016-10-01
Intrinsic alignments are believed to be a major source of systematics for future generation of weak gravitational lensing surveys like Euclid or LSST. Direct measurements of the alignment of the projected light distribution of galaxies in wide field imaging data seem to agree on a contamination at a level of a few per cent of the shear correlation functions, although the amplitude of the effect depends on the population of galaxies considered. Given this dependency, it is difficult to use dark matter-only simulations as the sole resource to predict and control intrinsic alignments. We report here estimates on the level of intrinsic alignment in the cosmological hydrodynamical simulation Horizon-AGN that could be a major source of systematic errors in weak gravitational lensing measurements. In particular, assuming that the spin of galaxies is a good proxy for their ellipticity, we show how those spins are spatially correlated and how they couple to the tidal field in which they are embedded. We will also present theoretical calculations that illustrate and qualitatively explain the observed signals.
Supernovae and cosmology with future European facilities.
Hook, I M
2013-06-13
Prospects for future supernova surveys are discussed, focusing on the European Space Agency's Euclid mission and the European Extremely Large Telescope (E-ELT), both expected to be in operation around the turn of the decade. Euclid is a 1.2 m space survey telescope that will operate at visible and near-infrared wavelengths, and has the potential to find and obtain multi-band lightcurves for thousands of distant supernovae. The E-ELT is a planned, general-purpose ground-based, 40-m-class optical-infrared telescope with adaptive optics built in, which will be capable of obtaining spectra of type Ia supernovae to redshifts of at least four. The contribution to supernova cosmology with these facilities will be discussed in the context of other future supernova programmes such as those proposed for DES, JWST, LSST and WFIRST.
Rochester scientist discovers new comet with Dark Energy Camera (DECam) at
Sites Group MASS-DIMM New Projects NOAO Future Instrumentation DECam SAM LSST MONSOON What is MONSOON AURA Sites Group Talks and Meetings Upcoming Colloquia Sky Conditions CTIO Site Conditions TASCA colleagues believe. David Cameron, a visiting scientist in Eric Mamajek's research group in the Department of
LSST camera grid structure made out of ceramic composite material, HB-Cesic
NASA Astrophysics Data System (ADS)
Kroedel, Matthias R.; Langton, J. Bryan
2016-08-01
In this paper we are presenting the ceramic design and the fabrication of the camera structure which is using the unique manufacturing features of the HB-Cesic technology and associated with a dedicated metrology device in order to ensure the challenging flatness requirement of 4 micron over the full array.
Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation
NASA Astrophysics Data System (ADS)
Speagle, Joshua S.; Eisenstein, Daniel J.
2017-07-01
With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
Unveiling the Low Surface Brightness Stellar Peripheries of Galaxies
NASA Astrophysics Data System (ADS)
Ferguson, Annette M. N.
2018-01-01
The low surface brightness peripheral regions of galaxies contain a gold mine of information about how minor mergers and accretions have influenced their evolution over cosmic time. Enormous stellar envelopes and copious amounts of faint tidal debris are natural outcomes of the hierarchical assembly process and the search for and study of these features, albeit highly challenging, offers the potential for unrivalled insight into the mechanisms of galaxy growth. Over the last two decades, there has been burgeoning interest in probing galaxy outskirts using resolved stellar populations. Wide-field surveys have uncovered vast tidal debris features and new populations of very remote globular clusters, while deep Hubble Space Telescope photometry has provided exquisite star formation histories back to the earliest epochs. I will highlight some recent results from studies within and beyond the Local Group and conclude by briefly discussing the great potential of future facilities, such as JWST, Euclid, LSST and WFIRST, for major breakthroughs in low surface brightness galaxy periphery science.
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim
2016-08-01
We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
Final acceptance testing of the LSST monolithic primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Burge, James H.; Cuerden, Brian; Gressler, William; Martin, Hubert M.; West, Steven C.; Zhao, Chunyu
2014-07-01
The Large Synoptic Survey Telescope (LSST) is a three-mirror wide-field survey telescope with the primary and tertiary mirrors on one monolithic substrate1. This substrate is made of Ohara E6 borosilicate glass in a honeycomb sandwich, spin cast at the Steward Observatory Mirror Lab at The University of Arizona2. Each surface is aspheric, with the specification in terms of conic constant error, maximum active bending forces and finally a structure function specification on the residual errors3. There are high-order deformation terms, but with no tolerance, any error is considered as a surface error and is included in the structure function. The radii of curvature are very different, requiring two independent test stations, each with instantaneous phase-shifting interferometers with null correctors. The primary null corrector is a standard two-element Offner null lens. The tertiary null corrector is a phase-etched computer-generated hologram (CGH). This paper details the two optical systems and their tolerances, showing that the uncertainty in measuring the figure is a small fraction of the structure function specification. Additional metrology includes the radii of curvature, optical axis locations, and relative surface tilts. The methods for measuring these will also be described along with their tolerances.
On the Detectability of Interstellar Objects Like 1I/'Oumuamua
NASA Astrophysics Data System (ADS)
Ragozzine, Darin
2018-04-01
Almost since Oort's 1950 hypothesis of a tenuously bound cloud of comets, planetary formation theorists have realized that the process of planet formation must have ejected very large numbers of planetesimals into interstellar space. Unforunately, these objects are distributed over galactic volumes, while they are only likely to be detectable if they pass within a few AU of Earth, resulting in an incredibly sparse detectable population. Furthermore, hypotheses for the formation and distribution of these bodies allows for uncertainties of orders of magnitude in the expected detection rate: our analysis suggested LSST would discover 0.01-100 objects during its lifetime (Cook et al. 2016). The discovery of 1I/'Oumuamua by a survey less powerful that LSST indicates either a low probability event and/or that properties of this population are on the more favorable end of the spectrum. We revisit the detailed detection analysis of Cook et al. 2016 in light of the detection of 1I/'Oumuamua. We use these results to better understand 1I/'Oumuamua and to update our assessment of future detections of interstellar objects. We highlight some key questions that can be answered only by additional discoveries.
Supernovae anisotropy power spectrum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghodsi, Hoda; Baghram, Shant; Habibi, Farhang, E-mail: h.ghodsi@mehr.sharif.ir, E-mail: baghram@sharif.edu, E-mail: habibi@lal.in2p3.fr
2017-10-01
We contribute another anisotropy study to this field of research using Type Ia supernovae (SNe Ia). In this work, we utilise the power spectrum calculation method and apply it to both the current SNe Ia data and simulation. Using the Union2.1 data set at all redshifts, we compare the spectrum of the residuals of the observed distance moduli to that expected from an isotropic universe affected by the Union2.1 observational uncertainties at low multipoles. Through this comparison we find a dipolar anisotropy with tension of less that 2σ towards l = 171° ± 21° and b = −26° ± 28°more » which is mainly induced by anisotropic spatial distribution of the SNe with z > 0.2 rather than being a cosmic effect. Furthermore, we find a tension of ∼ 4σ at ℓ = 4 between the two spectra. Our simulations are constructed with the characteristics of the upcoming surveys like the Large Synoptic Survey Telescope (LSST), which shall bring us the largest SNe Ia collection to date. We make predictions for the amplitude of a possible dipolar anisotropy that would be detectable by future SNe Ia surveys.« less
The Future of Astrometric Education
NASA Astrophysics Data System (ADS)
van Altena, W.; Stavinschi, M.
2005-10-01
Astrometry is poised to enter an era of unparalleled growth and relevance due to the wealth of highly accurate data expected from the SIM and GAIA space missions. Innovative ground-based telescopes, such as the LSST, are planned which will provide less precise data, but for many more stars. The potential for studies of the structure, kinematics and dynamics of our Galaxy as well as for the physical nature of stars and the cosmological distance scale is without equal in the history of astronomy. It is therefore ironic that in two years not one course in astrometry will be taught in the US, leaving all astrometric education to Europe, China and Latin America. Who will ensure the astrometric quality control for the JWT, SIM, GAIA, LSST, to say nothing about the current large ground-based facilities, such as the VLT, Gemini, Keck, NOAO, Magellan, LBT, etc.? Hipparcos and the HST were astrometric successes due only to the dedicated work of specialists in astrometry who fought to maintain the astrometric characteristics of those satellites and their data pipelines. We propose a renewal of astrometric education in the universities to prepare qualified scientists so that the scientific returns from the investment of billions of dollars in these unique facilities will be maximized. The funding agencies are providing outstanding facilities. The universities, national and international observatories and agencies should acknowledge their responsibility to hire qualified full-time astrometric scientists to teach students, and to supervise existing and planned astronomical facilities so that quality data will be obtained and analyzed. A temporary solution to this problem is proposed in the form of a series of international summer schools in Astrometry. The Michelson Science Center of the SIM project has offered to hold an astrometry summer school in 2005 to begin this process. A one-semester syllabus is suggested as a means of meeting the needs of Astronomy by educating students in astrometric techniques that might be most valuable for careers associated with modern astrophysics.
Data Service: Distributed Data Capture and Replication
NASA Astrophysics Data System (ADS)
Warner, P. B.; Pietrowicz, S. R.
2007-10-01
Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.
LensFlow: A Convolutional Neural Network in Search of Strong Gravitational Lenses
NASA Astrophysics Data System (ADS)
Pourrahmani, Milad; Nayyeri, Hooshang; Cooray, Asantha
2018-03-01
In this work, we present our machine learning classification algorithm for identifying strong gravitational lenses from wide-area surveys using convolutional neural networks; LENSFLOW. We train and test the algorithm using a wide variety of strong gravitational lens configurations from simulations of lensing events. Images are processed through multiple convolutional layers that extract feature maps necessary to assign a lens probability to each image. LENSFLOW provides a ranking scheme for all sources that could be used to identify potential gravitational lens candidates by significantly reducing the number of images that have to be visually inspected. We apply our algorithm to the HST/ACS i-band observations of the COSMOS field and present our sample of identified lensing candidates. The developed machine learning algorithm is more computationally efficient and complimentary to classical lens identification algorithms and is ideal for discovering such events across wide areas from current and future surveys such as LSST and WFIRST.
SKA weak lensing - I. Cosmological forecasts and the power of radio-optical cross-correlations
NASA Astrophysics Data System (ADS)
Harrison, Ian; Camera, Stefano; Zuntz, Joe; Brown, Michael L.
2016-12-01
We construct forecasts for cosmological parameter constraints from weak gravitational lensing surveys involving the Square Kilometre Array (SKA). Considering matter content, dark energy and modified gravity parameters, we show that the first phase of the SKA (SKA1) can be competitive with other Stage III experiments such as the Dark Energy Survey and that the full SKA (SKA2) can potentially form tighter constraints than Stage IV optical weak lensing experiments, such as those that will be conducted with LSST, WFIRST-AFTA or Euclid-like facilities. Using weak lensing alone, going from SKA1 to SKA2 represents improvements by factors of ˜10 in matter, ˜10 in dark energy and ˜5 in modified gravity parameters. We also show, for the first time, the powerful result that comparably tight constraints (within ˜5 per cent) for both Stage III and Stage IV experiments, can be gained from cross-correlating shear maps between the optical and radio wavebands, a process which can also eliminate a number of potential sources of systematic errors which can otherwise limit the utility of weak lensing cosmology.
Earth's Minimoons: Opportunities for Science and Technology.
NASA Astrophysics Data System (ADS)
Jedicke, Robert; Bolin, Bryce T.; Bottke, William F.; Chyba, Monique; Fedorets, Grigori; Granvik, Mikael; Jones, Lynne; Urrutxua, Hodei
2018-05-01
Twelve years ago the Catalina Sky Survey discovered Earth's first known natural geocentric object other than the Moon, a few-meter diameter asteroid designated \\RH. Despite significant improvements in ground-based asteroid surveying technology in the past decade they have not discovered another temporarily-captured orbiter (TCO; colloquially known as minimoons) but the all-sky fireball system operated in the Czech Republic as part of the European Fireball Network detected a bright natural meteor that was almost certainly in a geocentric orbit before it struck Earth's atmosphere. Within a few years the Large Synoptic Survey Telescope (LSST) will either begin to regularly detect TCOs or force a re-analysis of the creation and dynamical evolution of small asteroids in the inner solar system. The first studies of the provenance, properties, and dynamics of Earth's minimoons suggested that there should be a steady state population with about one 1- to 2-meter diameter captured objects at any time, with the number of captured meteoroids increasing exponentially for smaller sizes. That model was then improved and extended to include the population of temporarily-captured flybys (TCFs), objects that fail to make an entire revolution around Earth while energetically bound to the Earth-Moon system. Several different techniques for discovering TCOs have been considered but their small diameters, proximity, and rapid motion make them challenging targets for existing ground-based optical, meteor, and radar surveys. However, the LSST's tremendous light gathering power and short exposure times could allow it to detect and discover many minimoons. We expect that if the TCO population is confirmed, and new objects are frequently discovered, they can provide new opportunities for 1) studying the dynamics of the Earth-Moon system, 2) testing models of the production and dynamical evolution of small asteroids from the asteroid belt, 3) rapid and frequent low delta-v missions to multiple minimoons, and 4) evaluating in-situ resource utilization techniques on asteroidal material. Here we review the past decade of minimoon studies in preparation for capitalizing on the scientific and commercial opportunities of TCOs in the first decade of LSST operations.
NASA Astrophysics Data System (ADS)
2008-10-01
As floods and hurricanes disrupt the lives of people round the world, a new generation of scientific tools are supporting both storm preparedness and recovery. As International Year of Astronomy 2009 approaches, the UK website is developing more features that make it easier to see what's planned for this science extravaganza.
The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Foxley-Marrable, Max; Collett, Thomas E.; Vernardos, Georgios; Goldstein, Daniel A.; Bacon, David
2018-05-01
We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is ≲ 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (≲ 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ˜ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.
The LSSTC Data Science Fellowship Program
NASA Astrophysics Data System (ADS)
Miller, Adam; Walkowicz, Lucianne; LSSTC DSFP Leadership Council
2017-01-01
The Large Synoptic Survey Telescope Corporation (LSSTC) Data Science Fellowship Program (DSFP) is a unique professional development program for astronomy graduate students. DSFP students complete a series of six, one-week long training sessions over the course of two years. The sessions are cumulative, each building on the last, to allow an in-depth exploration of the topics covered: data science basics, statistics, image processing, machine learning, scalable software, data visualization, time-series analysis, and science communication. The first session was held in Aug 2016 at Northwestern University, with all materials and lectures publicly available via github and YouTube. Each session focuses on a series of technical problems which are written in iPython notebooks. The initial class of fellows includes 16 students selected from across the globe, while an additional 14 fellows will be added to the program in year 2. Future sessions of the DSFP will be hosted by a rotating cast of LSSTC member institutions. The DSFP is designed to supplement graduate education in astronomy by teaching the essential skills necessary for dealing with big data, serving as a resource for all in the LSST era. The LSSTC DSFP is made possible by the generous support of the LSST Corporation, the Data Science Initiative (DSI) at Northwestern, and CIERA.
Optical Variability and Classification of High Redshift (3.5 < z < 5.5) Quasars on SDSS Stripe 82
NASA Astrophysics Data System (ADS)
AlSayyad, Yusra; McGreer, Ian D.; Fan, Xiaohui; Connolly, Andrew J.; Ivezic, Zeljko; Becker, Andrew C.
2015-01-01
Recent studies have shown promise in combining optical colors with variability to efficiently select and estimate the redshifts of low- to mid-redshift quasars in upcoming ground-based time-domain surveys. We extend these studies to fainter and less abundant high-redshift quasars using light curves from 235 sq. deg. and 10 years of Stripe 82 imaging reprocessed with the prototype LSST data management stack. Sources are detected on the i-band co-adds (5σ: i ~ 24) but measured on the single-epoch (ugriz) images, generating complete and unbiased lightcurves for sources fainter than the single-epoch detection threshold. Using these forced photometry lightcurves, we explore optical variability characteristics of high redshift quasars and validate classification methods with particular attention to the low signal limit. In this low SNR limit, we quantify the degradation of the uncertainties and biases on variability parameters using simulated light curves. Completeness/efficiency and redshift accuracy are verified with new spectroscopic observations on the MMT and APO 3.5m. These preliminary results are part of a survey to measure the z~4 luminosity function for quasars (i < 23) on Stripe 82 and to validate purely photometric classification techniques for high redshift quasars in LSST.
NASA Astrophysics Data System (ADS)
Fong, M.; Bowyer, R.; Whitehead, A.; Lee, B.; King, L.; Applegate, D.; McCarthy, I.
2018-05-01
For more than two decades, the Navarro, Frenk, and White (NFW) model has stood the test of time; it has been used to describe the distribution of mass in galaxy clusters out to their outskirts. Stacked weak lensing measurements of clusters are now revealing the distribution of mass out to and beyond their virial radii, where the NFW model is no longer applicable. In this study we assess how well the parameterised Diemer & Kravstov (DK) density profile describes the characteristic mass distribution of galaxy clusters extracted from cosmological simulations. This is determined from stacked synthetic lensing measurements of the 50 most massive clusters extracted from the Cosmo-OWLS simulations, using the Dark Matter Only run and also the run that most closely matches observations. The characteristics of the data reflect the Weighing the Giants survey and data from the future Large Synoptic Survey Telescope (LSST). In comparison with the NFW model, the DK model favored by the stacked data, in particular for the future LSST data, where the number density of background galaxies is higher. The DK profile depends on the accretion history of clusters which is specified in the current study. Eventually however subsamples of galaxy clusters with qualities indicative of disparate accretion histories could be studied.
Test of Parameterized Post-Newtonian Gravity with Galaxy-scale Strong Lensing Systems
NASA Astrophysics Data System (ADS)
Cao, Shuo; Li, Xiaolei; Biesiada, Marek; Xu, Tengpeng; Cai, Yongzhi; Zhu, Zong-Hong
2017-01-01
Based on a mass-selected sample of galaxy-scale strong gravitational lenses from the SLACS, BELLS, LSD, and SL2S surveys and using a well-motivated fiducial set of lens-galaxy parameters, we tested the weak-field metric on kiloparsec scales and found a constraint on the post-Newtonian parameter γ ={0.995}-0.047+0.037 under the assumption of a flat ΛCDM universe with parameters taken from Planck observations. General relativity (GR) predicts exactly γ = 1. Uncertainties concerning the total mass density profile, anisotropy of the velocity dispersion, and the shape of the light profile combine to systematic uncertainties of ˜25%. By applying a cosmological model-independent method to the simulated future LSST data, we found a significant degeneracy between the PPN γ parameter and the spatial curvature of the universe. Setting a prior on the cosmic curvature parameter -0.007 < Ωk < 0.006, we obtained the constraint on the PPN parameter that γ ={1.000}-0.0025+0.0023. We conclude that strong lensing systems with measured stellar velocity dispersions may serve as another important probe to investigate validity of the GR, if the mass-dynamical structure of the lensing galaxies is accurately constrained in future lens surveys.
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
New characterization techniques for LSST sensors
Nomerotski, A.
2015-06-18
Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.
NASA Astrophysics Data System (ADS)
Lacki, Brian C.; Kochanek, Christopher S.; Stanek, Krzysztof Z.; Inada, Naohisa; Oguri, Masamune
2009-06-01
Difference imaging provides a new way to discover gravitationally lensed quasars because few nonlensed sources will show spatially extended, time variable flux. We test the method on the fields of lens candidates in the Sloan Digital Sky Survey (SDSS) Supernova Survey region from the SDSS Quasar Lens Search (SQLS) and one serendipitously discovered lensed quasar. Starting from 20,536 sources, including 49 SDSS quasars, 32 candidate lenses/lensed images, and one known lensed quasar, we find that 174 sources including 35 SDSS quasars, 16 candidate lenses/lensed images, and the known lensed quasar are nonperiodic variable sources. We can measure the spatial structure of the variable flux for 119 of these variable sources and identify only eight as candidate extended variables, including the known lensed quasar. Only the known lensed quasar appears as a close pair of sources on the difference images. Inspection of the remaining seven suggests they are false positives, and only two were spectroscopically identified quasars. One of the lens candidates from the SQLS survives our cuts, but only as a single image instead of a pair. This indicates a false positive rate of order ~1/4000 for the method, or given our effective survey area of order 0.82 deg2, ~5 per deg2 in the SDSS Supernova Survey. The fraction of quasars not found to be variable and the false positive rate would both fall if we had analyzed the full, later data releases for the SDSS fields. While application of the method to the SDSS is limited by the resolution, depth, and sampling of the survey, several future surveys such as Pan-STARRS, LSST, and SNAP will significantly improve on these limitations.
Big Data Science Cafés: High School Students Experiencing Real Research with Scientists
NASA Astrophysics Data System (ADS)
Walker, C. E.; Pompea, S. M.
2017-12-01
The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all stepped up to make the program a success. The project achieved many of its goals with a relatively small budget, providing value not only to the student leaders and student attendees, but to the scientists and staff as well. Staff learned what worked and what needed more fine-tuning to successfully launch and run a big data academy for teens in the years that follow.
Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis
2016-09-15
future research. 3 II. Astrostatistics Historically, astronomy has been a data-driven science. Larger and more precise data sets have led to the...forthcoming Large Synoptic Survey Telescope (LSST), the human-centric approach to astronomy is becoming strained [13, 24, 25, 63]. More than ever...process. One use of the filtering process is to remove artifacts from the data set. In the context of time domain astronomy , an artifact is an error in
Large Synoptic Survey Telescope: From Science Drivers to Reference Design
2008-01-01
faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter , taking an inventory of the Solar...Energy and Dark Matter (2) Taking an Inventory of the Solar System (3) Exploring the Transient Optical Sky (4) Mapping the Milky Way Each of these four...Constraining Dark Energy and Dark Matter Current models of cosmology require the exis- tence of both dark matter and dark energy to match observational
Responding to the Event Deluge
NASA Technical Reports Server (NTRS)
Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John
2012-01-01
We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others
On the accuracy of modelling the dynamics of large space structures
NASA Technical Reports Server (NTRS)
Diarra, C. M.; Bainum, P. M.
1985-01-01
Proposed space missions will require large scale, light weight, space based structural systems. Large space structure technology (LSST) systems will have to accommodate (among others): ocean data systems; electronic mail systems; large multibeam antenna systems; and, space based solar power systems. The structures are to be delivered into orbit by the space shuttle. Because of their inherent size, modelling techniques and scaling algorithms must be developed so that system performance can be predicted accurately prior to launch and assembly. When the size and weight-to-area ratio of proposed LSST systems dictate that the entire system be considered flexible, there are two basic modeling methods which can be used. The first is a continuum approach, a mathematical formulation for predicting the motion of a general orbiting flexible body, in which elastic deformations are considered small compared with characteristic body dimensions. This approach is based on an a priori knowledge of the frequencies and shape functions of all modes included within the system model. Alternatively, finite element techniques can be used to model the entire structure as a system of lumped masses connected by a series of (restoring) springs and possibly dampers. In addition, a computational algorithm was developed to evaluate the coefficients of the various coupling terms in the equations of motion as applied to the finite element model of the Hoop/Column.
Firefly: embracing future web technologies
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.
2016-07-01
At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.
Synoptic Sky Surveys: Lessons Learned and Challenges Ahead
NASA Astrophysics Data System (ADS)
Djorgovski, Stanislav G.; CRTS Team
2014-01-01
A new generation of synoptic sky surveys is now opening the time domain for a systematic exploration, presenting both great new scientific opportunities as well as the challenges. These surveys are touching essentially all subfields of astronomy, producing large statistical samples of the known types of objects and events (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). They are generating new science now, and paving the way for even larger surveys to come, e.g., the LSST. Our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already be severely limited, and this problem will grow by orders of magnitude. This requires an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of transient events, that incorporates heterogeneous data from the surveys themselves, archival information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts now under way. This is inherently an astronomy of telescope-computational systems, that increasingly depends on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is an archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional data parameter spaces.
Technology for large space systems: A special bibliography with indexes
NASA Technical Reports Server (NTRS)
1979-01-01
This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.
Technology for large space systems: A special bibliography with indexes (supplement 01)
NASA Technical Reports Server (NTRS)
1979-01-01
This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.
Control system design for the large space systems technology reference platform
NASA Technical Reports Server (NTRS)
Edmunds, R. S.
1982-01-01
Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masters, Daniel C.; Stern, Daniel K.; Rhodes, Jason D.
A key goal of the Stage IV dark energy experiments Euclid , LSST, and WFIRST is to measure the growth of structure with cosmic time from weak lensing analysis over large regions of the sky. Weak lensing cosmology will be challenging: in addition to highly accurate galaxy shape measurements, statistically robust and accurate photometric redshift (photo- z ) estimates for billions of faint galaxies will be needed in order to reconstruct the three-dimensional matter distribution. Here we present an overview of and initial results from the Complete Calibration of the Color–Redshift Relation (C3R2) survey, which is designed specifically to calibratemore » the empirical galaxy color–redshift relation to the Euclid depth. These redshifts will also be important for the calibrations of LSST and WFIRST . The C3R2 survey is obtaining multiplexed observations with Keck (DEIMOS, LRIS, and MOSFIRE), the Gran Telescopio Canarias (GTC; OSIRIS), and the Very Large Telescope (VLT; FORS2 and KMOS) of a targeted sample of galaxies that are most important for the redshift calibration. We focus spectroscopic efforts on undersampled regions of galaxy color space identified in previous work in order to minimize the number of spectroscopic redshifts needed to map the color–redshift relation to the required accuracy. We present the C3R2 survey strategy and initial results, including the 1283 high-confidence redshifts obtained in the 2016A semester and released as Data Release 1.« less
NASA Astrophysics Data System (ADS)
DeVries, J.; Neill, D. R.; Barr, J.; De Lorenzi, Simone; Marchiori, Gianpietro
2016-07-01
The Large Synoptic Survey Telescope (LSST) is a large (8.4 meter) wide-field (3.5 degree) survey telescope, which will be located on the Cerro Pachón summit in Chile 1. As a result of the Telescope wide field of view, the optical system is unusually susceptible to stray light 2. In addition, balancing the effect of wind induced telescope vibrations with Dome seeing is crucial. The rotating enclosure system (Dome) includes a moving wind screen and light baffle system. All of the Dome vents include hinged light baffles, which provide exceptional Dome flushing, stray light attenuation, and allows for vent maintenance access from inside the Dome. The wind screen also functions as a light screen, and helps define a clear optical aperture for the Telescope. The Dome must operate continuously without rotational travel limits to accommodate the Telescope cadence and travel. Consequently, the Azimuth drives are located on the fixed lower enclosure to accommodate glycol water cooling without the need for a utility cable wrap. An air duct system aligns when the Dome is in its parked position, and this provides air cooling for temperature conditioning of the Dome during the daytime. A bridge crane and a series of ladders, stairs and platforms provide for the inspection, maintenance and repair of all of the Dome mechanical systems. The contract to build the Dome was awarded to European Industrial Engineering in Mestre, Italy in May 2015. In this paper, we present the final design of this telescope and site sub-system.
Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)
NASA Technical Reports Server (NTRS)
1980-01-01
This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
Time-Resolved Surveys of Stellar Clusters
NASA Astrophysics Data System (ADS)
Eyer, Laurent; Eggenberger, Patrick; Greco, Claudia; Saesen, Sophie; Anderson, Richard I.; Mowlavi, Nami
We describe the information that can be gained when a survey is done multi-epoch, and its particular impact in open cluster research. We first explain the irreplaceable information that multi-epoch observations are giving within astrometry, photometry and spectroscopy. Then we give three examples of results on open clusters from multi-epoch surveys, namely, the distance to the Pleiades, the angular momentum evolution of low mass stars and asteroseismology. Finally we mention several very large surveys, which are ongoing or planned for the future, Gaia, JASMINE, LSST, and VVV.
Technology for large space systems: A special bibliography with indexes (supplement 05)
NASA Technical Reports Server (NTRS)
1981-01-01
This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
Technology for large space systems: A special bibliography with indexes (supplement 06)
NASA Technical Reports Server (NTRS)
1982-01-01
This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.
Machine-assisted discovery of relationships in astronomy
NASA Astrophysics Data System (ADS)
Graham, Matthew J.; Djorgovski, S. G.; Mahabal, Ashish A.; Donalek, Ciro; Drake, Andrew J.
2013-05-01
High-volume feature-rich data sets are becoming the bread-and-butter of 21st century astronomy but present significant challenges to scientific discovery. In particular, identifying scientifically significant relationships between sets of parameters is non-trivial. Similar problems in biological and geosciences have led to the development of systems which can explore large parameter spaces and identify potentially interesting sets of associations. In this paper, we describe the application of automated discovery systems of relationships to astronomical data sets, focusing on an evolutionary programming technique and an information-theory technique. We demonstrate their use with classical astronomical relationships - the Hertzsprung-Russell diagram and the Fundamental Plane of elliptical galaxies. We also show how they work with the issue of binary classification which is relevant to the next generation of large synoptic sky surveys, such as the Large Synoptic Survey Telescope (LSST). We find that comparable results to more familiar techniques, such as decision trees, are achievable. Finally, we consider the reality of the relationships discovered and how this can be used for feature selection and extraction.
CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding
NASA Astrophysics Data System (ADS)
Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás
2018-01-01
Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.
NASA Astrophysics Data System (ADS)
Friedrich, Oliver; Eifler, Tim
2018-01-01
Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multiprobe) analyses of the large-scale structure of the Universe. Analytically computed covariances are noise-free and hence straightforward to invert; however, the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best-fitting values. For future surveys, reducing both effects to an acceptable level requires an unfeasibly large number of simulations. In this paper we describe a way to expand the precision matrix around a covariance model and show how to estimate the leading order terms of this expansion from simulations. This is especially powerful if the covariance matrix is the sum of two contributions, C = A+B, where A is well understood analytically and can be turned off in simulations (e.g. shape noise for cosmic shear) to yield a direct estimate of B. We test our method in mock experiments resembling tomographic weak lensing data vectors from the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope (LSST). For DES we find that 400 N-body simulations are sufficient to achieve negligible statistical uncertainties on parameter constraints. For LSST this is achieved with 2400 simulations. The standard covariance estimator would require >105 simulations to reach a similar precision. We extend our analysis to a DES multiprobe case finding a similar performance.
Strong Lens Time Delay Challenge. I. Experimental Design
NASA Astrophysics Data System (ADS)
Dobler, Gregory; Fassnacht, Christopher D.; Treu, Tommaso; Marshall, Phil; Liao, Kai; Hojjati, Alireza; Linder, Eric; Rumbaugh, Nicholas
2015-02-01
The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~103 strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders," each containing a group of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.
Fabrication of the LSST monolithic primary-tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Ketelsen, Dean A.; Law, Kevin; Gressler, William J.; Zhao, Chunyu
2012-09-01
As previously reported (at the SPIE Astronomical Instrumentation conference of 2010 in San Diego1), the Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona’s Steward Observatory Mirror Lab. We will provide an update to the status of the mirrors and metrology systems, which have advanced from concepts to hardware in the past two years. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab, reducing the degrees of freedom needed to be controlled in the telescope. The surface specification is described as a structure function, related to seeing in excellent conditions. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper details the manufacturing process and metrology systems for each surface, including the alignment of the two surfaces. M1 is a hyperboloid and can utilize a standard Offner null corrector, whereas M3 is an oblate ellipsoid, so it has positive spherical aberration. The null corrector is a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature. Laser trackers are relied upon to measure the alignment and spacing as well as rough-surface metrology during looseabrasive grinding.
PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
VanderPlas, Jacob T.; Ivezic, Željko
This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common tomore » all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.« less
AXIS - Advanced X-ray Imaging Sarellite
NASA Astrophysics Data System (ADS)
Loewenstein, Michael; AXIS Team
2018-01-01
We present an overview of the Advanced X-ray Imaging Satellite (AXIS), a probe mission concept under study to the 2020 Decadal survey. AXIS follows in the footsteps of the spectacularly successful Chandra X-ray Observatory with similar or higher angular resolution and an order of magnitude more collecting area in the 0.3-10 keV band over a 15' field of view. These capabilities are designed to attain a wide range of science goals such as (i) measuring the event horizon scale structure in AGN accretion disks and the spin of supermassive black holes through monitoring of gravitationally microlensed quasars; (ii) understanding AGN and starburst feedback in galaxies and galaxy clusters through direct imaging of winds and interaction of jets and via spatially resolved imaging of galaxies at high-z; (iii) probing the fueling of AGN by resolving the SMBH sphere of influence in nearby galaxies; (iv) investigating hierarchical structure formation and the SMBH merger rate through measurement of the occurrence rate of dual AGN and occupation fraction of SMBHs; (v) advancing SNR physics and galaxy ecology through large detailed samples of SNR in nearby galaxies; (vi) measuring the Cosmic Web through its connection to cluster outskirts. With a nominal 2028 launch, AXIS benefits from natural synergies with LSST, ELTs, ALMA, WFIRST and ATHENA, and will be a valuable precursor to Lynx. AXIS utilizes breakthroughs in the construction of light-weight X-ray optics from mono-crystalline silicon blocks, and developments in the fabrication of large format, small pixel, high readout detectors.
NASA Astrophysics Data System (ADS)
Petri, Andrea; May, Morgan; Haiman, Zoltán
2016-09-01
Weak gravitational lensing is becoming a mature technique for constraining cosmological parameters, and future surveys will be able to constrain the dark energy equation of state w . When analyzing galaxy surveys, redshift information has proven to be a valuable addition to angular shear correlations. We forecast parameter constraints on the triplet (Ωm,w ,σ8) for a LSST-like photometric galaxy survey, using tomography of the shear-shear power spectrum, convergence peak counts and higher convergence moments. We find that redshift tomography with the power spectrum reduces the area of the 1 σ confidence interval in (Ωm,w ) space by a factor of 8 with respect to the case of the single highest redshift bin. We also find that adding non-Gaussian information from the peak counts and higher-order moments of the convergence field and its spatial derivatives further reduces the constrained area in (Ωm,w ) by factors of 3 and 4, respectively. When we add cosmic microwave background parameter priors from Planck to our analysis, tomography improves power spectrum constraints by a factor of 3. Adding moments yields an improvement by an additional factor of 2, and adding both moments and peaks improves by almost a factor of 3 over power spectrum tomography alone. We evaluate the effect of uncorrected systematic photometric redshift errors on the parameter constraints. We find that different statistics lead to different bias directions in parameter space, suggesting the possibility of eliminating this bias via self-calibration.
The Amateurs' Love Affair with Large Datasets
NASA Astrophysics Data System (ADS)
Price, Aaron; Jacoby, S. H.; Henden, A.
2006-12-01
Amateur astronomers are professionals in other areas. They bring expertise from such varied and technical careers as computer science, mathematics, engineering, and marketing. These skills, coupled with an enthusiasm for astronomy, can be used to help manage the large data sets coming online in the next decade. We will show specific examples where teams of amateurs have been involved in mining large, online data sets and have authored and published their own papers in peer-reviewed astronomical journals. Using the proposed LSST database as an example, we will outline a framework for involving amateurs in data analysis and education with large astronomical surveys.
Data Management challenges in Astronomy and Astroparticle Physics
NASA Astrophysics Data System (ADS)
Lamanna, Giovanni
2015-12-01
Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.
The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept
NASA Astrophysics Data System (ADS)
Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking
2018-01-01
CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.
On determination of charge transfer efficiency of thick, fully depleted CCDs with 55 Fe x-rays
Yates, D.; Kotov, I.; Nomerotski, A.
2017-07-01
Charge transfer efficiency (CTE) is one of the most important CCD characteristics. Our paper examines ways to optimize the algorithms used to analyze 55Fe x-ray data on the CCDs, as well as explores new types of observables for CTE determination that can be used for testing LSST CCDs. Furthermore, the observables are modeled employing simple Monte Carlo simulations to determine how the charge diffusion in thick, fully depleted silicon affects the measurement. The data is compared to the simulations for one of the observables, integral flux of the x-ray hit.
Astrophysics in the Era of Massive Time-Domain Surveys
NASA Astrophysics Data System (ADS)
Djorgovski, G.
Synoptic sky surveys are now the largest data producers in astronomy, entering the Petascale regime, opening the time domain for a systematic exploration. A great variety of interesting phenomena, spanning essentially all subfields of astronomy, can only be studied in the time domain, and these new surveys are producing large statistical samples of the known types of objects and events for further studies (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). These surveys are generating a new science, and paving the way for even larger surveys to come, e.g., the LSST; our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges, the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already and, for the predictable future, will be severely limited, thus requiring an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of events, that incorporates heterogeneous data from the surveys themselves, archival and contextual information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts. Time domain astronomy is inherently an astronomy of telescope-computational systems, and will increasingly depend on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is a purely archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional parameter spaces produced by sky surveys.
Mental Fatigue Impairs Soccer-Specific Physical and Technical Performance.
Smith, Mitchell R; Coutts, Aaron J; Merlini, Michele; Deprez, Dieter; Lenoir, Matthieu; Marcora, Samuele M
2016-02-01
To investigate the effects of mental fatigue on soccer-specific physical and technical performance. This investigation consisted of two separate studies. Study 1 assessed the soccer-specific physical performance of 12 moderately trained soccer players using the Yo-Yo Intermittent Recovery Test, Level 1 (Yo-Yo IR1). Study 2 assessed the soccer-specific technical performance of 14 experienced soccer players using the Loughborough Soccer Passing and Shooting Tests (LSPT, LSST). Each test was performed on two occasions and preceded, in a randomized, counterbalanced order, by 30 min of the Stroop task (mentally fatiguing treatment) or 30 min of reading magazines (control treatment). Subjective ratings of mental fatigue were measured before and after treatment, and mental effort and motivation were measured after treatment. Distance run, heart rate, and ratings of perceived exertion were recorded during the Yo-Yo IR1. LSPT performance time was calculated as original time plus penalty time. LSST performance was assessed using shot speed, shot accuracy, and shot sequence time. Subjective ratings of mental fatigue and effort were higher after the Stroop task in both studies (P < 0.001), whereas motivation was similar between conditions. This mental fatigue significantly reduced running distance in the Yo-Yo IR1 (P < 0.001). No difference in heart rate existed between conditions, whereas ratings of perceived exertion were significantly higher at iso-time in the mental fatigue condition (P < 0.01). LSPT original time and performance time were not different between conditions; however, penalty time significantly increased in the mental fatigue condition (P = 0.015). Mental fatigue also impaired shot speed (P = 0.024) and accuracy (P < 0.01), whereas shot sequence time was similar between conditions. Mental fatigue impairs soccer-specific running, passing, and shooting performance.
Strong Gravitational Lensing as a Probe of Gravity, Dark-Matter and Super-Massive Black Holes
NASA Astrophysics Data System (ADS)
Koopmans, L.V.E.; Barnabe, M.; Bolton, A.; Bradac, M.; Ciotti, L.; Congdon, A.; Czoske, O.; Dye, S.; Dutton, A.; Elliasdottir, A.; Evans, E.; Fassnacht, C.D.; Jackson, N.; Keeton, C.; Lasio, J.; Moustakas, L.; Meneghetti, M.; Myers, S.; Nipoti, C.; Suyu, S.; van de Ven, G.; Vegetti, S.; Wucknitz, O.; Zhao, H.-S.
Whereas considerable effort has been afforded in understanding the properties of galaxies, a full physical picture, connecting their baryonic and dark-matter content, super-massive black holes, and (metric) theories of gravity, is still ill-defined. Strong gravitational lensing furnishes a powerful method to probe gravity in the central regions of galaxies. It can (1) provide a unique detection-channel of dark-matter substructure beyond the local galaxy group, (2) constrain dark-matter physics, complementary to direct-detection experiments, as well as metric theories of gravity, (3) probe central super-massive black holes, and (4) provide crucial insight into galaxy formation processes from the dark matter point of view, independently of the nature and state of dark matter. To seriously address the above questions, a considerable increase in the number of strong gravitational-lens systems is required. In the timeframe 2010-2020, a staged approach with radio (e.g. EVLA, e-MERLIN, LOFAR, SKA phase-I) and optical (e.g. LSST and JDEM) instruments can provide 10^(2-4) new lenses, and up to 10^(4-6) new lens systems from SKA/LSST/JDEM all-sky surveys around ~2020. Follow-up imaging of (radio) lenses is necessary with moderate ground/space-based optical-IR telescopes and with 30-50m telescopes for spectroscopy (e.g. TMT, GMT, ELT). To answer these fundamental questions through strong gravitational lensing, a strong investment in large radio and optical-IR facilities is therefore critical in the coming decade. In particular, only large-scale radio lens surveys (e.g. with SKA) provide the large numbers of high-resolution and high-fidelity images of lenses needed for SMBH and flux-ratio anomaly studies.
STRONG LENS TIME DELAY CHALLENGE. I. EXPERIMENTAL DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobler, Gregory; Fassnacht, Christopher D.; Rumbaugh, Nicholas
2015-02-01
The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters. The number of lenses with measured time delays is growing rapidly; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ∼10{sup 3} strongly lensed quasars. In an effort to assess the present capabilities of the community, to accurately measure the time delays, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we have invited the community to take part in a ''Time Delay Challenge'' (TDC). The challenge is organized as a set of ''ladders'', each containing a groupmore » of simulated data sets to be analyzed blindly by participating teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' data sets increasing in complexity and realism. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of data sets, and is designed to be used as a practice set by the participating teams. The (non-mandatory) deadline for completion of TDC0 was the TDC1 launch date, 2013 December 1. The TDC1 deadline was 2014 July 1. Here we give an overview of the challenge, we introduce a set of metrics that will be used to quantify the goodness of fit, efficiency, precision, and accuracy of the algorithms, and we present the results of TDC0. Thirteen teams participated in TDC0 using 47 different methods. Seven of those teams qualified for TDC1, which is described in the companion paper.« less
STRONG LENS TIME DELAY CHALLENGE. II. RESULTS OF TDC1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Kai; Treu, Tommaso; Marshall, Phil
2015-02-10
We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g., COSMOGRAIL) and expected from future synoptic surveys (e.g., LSST), and include simulated systematic errors. Seven teams participated in TDC1, submitting results from 78 different method variants. After describing each method, we compute and analyze basic statistics measuring accuracy (ormore » bias) A, goodness of fit χ{sup 2}, precision P, and success rate f. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give |A| < 0.03, P < 0.03, and χ{sup 2} < 1.5, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range f = 20%-40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher f than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with P < 0.03 and |A| < 0.01, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that A and f depend mostly on season length, while P depends mostly on cadence and campaign duration.« less
PROFIT: Bayesian profile fitting of galaxy images
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.
2017-04-01
We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (
Unveiling the population of orphan γ-ray bursts
NASA Astrophysics Data System (ADS)
Ghirlanda, G.; Salvaterra, R.; Campana, S.; Vergani, S. D.; Japelj, J.; Bernardini, M. G.; Burlon, D.; D'Avanzo, P.; Melandri, A.; Gomboc, A.; Nappo, F.; Paladini, R.; Pescalli, A.; Salafia, O. S.; Tagliaferri, G.
2015-06-01
Gamma-ray bursts (GRBs) are detectable in the γ-ray band if their jets are oriented toward the observer. However, for each GRB with a typical θjet, there should be ~2/θ2jet bursts whose emission cone is oriented elsewhere in space. These off-axis bursts can eventually be detected when, due to the deceleration of their relativistic jets, the beaming angle becomes comparable to the viewing angle. Orphan afterglows (OAs) should outnumber the current population of bursts detected in the γ-ray band even if they have not been conclusively observed so far at any frequency. We compute the expected flux of the population of orphan afterglows in the mm, optical, and X-ray bands through a population synthesis code of GRBs and the standard afterglow emission model. We estimate the detection rate of OAs with ongoing and forthcoming surveys. The average duration of OAs as transients above a given limiting flux is derived and described with analytical expressions: in general OAs should appear as daily transients in optical surveys and as monthly/yearly transients in the mm/radio band. We find that ~2 OA yr-1 could already be detected by Gaia and up to 20 OA yr-1 could be observed by the ZTF survey. A larger number of 50 OA yr-1 should be detected by LSST in the optical band. For the X-ray band, ~26 OA yr-1 could be detected by the eROSITA. For the large population of OA detectable by LSST, the X-ray and optical follow up of the light curve (for the brightest cases) and/or the extensive follow up of their emission in the mm and radio band could be the key to disentangling their GRB nature from other extragalactic transients of comparable flux density.
Optical testing of the LSST combined primary/tertiary mirror
NASA Astrophysics Data System (ADS)
Tuell, Michael T.; Martin, Hubert M.; Burge, James H.; Gressler, William J.; Zhao, Chunyu
2010-07-01
The Large Synoptic Survey Telescope (LSST) utilizes a three-mirror design in which the primary (M1) and tertiary (M3) mirrors are two concentric aspheric surfaces on one monolithic substrate. The substrate material is Ohara E6 borosilicate glass, in a honeycomb sandwich configuration, currently in production at The University of Arizona's Steward Observatory Mirror Lab. In addition to the normal requirements for smooth surfaces of the appropriate prescriptions, the alignment of the two surfaces must be accurately measured and controlled in the production lab. Both the pointing and centration of the two optical axes are important parameters, in addition to the axial spacing of the two vertices. This paper describes the basic metrology systems for each surface, with particular attention to the alignment of the two surfaces. These surfaces are aspheric enough to require null correctors for each wavefront. Both M1 and M3 are concave surfaces with both non-zero conic constants and higher-order terms (6th order for M1 and both 6th and 8th orders for M3). M1 is hyperboloidal and can utilize a standard Offner null corrector. M3 is an oblate ellipsoid, so has positive spherical aberration. We have chosen to place a phase-etched computer-generated hologram (CGH) between the mirror surface and the center-of-curvature (CoC), whereas the M1 null lens is beyond the CoC. One relatively new metrology tool is the laser tracker, which is relied upon to measure the alignment and spacings. A separate laser tracker system will be used to measure both surfaces during loose abrasive grinding and initial polishing.
Computer analysis of digital sky surveys using citizen science and manual classification
NASA Astrophysics Data System (ADS)
Kuminski, Evan; Shamir, Lior
2015-01-01
As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.
Solar system science with ESA Euclid
NASA Astrophysics Data System (ADS)
Carry, B.
2018-01-01
Context. The ESA Euclid mission has been designed to map the geometry of the dark Universe. Scheduled for launch in 2020, it will conduct a six-year visible and near-infrared imaging and spectroscopic survey over 15 000 deg2 down to VAB 24.5. Although the survey will avoid ecliptic latitudes below 15°, the survey pattern in repeated sequences of four broadband filters seems well-adapted to detect and characterize solar system objects (SSOs). Aims: We aim at evaluating the capability of Euclid of discovering SSOs and of measuring their position, apparent magnitude, and spectral energy distribution. We also investigate how the SSO orbits, morphology (activity and multiplicity), physical properties (rotation period, spin orientation, and 3D shape), and surface composition can be determined based on these measurements. Methods: We used the current census of SSOs to extrapolate the total amount of SSOs that will be detectable by Euclid, that is, objects within the survey area and brighter than the limiting magnitude. For each different population of SSO, from neighboring near-Earth asteroids to distant Kuiper-belt objects (KBOs) and including comets, we compared the expected Euclid astrometry, photometry, and spectroscopy with the SSO properties to estimate how Euclid will constrain the SSOs dynamical, physical, and compositional properties. Results: With the current survey design, about 150 000 SSOs, mainly from the asteroid main-belt, should be observable by Euclid. These objects will all have high inclination, which is a difference to many SSO surveys that focus on the ecliptic plane. Euclid may be able to discover several 104 SSOs, in particular, distant KBOs at high declination. The Euclid observations will consist of a suite of four sequences of four measurements and will refine the spectral classification of SSOs by extending the spectral coverage provided by Gaia and the LSST, for instance, to 2 microns. Combined with sparse photometry such as measured by Gaia and the LSST, the time-resolved photometry will contribute to determining the SSO rotation period, spin orientation, and 3D shape model. The sharp and stable point-spread function of Euclid will also allow us to resolve binary systems in the Kuiper belt and detect activity around Centaurs. Conclusions: The depth of the Euclid survey (VAB 24.5), its spectral coverage (0.5 to 2.0 μm), and its observation cadence has great potential for solar system research. A dedicated processing for SSOs is being set up within the Euclid consortium to produce astrometry catalogs, multicolor and time-resolved photometry, and spectral classification of some 105 SSOs, which will be delivered as Legacy Science.
voevent-parse: Parse, manipulate, and generate VOEvent XML packets
NASA Astrophysics Data System (ADS)
Staley, Tim D.
2014-11-01
voevent-parse, written in Python, parses, manipulates, and generates VOEvent XML packets; it is built atop lxml.objectify. Details of transients detected by many projects, including Fermi, Swift, and the Catalina Sky Survey, are currently made available as VOEvents, which is also the standard alert format by future facilities such as LSST and SKA. However, working with XML and adhering to the sometimes lengthy VOEvent schema can be a tricky process. voevent-parse provides convenience routines for common tasks, while allowing the user to utilise the full power of the lxml library when required. An earlier version of voevent-parse was part of the pysovo (ascl:1411.002) library.
The Hyper Suprime-Cam software pipeline
NASA Astrophysics Data System (ADS)
Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi
2018-01-01
In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.
Cosmological N-body Simulation of Galaxy and Large-Scale Structure Formation: The Gravity Frontier
NASA Astrophysics Data System (ADS)
Klypin, Anatoly
2015-04-01
One of the first N-body simulations done almost 50 years ago had only 200 self-gravitating particles. Even this first baby step made substantial impact on understanding how astronomical objects should form. Now powerful supercomputers and new algorithms allow astronomers produce N-body simulations that employ up to a trillion dark matter particles and produce vital theoretical predictions regarding formation, evolution, structure and statistics of objects ranging from dwarf galaxies to clusters and superclusters of galaxies. With only gravity involved in these theoretical models, one would naively expect that by now we should know everything we need about N-body dynamics of cosmological fluctuations. Not the case. It appears that the Universe was not cooperative and gave us divergencies in the initial conditions generated during the Inflation epoch and subsequent expansion of the Universe - the infinite phase-space density and divergent density fluctuations. Ever increasing observational demands on statistics and accuracy of theoretical predictions is another driving force for more realistic and larger N-body simulations. Large current and new planned observational projects such as BOSS, eBOSS, Euclid, LSST will bring information on spatial distribution, motion, and properties of millions of galaxies at different redshifts. Direct simulations of evolution of gas and formation of stars for millions of forming galaxies will not be available for years leaving astronomers with the only option - to develop methods to combine large N-body simulations with models of galaxy formation to produce accurate theoretical predictions. I will discuss the current status of the field and directions of its development.
Petri, Andrea; May, Morgan; Haiman, Zoltán
2016-09-30
Weak gravitational lensing is becoming a mature technique for constraining cosmological parameters, and future surveys will be able to constrain the dark energy equation of state w. When analyzing galaxy surveys, redshift information has proven to be a valuable addition to angular shear correlations. We forecast parameter constraints on the triplet (Ω m,w,σ 8) for a LSST-like photometric galaxy survey, using tomography of the shear-shear power spectrum, convergence peak counts and higher convergence moments. Here we find that redshift tomography with the power spectrum reduces the area of the 1σ confidence interval in (Ω m,w) space by a factor ofmore » 8 with respect to the case of the single highest redshift bin. We also find that adding non-Gaussian information from the peak counts and higher-order moments of the convergence field and its spatial derivatives further reduces the constrained area in (Ω m,w) by factors of 3 and 4, respectively. When we add cosmic microwave background parameter priors from Planck to our analysis, tomography improves power spectrum constraints by a factor of 3. Adding moments yields an improvement by an additional factor of 2, and adding both moments and peaks improves by almost a factor of 3 over power spectrum tomography alone. We evaluate the effect of uncorrected systematic photometric redshift errors on the parameter constraints. In conclusion, we find that different statistics lead to different bias directions in parameter space, suggesting the possibility of eliminating this bias via self-calibration.« less
Future Sky Surveys: New Discovery Frontiers
NASA Astrophysics Data System (ADS)
Tyson, J. Anthony; Borne, Kirk D.
2012-03-01
Driven by the availability of new instrumentation, there has been an evolution in astronomical science toward comprehensive investigations of new phenomena. Major advances in our understanding of the Universe over the history of astronomy have often arisen from dramatic improvements in our capability to observe the sky to greater depth, in previously unexplored wavebands, with higher precision, or with improved spatial, spectral, or temporal resolution. Substantial progress in the important scientific problems of the next decade (determining the nature of dark energy and dark matter, studying the evolution of galaxies and the structure of our own Milky Way, opening up the time domain to discover faint variable objects, and mapping both the inner and outer Solar System) can be achieved through the application of advanced data mining methods and machine learning algorithms operating on the numerous large astronomical databases that will be generated from a variety of revolutionary future sky surveys. Over the next decade, astronomy will irrevocably enter the era of big surveys and of really big telescopes. New sky surveys (some of which will produce petabyte-scale data collections) will begin their operations, and one or more very large telescopes (ELTs = Extremely Large Telescopes) will enter the construction phase. These programs and facilities will generate a remarkable wealth of data of high complexity, endowed with enormous scientific knowledge discovery potential. New parameter spaces will be opened, in multiple wavelength domains as well as the time domain, across wide areas of the sky, and down to unprecedented faint source flux limits. The synergies of grand facilities, massive data collections, and advanced machine learning algorithms will come together to enable discoveries within most areas of astronomical science, including Solar System, exo-planets, star formation, stellar populations, stellar death, galaxy assembly, galaxy evolution, quasar evolution, and cosmology. Current and future sky surveys, comprising an alphabet soup of project names (e.g., Pan- STARRS, WISE, Kepler, DES, VST, VISTA, GAIA, EUCLID, SKA, LSST, and WFIRST; some of which are discussed in Chapters 17, 18, and 20),will contribute to the exponential explosion of complex data in astronomy. The scientific goals of these projects are as monumental as the programs themselves. The core scientific output of all of these will be their scientific data collection. Consequently, data mining and machine learning algorithms and specialists will become a common component of future astronomical research with these facilities. This synergistic combination and collaboration among multiple disciplines are essential in order to maximize the scientific discovery potential, the science output, the research efficiency, and the success of these projects.
IAC level "O" program development
NASA Technical Reports Server (NTRS)
Vos, R. G.
1982-01-01
The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.
Protecting Dark Skies in Chile
NASA Astrophysics Data System (ADS)
Smith, R. Chris; Sanhueza, Pedro; Phillips, Mark
2018-01-01
Current projections indicate that Chile will host approximately 70% of the astronomical collecting area on Earth by 2030, augmenting the enormous area of ALMA with that of three next-generation optical telescopes: LSST, GMTO, and E-ELT. These cutting-edge facilities represent billions of dollars of investment in the astronomical facilities hosted in Chile. The Chilean government, Chilean astronomical community, and the international observatories in Chile have recognized that these investments are threatened by light pollution, and have formed a strong collaboration to work at managing the threats. We will provide an update on the work being done in Chile, ranging from training municipalities about new lighting regulations to exploring international recognition of the dark sky sites of Northern Chile.
The Hyper Suprime-Cam software pipeline
Bosch, James; Armstrong, Robert; Bickerton, Steven; ...
2017-10-12
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
The Hyper Suprime-Cam software pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosch, James; Armstrong, Robert; Bickerton, Steven
Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less
Machine Learning for Zwicky Transient Facility
NASA Astrophysics Data System (ADS)
Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey
2018-01-01
The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.
Agile software development in an earned value world: a survival guide
NASA Astrophysics Data System (ADS)
Kantor, Jeffrey; Long, Kevin; Becla, Jacek; Economou, Frossie; Gelman, Margaret; Juric, Mario; Lambert, Ron; Krughoff, Simon; Swinbank, John D.; Wu, Xiuqin
2016-08-01
Agile methodologies are current best practice in software development. They are favored for, among other reasons, preventing premature optimization by taking a somewhat short-term focus, and allowing frequent replans/reprioritizations of upcoming development work based on recent results and current backlog. At the same time, funding agencies prescribe earned value management accounting for large projects which, these days, inevitably include substantial software components. Earned Value approaches emphasize a more comprehensive and typically longer-range plan, and tend to characterize frequent replans and reprioritizations as indicative of problems. Here we describe the planning, execution and reporting framework used by the LSST Data Management team, that navigates these opposite tensions.
Characterising CCDs with cosmic rays
Fisher-Levine, M.; Nomerotski, A.
2015-08-06
The properties of cosmic ray muons make them a useful probe for measuring the properties of thick, fully depleted CCD sensors. The known energy deposition per unit length allows measurement of the gain of the sensor's amplifiers, whilst the straightness of the tracks allows for a crude assessment of the static lateral electric fields at the sensor's edges. The small volume in which the muons deposit their energy allows measurement of the contribution to the PSF from the diffusion of charge as it drifts across the sensor. In this work we present a validation of the cosmic ray gain measurementmore » technique by comparing with radioisotope gain measurments, and calculate the charge diffusion coefficient for prototype LSST sensors.« less
Managing Astronomy Research Data: Case Studies of Big and Small Research Projects
NASA Astrophysics Data System (ADS)
Sands, Ashley E.
2015-01-01
Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy data workforce encompasses a greater breadth of educational backgrounds. Results show that teams of individuals with distinct expertise are key to ensuring the long-term preservation and usability of astronomy datasets.
Color Me Intrigued: The Discovery of iPTF 16fnm, an SN 2002cx-like Object
NASA Astrophysics Data System (ADS)
Miller, A. A.; Kasliwal, M. M.; Cao, Y.; Adams, S. M.; Goobar, A.; Knežević, S.; Laher, R. R.; Lunnan, R.; Masci, F. J.; Nugent, P. E.; Perley, D. A.; Petrushevska, T.; Quimby, R. M.; Rebbapragada, U. D.; Sollerman, J.; Taddia, F.; Kulkarni, S. R.
2017-10-01
Modern wide-field, optical time-domain surveys must solve a basic optimization problem: maximize the number of transient discoveries or minimize the follow-up needed for the new discoveries. Here, we describe the Color Me Intrigued experiment, the first from the intermediate Palomar Transient Factory (iPTF) to search for transients simultaneously in the g PTF and R PTF bands. During the course of this experiment, we discovered iPTF 16fnm, a new member of the 02cx-like subclass of Type Ia supernovae (SNe). iPTF 16fnm peaked at {M}{g{PTF}}=-15.09+/- 0.17 {mag}, making it the second-least-luminous known SN Ia. iPTF 16fnm exhibits all the hallmarks of the 02cx-like class: (I) low luminosity at peak, (II) low ejecta velocities, and (III) a non-nebular spectrum several months after peak. Spectroscopically, iPTF 16fnm exhibits a striking resemblance to two other low-luminosity 02cx-like SNe: SN 2007qd and SN 2010ae. iPTF 16fnm and SN 2005hk decline at nearly the same rate, despite a 3 mag difference in brightness at peak. When considering the full subclass of 02cx-like SNe, we do not find evidence for a tight correlation between peak luminosity and decline rate in either the g‧ or r‧ band. We measure the relative rate of 02cx-like SNe to normal SNe Ia and find {r}{N02{cx}/{N}{Ia}}={33}-25+158 % . We further examine the g‧ - r‧ evolution of 02cx-like SNe and find that their unique color evolution can be used to separate them from 91bg-like and normal SNe Ia. This selection function will be especially important in the spectroscopically incomplete Zwicky Transient Facility/Large Synoptic Survey Telescope (LSST) era. Finally, we close by recommending that LSST periodically evaluate, and possibly update, its observing cadence to maximize transient science.
Liverpool telescope 2: a new robotic facility for rapid transient follow-up
NASA Astrophysics Data System (ADS)
Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Bersier, D.; Bode, M. F.; Carter, D.; Clay, N. R.; Collins, C. A.; Darnley, M. J.; Davis, C. J.; Gutierrez, C. M.; Harman, D. J.; James, P. A.; Knapen, J. H.; Kobayashi, S.; Marchant, J. M.; Mazzali, P. A.; Mottram, C. J.; Mundell, C. G.; Newsam, A.; Oscoz, A.; Palle, E.; Piascik, A.; Rebolo, R.; Smith, R. J.
2015-03-01
The Liverpool Telescope is one of the world's premier facilities for time domain astronomy. The time domain landscape is set to radically change in the coming decade, with synoptic all-sky surveys such as LSST providing huge numbers of transient detections on a nightly basis; transient detections across the electromagnetic spectrum from other major facilities such as SVOM, SKA and CTA; and the era of `multi-messenger astronomy', wherein astrophysical events are detected via non-electromagnetic means, such as neutrino or gravitational wave emission. We describe here our plans for the Liverpool Telescope 2: a new robotic telescope designed to capitalise on this new era of time domain astronomy. LT2 will be a 4-metre class facility co-located with the Liverpool Telescope at the Observatorio del Roque de Los Muchachos on the Canary island of La Palma. The telescope will be designed for extremely rapid response: the aim is that the telescope will take data within 30 seconds of the receipt of a trigger from another facility. The motivation for this is twofold: firstly it will make it a world-leading facility for the study of fast fading transients and explosive phenomena discovered at early times. Secondly, it will enable large-scale programmes of low-to-intermediate resolution spectral classification of transients to be performed with great efficiency. In the target-rich environment of the LSST era, minimising acquisition overheads will be key to maximising the science gains from any follow-up programme. The telescope will have a diverse instrument suite which is simultaneously mounted for automatic changes, but it is envisaged that the primary instrument will be an intermediate resolution, optical/infrared spectrograph for scientific exploitation of transients discovered with the next generation of synoptic survey facilities. In this paper we outline the core science drivers for the telescope, and the requirements for the optical and mechanical design.
AXIS - A High Angular Resoltuion X-ray Probe Concept Study
NASA Astrophysics Data System (ADS)
Mushotzky, Richard; AXIS Study Team
2018-01-01
AXIS is a probe-class concept under study to the 2020 Decadal survey. AXIS will extend and enhance the science of high angular resolution x-ray imaging and spectroscopy in the next decade with ~0.3" angular resolution over a 7' radius field of view and an order of magnitude more collecting area than Chandra in the 0.3-12 keV band with a cost consistent with a probe.These capabilities enable major advances in a wide range of science such as: (1) measuring the event horizon scale structure in AGN accretion disks and the spins of supermassive black holes through observations of gravitationally-microlensed quasars; (ii) determining AGN and starburst feedback in galaxies and galaxy clusters through direct imaging of winds and interaction of jets and via spatially resolved imaging of galaxies at high-z; (iii) fueling of AGN by probing the Bondi radius of over 20 nearby galaxies; (iv) hierarchical structure formation and the SMBH merger rate through measurement of the occurrence rate of dual AGN and occupation fraction of SMBHs; (v) advancing SNR physics and galaxy ecology through large detailed samples of SNR in nearby galaxies; (vi) measuring the Cosmic Web through its connection to cluster outskirts. With a nominal 2028 launch, AXIS benefits from natural synergies with the ELTs, LSST, ALMA, WFIRST and ATHENA. AXIS utilizes breakthroughs in the construction of lightweight X-ray optics from mono-crystalline silicon blocks, and developments in the fabrication of large format, small pixel, high readout rate detectors allowing a robust and cost effective design. The AXIS team welcomes input and feedback from the community in preparation for the 2020 Decadal review.
The Unprecedented Properties of the First Electromagnetic Counterpart to a Gravitational-wave Source
NASA Astrophysics Data System (ADS)
Siebert, M. R.; Foley, R. J.; Drout, M. R.; Kilpatrick, C. D.; Shappee, B. J.; Coulter, D. A.; Kasen, D.; Madore, B. F.; Murguia-Berthier, A.; Pan, Y.-C.; Piro, A. L.; Prochaska, J. X.; Ramirez-Ruiz, E.; Rest, A.; Contreras, C.; Morrell, N.; Rojas-Bravo, C.; Simon, J. D.
2017-10-01
We discovered Swope Supernova Survey 2017a (SSS17a) in the LIGO/Virgo Collaboration (LVC) localization volume of GW170817, the first detected binary neutron star (BNS) merger, only 10.9 hr after the trigger. No object was present at the location of SSS17a only a few days earlier, providing a qualitative spatial and temporal association with GW170817. Here, we quantify this association, finding that SSS17a is almost certainly the counterpart of GW170817, with the chance of a coincidence being ≤9× {10}-6 (90% confidence). We arrive at this conclusion by comparing the optical properties of SSS17a to other known astrophysical transients, finding that SSS17a fades and cools faster than any other observed transient. For instance, SSS17a fades >5 mag in g within 7 days of our first data point, while all other known transients of similar luminosity fade by <1 mag during the same time period. Its spectra are also unique, being mostly featureless, even as it cools. The rarity of “SSS17a-like” transients combined with the relatively small LVC localization volume and recent non-detection imply the extremely unlikely chance coincidence. We find that the volumetric rate of SSS17a-like transients is ≤1.6× {10}4 Gpc-3 yr-1 and the Milky Way rate is ≤slant 0.19 per century. A transient survey designed to discover similar events should be high cadence and observe in red filters. The LVC will likely detect substantially more BNS mergers than current optical surveys will independently discover SSS17a-like transients, however a 1 day cadence survey with the Large Synoptic Survey Telescope (LSST) could discover an order of magnitude more events.
Tidal features of classical Milky Way satellites in a Λ cold dark matter universe
NASA Astrophysics Data System (ADS)
Wang, M.-Y.; Fattahi, Azadeh; Cooper, Andrew P.; Sawala, Till; Strigari, Louis E.; Frenk, Carlos S.; Navarro, Julio F.; Oman, Kyle; Schaller, Matthieu
2017-07-01
We use the APOSTLE (A Project Of Simulating The Local Environment) cosmological hydrodynamic simulations to examine the effects of tidal stripping on cold dark matter subhaloes that host three of the most luminous Milky Way dwarf satellite galaxies: Fornax, Sculptor and Leo I. We identify simulated satellites that match the observed spatial and kinematic distributions of stars in these galaxies, and track their evolution after infall. We find ˜30 per cent of subhaloes hosting satellites with present-day stellar mass 106-108 M⊙ experience >20 per cent stellar mass-loss after infall. Fornax analogues have earlier infall times compared to Sculptor and Leo I analogues. Star formation in Fornax analogues continues for ˜3-6 Gyr after infall, whereas Sculptor and Leo I analogues stop forming stars <2-3 Gyr after infall. Fornax analogues typically show more significant stellar mass-loss and exhibit stellar tidal tails, whereas Sculptor and Leo I analogues, which are more deeply embedded in their host dark matter haloes at infall, do not show substantial mass-loss due to tides. When additionally comparing the orbital motion of the host subaloes to the measured proper motion of Fornax, we find the matching more difficult; host subhaloes tend to have pericentres smaller than that measured for Fornax itself. From the kinematic and orbital data, we estimate that Fornax has lost 10-20 per cent of its infall stellar mass. Our best estimate for the surface brightness of a stellar tidal stream associated with Fornax is Σ ˜ 32.6 mag arcsec-2, which may be detectable with deep imaging surveys such as DES and LSST.
OPTIMAL TIME-SERIES SELECTION OF QUASARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Nathaniel R.; Bloom, Joshua S.
2011-03-15
We present a novel method for the optimal selection of quasars using time-series observations in a single photometric bandpass. Utilizing the damped random walk model of Kelly et al., we parameterize the ensemble quasar structure function in Sloan Stripe 82 as a function of observed brightness. The ensemble model fit can then be evaluated rigorously for and calibrated with individual light curves with no parameter fitting. This yields a classification in two statistics-one describing the fit confidence and the other describing the probability of a false alarm-which can be tuned, a priori, to achieve high quasar detection fractions (99% completenessmore » with default cuts), given an acceptable rate of false alarms. We establish the typical rate of false alarms due to known variable stars as {approx}<3% (high purity). Applying the classification, we increase the sample of potential quasars relative to those known in Stripe 82 by as much as 29%, and by nearly a factor of two in the redshift range 2.5 < z < 3, where selection by color is extremely inefficient. This represents 1875 new quasars in a 290 deg{sup 2} field. The observed rates of both quasars and stars agree well with the model predictions, with >99% of quasars exhibiting the expected variability profile. We discuss the utility of the method at high redshift and in the regime of noisy and sparse data. Our time-series selection complements well-independent selection based on quasar colors and has strong potential for identifying high-redshift quasars for Baryon Acoustic Oscillations and other cosmology studies in the LSST era.« less
Ground/bonding for Large Space System Technology (LSST). [of metallic and nonmetallic structures
NASA Technical Reports Server (NTRS)
Dunbar, W. G.
1980-01-01
The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.
Cables and connectors for Large Space System Technology (LSST)
NASA Technical Reports Server (NTRS)
Dunbar, W. G.
1980-01-01
The effect of the environment and extravehicular activity/remote assembly operations on the cables and connectors for spacecraft with metallic and/or nonmetallic structures was examined. Cable and connector philosophy was outlined for the electrical systems and electronic compartments which contain high-voltage, high-power electrical and electronic equipment. The influence of plasma and particulates on the system is analyzed and the effect of static buildup on the spacecraft electrical system discussed. Conceptual cable and connector designs are assessed for capability to withstand high current and high voltage without danger of arcs and electromagnetic interference. The extravehicular activites required of the space station and/or supply spacecraft crew members to join and inspect the electrical system, using manual or remote assembly construction are also considered.
Toroid Joining Gun. [thermoplastic welding system using induction heating
NASA Technical Reports Server (NTRS)
Buckley, J. D.; Fox, R. L.; Swaim, R J.
1985-01-01
The Toroid Joining Gun is a low cost, self-contained, portable low powered (100-400 watts) thermoplastic welding system developed at Langley Research Center for joining plastic and composite parts using an induction heating technique. The device developed for use in the fabrication of large space sructures (LSST Program) can be used in any atmosphere or in a vacuum. Components can be joined in situ, whether on earth or on a space platform. The expanded application of this welding gun is in the joining of thermoplastic composites, thermosetting composites, metals, and combinations of these materials. Its low-power requirements, light weight, rapid response, low cost, portability, and effective joining make it a candidate for solving many varied and unique bonding tasks.
Automated software configuration in the MONSOON system
NASA Astrophysics Data System (ADS)
Daly, Philip N.; Buchholz, Nick C.; Moore, Peter C.
2004-09-01
MONSOON is the next generation OUV-IR controller project being developed at NOAO. The design is flexible, emphasizing code re-use, maintainability and scalability as key factors. The software needs to support widely divergent detector systems ranging from multi-chip mosaics (for LSST, QUOTA, ODI and NEWFIRM) down to large single or multi-detector laboratory development systems. In order for this flexibility to be effective and safe, the software must be able to configure itself to the requirements of the attached detector system at startup. The basic building block of all MONSOON systems is the PAN-DHE pair which make up a single data acquisition node. In this paper we discuss the software solutions used in the automatic PAN configuration system.
The Dynamics of the Local Group in the Era of Precision Astrometry
NASA Astrophysics Data System (ADS)
Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta
2018-06-01
Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.
Ground/bonding for Large Space System Technology (LSST)
NASA Astrophysics Data System (ADS)
Dunbar, W. G.
1980-04-01
The influence of the environment and extravehicular activity remote assembly operations on the grounding and bonding of metallic and nonmetallic structures is discussed. Grounding and bonding philosophy is outlined for the electrical systems and electronic compartments which contain high voltage, high power electrical and electronic equipment. The influence of plasma and particulate on the system was analyzed and the effects of static buildup on the spacecraft electrical system discussed. Conceptual grounding bonding designs are assessed for capability to withstand high current arcs to ground from a high voltage conductor and electromagnetic interference. Also shown were the extravehicular activities required of the space station and or supply spacecraft crew members to join and inspect the ground system using manual on remote assembly construction.
Wu, Wei; Wang, Jin
2013-09-28
We established a potential and flux field landscape theory to quantify the global stability and dynamics of general spatially dependent non-equilibrium deterministic and stochastic systems. We extended our potential and flux landscape theory for spatially independent non-equilibrium stochastic systems described by Fokker-Planck equations to spatially dependent stochastic systems governed by general functional Fokker-Planck equations as well as functional Kramers-Moyal equations derived from master equations. Our general theory is applied to reaction-diffusion systems. For equilibrium spatially dependent systems with detailed balance, the potential field landscape alone, defined in terms of the steady state probability distribution functional, determines the global stability and dynamics of the system. The global stability of the system is closely related to the topography of the potential field landscape in terms of the basins of attraction and barrier heights in the field configuration state space. The effective driving force of the system is generated by the functional gradient of the potential field alone. For non-equilibrium spatially dependent systems, the curl probability flux field is indispensable in breaking detailed balance and creating non-equilibrium condition for the system. A complete characterization of the non-equilibrium dynamics of the spatially dependent system requires both the potential field and the curl probability flux field. While the non-equilibrium potential field landscape attracts the system down along the functional gradient similar to an electron moving in an electric field, the non-equilibrium flux field drives the system in a curly way similar to an electron moving in a magnetic field. In the small fluctuation limit, the intrinsic potential field as the small fluctuation limit of the potential field for spatially dependent non-equilibrium systems, which is closely related to the steady state probability distribution functional, is found to be a Lyapunov functional of the deterministic spatially dependent system. Therefore, the intrinsic potential landscape can characterize the global stability of the deterministic system. The relative entropy functional of the stochastic spatially dependent non-equilibrium system is found to be the Lyapunov functional of the stochastic dynamics of the system. Therefore, the relative entropy functional quantifies the global stability of the stochastic system with finite fluctuations. Our theory offers an alternative general approach to other field-theoretic techniques, to study the global stability and dynamics of spatially dependent non-equilibrium field systems. It can be applied to many physical, chemical, and biological spatially dependent non-equilibrium systems.
Masked areas in shear peak statistics. A forward modeling approach
Bard, D.; Kratochvil, J. M.; Dawson, W.
2016-03-09
The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less
MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bard, D.; Kratochvil, J. M.; Dawson, W., E-mail: djbard@slac.stanford.edu
2016-03-10
The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less
Unveiling the extreme nature of the hyper faint galaxy Virgo I
NASA Astrophysics Data System (ADS)
Crnojevic, Denija
2017-08-01
We request HST/ACS imaging to obtain a deep color-magnitude diagram of the newly discovered candidate Milky Way satellite Virgo I. With an estimated absolute magnitude of only M_V -0.8 and a Galactocentric radius of 90 kpc, Virgo I is one of the faintest and most distant dwarfs ever observed, and could be identified as a prototype ''hyper'' faint galaxy. The detailed characterization of the smallest inhabited dark matter subhalos is crucial to guide hierarchical galaxy formation models, and in particular to constrain reionization, the nature of the dark matter particle, etc. With the advent of deep wide-field, ground-based surveys, the potential of uncovering these lowest-mass galaxies is quickly turning into reality, as demonstrated by the discovery in the past two years of tens of new Local Group members in the ultra-faint regime (M_V>-8). Virgo I represents a new record in galaxy physical properties, and urges us to be prepared for the likely emergence of an entirely new class of such objects in the era of future wide-field surveys (e.g., LSST). Only high resolution HST observations can enable us to confirm the nature of Virgo I, providing significantly more accurate estimates for its distance and structural properties, when compared to the discovery Subaru/HyperSuprimeCam imaging. Our proposed dataset will constitute a fundamental step in the upcoming hunt for galaxies with similarly extreme properties.
DESI and other Dark Energy experiments in the era of neutrino mass measurements
Font-Ribera, Andreu; McDonald, Patrick; Mostek, Nick; ...
2014-05-19
Here we present Fisher matrix projections for future cosmological parameter measurements, including neutrino masses, Dark Energy, curvature, modified gravity, the inflationary perturbation spectrum, non-Gaussianity, and dark radiation. We focus on DESI and generally redshift surveys (BOSS, HETDEX, eBOSS, Euclid, and WFIRST), but also include CMB (Planck) and weak gravitational lensing (DES and LSST) constraints. The goal is to present a consistent set of projections, for concrete experiments, which are otherwise scattered throughout many papers and proposals. We include neutrino mass as a free parameter in most projections, as it will inevitably be relevant $-$ DESI and other experiments can measuremore » the sum of neutrino masses to ~ 0.02 eV or better, while the minimum possible sum is 0.06 eV. We note that constraints on Dark Energy are significantly degraded by the presence of neutrino mass uncertainty, especially when using galaxy clustering only as a probe of the BAO distance scale (because this introduces additional uncertainty in the background evolution after the CMB epoch). Using broadband galaxy power becomes relatively more powerful, and bigger gains are achieved by combining lensing survey constraints with redshift survey constraints. Finally, we do not try to be especially innovative, e.g., with complex treatments of potential systematic errors $-$ these projections are intended as a straightforward baseline for comparison to more detailed analyses.« less
NASA Astrophysics Data System (ADS)
Schaan, Emmanuel
2017-01-01
I will present two promising ways in which the cosmic microwave background (CMB) sheds light on critical uncertain physics and systematics of the large-scale structure. Shear calibration with CMB lensing: Realizing the full potential of upcoming weak lensing surveys requires an exquisite understanding of the errors in galaxy shape estimation. In particular, such errors lead to a multiplicative bias in the shear, degenerate with the matter density parameter and the amplitude of fluctuations. Its redshift-evolution can hide the true evolution of the growth of structure, which probes dark energy and possible modifications to general relativity. I will show that CMB lensing from a stage 4 experiment (CMB S4) can self-calibrate the shear for an LSST-like optical lensing survey. This holds in the presence of photo-z errors and intrinsic alignment. Evidence for the kinematic Sunyaev-Zel'dovich (kSZ) effect; cluster energetics: Through the kSZ effect, the baryon momentum field is imprinted on the CMB. I will report significant evidence for the kSZ effect from ACTPol and peculiar velocities reconstructed from BOSS. I will present the prospects for constraining cluster gas profiles and energetics from the kSZ effect with SPT-3G, AdvACT and CMB S4. This will provide constraints on galaxy formation and feedback models.
Using Deep Learning to Analyze the Voices of Stars.
NASA Astrophysics Data System (ADS)
Boudreaux, Thomas Macaulay
2018-01-01
With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and compare the performance of different deep learning algorithms, including Artifical Neural Netoworks, and Convolutional Neural Networks, in classifing these synthetic data sets as either pulsators, or not observed to vary stars.
The Maunakea Spectroscopic ExplorerStatus and System overview
NASA Astrophysics Data System (ADS)
Mignot, S.; Murowinski, R.; Szeto, K.; Blin, A.; Caillier, P.
2017-12-01
The Maunakea Spectroscopic Explorer (MSE) project explores the possibility of upgrading the existing CFHT telescope and collaboration to turn it into the most powerful spectroscopic facility available in the years 2020s. Its 10 meter aperture and its 1.5°² hexagonal field of view will allow both large and deep surveys, as complements to current (Gaia, eRosita, LOFAR) and future imaging (Euclid, WFIRST, SKA, LSST) surveys, but also to provide tentative targets to the TMT or the E-ELT. In perfect agreement with INSU's 2015-2020 prospective, besides being well represented in MSE's science team (23/105 members), France is also a major contributor to the Conceptual Design studies with CRAL developing a concept for the low and moderate spectrographs, DT INSU for the prime focus environment and GEPI for systems engineering.
ERIC Educational Resources Information Center
Hirai, Masahiro; Hiraki, Kazuo
2006-01-01
We investigated how the spatiotemporal structure of animations of biological motion (BM) affects brain activity. We measured event-related potentials (ERPs) during the perception of BM under four conditions: normal spatial and temporal structure; scrambled spatial and normal temporal structure; normal spatial and scrambled temporal structure; and…
Only marginal alignment of disc galaxies
NASA Astrophysics Data System (ADS)
Andrae, René; Jahnke, Knud
2011-12-01
Testing theories of angular-momentum acquisition of rotationally supported disc galaxies is the key to understanding the formation of this type of galaxies. The tidal-torque theory aims to explain this acquisition process in a cosmological framework and predicts positive autocorrelations of angular-momentum orientation and spiral-arm handedness, i.e. alignment of disc galaxies, on short distance scales of 1 Mpc h-1. This disc alignment can also cause systematic effects in weak-lensing measurements. Previous observations claimed discovering these correlations but are overly optimistic in the reported level of statistical significance of the detections. Errors in redshift, ellipticity and morphological classifications were not taken into account, although they have a significant impact. We explain how to rigorously propagate all the important errors through the estimation process. Analysing disc galaxies in the Sloan Digital Sky Survey (SDSS) data base, we find that positive autocorrelations of spiral-arm handedness and angular-momentum orientations on distance scales of 1 Mpc h-1 are plausible but not statistically significant. Current data appear not good enough to constrain parameters of theory. This result agrees with a simple hypothesis test in the Local Group, where we also find no evidence for disc alignment. Moreover, we demonstrate that ellipticity estimates based on second moments are strongly biased by galactic bulges even for Scd galaxies, thereby corrupting correlation estimates and overestimating the impact of disc alignment on weak-lensing studies. Finally, we discuss the potential of future sky surveys. We argue that photometric redshifts have too large errors, i.e. PanSTARRS and LSST cannot be used. Conversely, the EUCLID project will not cover the relevant redshift regime. We also discuss the potentials and problems of front-edge classifications of galaxy discs in order to improve the autocorrelation estimates of angular-momentum orientation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loebman, Sarah R.; Ivezic, Zeljko; Quinn, Thomas R.
2012-10-10
We search for evidence of dark matter in the Milky Way by utilizing the stellar number density distribution and kinematics measured by the Sloan Digital Sky Survey (SDSS) to heliocentric distances exceeding {approx}10 kpc. We employ the cylindrically symmetric form of Jeans equations and focus on the morphology of the resulting acceleration maps, rather than the normalization of the total mass as done in previous, mostly local, studies. Jeans equations are first applied to a mock catalog based on a cosmologically derived N-body+SPH simulation, and the known acceleration (gradient of gravitational potential) is successfully recovered. The same simulation is alsomore » used to quantify the impact of dark matter on the total acceleration. We use Galfast, a code designed to quantitatively reproduce SDSS measurements and selection effects, to generate a synthetic stellar catalog. We apply Jeans equations to this catalog and produce two-dimensional maps of stellar acceleration. These maps reveal that in a Newtonian framework, the implied gravitational potential cannot be explained by visible matter alone. The acceleration experienced by stars at galactocentric distances of {approx}20 kpc is three times larger than what can be explained by purely visible matter. The application of an analytic method for estimating the dark matter halo axis ratio to SDSS data implies an oblate halo with q{sub DM} = 0.47 {+-} 0.14 within the same distance range. These techniques can be used to map the dark matter halo to much larger distances from the Galactic center using upcoming deep optical surveys, such as LSST.« less
NASA Astrophysics Data System (ADS)
Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.
2009-12-01
The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.
High resolution optical surface metrology with the slope measuring portable optical test system
NASA Astrophysics Data System (ADS)
Maldonado, Alejandro V.
New optical designs strive to achieve extreme performance, and continually increase the complexity of prescribed optical shapes, which often require wide dynamic range and high resolution. SCOTS, or the Software Configurable Optical Test System, can measure a wide range of optical surfaces with high sensitivity using surface slope. This dissertation introduces a high resolution version of SCOTS called SPOTS, or the Slope measuring Portable Optical Test System. SPOTS improves the metrology of surface features on the order of sub-millimeter to decimeter spatial scales and nanometer to micrometer level height scales. Currently there is no optical surface metrology instrument with the same utility. SCOTS uses a computer controlled display (such as an LCD monitor) and camera to measure surface slopes over the entire surface of a mirror. SPOTS differs in that an additional lens is placed near the surface under test. A small prototype system is discussed in general, providing the support for the design of future SPOTS devices. Then the SCOTS instrument transfer function is addressed, which defines the way the system filters surface heights. Lastly, the calibration and performance of larger SPOTS device is analyzed with example measurements of the 8.4-m diameter aspheric Large Synoptic Survey Telescope's (LSST) primary mirror. In general optical systems have a transfer function, which filters data. In the case of optical imaging systems the instrument transfer function (ITF) follows the modulation transfer function (MTF), which causes a reduction of contrast as a function of increasing spatial frequency due to diffraction. In SCOTS, ITF is shown to decrease the measured height of surface features as their spatial frequency increases, and thus the SCOTS and SPOTS ITF is proportional to their camera system's MTF. Theory and simulations are supported by a SCOTS measurement of a test piece with a set of lithographically written sinusoidal surface topographies. In addition, an example of a simple inverse filtering technique is provided. The success of a small SPOTS proof of concept instrument paved the way for a new larger prototype system, which is intended to measure subaperture regions on large optical mirrors. On large optics, the prototype SPOTS is light weight and it rests on the surface being tested. One advantage of this SPOTS is stability over time in maintaining its calibration. Thus the optician can simply place SPOTS on the mirror, perform a simple alignment, collect measurement data, then pick the system up and repeat at a new location. The entire process takes approximately 5 to 10 minutes, of which 3 minutes is spent collecting data. SPOTS' simplicity of design, light weight, robustness, wide dynamic range, and high sensitivity make it a useful tool for optical shop use during the fabrication and testing process of large and small optics.
2013-01-01
Introduction There is a great health services disparity between urban and rural areas in China. The percentage of people who are unable to access health services due to long travel times increases. This paper takes Donghai County as the study unit to analyse areas with physician shortages and characteristics of the potential spatial accessibility of health services. We analyse how the unequal health services resources distribution and the New Cooperative Medical Scheme affect the potential spatial accessibility of health services in Donghai County. We also give some advice on how to alleviate the unequal spatial accessibility of health services in areas that are more remote and isolated. Methods The shortest traffic times of from hospitals to villages are calculated with an O-D matrix of GIS extension model. This paper applies an enhanced two-step floating catchment area (E2SFCA) method to study the spatial accessibility of health services and to determine areas with physician shortages in Donghai County. The sensitivity of the E2SFCA for assessing variation in the spatial accessibility of health services is checked using different impedance coefficient valuesa. Geostatistical Analyst model and spatial analyst method is used to analyse the spatial pattern and the edge effect of potential spatial accessibility of health services. Results The results show that 69% of villages have access to lower potential spatial accessibility of health services than the average for Donghai County, and 79% of the village scores are lower than the average for Jiangsu Province. The potential spatial accessibility of health services diminishes greatly from the centre of the county to outlying areas. Using a smaller impedance coefficient leads to greater disparity among the villages. The spatial accessibility of health services is greater along highway in the county. Conclusions Most of villages are in underserved health services areas. An unequal distribution of health service resources and the reimbursement policies of the New Cooperative Medical Scheme have led to an edge effect regarding spatial accessibility of health services in Donghai County, whereby people living on the edge of the county have less access to health services. Comprehensive measures should be considered to alleviate the unequal spatial accessibility of health services in areas that are more remote and isolated. PMID:23688278
Spatial Bose-Einstein Condensation.
ERIC Educational Resources Information Center
Masut, Remo; Mullin, William J.
1979-01-01
Analyzes three examples of spatial Bose-Einstein condensations in which the particles macroscopically occupy the lowest localized state of an inhomogeneous external potential. The three cases are (1) a box with a small square potential well inside, (2) a harmonic oscillator potential, and (3) randomly sized trapping potentials caused by…
A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs
Gilbertson, W.; Nomerotski, A.; Takacs, P.
2017-09-07
In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less
A fringe projector-based study of the Brighter-Fatter Effect in LSST CCDs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbertson, W.; Nomerotski, A.; Takacs, P.
In order to achieve the goals of the Large Synoptic Survey Telescope for Dark Energy science requires a detailed understanding of CCD sensor effects. One such sensor effect is the Point Spread Function (PSF) increasing with flux, alternatively called the `Brighter-Fatter Effect.' Here a novel approach was tested to perform the PSF measurements in the context of the Brighter-Fatter Effect employing a Michelson interferometer to project a sinusoidal fringe pattern onto the CCD. The Brighter-Fatter effect predicts that the fringe pattern should become asymmetric in the intensity pattern as the brighter peaks corresponding to a larger flux are smeared bymore » a larger PSF. By fitting the data with a model that allows for a changing PSF, the strength of the Brighter-Fatter effect can be evaluated.« less
Centroid Position as a Function of Total Counts in a Windowed CMOS Image of a Point Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtz, R E; Olivier, S; Riot, V
2010-05-27
We obtained 960,200 22-by-22-pixel windowed images of a pinhole spot using the Teledyne H2RG CMOS detector with un-cooled SIDECAR readout. We performed an analysis to determine the precision we might expect in the position error signals to a telescope's guider system. We find that, under non-optimized operating conditions, the error in the computed centroid is strongly dependent on the total counts in the point image only below a certain threshold, approximately 50,000 photo-electrons. The LSST guider camera specification currently requires a 0.04 arcsecond error at 10 Hertz. Given the performance measured here, this specification can be delivered with a singlemore » star at 14th to 18th magnitude, depending on the passband.« less
Managing the Big Data Avalanche in Astronomy - Data Mining the Galaxy Zoo Classification Database
NASA Astrophysics Data System (ADS)
Borne, Kirk D.
2014-01-01
We will summarize a variety of data mining experiments that have been applied to the Galaxy Zoo database of galaxy classifications, which were provided by the volunteer citizen scientists. The goal of these exercises is to learn new and improved classification rules for diverse populations of galaxies, which can then be applied to much larger sky surveys of the future, such as the LSST (Large Synoptic Sky Survey), which is proposed to obtain detailed photometric data for approximately 20 billion galaxies. The massive Big Data that astronomy projects will generate in the future demand greater application of data mining and data science algorithms, as well as greater training of astronomy students in the skills of data mining and data science. The project described here has involved several graduate and undergraduate research assistants at George Mason University.
Gamma Ray Bursts as Cosmological Probes with EXIST
NASA Astrophysics Data System (ADS)
Hartmann, Dieter; EXIST Team
2006-12-01
The EXIST mission, studied as a Black Hole Finder Probe within NASA's Beyond Einstein Program, would, in its current design, trigger on 1000 Gamma Ray Bursts (GRBs) per year (Grindlay et al, this meeting). The redshift distribution of these GRBs, using results from Swift as a guide, would probe the z > 7 epoch at an event rate of > 50 per year. These bursts trace early cosmic star formation history, point to a first generation of stellar objects that reionize the universe, and provide bright beacons for absorption line studies with groundand space-based observatories. We discuss how EXIST, in conjunction with other space missions and future large survey programs such as LSST, can be utilized to advance our understanding of cosmic chemical evolution, the structure and evolution of the baryonic cosmic web, and the formation of stars in low metallicity environments.
Connecting the time domain community with the Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Plante, Raymond L.; Kantor, Jeffrey; Good, John C.
2012-09-01
The time domain has been identied as one of the most important areas of astronomical research for the next decade. The Virtual Observatory is in the vanguard with dedicated tools and services that enable and facilitate the discovery, dissemination and analysis of time domain data. These range in scope from rapid notications of time-critical astronomical transients to annotating long-term variables with the latest modelling results. In this paper, we will review the prior art in these areas and focus on the capabilities that the VAO is bringing to bear in support of time domain science. In particular, we will focus on the issues involved with the heterogeneous collections of (ancilllary) data associated with astronomical transients, and the time series characterization and classication tools required by the next generation of sky surveys, such as LSST and SKA.
Do's and Don'ts with Lateralized Event-Related Brain Potentials
ERIC Educational Resources Information Center
Praamstra, Peter
2007-01-01
K. Wiegand and E. Wascher (2005) used the lateralized readiness potential (LRP) to investigate the mechanisms underlying spatial stimulus-response (S-R) correspondence. The authors compared spatial S-R correspondence effects obtained with horizontal and vertical S-R arrangements. In some relevant previous investigations on spatial S-R…
Extracting meaning from astronomical telegrams
NASA Astrophysics Data System (ADS)
Graham, Matthew; Conwill, L.; Djorgovski, S. G.; Mahabal, A.; Donalek, C.; Drake, A.
2011-01-01
The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using aspects of natural language processing. We demonstrate that it is possible to infer the subject of an ATEL from the vocabulary used and to identify previously unassociated reports.
Validity of the Born approximation for beyond Gaussian weak lensing observables
Petri, Andrea; Haiman, Zoltan; May, Morgan
2017-06-06
Accurate forward modeling of weak lensing (WL) observables from cosmological parameters is necessary for upcoming galaxy surveys. Because WL probes structures in the nonlinear regime, analytical forward modeling is very challenging, if not impossible. Numerical simulations of WL features rely on ray tracing through the outputs of N-body simulations, which requires knowledge of the gravitational potential and accurate solvers for light ray trajectories. A less accurate procedure, based on the Born approximation, only requires knowledge of the density field, and can be implemented more efficiently and at a lower computational cost. In this work, we use simulations to show thatmore » deviations of the Born-approximated convergence power spectrum, skewness and kurtosis from their fully ray-traced counterparts are consistent with the smallest nontrivial O(Φ 3) post-Born corrections (so-called geodesic and lens-lens terms). Our results imply a cancellation among the larger O(Φ 4) (and higher order) terms, consistent with previous analytic work. We also find that cosmological parameter bias induced by the Born-approximated power spectrum is negligible even for a LSST-like survey, once galaxy shape noise is considered. When considering higher order statistics such as the κ skewness and kurtosis, however, we find significant bias of up to 2.5σ. Using the LensTools software suite, we show that the Born approximation saves a factor of 4 in computing time with respect to the full ray tracing in reconstructing the convergence.« less
Connecting Variability and Metals in White Dwarfs
NASA Astrophysics Data System (ADS)
Kilic, Mukremin
2016-10-01
The Kepler and K2 missions have revealed that about half of the observed white dwarfs with sufficient signal-to-noise ratio light curves have low-level photometric variations at hour to day timescales. Potential explanations for the observed variability include the relativistic beaming effect, ellipsodial variations, eclipses, and reflection off of giant planets in close orbits. However, these are all rare events. Roughly 10% of white dwarfs are magnetic, and magnetic fields can explain part of this puzzle. However, the high incidence (50%) of variability is currently unexplained. HST COS spectroscopy of nearby white dwarfs show that about half of them have metals on their surface. Hence, we propose that the observed variability is due to the rotation of the star coupled with an inhomogeneous surface distribution of accreted metals. We have recently discovered an ideal system to test this hypothesis. J1529 is an apparently non-magnetic white dwarf that shows 5.9% photometric dips in the optical every 38 min. We propose to obtain COS TIME-TAG spectroscopy of J1529 over 4 orbits to search for surface abundance differences throughout the orbit and look for the flux redistribution effect in the optical. These observations will confirm or rule out the idea that inhomogeneous metal accretion on white dwarfs can explain the high incidence of variability. We predict that the LSST will identify 100,000 variable white dwarfs. Hence, understanding the source of variability in white dwarfs has implications for the current and future transient surveys.
Validity of the Born approximation for beyond Gaussian weak lensing observables
NASA Astrophysics Data System (ADS)
Petri, Andrea; Haiman, Zoltán; May, Morgan
2017-06-01
Accurate forward modeling of weak lensing (WL) observables from cosmological parameters is necessary for upcoming galaxy surveys. Because WL probes structures in the nonlinear regime, analytical forward modeling is very challenging, if not impossible. Numerical simulations of WL features rely on ray tracing through the outputs of N -body simulations, which requires knowledge of the gravitational potential and accurate solvers for light ray trajectories. A less accurate procedure, based on the Born approximation, only requires knowledge of the density field, and can be implemented more efficiently and at a lower computational cost. In this work, we use simulations to show that deviations of the Born-approximated convergence power spectrum, skewness and kurtosis from their fully ray-traced counterparts are consistent with the smallest nontrivial O (Φ3) post-Born corrections (so-called geodesic and lens-lens terms). Our results imply a cancellation among the larger O (Φ4) (and higher order) terms, consistent with previous analytic work. We also find that cosmological parameter bias induced by the Born-approximated power spectrum is negligible even for a LSST-like survey, once galaxy shape noise is considered. When considering higher order statistics such as the κ skewness and kurtosis, however, we find significant bias of up to 2.5 σ . Using the LensTools software suite, we show that the Born approximation saves a factor of 4 in computing time with respect to the full ray tracing in reconstructing the convergence.
Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan
2016-06-01
Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.
Liu, Shen; McGree, James; Hayes, John F; Goonetilleke, Ashantha
2016-10-01
Potential human health risk from waterborne diseases arising from unsatisfactory performance of on-site wastewater treatment systems is driven by landscape factors such as topography, soil characteristics, depth to water table, drainage characteristics and the presence of surface water bodies. These factors are present as random variables which are spatially distributed across a region. A methodological framework is presented that can be applied to model and evaluate the influence of various factors on waterborne disease potential. This framework is informed by spatial data and expert knowledge. For prediction at unsampled sites, interpolation methods were used to derive a spatially smoothed surface of disease potential which takes into account the uncertainty due to spatial variation at any pre-determined level of significance. This surface was constructed by accounting for the influence of multiple variables which appear to contribute to disease potential. The framework developed in this work strengthens the understanding of the characteristics of disease potential and provides predictions of this potential across a region. The study outcomes presented constitutes an innovative approach to environmental monitoring and management in the face of data paucity. Copyright © 2016 Elsevier B.V. All rights reserved.
Constraints on Primordial Non-Gaussianity from 800 000 Photometric Quasars.
Leistedt, Boris; Peiris, Hiranya V; Roth, Nina
2014-11-28
We derive robust constraints on primordial non-Gaussianity (PNG) using the clustering of 800 000 photometric quasars from the Sloan Digital Sky Survey in the redshift range 0.5
Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets
NASA Astrophysics Data System (ADS)
Juric, Mario
2011-01-01
The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.
NASA Astrophysics Data System (ADS)
Schaan, Emmanuel Sebastien
The primary fluctuations in the cosmic microwave background (CMB), the leftover heat from the big bang, have revealed invaluable clues about our universe (age, history, geometry, composition), and are now measured almost to the cosmic variance limit. While important fundamental physics questions remain to be answered from the primary CMB alone (e.g., detection of gravitational waves from inflation, number of relativistic species), many others require looking beyond the primary anisotropies: what is dark energy, this mysterious component responsible for the accelerated expansion of the universe? What is the nature of the dark matter, five times more abundant than ordinary matter? What are the masses of the neutrinos? The clustering pattern in the spatial distribution of galaxies across the universe, the so-called large-scale structure (LSS), contains the key to these fundamental physics questions, as well as many tightly related astrophysical questions: what are the key processes in galaxy formation? How did the universe transition from neutral to ionized, one billion years after the big bang? However, several hurdles hinder extracting this information: non-linear evolution under gravity is complex to model and turns independent Gaussian initial conditions into coupled non-Gaussian modes; uncertain astrophysical effects obscure the connection between visible and dark matter, and alter the matter power spectrum on small-scales; LSS observables are often complex and systematics-limited. In this thesis, I tackle these issues and explore various ways of using the CMB as a backlight for the LSS, to illuminate aspects of its uncertain physics and systematics. In the coming years, ever more sensitive CMB experiments (AdvACT, SPT-3G, Simons Observatory, CMB Stage 4) will overlap with imaging surveys (DES, HSC, LSST, Euclid, WFIRST) and spectroscopic surveys (DESI, PFS), thus greatly magnifying the power of the methods I developed, and helping to answer some of the most pressing astrophysics and fundamental physics questions.
David W. Williams; Andrew M. Liebhold
1995-01-01
Changes in geographical ranges and spatial extent of outbreaks of pest species are likely consequences of climatic change. We investigated potential changes in spatial distribution of outbreaks of western spruce budworm, Choristoneura occidentalis Freeman, and gypsy moth, Lymantria dispar (L.), in Oregon and Pennsylvania,...
Regular and Chaotic Spatial Distribution of Bose-Einstein Condensed Atoms in a Ratchet Potential
NASA Astrophysics Data System (ADS)
Li, Fei; Xu, Lan; Li, Wenwu
2018-02-01
We study the regular and chaotic spatial distribution of Bose-Einstein condensed atoms with a space-dependent nonlinear interaction in a ratchet potential. There exists in the system a space-dependent atomic current that can be tuned via Feshbach resonance technique. In the presence of the space-dependent atomic current and a weak ratchet potential, the Smale-horseshoe chaos is studied and the Melnikov chaotic criterion is obtained. Numerical simulations show that the ratio between the intensities of optical potentials forming the ratchet potential, the wave vector of the laser producing the ratchet potential or the wave vector of the modulating laser can be chosen as the controlling parameters to result in or avoid chaotic spatial distributional states.
ERIC Educational Resources Information Center
Miyazaki, Mikio; Kimiho, Chino; Katoh, Ryuhei; Arai, Hitoshi; Ogihara, Fumihiro; Oguchi, Yuichi; Morozumi, Tatsuo; Kon, Mayuko; Komatsu, Kotaro
2012-01-01
Three-dimensional dynamic geometry software has the power to enhance students' learning of spatial geometry. The purpose of this research is to clarify what potential using three-dimensional dynamic geometry software can offer us in terms of how to develop the spatial geometry curriculum in lower secondary schools. By focusing on the impacts the…
Bagny Beilhe, Leïla; Piou, Cyril; Tadu, Zéphirin; Babin, Régis
2018-06-06
The use of ants for biological control of insect pests was the first reported case of conservation biological control. Direct and indirect community interactions between ants and pests lead to differential spatial pattern. We investigated spatial interactions between mirids, the major cocoa pest in West Africa and numerically dominant ant species, using bivariate point pattern analysis to identify potential biological control agents. We assume that potential biological control agents should display negative spatial interactions with mirids considering their niche overlap. The mirid/ant data were collected in complex cacao-based agroforestry systems sampled in three agroecological areas over a forest-savannah gradient in Cameroon. Three species, Crematogaster striatula Emery (Hymenoptera: Formicidae), Crematogaster clariventris Mayr (Hymenoptera: Formicidae), and Oecophylla longinoda Latreille (Hymenoptera: Formicidae) with high predator and aggressive behaviors were identified as dominant and showed negative spatial relationships with mirids. The weaver ant, O. longinoda was identified as the only potential biological control agent, considering its ubiquity in the plots, the similarity in niche requirements, and the spatial segregation with mirids resulting probably from exclusion mechanisms. Combining bivariate point pattern analysis to good knowledge of insect ecology was an effective method to identify a potentially good biological control agent.
A Photometric (griz) Metallicity Calibration for Cool Stars
NASA Astrophysics Data System (ADS)
West, Andrew A.; Davenport, James R. A.; Dhital, Saurav; Mann, Andrew; Massey, Angela P
2014-06-01
We present results from a study that uses wide pairs as tools for estimating and constraining the metal content of cool stars from their spectra and broad band colors. Specifically, we will present results that optimize the Mann et al. M dwarf metallicity calibrations (derived using wide binaries) for the optical regime covered by SDSS spectra. We will demonstrate the robustness of the new calibrations using a sample of wide, low-mass binaries for which both components have an SDSS spectrum. Using these new spectroscopic metallicity calibrations, we will present relations between the metallicities (from optical spectra) and the Sloan colors derived using more than 20,000 M dwarfs in the SDSS DR7 spectroscopic catalog. These relations have important ramifications for studies of Galactic chemical evolution, the search for exoplanets and subdwarfs, and are essential for surveys such as Pan-STARRS and LSST, which use griz photometry but have no spectroscopic component.
Transient Go: A Mobile App for Transient Astronomy Outreach
NASA Astrophysics Data System (ADS)
Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.
2016-12-01
Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.
Probing Neutrino Hierarchy and Chirality via Wakes.
Zhu, Hong-Ming; Pen, Ue-Li; Chen, Xuelei; Inman, Derek
2016-04-08
The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and neutrino wakes are expected to develop downstream of the dark matter halos. We propose a method of measuring the neutrino mass based on this mechanism. This neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys with a low redshift galaxy survey or a 21 cm intensity mapping survey, which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make a positive detection if the three neutrino masses are quasidegenerate with each neutrino mass of ∼0.1 eV, and a future high precision 21 cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right-handed Dirac neutrinos may be detectable.
Cosmic Visions Dark Energy: Small Projects Portfolio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Kyle; Frieman, Josh; Heitmann, Katrin
Understanding cosmic acceleration is one of the key science drivers for astrophysics and high-energy physics in the coming decade (2014 P5 Report). With the Large Synoptic Survey Telescope (LSST) and the Dark Energy Spectroscopic Instrument (DESI) and other new facilities beginning operations soon, we are entering an exciting phase during which we expect an order of magnitude improvement in constraints on dark energy and the physics of the accelerating Universe. This is a key moment for a matching Small Projects portfolio that can (1) greatly enhance the science reach of these flagship projects, (2) have immediate scientific impact, and (3)more » lay the groundwork for the next stages of the Cosmic Frontier Dark Energy program. In this White Paper, we outline a balanced portfolio that can accomplish these goals through a combination of observational, experimental, and theory and simulation efforts.« less
Summary of LSST systems analysis and integration task for SPS flight test articles
NASA Astrophysics Data System (ADS)
Greenberg, H. S.
1981-02-01
The structural and equipment requirements for two solar power satellite (SPS) test articles are defined. The first SPS concept uses a hexagonal frame structure to stabilize the array of primary tension cables configured to support a Mills Cross antenna containing 17,925 subarrays composed of dipole radiating elements and solid state power amplifier modules. The second test article consists of a microwave antenna and its power source, a 20 by 200 m array of solar cell blankets, both of which are supported by the solar blanket array support structure. The test article structure, a ladder, is comprised of two longitudinal beams (215 m long) spaced 10 m apart and interconnected by six lateral beams. The system control module structure and bridge fitting provide bending and torsional stiffness, and supplement the in plane Vierendeel structure behavior. Mission descriptions, construction, and structure interfaces are addressed.
Implications from XMM and Chandra Source Catalogs for Future Studies with Lynx
NASA Astrophysics Data System (ADS)
Ptak, Andrew
2018-01-01
Lynx will perform extremely sensitive X-ray surveys by combining very high-resolution imaging over a large field of view with a high effective area. These will include deep planned surveys and serendipitous source surveys. Here we discuss implications that can be gleaned from current Chandra and XMM-Newton serendipitous source surveys. These current surveys have discovered novel sources such as tidal disruption events, binary AGN, and ULX pulsars. In addition these surveys have detected large samples of normal galaxies, low-luminosity AGN and quasars due to the wide-area coverage of the Chandra and XMM-Newton source catalogs, allowing the evolution of these phenonema to be explored. The wide area Lynx surveys will probe down further in flux and will be coupled with very sensitive wide-area surveys such as LSST and SKA, allowing for detailed modeling of their SEDs and the discovery of rare, exotic sources and transient events.
Cosmic Evolution Through UV Spectroscopy (CETUS): A NASA Probe-Class Mission Concept
NASA Astrophysics Data System (ADS)
Heap, Sara R.; CETUS Team
2017-01-01
CETUS is a probe-class mission concept proposed for study to NASA in November 2016. Its overarching objective is to provide access to the ultraviolet (~100-400 nm) after Hubble has died. CETUS will be a major player in the emerging global network of powerful, new telescopes such as E-ROSITA, DESI, Subaru/PFS, GMT, LSST, WFIRST, JWST, and SKA. The CETUS mission concept provisionally features a 1.5-m telescope with a suite of instruments including a near-UV multi-object spectrograph (200-400 nm) complementing Subaru/PFS observations, wide-field far-UV and near-UV cameras, and far-UV and near-UV spectrographs that can be operated in either high-resolution or low-resolution mode. We have derived the scope and specific science requirements for CETUS for understanding the evolutionary history of galaxies, stars, and dust, but other applications are possible.
The science enabled by the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Martin, N. F.; Babusiaux, C.
2017-12-01
With its unique wide-field, multi-object, and dedicated spectroscopic capabilities, the Maunakea Spectroscopic Explorer (MSE) is a powerful facility to shed light on the faint Universe. Built around an upgrade of the Canada-France Hawaii Telescope (CFHT) to a 11.25-meter telescope with a dedicated ˜1.5 deg^2, 4,000-fiber wide-field spectrograph that covers the optical and near-infrared wavelengths at resolutions between 2,500 and 40,000, the MSE is the essential follow-up complement to the current and next generations of multi-wavelength imaging surveys, such as the LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for the extremely large telescopes that are currently being built (E-ELT, GMT, and TMT). The science enabled by the MSE is vast and would have an impact on almost all aspects of astronomy research.
Estimating explosion properties of normal hydrogen-rich core-collapse supernovae
NASA Astrophysics Data System (ADS)
Pejcha, Ondrej
2017-08-01
Recent parameterized 1D explosion models of hundreds of core-collapse supernova progenitors suggest that success and failure are intertwined in a complex pattern that is not a simple function of the progenitor initial mass. This rugged landscape is present also in other explosion properties, allowing for quantitative tests of the neutrino mechanism from observations of hundreds of supernovae discovered every year. We present a new self-consistent and versatile method that derives photospheric radius and temperature variations of normal hydrogen-rich core-collapse supernovae based on their photometric measurements and expansion velocities. We construct SED and bolometric light curves, determine explosion energies, ejecta and nickel masses while taking into account all uncertainties and covariances of the model. We describe the efforts to compare the inferences to the predictions of the neutrino mechanim. The model can be adapted to include more physical assumptions to utilize primarily photometric data coming from surveys such as LSST.
The applications of deep neural networks to sdBV classification
NASA Astrophysics Data System (ADS)
Boudreaux, Thomas M.
2017-12-01
With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and we show that two separate paradigms of deep learning - the Artificial Neural Network And the Convolutional Neural Network - can both be used to classify this synthetic data effectively. And that additionally this classification can be performed at relatively high levels of accuracy with minimal time spent adjusting network hyperparameters.
Vaughan, Adam S; Kramer, Michael R; Cooper, Hannah L F; Rosenberg, Eli S; Sullivan, Patrick S
2017-02-01
Theory and research on HIV and among men who have sex with men (MSM) have long suggested the importance of non-residential locations in defining structural exposures. Despite this, most studies within these fields define place as a residential context, neglecting the potential influence of non-residential locations on HIV-related outcomes. The concept of activity spaces, defined as a set of locations to which an individual is routinely exposed, represents one theoretical basis for addressing this potential imbalance. Using a one-time online survey to collect demographic, behavioral, and spatial data from MSM, this paper describes activity spaces and examines correlates of this spatial variation. We used latent class analysis to identify categories of activity spaces using spatial data on home, routine, potential sexual risk, and HIV prevention locations. We then assessed individual and area-level covariates for their associations with these categories. Classes were distinguished by the degree of spatial variation in routine and prevention behaviors (which were the same within each class) and in sexual risk behaviors (i.e., sex locations and locations of meeting sex partners). Partner type (e.g. casual or main) represented a key correlate of the activity space. In this early examination of activity spaces in an online sample of MSM, patterns of spatial behavior represent further evidence of significant spatial variation in locations of routine, potential HIV sexual risk, and HIV prevention behaviors among MSM. Although prevention behaviors tend to have similar geographic variation as routine behaviors, locations where men engage in potentially high-risk behaviors may be more spatially focused for some MSM than for others. Copyright © 2016 Elsevier Ltd. All rights reserved.
Loureiro, Adriana; Costa, Cláudia; Almendra, Ricardo; Freitas, Ângela; Santana, Paula
2015-11-01
This study's aims are: (i) identifying spatial patterns for the risk of hospitalization due to mental illness and for the potential risk resulting from contextual factors with influence on mental health; and (ii) analyzing the spatial association between risk of hospitalization due to mental illness and potential risk resulting from contextual factors in the metropolitan areas of Lisbon and Porto, Portugal. A cross-sectional ecological study was conducted by applying statistical methods for assessing spatial dependency and heterogeneity. Results reveal a spatial association between risk of hospitalization due to mental illness and potential risk resulting from contextual factors with a statistical relevance of moderate intensity. 20% of the population under study lives in areas with a simultaneously high potential risk resulting from contextual factors and risk of hospitalization due to mental illness. Porto Metropolitan Area show the highest percentage of population living in parishes with a significantly high risk of hospitalization due to mental health, which puts forward the need for interventions on territory-adjusted contextual factors influencing mental health.
The Potential for Spatial Distribution Indices to Signal Thresholds in Marine Fish Biomass
Reuchlin-Hugenholtz, Emilie
2015-01-01
The frequently observed positive relationship between fish population abundance and spatial distribution suggests that changes in distribution can be indicative of trends in abundance. If contractions in spatial distribution precede declines in spawning stock biomass (SSB), spatial distribution reference points could complement the SSB reference points that are commonly used in marine conservation biology and fisheries management. When relevant spatial distribution information is integrated into fisheries management and recovery plans, risks and uncertainties associated with a plan based solely on the SSB criterion would be reduced. To assess the added value of spatial distribution data, we examine the relationship between SSB and four metrics of spatial distribution intended to reflect changes in population range, concentration, and density for 10 demersal populations (9 species) inhabiting the Scotian Shelf, Northwest Atlantic. Our primary purpose is to assess their potential to serve as indices of SSB, using fisheries independent survey data. We find that metrics of density offer the best correlate of spawner biomass. A decline in the frequency of encountering high density areas is associated with, and in a few cases preceded by, rapid declines in SSB in 6 of 10 populations. Density-based indices have considerable potential to serve both as an indicator of SSB and as spatially based reference points in fisheries management. PMID:25789624
Özdem, Ceylan; Brass, Marcel; Van der Cruyssen, Laurens; Van Overwalle, Frank
2017-04-01
Neuroimaging research has demonstrated that the temporo-parietal junction (TPJ) is activated when unexpected stimuli appear in spatial reorientation tasks as well as during thinking about the beliefs of other people triggered by verbal scenarios. While the role of potential common component processes subserved by the TPJ has been extensively studied to explain this common activation, the potential confounding role of input modality (spatial vs. verbal) has been largely ignored. To investigate the role of input modality apart from task processes, we developed a novel spatial false belief task based on moving shapes. We explored the overlap in TPJ activation across this novel task and traditional tasks of spatial reorientation (Posner) and verbal belief (False Belief vs. Photo stories). The results show substantial overlap across the same spatial input modality (both reorientation and false belief) as well as across the common task process (verbal and spatial belief), but no triple overlap. This suggests the potential for an overarching function of the TPJ, with some degree of specialization in different subregions due to modality, function and connectivity. The results are discussed with respect to recent theoretical models of the TPJ.
Banerjee, Samiran
2012-01-01
Ammonia oxidation is a major process in nitrogen cycling, and it plays a key role in nitrogen limited soil ecosystems such as those in the arctic. Although mm-scale spatial dependency of ammonia oxidizers has been investigated, little is known about the field-scale spatial dependency of aerobic ammonia oxidation processes and ammonia-oxidizing archaeal and bacterial communities, particularly in arctic soils. The purpose of this study was to explore the drivers of ammonia oxidation at the field scale in cryosols (soils with permafrost within 1 m of the surface). We measured aerobic ammonia oxidation potential (both autotrophic and heterotrophic) and functional gene abundance (bacterial amoA and archaeal amoA) in 279 soil samples collected from three arctic ecosystems. The variability associated with quantifying genes was substantially less than the spatial variability observed in these soils, suggesting that molecular methods can be used reliably evaluate spatial dependency in arctic ecosystems. Ammonia-oxidizing archaeal and bacterial communities and aerobic ammonia oxidation were spatially autocorrelated. Gene abundances were spatially structured within 4 m, whereas biochemical processes were structured within 40 m. Ammonia oxidation was driven at small scales (<1m) by moisture and total organic carbon, whereas gene abundance and other edaphic factors drove ammonia oxidation at medium (1 to 10 m) and large (10 to 100 m) scales. In these arctic soils heterotrophs contributed between 29 and 47% of total ammonia oxidation potential. The spatial scale for aerobic ammonia oxidation genes differed from potential ammonia oxidation, suggesting that in arctic ecosystems edaphic, rather than genetic, factors are an important control on ammonia oxidation. PMID:22081570
Evaluation of potential water conservation using site-specific irrigation
USDA-ARS?s Scientific Manuscript database
With the advent of site-specific variable-rate irrigation (VRI) systems, irrigation can be spatially managed within sub-field-sized zones. Spatial irrigation management can optimize spatial water use efficiency and may conserve water. Spatial VRI systems are currently being managed by consultants ...
Ecologic Niche Modeling and Spatial Patterns of Disease Transmission
2006-01-01
Ecologic niche modeling (ENM) is a growing field with many potential applications to questions regarding the geography and ecology of disease transmission. Specifically, ENM has the potential to inform investigations concerned with the geography, or potential geography, of vectors, hosts, pathogens, or human cases, and it can achieve fine spatial resolution without the loss of information inherent in many other techniques. Potential applications and current frontiers and challenges are reviewed. PMID:17326931
Virtual Reality: An Instructional Medium for Visual-Spatial Tasks.
ERIC Educational Resources Information Center
Regian, J. Wesley; And Others
1992-01-01
Describes an empirical exploration of the instructional potential of virtual reality as an interface for simulation-based training. Shows that subjects learned spatial-procedural and spatial-navigational skills in virtual reality. (SR)
Exploration-driven NEO Detection Requirements
NASA Astrophysics Data System (ADS)
Head, J. N.; Sykes, M. V.
2005-12-01
The Vision for Space Exploration calls for use of in situ resources to support human solar system exploration goals. Focus has been on potential lunar polar ice, Martian subsurface water and resource extraction from Phobos. Near-earth objects (NEOs) offer easily accessible targets that may represent a critical component to achieving sustainable human operations, in particular small, newly discovered asteroids within a specified dynamical range having requisite composition and frequency. A minimum size requirement is estimated assuming CONOPs has an NEO harvester on station at L1. When the NEO launch window opens, the vehicle departs, rendezvousing within 30 days. Mining and processing operations ( 60 days) produces dirty water for the return trip ( 30 days) to L1 for final refinement into propellants. A market for propellant at L1 is estimated to be 700 mT /year: 250 mT for Mars missions, 100 mT for GTO services (Blair et al. 2002), 50 mT for L1 to lunar surface services, and 300 mT for bringing NEO-derived propellants to L1. Assuming an appropriate NEO has 5% recoverable water, exploited with 50% efficiency, 23000 mT/year must be processed. At 1500 kg/m3, this corresponds to one object per year with a radius of 15 meters, or two 5 m radius objects per month, of which it is estimated there are 10000 having delta-v < 4.2 km/s and 200/year of these available for short roundtrip missions to meet resource requirements (Jones et al. 2002). The importance of these potential resource objects should drive a requirement that next generation NEO detection systems (e.g., Pan-STARRS/LSST) be capable by 2010 of detecting dark NEOs fainter than V=24, allowing for identification 3 months before closest approach. Blair et al. 2002. Final Report to NASA Exploration Team, December 20, 2002. Jones et al. 2002. ASP Conf. Series Vol. 202 (M. Sykes, Ed.), pp. 141-154.
Matter power spectrum and the challenge of percent accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, Aurel; Teyssier, Romain; Potter, Doug
2016-04-01
Future galaxy surveys require one percent precision in the theoretical knowledge of the power spectrum over a large range including very nonlinear scales. While this level of accuracy is easily obtained in the linear regime with perturbation theory, it represents a serious challenge for small scales where numerical simulations are required. In this paper we quantify the precision of present-day N -body methods, identifying main potential error sources from the set-up of initial conditions to the measurement of the final power spectrum. We directly compare three widely used N -body codes, Ramses, Pkdgrav3, and Gadget3 which represent three main discretisationmore » techniques: the particle-mesh method, the tree method, and a hybrid combination of the two. For standard run parameters, the codes agree to within one percent at k ≤1 h Mpc{sup −1} and to within three percent at k ≤10 h Mpc{sup −1}. We also consider the bispectrum and show that the reduced bispectra agree at the sub-percent level for k ≤ 2 h Mpc{sup −1}. In a second step, we quantify potential errors due to initial conditions, box size, and resolution using an extended suite of simulations performed with our fastest code Pkdgrav3. We demonstrate that the simulation box size should not be smaller than L =0.5 h {sup −1}Gpc to avoid systematic finite-volume effects (while much larger boxes are required to beat down the statistical sample variance). Furthermore, a maximum particle mass of M {sub p}=10{sup 9} h {sup −1}M{sub ⊙} is required to conservatively obtain one percent precision of the matter power spectrum. As a consequence, numerical simulations covering large survey volumes of upcoming missions such as DES, LSST, and Euclid will need more than a trillion particles to reproduce clustering properties at the targeted accuracy.« less
Constructing Concept Schemes From Astronomical Telegrams Via Natural Language Clustering
NASA Astrophysics Data System (ADS)
Graham, Matthew; Zhang, M.; Djorgovski, S. G.; Donalek, C.; Drake, A. J.; Mahabal, A.
2012-01-01
The rapidly emerging field of time domain astronomy is one of the most exciting and vibrant new research frontiers, ranging in scientific scope from studies of the Solar System to extreme relativistic astrophysics and cosmology. It is being enabled by a new generation of large synoptic digital sky surveys - LSST, PanStarrs, CRTS - that cover large areas of sky repeatedly, looking for transient objects and phenomena. One of the biggest challenges facing these is the automated classification of transient events, a process that needs machine-processible astronomical knowledge. Semantic technologies enable the formal representation of concepts and relations within a particular domain. ATELs (http://www.astronomerstelegram.org) are a commonly-used means for reporting and commenting upon new astronomical observations of transient sources (supernovae, stellar outbursts, blazar flares, etc). However, they are loose and unstructured and employ scientific natural language for description: this makes automated processing of them - a necessity within the next decade with petascale data rates - a challenge. Nevertheless they represent a potentially rich corpus of information that could lead to new and valuable insights into transient phenomena. This project lies in the cutting-edge field of astrosemantics, a branch of astroinformatics, which applies semantic technologies to astronomy. The ATELs have been used to develop an appropriate concept scheme - a representation of the information they contain - for transient astronomy using hierarchical clustering of processed natural language. This allows us to automatically organize ATELs based on the vocabulary used. We conclude that we can use simple algorithms to process and extract meaning from astronomical textual data.
MOSFiT: Modular Open Source Fitter for Transients
NASA Astrophysics Data System (ADS)
Guillochon, James; Nicholl, Matt; Villar, V. Ashley; Mockler, Brenna; Narayan, Gautham; Mandel, Kaisey S.; Berger, Edo; Williams, Peter K. G.
2018-05-01
Much of the progress made in time-domain astronomy is accomplished by relating observational multiwavelength time-series data to models derived from our understanding of physical laws. This goal is typically accomplished by dividing the task in two: collecting data (observing), and constructing models to represent that data (theorizing). Owing to the natural tendency for specialization, a disconnect can develop between the best available theories and the best available data, potentially delaying advances in our understanding new classes of transients. We introduce MOSFiT: the Modular Open Source Fitter for Transients, a Python-based package that downloads transient data sets from open online catalogs (e.g., the Open Supernova Catalog), generates Monte Carlo ensembles of semi-analytical light-curve fits to those data sets and their associated Bayesian parameter posteriors, and optionally delivers the fitting results back to those same catalogs to make them available to the rest of the community. MOSFiT is designed to help bridge the gap between observations and theory in time-domain astronomy; in addition to making the application of existing models and creation of new models as simple as possible, MOSFiT yields statistically robust predictions for transient characteristics, with a standard output format that includes all the setup information necessary to reproduce a given result. As large-scale surveys such as that conducted with the Large Synoptic Survey Telescope (LSST), discover entirely new classes of transients, tools such as MOSFiT will be critical for enabling rapid comparison of models against data in statistically consistent, reproducible, and scientifically beneficial ways.
Measuring the velocity field from type Ia supernovae in an LSST-like sky survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odderskov, Io; Hannestad, Steen, E-mail: isho07@phys.au.dk, E-mail: sth@phys.au.dk
2017-01-01
In a few years, the Large Synoptic Survey Telescope will vastly increase the number of type Ia supernovae observed in the local universe. This will allow for a precise mapping of the velocity field and, since the source of peculiar velocities is variations in the density field, cosmological parameters related to the matter distribution can subsequently be extracted from the velocity power spectrum. One way to quantify this is through the angular power spectrum of radial peculiar velocities on spheres at different redshifts. We investigate how well this observable can be measured, despite the problems caused by areas with nomore » information. To obtain a realistic distribution of supernovae, we create mock supernova catalogs by using a semi-analytical code for galaxy formation on the merger trees extracted from N-body simulations. We measure the cosmic variance in the velocity power spectrum by repeating the procedure many times for differently located observers, and vary several aspects of the analysis, such as the observer environment, to see how this affects the measurements. Our results confirm the findings from earlier studies regarding the precision with which the angular velocity power spectrum can be determined in the near future. This level of precision has been found to imply, that the angular velocity power spectrum from type Ia supernovae is competitive in its potential to measure parameters such as σ{sub 8}. This makes the peculiar velocity power spectrum from type Ia supernovae a promising new observable, which deserves further attention.« less
The Strong Lensing Time Delay Challenge (2014)
NASA Astrophysics Data System (ADS)
Liao, Kai; Dobler, G.; Fassnacht, C. D.; Treu, T.; Marshall, P. J.; Rumbaugh, N.; Linder, E.; Hojjati, A.
2014-01-01
Time delays between multiple images in strong lensing systems are a powerful probe of cosmology. At the moment the application of this technique is limited by the number of lensed quasars with measured time delays. However, the number of such systems is expected to increase dramatically in the next few years. Hundred such systems are expected within this decade, while the Large Synoptic Survey Telescope (LSST) is expected to deliver of order 1000 time delays in the 2020 decade. In order to exploit this bounty of lenses we needed to make sure the time delay determination algorithms have sufficiently high precision and accuracy. As a first step to test current algorithms and identify potential areas for improvement we have started a "Time Delay Challenge" (TDC). An "evil" team has created realistic simulated light curves, to be analyzed blindly by "good" teams. The challenge is open to all interested parties. The initial challenge consists of two steps (TDC0 and TDC1). TDC0 consists of a small number of datasets to be used as a training template. The non-mandatory deadline is December 1 2013. The "good" teams that complete TDC0 will be given access to TDC1. TDC1 consists of thousands of lightcurves, a number sufficient to test precision and accuracy at the subpercent level, necessary for time-delay cosmography. The deadline for responding to TDC1 is July 1 2014. Submissions will be analyzed and compared in terms of predefined metrics to establish the goodness-of-fit, efficiency, precision and accuracy of current algorithms. This poster describes the challenge in detail and gives instructions for participation.
Impact of Atmospheric Chromatic Effects on Weak Lensing Measurements
NASA Astrophysics Data System (ADS)
Meyers, Joshua E.; Burchat, Patricia R.
2015-07-01
Current and future imaging surveys will measure cosmic shear with statistical precision that demands a deeper understanding of potential systematic biases in galaxy shape measurements than has been achieved to date. We use analytic and computational techniques to study the impact on shape measurements of two atmospheric chromatic effects for ground-based surveys such as the Dark Energy Survey and the Large Synoptic Survey Telescope (LSST): (1) atmospheric differential chromatic refraction and (2) wavelength dependence of seeing. We investigate the effects of using the point-spread function (PSF) measured with stars to determine the shapes of galaxies that have different spectral energy distributions than the stars. We find that both chromatic effects lead to significant biases in galaxy shape measurements for current and future surveys, if not corrected. Using simulated galaxy images, we find a form of chromatic “model bias” that arises when fitting a galaxy image with a model that has been convolved with a stellar, instead of galactic, PSF. We show that both forms of atmospheric chromatic biases can be predicted (and corrected) with minimal model bias by applying an ordered set of perturbative PSF-level corrections based on machine-learning techniques applied to six-band photometry. Catalog-level corrections do not address the model bias. We conclude that achieving the ultimate precision for weak lensing from current and future ground-based imaging surveys requires a detailed understanding of the wavelength dependence of the PSF from the atmosphere, and from other sources such as optics and sensors. The source code for this analysis is available at https://github.com/DarkEnergyScienceCollaboration/chroma.
Awh, E; Anllo-Vento, L; Hillyard, S A
2000-09-01
We investigated the hypothesis that the covert focusing of spatial attention mediates the on-line maintenance of location information in spatial working memory. During the delay period of a spatial working-memory task, behaviorally irrelevant probe stimuli were flashed at both memorized and nonmemorized locations. Multichannel recordings of event-related potentials (ERPs) were used to assess visual processing of the probes at the different locations. Consistent with the hypothesis of attention-based rehearsal, early ERP components were enlarged in response to probes that appeared at memorized locations. These visual modulations were similar in latency and topography to those observed after explicit manipulations of spatial selective attention in a parallel experimental condition that employed an identical stimulus display.
Spatially distributed potential evapotranspiration modeling and climate projections.
Gharbia, Salem S; Smullen, Trevor; Gill, Laurence; Johnston, Paul; Pilla, Francesco
2018-08-15
Evapotranspiration integrates energy and mass transfer between the Earth's surface and atmosphere and is the most active mechanism linking the atmosphere, hydrosphsophere, lithosphere and biosphere. This study focuses on the fine resolution modeling and projection of spatially distributed potential evapotranspiration on the large catchment scale as response to climate change. Six potential evapotranspiration designed algorithms, systematically selected based on a structured criteria and data availability, have been applied and then validated to long-term mean monthly data for the Shannon River catchment with a 50m 2 cell size. The best validated algorithm was therefore applied to evaluate the possible effect of future climate change on potential evapotranspiration rates. Spatially distributed potential evapotranspiration projections have been modeled based on climate change projections from multi-GCM ensembles for three future time intervals (2020, 2050 and 2080) using a range of different Representative Concentration Pathways producing four scenarios for each time interval. Finally, seasonal results have been compared to baseline results to evaluate the impact of climate change on the potential evapotranspiration and therefor on the catchment dynamical water balance. The results present evidence that the modeled climate change scenarios would have a significant impact on the future potential evapotranspiration rates. All the simulated scenarios predicted an increase in potential evapotranspiration for each modeled future time interval, which would significantly affect the dynamical catchment water balance. This study addresses the gap in the literature of using GIS-based algorithms to model fine-scale spatially distributed potential evapotranspiration on the large catchment systems based on climatological observations and simulations in different climatological zones. Providing fine-scale potential evapotranspiration data is very crucial to assess the dynamical catchment water balance to setup management scenarios for the water abstractions. This study illustrates a transferable systematic method to design GIS-based algorithms to simulate spatially distributed potential evapotranspiration on the large catchment systems. Copyright © 2018 Elsevier B.V. All rights reserved.
Power-law spatial dispersion from fractional Liouville equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarasov, Vasily E.
2013-10-15
A microscopic model in the framework of fractional kinetics to describe spatial dispersion of power-law type is suggested. The Liouville equation with the Caputo fractional derivatives is used to obtain the power-law dependence of the absolute permittivity on the wave vector. The fractional differential equations for electrostatic potential in the media with power-law spatial dispersion are derived. The particular solutions of these equations for the electric potential of point charge in this media are considered.
Morin, Dana J.; Fuller, Angela K.; Royle, J. Andrew; Sutherland, Chris
2017-01-01
Conservation and management of spatially structured populations is challenging because solutions must consider where individuals are located, but also differential individual space use as a result of landscape heterogeneity. A recent extension of spatial capture–recapture (SCR) models, the ecological distance model, uses spatial encounter histories of individuals (e.g., a record of where individuals are detected across space, often sequenced over multiple sampling occasions), to estimate the relationship between space use and characteristics of a landscape, allowing simultaneous estimation of both local densities of individuals across space and connectivity at the scale of individual movement. We developed two model-based estimators derived from the SCR ecological distance model to quantify connectivity over a continuous surface: (1) potential connectivity—a metric of the connectivity of areas based on resistance to individual movement; and (2) density-weighted connectivity (DWC)—potential connectivity weighted by estimated density. Estimates of potential connectivity and DWC can provide spatial representations of areas that are most important for the conservation of threatened species, or management of abundant populations (i.e., areas with high density and landscape connectivity), and thus generate predictions that have great potential to inform conservation and management actions. We used a simulation study with a stationary trap design across a range of landscape resistance scenarios to evaluate how well our model estimates resistance, potential connectivity, and DWC. Correlation between true and estimated potential connectivity was high, and there was positive correlation and high spatial accuracy between estimated DWC and true DWC. We applied our approach to data collected from a population of black bears in New York, and found that forested areas represented low levels of resistance for black bears. We demonstrate that formal inference about measures of landscape connectivity can be achieved from standard methods of studying animal populations which yield individual encounter history data such as camera trapping. Resulting biological parameters including resistance, potential connectivity, and DWC estimate the spatial distribution and connectivity of the population within a statistical framework, and we outline applications to many possible conservation and management problems.
On Electron-Positron Pair Production by a Spatially Inhomogeneous Electric Field
NASA Astrophysics Data System (ADS)
Chervyakov, A.; Kleinert, H.
2018-05-01
A detailed analysis of electron-positron pair creation induced by a spatially non-uniform and static electric field from vacuum is presented. A typical example is provided by the Sauter potential. For this potential, we derive the analytic expressions for vacuum decay and pair production rate accounted for the entire range of spatial variations. In the limit of a sharp step, we recover the divergent result due to the singular electric field at the origin. The limit of a constant field reproduces the classical result of Euler, Heisenberg and Schwinger, if the latter is properly averaged over the width of a spatial variation. The pair production by the Sauter potential is described for different regimes from weak to strong fields. For all these regimes, the locally constant-field rate is shown to be the upper limit.
Li, Qi; Yu, Hongtao; Wu, Yan; Gao, Ning
2016-08-26
The integration of multiple sensory inputs is essential for perception of the external world. The spatial factor is a fundamental property of multisensory audiovisual integration. Previous studies of the spatial constraints on bimodal audiovisual integration have mainly focused on the spatial congruity of audiovisual information. However, the effect of spatial reliability within audiovisual information on bimodal audiovisual integration remains unclear. In this study, we used event-related potentials (ERPs) to examine the effect of spatial reliability of task-irrelevant sounds on audiovisual integration. Three relevant ERP components emerged: the first at 140-200ms over a wide central area, the second at 280-320ms over the fronto-central area, and a third at 380-440ms over the parieto-occipital area. Our results demonstrate that ERP amplitudes elicited by audiovisual stimuli with reliable spatial relationships are larger than those elicited by stimuli with inconsistent spatial relationships. In addition, we hypothesized that spatial reliability within an audiovisual stimulus enhances feedback projections to the primary visual cortex from multisensory integration regions. Overall, our findings suggest that the spatial linking of visual and auditory information depends on spatial reliability within an audiovisual stimulus and occurs at a relatively late stage of processing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinistore, Julie C.; Reinemann, D. J.; Izaurralde, Roberto C.
Spatial variability in yields and greenhouse gas emissions from soils has been identified as a key source of variability in life cycle assessments (LCAs) of agricultural products such as cellulosic ethanol. This study aims to conduct an LCA of cellulosic ethanol production from switchgrass in a way that captures this spatial variability and tests results for sensitivity to using spatially averaged results. The Environment Policy Integrated Climate (EPIC) model was used to calculate switchgrass yields, greenhouse gas (GHG) emissions, and nitrogen and phosphorus emissions from crop production in southern Wisconsin and Michigan at the watershed scale. These data were combinedmore » with cellulosic ethanol production data via ammonia fiber expansion and dilute acid pretreatment methods and region-specific electricity production data into an LCA model of eight ethanol production scenarios. Standard deviations from the spatial mean yields and soil emissions were used to test the sensitivity of net energy ratio, global warming potential intensity, and eutrophication and acidification potential metrics to spatial variability. Substantial variation in the eutrophication potential was also observed when nitrogen and phosphorus emissions from soils were varied. This work illustrates the need for spatially explicit agricultural production data in the LCA of biofuels and other agricultural products.« less
Wirojanagud, Wanpen; Srisatit, Thares
2014-01-01
Fuzzy overlay approach on three raster maps including land slope, soil type, and distance to stream can be used to identify the most potential locations of high arsenic contamination in soils. Verification of high arsenic contamination was made by collection samples and analysis of arsenic content and interpolation surface by spatial anisotropic method. A total of 51 soil samples were collected at the potential contaminated location clarified by fuzzy overlay approach. At each location, soil samples were taken at the depth of 0.00-1.00 m from the surface ground level. Interpolation surface of the analysed arsenic content using spatial anisotropic would verify the potential arsenic contamination location obtained from fuzzy overlay outputs. Both outputs of the spatial surface anisotropic and the fuzzy overlay mapping were significantly spatially conformed. Three contaminated areas with arsenic concentrations of 7.19 ± 2.86, 6.60 ± 3.04, and 4.90 ± 2.67 mg/kg exceeded the arsenic content of 3.9 mg/kg, the maximum concentration level (MCL) for agricultural soils as designated by Office of National Environment Board of Thailand. It is concluded that fuzzy overlay mapping could be employed for identification of potential contamination area with the verification by surface anisotropic approach including intensive sampling and analysis of the substances of interest. PMID:25110751
Integrative Spatial Data Analytics for Public Health Studies of New York State
Chen, Xin; Wang, Fusheng
2016-01-01
Increased accessibility of health data made available by the government provides unique opportunity for spatial analytics with much higher resolution to discover patterns of diseases, and their correlation with spatial impact indicators. This paper demonstrated our vision of integrative spatial analytics for public health by linking the New York Cancer Mapping Dataset with datasets containing potential spatial impact indicators. We performed spatial based discovery of disease patterns and variations across New York State, and identify potential correlations between diseases and demographic, socio-economic and environmental indicators. Our methods were validated by three correlation studies: the correlation between stomach cancer and Asian race, the correlation between breast cancer and high education population, and the correlation between lung cancer and air toxics. Our work will allow public health researchers, government officials or other practitioners to adequately identify, analyze, and monitor health problems at the community or neighborhood level for New York State. PMID:28269834
The Ability of Young Korean Children to Use Spatial Representations
ERIC Educational Resources Information Center
Kim, Minsung; Bednarz, Robert; Kim, Jaeyil
2012-01-01
The National Research Council emphasizes using tools of representation as an essential element of spatial thinking. However, it is debatable at what age the use of spatial representation for spatial thinking skills should begin. This study investigated whether young Korean children possess the potential to understand map-like representation using…
Hippocampal CA1 Kindling but Not Long-Term Potentiation Disrupts Spatial Memory Performance
ERIC Educational Resources Information Center
Leung, L. Stan; Shen, Bixia
2006-01-01
Long-term synaptic enhancement in the hippocampus has been suggested to cause deficits in spatial performance. Synaptic enhancement has been reported after hippocampal kindling that induced repeated electrographic seizures or afterdischarges (ADs) and after long-term potentiation (LTP) defined as synaptic enhancement without ADs. We studied…
Zhong, Wei-Ping; Belić, Milivoj R
2010-05-01
We report on the nonlinear tunneling effects of spatial solitons of the generalized nonlinear Schrödinger equation with distributed coefficients in an external harmonic potential. By using the homogeneous balance principle and the F-expansion technique we find the spatial bright and dark soliton solutions. We then display tunneling effects of such solutions occurring under special conditions; specifically when the spatial solitons pass unchanged through the potential barriers and wells affected by special choices of the diffraction and/or the nonlinearity coefficients. Our results show that the solitons display tunneling effects not only when passing through the nonlinear potential barriers or wells but also when passing through the diffractive barriers or wells. During tunneling the solitons may also undergo a controllable compression.
Hess, Katherine C; Epting, William K; Litster, Shawn
2011-12-15
We report the development and use of a microstructured electrode scaffold (MES) to make spatially resolved, in situ, electrolyte potential measurements through the thickness of a polymer electrolyte fuel cell (PEFC) electrode. This new approach uses a microfabricated apparatus to analyze the coupled transport and electrochemical phenomena in porous electrodes at the microscale. In this study, the MES allows the fuel cell to run under near-standard operating conditions, while providing electrolyte potential measurements at discrete distances through the electrode's thickness. Here we use spatial distributions of electrolyte potential to evaluate the effects of Ohmic and mass transport resistances on the through-plane reaction distribution for various operating conditions. Additionally, we use the potential distributions to estimate the ionic conductivity of the electrode. Our results indicate the in situ conductivity is higher than typically estimated for PEFC electrodes based on bulk polymer electrolyte membrane (PEM) conductivity.
Estimating floodplain sedimentation in the Laguna de Santa Rosa, Sonoma County, CA
Curtis, Jennifer A.; Flint, Lorraine E.; Hupp, Cliff R.
2013-01-01
We present a conceptual and analytical framework for predicting the spatial distribution of floodplain sedimentation for the Laguna de Santa Rosa, Sonoma County, CA. We assess the role of the floodplain as a sink for fine-grained sediment and investigate concerns regarding the potential loss of flood storage capacity due to historic sedimentation. We characterized the spatial distribution of sedimentation during a post-flood survey and developed a spatially distributed sediment deposition potential map that highlights zones of floodplain sedimentation. The sediment deposition potential map, built using raster files that describe the spatial distribution of relevant hydrologic and landscape variables, was calibrated using 2 years of measured overbank sedimentation data and verified using longer-term rates determined using dendrochronology. The calibrated floodplain deposition potential relation was used to estimate an average annual floodplain sedimentation rate (3.6 mm/year) for the ~11 km2 floodplain. This study documents the development of a conceptual model of overbank sedimentation, describes a methodology to estimate the potential for various parts of a floodplain complex to accumulate sediment over time, and provides estimates of short and long-term overbank sedimentation rates that can be used for ecosystem management and prioritization of restoration activities.
Implications of the spatial dynamics of fire spread for the bistability of savanna and forest.
Schertzer, E; Staver, A C; Levin, S A
2015-01-01
The role of fire in expanding the global distribution of savanna is well recognized. Empirical observations and modeling suggest that fire spread has a threshold response to fuel-layer continuity, which sets up a positive feedback that maintains savanna-forest bistability. However, modeling has so far failed to examine fire spread as a spatial process that interacts with vegetation. Here, we use simple, well-supported assumptions about fire spread as an infection process and its effects on trees to ask whether spatial dynamics qualitatively change the potential for savanna-forest bistability. We show that the spatial effects of fire spread are the fundamental reason that bistability is possible: because fire spread is an infection process, it exhibits a threshold response to fuel continuity followed by a rapid increase in fire size. Other ecological processes affecting fire spread may also contribute including temporal variability in demography or fire spread. Finally, including the potential for spatial aggregation increases the potential both for savanna-forest bistability and for savanna and forest to coexist in a landscape mosaic.
Teaching the Gifted Visual Spatial Learner
ERIC Educational Resources Information Center
Freed, Jeff
2006-01-01
In working with right-brained or visual spatial children for the past 20 years, the author has noticed that they all learn in a similar manner. He has also noticed that a high percentage of gifted children are visual spatial learners. The more visual spatial a child is, the higher the potential for school difficulties. Since most teachers are…
Crooks, Valorie A; Schuurman, Nadine
2012-08-01
Primary health care (PHC) encompasses an array of health and social services that focus on preventative, diagnostic, and basic care measures to maintain wellbeing and address illnesses. In Canada, PHC involves the provision of first-contact health care services by providers such as family physicians and general practitioners - collectively referred as PHC physicians here. Ensuring access is a key requirement of effective PHC delivery. This is because having access to PHC has been shown to positively impact a number of health outcomes. We build on recent innovations in measuring potential spatial access to PHC physicians using geographic information systems (GIS) by running and then interpreting the findings of a modified gravity model. Elsewhere we have introduced the protocol for this model. In this article we run it for five selected Canadian provinces and territories. Our objectives are to present the results of the modified gravity model in order to: (1) understand how potential spatial access to PHC physicians can be interpreted in these Canadian jurisdictions, and (2) provide guidance regarding how findings of the modified gravity model should be interpreted in other analyses. Regarding the first objective, two distinct spatial patterns emerge regarding potential spatial access to PHC physicians in the five selected Canadian provinces: (1) a clear north-south pattern, where southern areas have greater potential spatial access than northern areas; and (2) while gradients of potential spatial access exist in and around urban areas, access outside of densely-to-moderately populated areas is fairly binary. Regarding the second objective, we identify three principles that others can use to interpret the findings of the modified gravity model when used in other research contexts. Future applications of the modified gravity model are needed in order to refine the recommendations we provide on interpreting its results. It is important that studies are undertaken that can help administrators, policy-makers, researchers, and others with characterizing the state of access to PHC, including potential spatial access. We encourage further research to be done using GIS in order to offer new, spatial perspectives on issues of access to health services given the increased recognition that the place-based nature of health services can benefit from the use of the capabilities of GIS to enhance the role that visualization plays in decision-making.
2012-01-01
Background Primary health care (PHC) encompasses an array of health and social services that focus on preventative, diagnostic, and basic care measures to maintain wellbeing and address illnesses. In Canada, PHC involves the provision of first-contact health care services by providers such as family physicians and general practitioners – collectively referred as PHC physicians here. Ensuring access is a key requirement of effective PHC delivery. This is because having access to PHC has been shown to positively impact a number of health outcomes. Methods We build on recent innovations in measuring potential spatial access to PHC physicians using geographic information systems (GIS) by running and then interpreting the findings of a modified gravity model. Elsewhere we have introduced the protocol for this model. In this article we run it for five selected Canadian provinces and territories. Our objectives are to present the results of the modified gravity model in order to: (1) understand how potential spatial access to PHC physicians can be interpreted in these Canadian jurisdictions, and (2) provide guidance regarding how findings of the modified gravity model should be interpreted in other analyses. Results Regarding the first objective, two distinct spatial patterns emerge regarding potential spatial access to PHC physicians in the five selected Canadian provinces: (1) a clear north–south pattern, where southern areas have greater potential spatial access than northern areas; and (2) while gradients of potential spatial access exist in and around urban areas, access outside of densely-to-moderately populated areas is fairly binary. Regarding the second objective, we identify three principles that others can use to interpret the findings of the modified gravity model when used in other research contexts. Conclusions Future applications of the modified gravity model are needed in order to refine the recommendations we provide on interpreting its results. It is important that studies are undertaken that can help administrators, policy-makers, researchers, and others with characterizing the state of access to PHC, including potential spatial access. We encourage further research to be done using GIS in order to offer new, spatial perspectives on issues of access to health services given the increased recognition that the place-based nature of health services can benefit from the use of the capabilities of GIS to enhance the role that visualization plays in decision-making. PMID:22852816
Quantum interference between transverse spatial waveguide modes.
Mohanty, Aseema; Zhang, Mian; Dutt, Avik; Ramelow, Sven; Nussenzveig, Paulo; Lipson, Michal
2017-01-20
Integrated quantum optics has the potential to markedly reduce the footprint and resource requirements of quantum information processing systems, but its practical implementation demands broader utilization of the available degrees of freedom within the optical field. To date, integrated photonic quantum systems have primarily relied on path encoding. However, in the classical regime, the transverse spatial modes of a multi-mode waveguide have been easily manipulated using the waveguide geometry to densely encode information. Here, we demonstrate quantum interference between the transverse spatial modes within a single multi-mode waveguide using quantum circuit-building blocks. This work shows that spatial modes can be controlled to an unprecedented level and have the potential to enable practical and robust quantum information processing.
Interactive (statistical) visualisation and exploration of a billion objects with vaex
NASA Astrophysics Data System (ADS)
Breddels, M. A.
2017-06-01
With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
LSST system analysis and integration task for an advanced science and application space platform
NASA Technical Reports Server (NTRS)
1980-01-01
To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.
NASA Astrophysics Data System (ADS)
Yoon, Mijin; Jee, Myungkook James; Tyson, Tony
2018-01-01
The Deep Lens Survey (DLS), a precursor to the Large Synoptic Survey Telescope (LSST), is a 20 sq. deg survey carried out with NOAO’s Blanco and Mayall telescopes. The strength of the survey lies in its depth reaching down to ~27th mag in BVRz bands. This enables a broad redshift baseline study and allows us to investigate cosmological evolution of the large-scale structure. In this poster, we present the first cosmological analysis from the DLS using galaxy-shear correlations and galaxy clustering signals. Our DLS shear calibration accuracy has been validated through the most recent public weak-lensing data challenge. Photometric redshift systematic errors are tested by performing lens-source flip tests. Instead of real-space correlations, we reconstruct band-limited power spectra for cosmological parameter constraints. Our analysis puts a tight constraint on the matter density and the power spectrum normalization parameters. Our results are highly consistent with our previous cosmic shear analysis and also with the Planck CMB results.
Variability Analysis: Detection and Classification
NASA Astrophysics Data System (ADS)
Eyer, L.
2005-01-01
The Gaia mission will offer an exceptional opportunity to perform variability studies. The data homogeneity, its optimised photometric systems, composed of 11 medium and 4-5 broad bands, the high photometric precision in G band of one milli-mag for V = 13-15, the radial velocity measurements and the exquisite astrometric precision for one billion stars will permit a detailed description of variable objects like stars, quasars and asteroids. However the time sampling and the total number of measurements change from one object to another because of the satellite scanning law. The data analysis is a challenge because of the huge amount of data, the complexity of the observed objects and the peculiarities of the satellite, and needs thorough preparation. Experience can be gained by the study of past and present survey analyses and results, and Gaia should be put in perspective with the future large scale surveys, like PanSTARRS or LSST. We present the activities of the Variable Star Working Group and a general plan to digest this unprecedented data set, focusing here on the photometry.
Primary results from the Pan-STARRS-1 Outer Solar System Key Project
NASA Astrophysics Data System (ADS)
Holman, Matthew J.; Chen, Ying-Tung; Lackner, Michael; Payne, Matthew John; Lin, Hsing-Wen; Cristopher Fraser, Wesley; Lacerda, Pedro; Pan-STARRS 1 Science Consortium
2016-10-01
We have completed a search for slow moving bodies in the data obtained by the Pan-STARRS-1 (PS1) Science Consortium from 2010 to 2014. The data set covers the full sky north of -30 degrees declination, in the PS1 g, r, i, z, y, and w (g+r+i) filters. Our novel distance-based search is effective at detecting and linking very slow moving objects with sparsely sampled observations, even if observations are widely separated in RA, Dec and time, which is relevant to the future LSST solar system searches. In particular, our search is sensitive to objects at heliocentric distances of 25-2000 AU with magnitudes brighter than approximately r=22.5, without limits on the inclination of the object. We recover hundreds of known TNOs and Centaurs and discover hundreds of new objects, measuring phase and color information for many of them. Other highlights include the discovery of a second retrograde TNO, a number of Neptune Trojans, and large numbers of distant resonant TNOs.
zBEAMS: a unified solution for supernova cosmology with redshift uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Ethan; Lochner, Michelle; Bassett, Bruce A.
Supernova cosmology without spectra will be an important component of future surveys such as LSST. This lack of supernova spectra results in uncertainty in the redshifts which, if ignored, leads to significantly biased estimates of cosmological parameters. Here we present a hierarchical Bayesian formalism— zBEAMS—that addresses this problem by marginalising over the unknown or uncertain supernova redshifts to produce unbiased cosmological estimates that are competitive with supernova data with spectroscopically confirmed redshifts. zBEAMS provides a unified treatment of both photometric redshifts and host galaxy misidentification (occurring due to chance galaxy alignments or faint hosts), effectively correcting the inevitable contamination inmore » the Hubble diagram. Like its predecessor BEAMS, our formalism also takes care of non-Ia supernova contamination by marginalising over the unknown supernova type. We illustrate this technique with simulations of supernovae with photometric redshifts and host galaxy misidentification. A novel feature of the photometric redshift case is the important role played by the redshift distribution of the supernovae.« less
Systematic Serendipity: A Method to Discover the Anomalous
NASA Astrophysics Data System (ADS)
Giles, Daniel; Walkowicz, Lucianne
2018-01-01
One of the challenges in the era of big data astronomical surveys is identifying anomalous data, data that exhibits as-of-yet unobserved behavior. These data may result from systematic errors, extreme (or rare) forms of known phenomena, or, most interestingly, truly novel phenomena that has historically required a trained eye and often fortuitous circumstance to identify. We describe a method that uses machine clustering techniques to discover anomalous data in Kepler lightcurves, as a step towards systematizing the detection of novel phenomena in the era of LSST. As a proof of concept, we apply our anomaly detection method to Kepler data including Boyajian's Star (KIC 8462852). We examine quarters 4, 8, 11, and 16 of the Kepler data which contain Boyajian’s Star acting normally (quarters 4 and 11) and anomalously (quarters 8 and 16). We demonstrate that our method is capable of identifying Boyajian’s Star’s anomalous behavior in quarters of interest, and we further identify other anomalous light curves that exhibit a range of interesting variability.
A model to forecast data centre infrastructure costs.
NASA Astrophysics Data System (ADS)
Vernet, R.
2015-12-01
The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.
Recovering Galaxy Properties Using Gaussian Process SED Fitting
NASA Astrophysics Data System (ADS)
Iyer, Kartheik; Awan, Humna
2018-01-01
Information about physical quantities like the stellar mass, star formation rates, and ages for distant galaxies is contained in their spectral energy distributions (SEDs), obtained through photometric surveys like SDSS, CANDELS, LSST etc. However, noise in the photometric observations often is a problem, and using naive machine learning methods to estimate physical quantities can result in overfitting the noise, or converging on solutions that lie outside the physical regime of parameter space.We use Gaussian Process regression trained on a sample of SEDs corresponding to galaxies from a Semi-Analytic model (Somerville+15a) to estimate their stellar masses, and compare its performance to a variety of different methods, including simple linear regression, Random Forests, and k-Nearest Neighbours. We find that the Gaussian Process method is robust to noise and predicts not only stellar masses but also their uncertainties. The method is also robust in the cases where the distribution of the training data is not identical to the target data, which can be extremely useful when generalized to more subtle galaxy properties.
Spatial modeling of potential woody biomass flow
Woodam Chung; Nathaniel Anderson
2012-01-01
The flow of woody biomass to end users is determined by economic factors, especially the amount available across a landscape and delivery costs of bioenergy facilities. The objective of this study develop methodology to quantify landscape-level stocks and potential biomass flows using the currently available spatial database road network analysis tool. We applied this...
USDA-ARS?s Scientific Manuscript database
This paper reviews the literature and reports on the current state of knowledge regarding the potential for managers to use visual (VC), auditory (AC), and olfactory (OC) cues to manage foraging behavior and spatial distribution of rangeland livestock. We present evidence that free-ranging livestock...
Spatial adaptation of the cortical visual evoked potential of the cat.
Bonds, A B
1984-06-01
Adaptation that is spatially specific for the adapting pattern has been seen psychophysically in humans. This is indirect evidence for independent analyzers (putatively single units) that are specific for orientation and spatial frequency in the human visual system, but it is unclear how global adaptation characteristics may be related to single unit performance. Spatially specific adaptation was sought in the cat visual evoked potential (VEP), with a view towards relating this phenomenon with what we know of cat single units. Adaptation to sine-wave gratings results in a temporary loss of cat VEP amplitude, with induction and recovery similar to that seen in human psychophysical experiments. The amplitude loss was specific for both the spatial frequency and orientation of the adapting pattern. The bandwidth of adaptation was not unlike the average selectivity of a population of cat single units.
NASA Astrophysics Data System (ADS)
Balss, Karin Maria
The research contained in this thesis is focused on the formation and characterization of surface composition gradients on thin gold films that are formed by applications of in-plane potential gradients. Injecting milliamp currents into thin Au films yields significant in-plane voltage drops so that, rather than assuming a single value of potential, an in-plane potential gradient is imposed on the film which depends on the resistivity of the film, the cross sectional area and the magnitude of the potential drop. Furthermore, the in-plane electric potential gradient means that, relative to a solution reference couple, electrochemical reactions occurs at defined spatial positions corresponding to the local potential, V(x) ˜ E0. The spatial gradient in electrochemical potential can then produce spatially dependent electrochemistry. Surface-chemical potential gradients can be prepared by arranging the spread of potentials to span an electrochemical wave mediating redox-associated adsorption or desorption. Examples of reactions that can be spatially patterned include the electrosorption of alkanethiols and over-potential metal deposition. The unique advantage of this method for patterning spatial compositions is the control of surface coverage in both space and time. The thesis is organized into two parts. In Part I, formation and characterization of 1- and 2-component alkanethiol monolayer gradients is investigated. Numerous surface science tools are employed to examine the distribution in coverage obtained by application of in-plane potential gradients. Macroscopic characterization was obtained by sessile water drop contact angle measurements and surface plasmon resonance imaging. Gradients were also imaged on micron length scales with pulsed-force mode atomic force microscopy. Direct chemical evidence of surface compositions in aromatic thiol surface coverage was obtained by surface-enhanced Raman spectroscopy. In Part II, the applications of in-plane potential gradients is discussed. Electrochemical reactions other than electrosorption of alkanethiols were demonstrated with over-potential deposition of copper onto gold films. One application of these patterns is to control the movement of supermolecular objects. As a first step towards this goal, biological cells were seeded onto gradient patterns containing adhesion promoters and inhibitors. The morphology and adhesion was investigated as a function of concentration along the gradient.
Chromatic and Achromatic Spatial Resolution of Local Field Potentials in Awake Cortex
Jansen, Michael; Li, Xiaobing; Lashgari, Reza; Kremkow, Jens; Bereshpolova, Yulia; Swadlow, Harvey A.; Zaidi, Qasim; Alonso, Jose-Manuel
2015-01-01
Local field potentials (LFPs) have become an important measure of neuronal population activity in the brain and could provide robust signals to guide the implant of visual cortical prosthesis in the future. However, it remains unclear whether LFPs can detect weak cortical responses (e.g., cortical responses to equiluminant color) and whether they have enough visual spatial resolution to distinguish different chromatic and achromatic stimulus patterns. By recording from awake behaving macaques in primary visual cortex, here we demonstrate that LFPs respond robustly to pure chromatic stimuli and exhibit ∼2.5 times lower spatial resolution for chromatic than achromatic stimulus patterns, a value that resembles the ratio of achromatic/chromatic resolution measured with psychophysical experiments in humans. We also show that, although the spatial resolution of LFP decays with visual eccentricity as is also the case for single neurons, LFPs have higher spatial resolution and show weaker response suppression to low spatial frequencies than spiking multiunit activity. These results indicate that LFP recordings are an excellent approach to measure spatial resolution from local populations of neurons in visual cortex including those responsive to color. PMID:25416722
ERIC Educational Resources Information Center
Yoon, So Yoon; Mann, Eric L.
2017-01-01
Spatial ability has been valued as a talent domain and as an assessment form that reduces cultural, linguistic, and socioeconomic status biases, yet little is known of the spatial ability of students in gifted programs compared with those in general education. Spatial ability is considered an important indicator of potential talent in the domains…
Spatial-Temporal Dynamics of Urban Fire Incidents: a Case Study of Nanjing, China
NASA Astrophysics Data System (ADS)
Yao, J.; Zhang, X.
2016-06-01
Fire and rescue service is one of the fundamental public services provided by government in order to protect people, properties and environment from fires and other disasters, and thus promote a safer living environment. Well understanding spatial-temporal dynamics of fire incidents can offer insights for potential determinants of various fire events and enable better fire risk estimation, assisting future allocation of prevention resources and strategic planning of mitigation programs. Using a 12-year (2002-2013) dataset containing the urban fire events in Nanjing, China, this research explores the spatial-temporal dynamics of urban fire incidents. A range of exploratory spatial data analysis (ESDA) approaches and tools, such as spatial kernel density and co-maps, are employed to examine the spatial, temporal and spatial-temporal variations of the fire events. Particular attention has been paid to two types of fire incidents: residential properties and local facilities, due to their relatively higher occurrence frequencies. The results demonstrated that the amount of urban fire has greatly increased in the last decade and spatial-temporal distribution of fire events vary among different incident types, which implies varying impact of potential influencing factors for further investigation.
Spatial Resolution Requirements for Accurate Identification of Drivers of Atrial Fibrillation
Roney, Caroline H.; Cantwell, Chris D.; Bayer, Jason D.; Qureshi, Norman A.; Lim, Phang Boon; Tweedy, Jennifer H.; Kanagaratnam, Prapa; Vigmond, Edward J.; Ng, Fu Siong
2017-01-01
Background— Recent studies have demonstrated conflicting mechanisms underlying atrial fibrillation (AF), with the spatial resolution of data often cited as a potential reason for the disagreement. The purpose of this study was to investigate whether the variation in spatial resolution of mapping may lead to misinterpretation of the underlying mechanism in persistent AF. Methods and Results— Simulations of rotors and focal sources were performed to estimate the minimum number of recording points required to correctly identify the underlying AF mechanism. The effects of different data types (action potentials and unipolar or bipolar electrograms) and rotor stability on resolution requirements were investigated. We also determined the ability of clinically used endocardial catheters to identify AF mechanisms using clinically recorded and simulated data. The spatial resolution required for correct identification of rotors and focal sources is a linear function of spatial wavelength (the distance between wavefronts) of the arrhythmia. Rotor localization errors are larger for electrogram data than for action potential data. Stationary rotors are more reliably identified compared with meandering trajectories, for any given spatial resolution. All clinical high-resolution multipolar catheters are of sufficient resolution to accurately detect and track rotors when placed over the rotor core although the low-resolution basket catheter is prone to false detections and may incorrectly identify rotors that are not present. Conclusions— The spatial resolution of AF data can significantly affect the interpretation of the underlying AF mechanism. Therefore, the interpretation of human AF data must be taken in the context of the spatial resolution of the recordings. PMID:28500175
VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi
2018-04-17
Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.
Calculating potential fields using microchannel spatial light modulators
NASA Technical Reports Server (NTRS)
Reid, Max B.
1993-01-01
We describe and present experimental results of the optical calculation of potential field maps suitable for mobile robot navigation. The optical computation employs two write modes of a microchannel spatial light modulator (MSLM). In one mode, written patterns expand spatially, and this characteristic is used to create an extended two dimensional function representing the influence of the goal in a robot's workspace. Distinct obstacle patterns are written in a second, non-expanding, mode. A model of the mechanisms determining MSLM write mode characteristics is developed and used to derive the optical calculation time for full potential field maps. Field calculations at a few hertz are possible with current technology, and calculation time vs. map size scales favorably in comparison to digital electronic computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Hugh H.; Balasubramanian, V.; Bernstein, G.
The University of Pennsylvania elementary particle physics/particle cosmology group, funded by the Department of Energy Office of Science, participates in research in high energy physics and particle cosmology that addresses some of the most important unanswered questions in science. The research is divided into five areas. Energy Frontier - We participate in the study of proton-proton collisions at the Large Hadron Collider in Geneva, Switzerland using the ATLAS detector. The University of Pennsylvania group was responsible for the design, installation, and commissioning of the front-end electronics for the Transition Radiation Tracker (TRT) and plays the primary role in its maintenancemore » and operation. We play an important role in the triggering of ATLAS, and we have made large contributions to the TRT performance and to the study and identification of electrons, photons, and taus. We have been actively involved in searches for the Higgs boson and for SUSY and other exotic particles. We have made significant contributions to measurement of Standard Model processes such as inclusive photon production and WW pair production. We also have participated significantly in R&D for upgrades to the ATLAS detector. Cosmic Frontier - The Dark Energy Survey (DES) telescope will be used to elucidate the nature of dark energy and the distribution of dark matter. Penn has played a leading role both in the use of weak gravitational lensing of distant galaxies and the discovery of large numbers of distant supernovae. The techniques and forecasts developed at Penn are also guiding the development of the proposed Large Synoptic Survey Telescope (LSST).We are also developing a new detector, MiniClean, to search for direct detection of dark matter particles. Intensity Frontier - We are participating in the design and R&D of detectors for the Long Baseline Neutrino Experiment (now DUNE), a new experiment to study the properties of neutrinos. Advanced Techology R&D - We have an extensive involvement in electronics required for sophisticated new detectors at the LHC and are developing electronics for the LSST camera. Theoretical Physics - We are carrying out a broad program studying the fundamental forces of nature and early universe cosmology and mathematical physics. Our activities span the range from model building, formal field theory, and string theory to new paradigms for cosmology and the interface of string theory with mathematics. Our effort combines extensive development of the formal aspects of string theory with a focus on real phenomena in particle physics, cosmology and gravity.« less
[In search for neurophysiological criteria of altered consciousness].
Sviderskaia, N E
2002-01-01
Neurophysiological approaches to brain mechanisms of consciousness are discussed. The concept of spatial synchronization of nervous processes developed by M.N. Livanov is applied to neurophysiological analysis of higher brain functions. However, the spatial synchronization of brain potentials is only a condition for information processing and does not represent it as such. This imposes restrictions on conclusions about the neural mechanisms of consciousness. It is more adequate to use the concept of spatial synchronization in views of consciousness as a psychophysiological level along with sub- and superconsciousness in three-level structure of mind according to P.V. Simonov. Forms of consciousness interaction with other levels concern the problem of altered consciousness and may be reflected in various patterns of spatial organization of brain potentials.
Pixels, Blocks of Pixels, and Polygons: Choosing a Spatial Unit for Thematic Accuracy Assessment
Pixels, polygons, and blocks of pixels are all potentially viable spatial assessment units for conducting an accuracy assessment. We develop a statistical population-based framework to examine how the spatial unit chosen affects the outcome of an accuracy assessment. The populati...
The Impact of Participation in Music on Learning Mathematics
ERIC Educational Resources Information Center
Holmes, Sylwia; Hallam, Susan
2017-01-01
Music psychologists have established that some forms of musical activity improve intellectual performance, spatial-temporal reasoning and other skills advantageous for learning. In this research, the potential of active music-making for improving pupils' achievement in spatial- temporal reasoning was investigated. As spatial-temporal skills are…
Visual attention spreads broadly but selects information locally.
Shioiri, Satoshi; Honjyo, Hajime; Kashiwase, Yoshiyuki; Matsumiya, Kazumichi; Kuriki, Ichiro
2016-10-19
Visual attention spreads over a range around the focus as the spotlight metaphor describes. Spatial spread of attentional enhancement and local selection/inhibition are crucial factors determining the profile of the spatial attention. Enhancement and ignorance/suppression are opposite effects of attention, and appeared to be mutually exclusive. Yet, no unified view of the factors has been provided despite their necessity for understanding the functions of spatial attention. This report provides electroencephalographic and behavioral evidence for the attentional spread at an early stage and selection/inhibition at a later stage of visual processing. Steady state visual evoked potential showed broad spatial tuning whereas the P3 component of the event related potential showed local selection or inhibition of the adjacent areas. Based on these results, we propose a two-stage model of spatial attention with broad spread at an early stage and local selection at a later stage.
NASA Astrophysics Data System (ADS)
Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César
2014-11-01
We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.
NASA Astrophysics Data System (ADS)
Bergström, Per; Lindegarth, Susanne; Lindegarth, Mats
2013-10-01
Human pressures on coastal seas are increasing and methods for sustainable management, including spatial planning and mitigative actions, are therefore needed. In coastal areas worldwide, the development of mussel farming as an economically and ecologically sustainable industry requires geographic information on the growth and potential production capacity. In practice this means that coherent maps of temporally stable spatial patterns of growth need to be available in the planning process and that maps need to be based on mechanistic or empirical models. Therefore, as a first step towards development of models of growth, we assessed empirically the fundamental requirement that there are temporally consistent spatial patterns of growth in the blue mussel, Mytilus edulis. Using a pilot study we designed and dimensioned a transplant experiment, where the spatial consistency in the growth of mussels was evaluated at two resolutions. We found strong temporal and scale-dependent spatial variability in growth but patterns suggested that spatial patterns were uncoupled between growth of shell and that of soft tissue. Spatial patterns of shell growth were complex and largely inconsistent among years. Importantly, however, the growth of soft tissue was qualitatively consistent among years at the scale of km. The results suggest that processes affecting the whole coastal area cause substantial differences in growth of soft tissue among years but that factors varying at the scale of km create strong and persistent spatial patterns of growth, with a potential doubling of productivity by identifying the most suitable locations. We conclude that the observed spatial consistency provides a basis for further development of predictive modelling and mapping of soft tissue growth in these coastal areas. Potential causes of observed patterns, consequences for mussel-farming as a tool for mitigating eutrophication, aspects of precision of modelling and sampling of mussel growth as well as ecological functions in general are discussed.
Serendipitous discovery of a strong-lensed galaxy in integral field spectroscopy from MUSE
NASA Astrophysics Data System (ADS)
Galbany, Lluís; Collett, Thomas E.; Méndez-Abreu, Jairo; Sánchez, Sebastián F.; Anderson, Joseph P.; Kuncarayakti, Hanindyo
2018-06-01
2MASX J04035024-0239275 is a bright red elliptical galaxy at redshift 0.0661 that presents two extended sources at 2″ to the north-east and 1″ to the south-west. The sizes and surface brightnesses of the two blue sources are consistent with a gravitationally-lensed background galaxy. In this paper we present MUSE observations of this galaxy from the All-weather MUse Supernova Integral-field Nearby Galaxies (AMUSING) survey, and report the discovery of a background lensed galaxy at redshift 0.1915, together with other 15 background galaxies at redshifts ranging from 0.09 to 0.9, that are not multiply imaged. We have extracted aperture spectra of the lens and all the sources and fit the stellar continuum with STARLIGHT to estimate their stellar and emission line properties. A trace of past merger and active nucleus activity is found in the lensing galaxy, while the background lensed galaxy is found to be star-forming. Modeling the lensing potential with a singular isothermal ellipsoid, we find an Einstein radius of 1."45±0."04, which corresponds to 1.9 kpc at the redshift of the lens and it is much smaller than its effective radius (reff ˜ 9″"). Comparing the Einstein mass and the STARLIGHT stellar mass within the same aperture yields a dark matter fraction of 18% ± 8 % within the Einstein radius. The advent of large surveys such as the Large Synoptic Survey Telescope (LSST) will discover a number of strong-lensed systems, and here we demonstrate how wide-field integral field spectroscopy offers an excellent approach to study them and to precisely model lensing effects.
THE HISTORY OF TIDAL DISRUPTION EVENTS IN GALACTIC NUCLEI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aharon, Danor; Battisti, Alessandra Mastrobuono; Perets, Hagai B.
The tidal disruption of a star by a massive black hole (MBH) is thought to produce a transient luminous event. Such tidal disruption events (TDEs) may play an important role in the detection and characterization of MBHs, and in probing the properties and dynamics of their nuclear stellar cluster (NSC) hosts. Previous studies estimated the recent rates of TDEs in the local universe. However, the long-term evolution of the rates throughout the history of the universe has been little explored. Here we consider TDE history, using evolutionary models for the evolution of galactic nuclei. We use a 1D Fokker–Planck approachmore » to explore the evolution of MBH-hosting NSCs, and obtain the disruption rates of stars during their evolution. We complement these with an analysis of TDE history based on N -body simulation data, and find them to be comparable. We consider NSCs that are built up from close-in star formation (SF) or from far-out SF/cluster-dispersal, a few pc from the MBH. We also explore cases where primordial NSCs exist and later evolve through additional SF/cluster-dispersal processes. We study the dependence of the TDE history on the type of galaxy, as well as the dependence on the MBH mass. These provide several scenarios, with a continuous increase of the TDE rates over time for cases of far-out SF and a more complex behavior for the close-in SF cases. Finally, we integrate the TDE histories of the various scenarios to provide a total TDE history of the universe, which can be potentially probed with future large surveys (e.g., LSST).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Ji; Ishak, Mustapha; Lin, Weikang
Intrinsic alignments (IA) of galaxies have been recognized as one of the most serious contaminants to weak lensing. These systematics need to be isolated and mitigated in order for ongoing and future lensing surveys to reach their full potential. The IA self-calibration (SC) method was shown in previous studies to be able to reduce the GI contamination by up to a factor of 10 for the 2-point and 3-point correlations. The SC method does not require the assumption of an IA model in its working and can extract the GI signal from the same photo-z survey offering the possibility tomore » test and understand structure formation scenarios and their relationship to IA models. In this paper, we study the effects of the IA SC mitigation method on the precision and accuracy of cosmological parameter constraints from future cosmic shear surveys LSST, WFIRST and Euclid. We perform analytical and numerical calculations to estimate the loss of precision and the residual bias in the best fit cosmological parameters after the self-calibration is performed. We take into account uncertainties from photometric redshifts and the galaxy bias. We find that the confidence contours are slightly inflated from applying the SC method itself while a significant increase is due to the inclusion of the photo-z uncertainties. The bias of cosmological parameters is reduced from several-σ, when IA is not corrected for, to below 1-σ after SC is applied. These numbers are comparable to those resulting from applying the method of marginalizing over IA model parameters despite the fact that the two methods operate very differently. We conclude that implementing the SC for these future cosmic-shear surveys will not only allow one to efficiently mitigate the GI contaminant but also help to understand their modeling and link to structure formation.« less
Probing Inflation Using Galaxy Clustering On Ultra-Large Scales
NASA Astrophysics Data System (ADS)
Dalal, Roohi; de Putter, Roland; Dore, Olivier
2018-01-01
A detailed understanding of curvature perturbations in the universe is necessary to constrain theories of inflation. In particular, measurements of the local non-gaussianity parameter, flocNL, enable us to distinguish between two broad classes of inflationary theories, single-field and multi-field inflation. While most single-field theories predict flocNL ≈ ‑5/12 (ns -1), in multi-field theories, flocNL is not constrained to this value and is allowed to be observably large. Achieving σ(flocNL) = 1 would give us discovery potential for detecting multi-field inflation, while finding flocNL=0 would rule out a good fraction of interesting multi-field models. We study the use of galaxy clustering on ultra-large scales to achieve this level of constraint on flocNL. Upcoming surveys such as Euclid and LSST will give us galaxy catalogs from which we can construct the galaxy power spectrum and hence infer a value of flocNL. We consider two possible methods of determining the galaxy power spectrum from a catalog of galaxy positions: the traditional Feldman Kaiser Peacock (FKP) Power Spectrum Estimator, and an Optimal Quadratic Estimator (OQE). We implemented and tested each method using mock galaxy catalogs, and compared the resulting constraints on flocNL. We find that the FKP estimator can measure flocNL in an unbiased way, but there remains room for improvement in its precision. We also find that the OQE is not computationally fast, but remains a promising option due to its ability to isolate the power spectrum at large scales. We plan to extend this research to study alternative methods, such as pixel-based likelihood functions. We also plan to study the impact of general relativistic effects at these scales on our ability to measure flocNL.
Disentangling dark energy and cosmic tests of gravity from weak lensing systematics
NASA Astrophysics Data System (ADS)
Laszlo, Istvan; Bean, Rachel; Kirk, Donnacha; Bridle, Sarah
2012-06-01
We consider the impact of key astrophysical and measurement systematics on constraints on dark energy and modifications to gravity on cosmic scales. We focus on upcoming photometric ‘stage III’ and ‘stage IV’ large-scale structure surveys such as the Dark Energy Survey (DES), the Subaru Measurement of Images and Redshifts survey, the Euclid survey, the Large Synoptic Survey Telescope (LSST) and Wide Field Infra-Red Space Telescope (WFIRST). We illustrate the different redshift dependencies of gravity modifications compared to intrinsic alignments, the main astrophysical systematic. The way in which systematic uncertainties, such as galaxy bias and intrinsic alignments, are modelled can change dark energy equation-of-state parameter and modified gravity figures of merit by a factor of 4. The inclusion of cross-correlations of cosmic shear and galaxy position measurements helps reduce the loss of constraining power from the lensing shear surveys. When forecasts for Planck cosmic microwave background and stage IV surveys are combined, constraints on the dark energy equation-of-state parameter and modified gravity model are recovered, relative to those from shear data with no systematic uncertainties, provided fewer than 36 free parameters in total are used to describe the galaxy bias and intrinsic alignment models as a function of scale and redshift. While some uncertainty in the intrinsic alignment (IA) model can be tolerated, it is going to be important to be able to parametrize IAs well in order to realize the full potential of upcoming surveys. To facilitate future investigations, we also provide a fitting function for the matter power spectrum arising from the phenomenological modified gravity model we consider.
Searching for Exoplanets using Artificial Intelligence
NASA Astrophysics Data System (ADS)
Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann
2017-10-01
In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.
Spatial Visualisation and Cognitive Style: How Do Gender Differences Play Out?
ERIC Educational Resources Information Center
Ramful, Ajay; Lowrie, Tom
2015-01-01
This study investigated potential gender differences in a sample of 807 Year 6 Singaporean students in relation to two variables: spatial visualisation ability and cognitive style. In contrast to the general trend, overall there were no significant gender differences on spatial visualisation ability. However, gender differences were prevalent…
Facilitating Spatial Thinking in World Geography Using Web-Based GIS
ERIC Educational Resources Information Center
Jo, Injeong; Hong, Jung Eun; Verma, Kanika
2016-01-01
Advocates for geographic information system (GIS) education contend that learning about GIS promotes students' spatial thinking. Empirical studies are still needed to elucidate the potential of GIS as an instructional tool to support spatial thinking in other geography courses. Using a non-equivalent control group research design, this study…
A review of potential image fusion methods for remote sensing-based irrigation management: Part II
USDA-ARS?s Scientific Manuscript database
Satellite-based sensors provide data at either greater spectral and coarser spatial resolutions, or lower spectral and finer spatial resolutions due to complementary spectral and spatial characteristics of optical sensor systems. In order to overcome this limitation, image fusion has been suggested ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrović, V. M.; Miladinović, T. B., E-mail: tanja.miladinovic@gmail.com
2016-05-15
Within the framework of the Ammosov–Delone–Krainov theory, we consider the angular and energy distribution of outgoing electrons due to ionization by a circularly polarized electromagnetic field. A correction of the ground ionization potential by the ponderomotive and Stark shift is incorporated in both distributions. Spatial dependence is analyzed.
Razmjou, Amir; Asadnia, Mohsen; Ghaebi, Omid; Yang, Hao-Cheng; Ebrahimi Warkiani, Majid; Hou, Jingwei; Chen, Vicki
2017-11-01
In this work, spatial patterning of a thin, dense, zeolitic imidazolate framework (ZIF-8) pattern was generated using photolithography and nanoscale (60 nm) dopamine coating. A bioinspired, unique, reversible, two-color iridescent pattern can be easily obtained for potential applications in sensing and photonics.
Das, Ayan; Bhattacharya, Pallab; Heo, Junseok; Banerjee, Animesh; Guo, Wei
2013-01-01
A spatial potential trap is formed in a 6.0-μm Al(Ga)N nanowire by varying the Al composition along its length during epitaxial growth. The polariton emission characteristics of a dielectric microcavity with the single nanowire embedded in-plane have been studied at room temperature. Excitation is provided at the Al(Ga)N end of the nanowire, and polariton emission is observed from the lowest bandgap GaN region within the potential trap. Comparison of the results with those measured in an identical microcavity with a uniform GaN nanowire and having an identical exciton–photon detuning suggests evaporative cooling of the polaritons as they are transported into the trap in the Al(Ga)N nanowire. Measurement of the spectral characteristics of the polariton emission, their momentum distribution, first-order spatial coherence, and time-resolved measurements of polariton cooling provides strong evidence of the formation of a near-equilibrium Bose–Einstein condensate in the GaN region of the nanowire at room temperature. In contrast, the condensate formed in the uniform GaN nanowire–dielectric microcavity without the spatial potential trap is only in self-equilibrium. PMID:23382183
Moriguchi, Sachiko; Tominaga, Atsushi; Irwin, Kelly J; Freake, Michael J; Suzuki, Kazutaka; Goka, Koichi
2015-04-08
Batrachochytrium dendrobatidis (Bd) is the pathogen responsible for chytridiomycosis, a disease that is associated with a worldwide amphibian population decline. In this study, we predicted the potential distribution of Bd in East and Southeast Asia based on limited occurrence data. Our goal was to design an effective survey area where efforts to detect the pathogen can be focused. We generated ecological niche models using the maximum-entropy approach, with alleviation of multicollinearity and spatial autocorrelation. We applied eigenvector-based spatial filters as independent variables, in addition to environmental variables, to resolve spatial autocorrelation, and compared the model's accuracy and the degree of spatial autocorrelation with those of a model estimated using only environmental variables. We were able to identify areas of high suitability for Bd with accuracy. Among the environmental variables, factors related to temperature and precipitation were more effective in predicting the potential distribution of Bd than factors related to land use and cover type. Our study successfully predicted the potential distribution of Bd in East and Southeast Asia. This information should now be used to prioritize survey areas and generate a surveillance program to detect the pathogen.
Spatial navigation by congenitally blind individuals.
Schinazi, Victor R; Thrash, Tyler; Chebat, Daniel-Robert
2016-01-01
Spatial navigation in the absence of vision has been investigated from a variety of perspectives and disciplines. These different approaches have progressed our understanding of spatial knowledge acquisition by blind individuals, including their abilities, strategies, and corresponding mental representations. In this review, we propose a framework for investigating differences in spatial knowledge acquisition by blind and sighted people consisting of three longitudinal models (i.e., convergent, cumulative, and persistent). Recent advances in neuroscience and technological devices have provided novel insights into the different neural mechanisms underlying spatial navigation by blind and sighted people and the potential for functional reorganization. Despite these advances, there is still a lack of consensus regarding the extent to which locomotion and wayfinding depend on amodal spatial representations. This challenge largely stems from methodological limitations such as heterogeneity in the blind population and terminological ambiguity related to the concept of cognitive maps. Coupled with an over-reliance on potential technological solutions, the field has diffused into theoretical and applied branches that do not always communicate. Here, we review research on navigation by congenitally blind individuals with an emphasis on behavioral and neuroscientific evidence, as well as the potential of technological assistance. Throughout the article, we emphasize the need to disentangle strategy choice and performance when discussing the navigation abilities of the blind population. For further resources related to this article, please visit the WIREs website. © 2015 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.
Leng, Shuai; Rajendran, Kishore; Gong, Hao; Zhou, Wei; Halaweish, Ahmed F; Henning, Andre; Kappler, Steffen; Baer, Matthias; Fletcher, Joel G; McCollough, Cynthia H
2018-05-28
The aims of this study were to quantitatively assess two new scan modes on a photon-counting detector computed tomography system, each designed to maximize spatial resolution, and to qualitatively demonstrate potential clinical impact using patient data. This Health Insurance Portability Act-compliant study was approved by our institutional review board. Two high-spatial-resolution scan modes (Sharp and UHR) were evaluated using phantoms to quantify spatial resolution and image noise, and results were compared with the standard mode (Macro). Patients were scanned using a conventional energy-integrating detector scanner and the photon-counting detector scanner using the same radiation dose. In first patient images, anatomic details were qualitatively evaluated to demonstrate potential clinical impact. Sharp and UHR modes had a 69% and 87% improvement in in-plane spatial resolution, respectively, compared with Macro mode (10% modulation-translation-function values of 16.05, 17.69, and 9.48 lp/cm, respectively). The cutoff spatial frequency of the UHR mode (32.4 lp/cm) corresponded to a limiting spatial resolution of 150 μm. The full-width-at-half-maximum values of the section sensitivity profiles were 0.41, 0.44, and 0.67 mm for the thinnest image thickness for each mode (0.25, 0.25, and 0.5 mm, respectively). At the same in-plane spatial resolution, Sharp and UHR images had up to 15% lower noise than Macro images. Patient images acquired in Sharp mode demonstrated better delineation of fine anatomic structures compared with Macro mode images. Phantom studies demonstrated superior resolution and noise properties for the Sharp and UHR modes relative to the standard Macro mode and patient images demonstrated the potential benefit of these scan modes for clinical practice.
NASA Astrophysics Data System (ADS)
Mosquera, Martín A.
2017-10-01
Provided the initial state, the Runge-Gross theorem establishes that the time-dependent (TD) external potential of a system of non-relativistic electrons determines uniquely their TD electronic density, and vice versa (up to a constant in the potential). This theorem requires the TD external potential and density to be Taylor-expandable around the initial time of the propagation. This paper presents an extension without this restriction. Given the initial state of the system and evolution of the density due to some TD scalar potential, we show that a perturbative (not necessarily weak) TD potential that induces a non-zero divergence of the external force-density, inside a small spatial subset and immediately after the initial propagation time, will cause a change in the density within that subset, implying that the TD potential uniquely determines the TD density. In this proof, we assume unitary evolution of wavefunctions and first-order differentiability (which does not imply analyticity) in time of the internal and external force-densities, electronic density, current density, and their spatial derivatives over the small spatial subset and short time interval.
Cicore, Pablo; Serrano, João; Shahidian, Shakib; Sousa, Adelia; Costa, José Luis; da Silva, José Rafael Marques
2016-09-01
Little information is available on the degree of within-field variability of potential production of Tall wheatgrass (Thinopyrum ponticum) forage under unirrigated conditions. The aim of this study was to characterize the spatial variability of the accumulated biomass (AB) without nutritional limitations through vegetation indexes, and then use this information to determine potential management zones. A 27-×-27-m grid cell size was chosen and 84 biomass sampling areas (BSA), each 2 m(2) in size, were georeferenced. Nitrogen and phosphorus fertilizers were applied after an initial cut at 3 cm height. At 500 °C day, the AB from each sampling area, was collected and evaluated. The spatial variability of AB was estimated more accurately using the Normalized Difference Vegetation Index (NDVI), calculated from LANDSAT 8 images obtained on 24 November 2014 (NDVInov) and 10 December 2014 (NDVIdec) because the potential AB was highly associated with NDVInov and NDVIdec (r (2) = 0.85 and 0.83, respectively). These models between the potential AB data and NDVI were evaluated by root mean squared error (RMSE) and relative root mean squared error (RRMSE). This last coefficient was 12 and 15 % for NDVInov and NDVIdec, respectively. Potential AB and NDVI spatial correlation were quantified with semivariograms. The spatial dependence of AB was low. Six classes of NDVI were analyzed for comparison, and two management zones (MZ) were established with them. In order to evaluate if the NDVI method allows us to delimit MZ with different attainable yields, the AB estimated for these MZ were compared through an ANOVA test. The potential AB had significant differences among MZ. Based on these findings, it can be concluded that NDVI obtained from LANDSAT 8 images can be reliably used for creating MZ in soils under permanent pastures dominated by Tall wheatgrass.
Chromatic and Achromatic Spatial Resolution of Local Field Potentials in Awake Cortex.
Jansen, Michael; Li, Xiaobing; Lashgari, Reza; Kremkow, Jens; Bereshpolova, Yulia; Swadlow, Harvey A; Zaidi, Qasim; Alonso, Jose-Manuel
2015-10-01
Local field potentials (LFPs) have become an important measure of neuronal population activity in the brain and could provide robust signals to guide the implant of visual cortical prosthesis in the future. However, it remains unclear whether LFPs can detect weak cortical responses (e.g., cortical responses to equiluminant color) and whether they have enough visual spatial resolution to distinguish different chromatic and achromatic stimulus patterns. By recording from awake behaving macaques in primary visual cortex, here we demonstrate that LFPs respond robustly to pure chromatic stimuli and exhibit ∼2.5 times lower spatial resolution for chromatic than achromatic stimulus patterns, a value that resembles the ratio of achromatic/chromatic resolution measured with psychophysical experiments in humans. We also show that, although the spatial resolution of LFP decays with visual eccentricity as is also the case for single neurons, LFPs have higher spatial resolution and show weaker response suppression to low spatial frequencies than spiking multiunit activity. These results indicate that LFP recordings are an excellent approach to measure spatial resolution from local populations of neurons in visual cortex including those responsive to color. © The Author 2014. Published by Oxford University Press.
Spatialized audio improves call sign recognition during multi-aircraft control.
Kim, Sungbin; Miller, Michael E; Rusnock, Christina F; Elshaw, John J
2018-07-01
We investigated the impact of a spatialized audio display on response time, workload, and accuracy while monitoring auditory information for relevance. The human ability to differentiate sound direction implies that spatial audio may be used to encode information. Therefore, it is hypothesized that spatial audio cues can be applied to aid differentiation of critical versus noncritical verbal auditory information. We used a human performance model and a laboratory study involving 24 participants to examine the effect of applying a notional, automated parser to present audio in a particular ear depending on information relevance. Operator workload and performance were assessed while subjects listened for and responded to relevant audio cues associated with critical information among additional noncritical information. Encoding relevance through spatial location in a spatial audio display system--as opposed to monophonic, binaural presentation--significantly reduced response time and workload, particularly for noncritical information. Future auditory displays employing spatial cues to indicate relevance have the potential to reduce workload and improve operator performance in similar task domains. Furthermore, these displays have the potential to reduce the dependence of workload and performance on the number of audio cues. Published by Elsevier Ltd.
Application of Remote Sensing for Generation of Groundwater Prospect Map
NASA Astrophysics Data System (ADS)
Inayathulla, Masool
2016-07-01
In developing accurate hydrogeomorphological analysis, monitoring, ability to generate information in spatial and temporal domain and delineation of land features are crucial for successful analysis and prediction of groundwater resources. However, the use of RS and GIS in handling large amount of spatial data provides to gain accurate information for delineating the geological and geomorphological characteristics and allied significance, which are considered as a controlling factor for the occurrence and movement of groundwater used IRS LISS II data on 1: 50000 scale along with topographic maps in various parts of India to develop integrated groundwater potential zones. The present work is an attempt to integrate RS and GIS based analysis and methodology in groundwater potential zone identification in the Arkavathi Basin, Bangalore, study area. The information on geology, geomorphology, soil, slope, rainfall, water level and land use/land cover was gathered, in addition, GIS platform was used for the integration of various themes. The composite map generated was further classified according to the spatial variation of the groundwater potential. Five categories of groundwater potential zones namely poor, moderate to poor, moderate, good and very good were identified and delineated. The hydrogeomorphological units like valley fills and alluvial plain and are potential zones for groundwater exploration and development and valley fills associated with lineaments is highly promising area for ground water recharging. The spatial variation of the potential indicates that groundwater occurrence is controlled by geology, land use / land cover, slope and landforms.
Seedling establishment and physiological responses to temporal and spatial soil moisture changes
Jeremy Pinto; John D. Marshall; Kas Dumroese; Anthony S. Davis; Douglas R. Cobos
2016-01-01
In many forests of the world, the summer season (temporal element) brings drought conditions causing low soil moisture in the upper soil profile (spatial element) - a potentially large barrier to seedling establishment. We evaluated the relationship between initial seedling root depth, temporal and spatial changes in soil moisture during drought after...
Does Spatial Training Improve Children's Mathematics Ability?
ERIC Educational Resources Information Center
Cheng, Yi-Ling; Mix, Kelly
2011-01-01
The authors' primary aim was to investigate a potential causal relationship between spatial ability and math ability. To do so, they used a pretest-training-posttest experimental design in which children received short-term spatial training and were tested on problem solving in math. They focused on first and second graders because earlier studies…
Michael S. Hand; Matthew P. Thompson; Dave Calkin
2016-01-01
Increasing costs of wildfire management have highlighted the need to better understand suppression expenditures and potential tradeoffs of land management activities that may affect fire risks. Spatially and temporally descriptive data is used to develop a model of wildfire suppression expenditures, providing new insights into the role of spatial and temporal...
Potential for spatial management of hunted mammal populations in tropical forests
Miranda H. Mockrin; Kent H. Redford
2011-01-01
Unsustainable hunting in tropical forests threatens biodiversity and rural livelihoods, yet managing these harvests in remote forests with low scientific capacity and funding is challenging. In response, some conservationists propose managing harvests through spatial management, a system of establishing notake zones where hunting is not allowed. Spatial management was...
Detecting spatial regimes in ecosystems
Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.
2017-01-01
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.
Motor expertise and performance in spatial tasks: A meta-analysis.
Voyer, Daniel; Jansen, Petra
2017-08-01
The present study aimed to provide a summary of findings relevant to the influence of motor expertise on performance in spatial tasks and to examine potential moderators of this effect. Studies of relevance were those in which individuals involved in activities presumed to require motor expertise were compared to non-experts in such activities. A final set of 62 effect sizes from 33 samples was included in a multilevel meta-analysis. The results showed an overall advantage in favor of motor experts in spatial tasks (d=0.38). However, the magnitude of that effect was moderated by expert type (athlete, open skills/ball sports, runner/cyclist, gymnast/dancers, musicians), stimulus type (2D, blocks, bodies, others), test category (mental rotation, spatial perception, spatial visualization), specific test (Mental Rotations Test, generic mental rotation, disembedding, rod-and-frame test, other), and publication status. These findings are discussed in the context of embodied cognition and the potential role of activities requiring motor expertise in promoting good spatial performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Predicting the Spatial Distribution of Aspen Growth Potential in the Upper Great Lakes Region
Eric J. Gustafson; Sue M. Lietz; John L. Wright
2003-01-01
One way to increase aspen yields is to produce aspen on sites where aspen growth potential is highest. Aspen growth rates are typically predicted using site index, but this is impractical for landscape-level assessments. We tested the hypothesis that aspen growth can be predicted from site and climate variables and generated a model to map the spatial variability of...
S. Salazar; M. Mendoza; A. M. Tejeda
2006-01-01
A spatial model is presented to explain the concentration of heavy metals (Fe, Cu, Zn, Ni, Cr, Co and Pb), in the soils around the industrial complex near the Port of Veracruz, Mexico. Unexpected low concentration sites where then tested to detect woody plant species that may have the capability to hiperacumulate these contaminants, hence having a potential for...
Estimates of spatial and temporal variation of energy crops biomass yields in the US
NASA Astrophysics Data System (ADS)
Song, Y.; Jain, A. K.; Landuyt, W.; Kheshgi, H. S.
2013-12-01
Perennial grasses, such as switchgrass (Panicum viragatum) and Miscanthus (Miscanthus x giganteus) have been identified for potential use as biomass feedstocks in the US. Current research on perennial grass biomass production has been evaluated on small-scale plots. However, the extent to which this potential can be realized at a landscape-scale will depend on the biophysical potential to grow these grasses with minimum possible amount of land that needs to be diverted from food to fuel production. To assess this potential three questions about the biomass yield for these grasses need to be answered: (1) how the yields for different grasses are varied spatially and temporally across the US; (2) whether the yields are temporally stable or not; and (3) how the spatial and temporal trends in yields of these perennial grasses are controlled by limiting factors, including soil type, water availability, climate, and crop varieties. To answer these questions, the growth processes of the perennial grasses are implemented into a coupled biophysical, physiological and biogeochemical model (ISAM). The model has been applied to quantitatively investigate the spatial and temporal trends in biomass yields for over the period 1980 -2010 in the US. The bioenergy grasses considered in this study include Miscanthus, Cave-in-Rock switchgrass and Alamo switchgrass. The effects of climate, soil and topography on the spatial and temporal trends of biomass yields are quantitatively analyzed using principal component analysis and GIS based geographically weighted regression. The spatial temporal trend results are evaluated further to classify each part of the US into four homogeneous potential yield zones: high and stable yield zone (HS), high but unstable yield zone (HU), low and stable yield zone (LS) and low but unstable yield zone (LU). Our preliminary results indicate that the yields for perennial grasses among different zones are strongly related to the different controlling factors. For example, the yield in HS zone is depended on soil and topography factors. However, the yields in HU zone are more controlled by climate factors, leading to a large uncertainty in yield potential of bioenergy grasses under future climate change.
Retrieved Products from Simulated Hyperspectral Observations of a Hurricane
NASA Technical Reports Server (NTRS)
Susskind, Joel; Kouvaris, Louis; Iredell, Lena; Blaisdell, John
2015-01-01
Demonstrate via Observing System Simulation Experiments (OSSEs) the potential utility of flying high spatial resolution AIRS class IR sounders on future LEO and GEO missions.The study simulates and analyzes radiances for 3 sounders with AIRS spectral and radiometric properties on different orbits with different spatial resolutions: 1) Control run 13 kilometers AIRS spatial resolution at nadir on LEO in Aqua orbit; 2) 2 kilometer spatial resolution LEO sounder at nadir ARIES; 3) 5 kilometers spatial resolution sounder on a GEO orbit, radiances simulated every 72 minutes.
Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I
2010-10-01
For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.
Tang, Xiaoyu; Li, Chunlin; Li, Qi; Gao, Yulin; Yang, Weiping; Yang, Jingjing; Ishikawa, Soushirou; Wu, Jinglong
2013-10-11
Utilizing the high temporal resolution of event-related potentials (ERPs), we examined how visual spatial or temporal cues modulated the auditory stimulus processing. The visual spatial cue (VSC) induces orienting of attention to spatial locations; the visual temporal cue (VTC) induces orienting of attention to temporal intervals. Participants were instructed to respond to auditory targets. Behavioral responses to auditory stimuli following VSC were faster and more accurate than those following VTC. VSC and VTC had the same effect on the auditory N1 (150-170 ms after stimulus onset). The mean amplitude of the auditory P1 (90-110 ms) in VSC condition was larger than that in VTC condition, and the mean amplitude of late positivity (300-420 ms) in VTC condition was larger than that in VSC condition. These findings suggest that modulation of auditory stimulus processing by visually induced spatial or temporal orienting of attention were different, but partially overlapping. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Spatial modelling of landscape aesthetic potential in urban-rural fringes.
Sahraoui, Yohan; Clauzel, Céline; Foltête, Jean-Christophe
2016-10-01
The aesthetic potential of landscape has to be modelled to provide tools for land-use planning. This involves identifying landscape attributes and revealing individuals' landscape preferences. Landscape aesthetic judgments of individuals (n = 1420) were studied by means of a photo-based survey. A set of landscape visibility metrics was created to measure landscape composition and configuration in each photograph using spatial data. These metrics were used as explanatory variables in multiple linear regressions to explain aesthetic judgments. We demonstrate that landscape aesthetic judgments may be synthesized in three consensus groups. The statistical results obtained show that landscape visibility metrics have good explanatory power. Ultimately, we propose a spatial modelling of landscape aesthetic potential based on these results combined with systematic computation of visibility metrics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo
2013-01-01
Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.
Evaluating Sentinel-2 for Lakeshore Habitat Mapping Based on Airborne Hyperspectral Data.
Stratoulias, Dimitris; Balzter, Heiko; Sykioti, Olga; Zlinszky, András; Tóth, Viktor R
2015-09-11
Monitoring of lakeshore ecosystems requires fine-scale information to account for the high biodiversity typically encountered in the land-water ecotone. Sentinel-2 is a satellite with high spatial and spectral resolution and improved revisiting frequency and is expected to have significant potential for habitat mapping and classification of complex lakeshore ecosystems. In this context, investigations of the capabilities of Sentinel-2 in regard to the spatial and spectral dimensions are needed to assess its potential and the quality of the expected output. This study presents the first simulation of the high spatial resolution (i.e., 10 m and 20 m) bands of Sentinel-2 for lakeshore mapping, based on the satellite's Spectral Response Function and hyperspectral airborne data collected over Lake Balaton, Hungary in August 2010. A comparison of supervised classifications of the simulated products is presented and the information loss from spectral aggregation and spatial upscaling in the context of lakeshore vegetation classification is discussed. We conclude that Sentinel-2 imagery has a strong potential for monitoring fine-scale habitats, such as reed beds.
Punchets: nonlinear transport in Hamiltonian pump-ratchet hybrids
NASA Astrophysics Data System (ADS)
Dittrich, Thomas; Medina Sánchez, Nicolás
2018-02-01
‘Punchets’ are hybrids between ratchets and pumps, combining a spatially periodic static potential, typically asymmetric under space inversion, with a local driving that breaks time-reversal invariance, and are intended to model metal or semiconductor surfaces irradiated by a collimated laser beam. Their crucial feature is irregular driven scattering between asymptotic regions supporting periodic (as opposed to free) motion. With all binary spatio-temporal symmetries broken, scattering in punchets typically generates directed currents. We here study the underlying nonlinear transport mechanisms, from chaotic scattering to the parameter dependence of the currents, in three types of Hamiltonian models, (i) with spatially periodic potentials where only in the driven scattering region, spatial and temporal symmetries are broken, and (ii), spatially asymmetric (ratchet) potentials with a driving that only breaks time-reversal invariance. As more realistic models of laser-irradiated surfaces, we consider (iii), a driving in the form of a running wave confined to a compact region by a static envelope. In this case, the induced current can even run against the direction of wave propagation, drastically evidencing its nonlinear nature. Quantizing punchets is indicated as a viable research perspective.
Cabrera, Alvaro Fuentes; Hoffmann, Pablo Faundez
2010-01-01
This study is focused on the single-trial classification of auditory event-related potentials elicited by sound stimuli from different spatial directions. Five naϊve subjects were asked to localize a sound stimulus reproduced over one of 8 loudspeakers placed in a circular array, equally spaced by 45°. The subject was seating in the center of the circular array. Due to the complexity of an eight classes classification, our approach consisted on feeding our classifier with two classes, or spatial directions, at the time. The seven chosen pairs were 0°, which was the loudspeaker directly in front of the subject, with all the other seven directions. The discrete wavelet transform was used to extract features in the time-frequency domain and a support vector machine performed the classification procedure. The average accuracy over all subjects and all pair of spatial directions was 76.5%, σ = 3.6. The results of this study provide evidence that the direction of a sound is encoded in single-trial auditory event-related potentials.
Determination of scattering structures from spatial coherence measurements.
Zarubin, A M
1996-03-01
A new method of structure determination and microscopic imaging with short-wavelength radiations (charged particles, X-rays, neutrons), based on measurements of the modulus and the phase of the degree of spatial coherence of the scattered radiation, is developed. The underlying principle of the method--transfer of structural information about the scattering potential via spatial coherence of the secondary (scattering) source of radiation formed by this potential--is expressed by the generalization of the van Cittert-Zernike theorem to wave and particle scattering [A.M. Zarubin, Opt. Commun. 100 (1993) 491; Opt. Commun. 102 (1993) 543]. Shearing interferometric techniques are proposed for implementing the above measurements; the limits of spatial resolution attainable by reconstruction of the absolute square of a 3D scattering potential and its 2D projections from the measurements are analyzed. It is shown theoretically that 3D imaging with atomic resolution can be realized in a "synthetic aperture" electron or ion microscope and that a 3D resolution of about 6 nm can be obtained with a "synthetic aperture" X-ray microscope. A proof-of-principle optical experiment is presented.
Evaluating Sentinel-2 for Lakeshore Habitat Mapping Based on Airborne Hyperspectral Data
Stratoulias, Dimitris; Balzter, Heiko; Sykioti, Olga; Zlinszky, András; Tóth, Viktor R.
2015-01-01
Monitoring of lakeshore ecosystems requires fine-scale information to account for the high biodiversity typically encountered in the land-water ecotone. Sentinel-2 is a satellite with high spatial and spectral resolution and improved revisiting frequency and is expected to have significant potential for habitat mapping and classification of complex lakeshore ecosystems. In this context, investigations of the capabilities of Sentinel-2 in regard to the spatial and spectral dimensions are needed to assess its potential and the quality of the expected output. This study presents the first simulation of the high spatial resolution (i.e., 10 m and 20 m) bands of Sentinel-2 for lakeshore mapping, based on the satellite’s Spectral Response Function and hyperspectral airborne data collected over Lake Balaton, Hungary in August 2010. A comparison of supervised classifications of the simulated products is presented and the information loss from spectral aggregation and spatial upscaling in the context of lakeshore vegetation classification is discussed. We conclude that Sentinel-2 imagery has a strong potential for monitoring fine-scale habitats, such as reed beds. PMID:26378538
A case study for the integration of predictive mineral potential maps
NASA Astrophysics Data System (ADS)
Lee, Saro; Oh, Hyun-Joo; Heo, Chul-Ho; Park, Inhye
2014-09-01
This study aims to elaborate on the mineral potential maps using various models and verify the accuracy for the epithermal gold (Au) — silver (Ag) deposits in a Geographic Information System (GIS) environment assuming that all deposits shared a common genesis. The maps of potential Au and Ag deposits were produced by geological data in Taebaeksan mineralized area, Korea. The methodological framework consists of three main steps: 1) identification of spatial relationships 2) quantification of such relationships and 3) combination of multiple quantified relationships. A spatial database containing 46 Au-Ag deposits was constructed using GIS. The spatial association between training deposits and 26 related factors were identified and quantified by probabilistic and statistical modelling. The mineral potential maps were generated by integrating all factors using the overlay method and recombined afterwards using the likelihood ratio model. They were verified by comparison with test mineral deposit locations. The verification revealed that the combined mineral potential map had the greatest accuracy (83.97%), whereas it was 72.24%, 65.85%, 72.23% and 71.02% for the likelihood ratio, weight of evidence, logistic regression and artificial neural network models, respectively. The mineral potential map can provide useful information for the mineral resource development.
ERIC Educational Resources Information Center
Zane, Emily
2016-01-01
This project used Event-Related Potentials (ERPs) to explore neurophysiological brain responses to prepositional phrases involving concrete and abstract reference nouns (e.g., "plate" and "moment," respectively) after the presentation of objects with varying spatial features. Prepositional phrases were headed by "in"…
Spatial Ability Mediates the Gender Difference in Middle School Students' Science Performance
ERIC Educational Resources Information Center
Ganley, Colleen M.; Vasilyeva, Marina; Dulaney, Alana
2014-01-01
Prior research has demonstrated a male advantage in spatial skills and science achievement. The present research integrated these findings by testing the potential role of spatial skills in gender differences in the science performance of eighth-grade students (13-15 years old). In "Study 1" (N = 113), the findings showed that mental…
K.yle J. Haynes; Ottar N. Bjornstad; Andrew J. Allstadt; Andrew M. Liebhold
2012-01-01
Despite the pervasiveness of spatial synchrony of population fluctuations in virtually every taxon, it remains difficult to disentangle its underlying mechanisms, such as environmental perturbations and dispersal. We used multiple regression of distance matrices (MRMs) to statistically partition the importance of several factors potentially synchronizing the dynamics...
ERIC Educational Resources Information Center
Güven, Bülent; Kosa, Temel
2008-01-01
Geometry is the study of shape and space. Without spatial ability, students cannot fully appreciate the natural world. Spatial ability is also very important for work in various fields such as computer graphics, engineering, architecture, and cartography. A number of studies have demonstrated that technology has an important potential to develop…
Number Prompts Left-to-Right Spatial Mapping in Toddlerhood
ERIC Educational Resources Information Center
McCrink, Koleen; Perez, Jasmin; Baruch, Erica
2017-01-01
Toddlers performed a spatial mapping task in which they were required to learn the location of a hidden object in a vertical array and then transpose this location information 90° to a horizontal array. During the vertical training, they were given (a) no labels, (b) alphabetical labels, or (c) numerical labels for each potential spatial location.…
A perturbation analysis of a mechanical model for stable spatial patterning in embryology
NASA Astrophysics Data System (ADS)
Bentil, D. E.; Murray, J. D.
1992-12-01
We investigate a mechanical cell-traction mechanism that generates stationary spatial patterns. A linear analysis highlights the model's potential for these heterogeneous solutions. We use multiple-scale perturbation techniques to study the evolution of these solutions and compare our solutions with numerical simulations of the model system. We discuss some potential biological applications among which are the formation of ridge patterns, dermatoglyphs, and wound healing.
NASA Astrophysics Data System (ADS)
Nordtvedt, Kenneth
2018-01-01
In the author's previous publications, a recursive linear algebraic method was introduced for obtaining (without gravitational radiation) the full potential expansions for the gravitational metric field components and the Lagrangian for a general N-body system. Two apparent properties of gravity— Exterior Effacement and Interior Effacement—were defined and fully enforced to obtain the recursive algebra, especially for the motion-independent potential expansions of the general N-body situation. The linear algebraic equations of this method determine the potential coefficients at any order n of the expansions in terms of the lower-order coefficients. Then, enforcing Exterior and Interior Effacement on a selecedt few potential series of the full motion-independent potential expansions, the complete exterior metric field for a single, spherically-symmetric mass source was obtained, producing the Schwarzschild metric field of general relativity. In this fourth paper of this series, the complete spatial metric's motion-independent potentials for N bodies are obtained using enforcement of Interior Effacement and knowledge of the Schwarzschild potentials. From the full spatial metric, the complete set of temporal metric potentials and Lagrangian potentials in the motion-independent case can then be found by transfer equations among the coefficients κ( n, α) → λ( n, ɛ) → ξ( n, α) with κ( n, α), λ( n, ɛ), ξ( n, α) being the numerical coefficients in the spatial metric, the Lagrangian, and the temporal metric potential expansions, respectively.
Current practices in the spatial analysis of cancer: flies in the ointment
Jacquez, Geoffrey M
2004-01-01
While many lessons have been learned from the spatial analysis of cancer, there are several caveats that apply to many, if not all such analyses. As "flies in the ointment", these can substantially detract from a spatial analysis, and if not accounted for, can lead to weakened and erroneous conclusions. This paper discusses several assumptions and limitations of spatial analysis, identifies problems of scientific inference, and concludes with potential solutions and future directions. PMID:15479473
Altering spatial priority maps via reward-based learning.
Chelazzi, Leonardo; Eštočinová, Jana; Calletti, Riccardo; Lo Gerfo, Emanuele; Sani, Ilaria; Della Libera, Chiara; Santandrea, Elisa
2014-06-18
Spatial priority maps are real-time representations of the behavioral salience of locations in the visual field, resulting from the combined influence of stimulus driven activity and top-down signals related to the current goals of the individual. They arbitrate which of a number of (potential) targets in the visual scene will win the competition for attentional resources. As a result, deployment of visual attention to a specific spatial location is determined by the current peak of activation (corresponding to the highest behavioral salience) across the map. Here we report a behavioral study performed on healthy human volunteers, where we demonstrate that spatial priority maps can be shaped via reward-based learning, reflecting long-lasting alterations (biases) in the behavioral salience of specific spatial locations. These biases exert an especially strong influence on performance under conditions where multiple potential targets compete for selection, conferring competitive advantage to targets presented in spatial locations associated with greater reward during learning relative to targets presented in locations associated with lesser reward. Such acquired biases of spatial attention are persistent, are nonstrategic in nature, and generalize across stimuli and task contexts. These results suggest that reward-based attentional learning can induce plastic changes in spatial priority maps, endowing these representations with the "intelligent" capacity to learn from experience. Copyright © 2014 the authors 0270-6474/14/348594-11$15.00/0.
NASA Astrophysics Data System (ADS)
Paudyal, D. R.; McDougall, K.; Apan, A.
2012-07-01
The participation and engagement of grass-root level community groups and citizens for natural resource management has a long history. With recent developments in ICT tools and spatial technology, these groups are seeking a new opportunity to manage natural resource data. There are lot of spatial information collected/generated by landcare groups, land holders and other community groups at the grass-root level through their volunteer initiatives. State government organisations are also interested in gaining access to this spatial data/information and engaging these groups to collect spatial information under their mapping programs. The aim of this paper is to explore the possible utilisation of volunteered geographic information (VGI) for catchment management activities. This research paper discusses the importance of spatial information and spatial data infrastructure (SDI) for catchment management and the emergence of VGI. A conceptual framework has been developed to illustrate how these emerging spatial information applications and various community volunteer activities can contribute to a more inclusive spatial data infrastructure (SDI) development at local level. A survey of 56 regional NRM bodies in Australia was utilised to explore the current community-driven volunteer initiatives for NRM activities and the potential of utilisation of VGI initiatives for NRM decision making process. This research paper concludes that VGI activities have great potential to contribute to SDI development at the community level to achieve better natural resource management (NRM) outcomes.
Aicher, Wilhelm K; Rolauffs, Bernd
2014-04-01
Chondrocytes display within the articular cartilage depth-dependent variations of their many properties that are comparable to the depth-dependent changes of the properties of the surrounding extracellular matrix. However, not much is known about the spatial organisation of the chondrocytes throughout the tissue. Recent studies revealed that human chondrocytes display distinct spatial patterns of organisation within the articular surface, and each joint surface is dominated in a typical way by one of four basic spatial patterns. The resulting complex spatial organisations correlate with the specific diarthrodial joint type, suggesting an association of the chondrocyte organisation within the joint surface with the occurring biomechanical forces. In response to focal osteoarthritis (OA), the superficial chondrocytes experience a destruction of their spatial organisation within the OA lesion, but they also undergo a defined remodelling process distant from the OA lesion in the remaining, intact cartilage surface. One of the biological insights that can be derived from this spatial remodelling process is that the chondrocytes are able to respond in a generalised and coordinated fashion to distant focal OA. The spatial characteristics of this process are tremendously different from the cellular aggregations typical for OA lesions, suggesting differences in the underlying mechanisms. Here we summarise the available information on the spatial organisation of chondrocytes and its potential roles in cartilage functioning. The spatial organisation could be used to diagnose early OA onset before manifest OA results in tissue destruction and clinical symptoms. With further development, this concept may become clinically suitable for the diagnosis of preclinical OA.
Precision cosmology with time delay lenses: High resolution imaging requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng, Xiao -Lei; Treu, Tommaso; Agnello, Adriano
Lens time delays are a powerful probe of cosmology, provided that the gravitational potential of the main deflector can be modeled with sufficient precision. Recent work has shown that this can be achieved by detailed modeling of the host galaxies of lensed quasars, which appear as ``Einstein Rings'' in high resolution images. The distortion of these arcs and counter-arcs, as measured over a large number of pixels, provides tight constraints on the difference between the gravitational potential between the quasar image positions, and thus on cosmology in combination with the measured time delay. We carry out a systematic exploration ofmore » the high resolution imaging required to exploit the thousands of lensed quasars that will be discovered by current and upcoming surveys with the next decade. Specifically, we simulate realistic lens systems as imaged by the Hubble Space Telescope (HST), James Webb Space Telescope (JWST), and ground based adaptive optics images taken with Keck or the Thirty Meter Telescope (TMT). We compare the performance of these pointed observations with that of images taken by the Euclid (VIS), Wide-Field Infrared Survey Telescope (WFIRST) and Large Synoptic Survey Telescope (LSST) surveys. We use as our metric the precision with which the slope γ' of the total mass density profile ρ tot∝ r–γ' for the main deflector can be measured. Ideally, we require that the statistical error on γ' be less than 0.02, such that it is subdominant to other sources of random and systematic uncertainties. We find that survey data will likely have sufficient depth and resolution to meet the target only for the brighter gravitational lens systems, comparable to those discovered by the SDSS survey. For fainter systems, that will be discovered by current and future surveys, targeted follow-up will be required. Furthermore, the exposure time required with upcoming facilitites such as JWST, the Keck Next Generation Adaptive Optics System, and TMT, will only be of order a few minutes per system, thus making the follow-up of hundreds of systems a practical and efficient cosmological probe.« less
Precision cosmology with time delay lenses: high resolution imaging requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng, Xiao-Lei; Liao, Kai; Treu, Tommaso
Lens time delays are a powerful probe of cosmology, provided that the gravitational potential of the main deflector can be modeled with sufficient precision. Recent work has shown that this can be achieved by detailed modeling of the host galaxies of lensed quasars, which appear as ''Einstein Rings'' in high resolution images. The distortion of these arcs and counter-arcs, as measured over a large number of pixels, provides tight constraints on the difference between the gravitational potential between the quasar image positions, and thus on cosmology in combination with the measured time delay. We carry out a systematic exploration ofmore » the high resolution imaging required to exploit the thousands of lensed quasars that will be discovered by current and upcoming surveys with the next decade. Specifically, we simulate realistic lens systems as imaged by the Hubble Space Telescope (HST), James Webb Space Telescope (JWST), and ground based adaptive optics images taken with Keck or the Thirty Meter Telescope (TMT). We compare the performance of these pointed observations with that of images taken by the Euclid (VIS), Wide-Field Infrared Survey Telescope (WFIRST) and Large Synoptic Survey Telescope (LSST) surveys. We use as our metric the precision with which the slope γ' of the total mass density profile ρ{sub tot}∝ r{sup −γ'} for the main deflector can be measured. Ideally, we require that the statistical error on γ' be less than 0.02, such that it is subdominant to other sources of random and systematic uncertainties. We find that survey data will likely have sufficient depth and resolution to meet the target only for the brighter gravitational lens systems, comparable to those discovered by the SDSS survey. For fainter systems, that will be discovered by current and future surveys, targeted follow-up will be required. However, the exposure time required with upcoming facilitites such as JWST, the Keck Next Generation Adaptive Optics System, and TMT, will only be of order a few minutes per system, thus making the follow-up of hundreds of systems a practical and efficient cosmological probe.« less
NASA Astrophysics Data System (ADS)
Budy, Phaedra; Baker, Matthew; Dahle, Samuel K.
2011-10-01
Anthropogenic impairment of water bodies represents a global environmental concern, yet few attempts have successfully linked fish performance to thermal habitat suitability and fewer have distinguished co-varying water quality constraints. We interfaced fish bioenergetics, field measurements, and Thermal Remote Imaging to generate a spatially-explicit, high-resolution surface of fish growth potential, and next employed a structured hypothesis to detect relationships among measures of fish performance and co-varying water quality constraints. Our thermal surface of fish performance captured the amount and spatial-temporal arrangement of thermally-suitable habitat for three focal species in an extremely heterogeneous reservoir, but interpretation of this pattern was initially confounded by seasonal covariation of water residence time and water quality. Subsequent path analysis revealed that in terms of seasonal patterns in growth potential, catfish and walleye responded to temperature, positively and negatively, respectively; crappie and walleye responded to eutrophy (negatively). At the high eutrophy levels observed in this system, some desired fishes appear to suffer from excessive cultural eutrophication within the context of elevated temperatures whereas others appear to be largely unaffected or even enhanced. Our overall findings do not lead to the conclusion that this system is degraded by pollution; however, they do highlight the need to use a sensitive focal species in the process of determining allowable nutrient loading and as integrators of habitat suitability across multiple spatial and temporal scales. We provide an integrated approach useful for quantifying fish growth potential and identifying water quality constraints on fish performance at spatial scales appropriate for whole-system management.
Detecting spatial regimes in ecosystems | Science Inventory ...
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning
High density event-related potential data acquisition in cognitive neuroscience.
Slotnick, Scott D
2010-04-16
Functional magnetic resonance imaging (fMRI) is currently the standard method of evaluating brain function in the field of Cognitive Neuroscience, in part because fMRI data acquisition and analysis techniques are readily available. Because fMRI has excellent spatial resolution but poor temporal resolution, this method can only be used to identify the spatial location of brain activity associated with a given cognitive process (and reveals virtually nothing about the time course of brain activity). By contrast, event-related potential (ERP) recording, a method that is used much less frequently than fMRI, has excellent temporal resolution and thus can track rapid temporal modulations in neural activity. Unfortunately, ERPs are under utilized in Cognitive Neuroscience because data acquisition techniques are not readily available and low density ERP recording has poor spatial resolution. In an effort to foster the increased use of ERPs in Cognitive Neuroscience, the present article details key techniques involved in high density ERP data acquisition. Critically, high density ERPs offer the promise of excellent temporal resolution and good spatial resolution (or excellent spatial resolution if coupled with fMRI), which is necessary to capture the spatial-temporal dynamics of human brain function.
Xie, Hualin; Liu, Zhifei; Wang, Peng; Liu, Guiying; Lu, Fucai
2013-01-01
Ecological land is one of the key resources and conditions for the survival of humans because it can provide ecosystem services and is particularly important to public health and safety. It is extremely valuable for effective ecological management to explore the evolution mechanisms of ecological land. Based on spatial statistical analyses, we explored the spatial disparities and primary potential drivers of ecological land change in the Poyang Lake Eco-economic Zone of China. The results demonstrated that the global Moran’s I value is 0.1646 during the 1990 to 2005 time period and indicated significant positive spatial correlation (p < 0.05). The results also imply that the clustering trend of ecological land changes weakened in the study area. Some potential driving forces were identified by applying the spatial autoregressive model in this study. The results demonstrated that the higher economic development level and industrialization rate were the main drivers for the faster change of ecological land in the study area. This study also tested the superiority of the spatial autoregressive model to study the mechanisms of ecological land change by comparing it with the traditional linear regressive model. PMID:24384778
Temporally increasing spatial synchrony of North American temperature and bird populations
NASA Astrophysics Data System (ADS)
Koenig, Walter D.; Liebhold, Andrew M.
2016-06-01
The ecological impacts of modern global climate change are detectable in a wide variety of phenomena, ranging from shifts in species ranges to changes in community composition and human disease dynamics. So far, however, little attention has been given to temporal changes in spatial synchrony--the coincident change in abundance or value across the landscape--despite the importance of environmental synchrony as a driver of population trends and the central role of environmental variability in population rescue and extinction. Here we demonstrate that across North America, spatial synchrony of a significant proportion of 49 widespread North American wintering bird species has increased over the past 50 years--the period encompassing particularly intense anthropogenic effects in climate--paralleling significant increases in spatial synchrony of mean maximum air temperature. These results suggest the potential for increased spatial synchrony in environmental factors to be affecting a wide range of ecological phenomena. These effects are likely to vary, but for North American wildlife species, increased spatial synchrony driven by environmental factors may be the basis for a previously unrecognized threat to their long-term persistence in the form of more synchronized population dynamics reducing the potential for demographic rescue among interacting subpopulations.
NASA Technical Reports Server (NTRS)
Forester, R. H.
1978-01-01
Polyimide membranes of a thickness range from under 0.01 micron m to greater than 1 micron m can be produced at an estimated cost of 50 cents per sq m (plus the cost of the polymer). The polymer of interest is dissolved in a solvent which is solube in water. The polymer or casting solution is allowed to flow down an inclined ramp onto a water surface where a pool of floating polymer develops. The solvent dissolves into the water lowering the surface tension of the water on equently, the contact angle of the polymer pool is very low and the edge of the pool is very thin. The solvent dissolves from this thin region too rapidly to be replenished from the bulk of the pool and a solid polymer film forms. Firm formation is rapid and spontaneous and the film spreads out unaided, many feet from the leading edge of the pool. The driving force for this process is the exothermic solution of the organic solvent from the polymer solution into the water.
Using "Big Data" in a Classroom Setting for Student-Developed Projects
NASA Astrophysics Data System (ADS)
Hayes-Gehrke, Melissa; Vogel, Stuart N.
2018-01-01
The advances in exploration of the optical transient sky anticipated with major facilities such as the Zwicky Transient Facility (ZTF) and Large Synoptic Survey Telescope (LSST) provide an opportunity to integrate large public research datasets into the undergraduate classroom. As a step in this direction, the NSF PIRE-funded GROWTH (Global Relay of Observatories Watching Transients Happen) collaboration provided funding for curriculum development using data from the precursor to ZTF, the Intermediate Palomar Transient Factory (iPTF). One of the iPTF portals, the PTF Variable Marshal, was used by 56 Astronomy majors in the fall 2016 and 2017 semesters of the required Observational Astronomy course at the University of Maryland. Student teams learned about the iPTF survey and how to use the PTF Variable Marshal and then developed their own hypotheses about variable stars to test using data they gathered from the Variable Marshal. Through this project, students gained experience in how to develop scientific questions that can be explored using large datasets and became aware of the limitations and difficulties of such projects. This work was supported in part by NSF award OISE-1545949.
Astrometric surveys in the Gaia era
NASA Astrophysics Data System (ADS)
Zacharias, Norbert
2018-04-01
The Gaia first data release (DR1) already provides an almost error free optical reference frame on the milli-arcsecond (mas) level allowing significantly better calibration of ground-based astrometric data than ever before. Gaia DR1 provides positions, proper motions and trigonometric parallaxes for just over 2 million stars in the Tycho-2 catalog. For over 1.1 billion additional stars DR1 gives positions. Proper motions for these, mainly fainter stars (G >= 11.5) are currently provided by several new projects which combine earlier epoch ground-based observations with Gaia DR1 positions. These data are very helpful in the interim period but will become obsolete with the second Gaia data release (DR2) expected in April 2018. The era of traditional, ground-based, wide-field astrometry with the goal to provide accurate reference stars has come to an end. Future ground-based astrometry will fill in some gaps (very bright stars, observations needed at many or specific epochs) and mainly will go fainter than the Gaia limit, like the PanSTARRS and the upcoming LSST surveys.
The Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Venn, Kim; Starkenburg, Else; Martin, Nicolas; Kielty, Collin; Youakim, Kris; Arnetsen, Anke
2018-06-01
The Maunakea Spectroscopic Explorer (MSE) is an ambitious project to transform the Canada-France-Hawaii 3.6-metre telescope into an 11.25-metre facility dedicated to wide field multi-object spectroscopy. Following a successful conceptual design review of ten subsystems and the systems-level review in January 2018, MSE is preparing to move into the Preliminary Design Phase. MSE will simultaneously deploy over 3000 fibers that feed low/medium resolution spectrometers and 1000 fibers that feed high-resolution (R~40,000) spectrometers. This design is expected to revolutionize astrophysical studies requiring large spectroscopic datasets: i.e., reconstructing the Milky Way's formation history through the chemical tagging of stars, searches for the effects of dark matter on stellar streams, determination of environmental influences on galaxy formation since cosmic noon, measuring black hole masses through repeat spectroscopy of quasars, follow-up of large samples identified in other surveys (Gaia, LSST, SKA, etc.), and more. MSE will reuse a large fraction of CFHT’s existing facilities while tripling the diameter of the telescope’s primary mirror and increasing the height of the enclosure by only 10%. I will discuss the progress to date and opportunities for partnerships.
Wide-Field InfraRed Survey Telescope WFIRST
NASA Technical Reports Server (NTRS)
Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Fan, X.; Rauscher, B.;
2012-01-01
In December 2010, NASA created a Science Definition Team (SDT) for WFIRST, the Wide Field Infra-Red Survey Telescope, recommended by the Astro 2010 Decadal Survey as the highest priority for a large space mission. The SDT was chartered to work with the WFIRST Project Office at GSFC and the Program Office at JPL to produce a Design Reference Mission (DRM) for WFIRST. Part of the original charge was to produce an interim design reference mission by mid-2011. That document was delivered to NASA and widely circulated within the astronomical community. In late 2011 the Astrophysics Division augmented its original charge, asking for two design reference missions. The first of these, DRM1, was to be a finalized version of the interim DRM, reducing overall mission costs where possible. The second of these, DRM2, was to identify and eliminate capabilities that overlapped with those of NASA's James Webb Space Telescope (henceforth JWST), ESA's Euclid mission, and the NSF's ground-based Large Synoptic Survey Telescope (henceforth LSST), and again to reduce overall mission cost, while staying faithful to NWNH. This report presents both DRM1 and DRM2.
Liverpool Telescope 2: beginning the design phase
NASA Astrophysics Data System (ADS)
Copperwheat, Christopher M.; Steele, Iain A.; Barnsley, Robert M.; Bates, Stuart D.; Bode, Mike F.; Clay, Neil R.; Collins, Chris A.; Jermak, Helen E.; Knapen, Johan H.; Marchant, Jon M.; Mottram, Chris J.; Piascik, Andrzej S.; Smith, Robert J.
2016-07-01
The Liverpool Telescope is a fully robotic 2-metre telescope located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope began routine science operations in 2004, and currently seven simultaneously mounted instruments support a broad science programme, with a focus on transient followup and other time domain topics well suited to the characteristics of robotic observing. Work has begun on a successor facility with the working title `Liverpool Telescope 2'. We are entering a new era of time domain astronomy with new discovery facilities across the electromagnetic spectrum, and the next generation of optical survey facilities such as LSST are set to revolutionise the field of transient science in particular. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time, and will be designed to meet the challenges of this new era. Following a conceptual design phase, we are about to begin the detailed design which will lead towards the start of construction in 2018, for first light ˜2022. In this paper we provide an overview of the facility and an update on progress.
Supernova Cosmology in the Big Data Era
NASA Astrophysics Data System (ADS)
Kessler, Richard
Here we describe large "Big Data" Supernova (SN) Ia surveys, past and present, used to make precision measurements of cosmological parameters that describe the expansion history of the universe. In particular, we focus on surveys designed to measure the dark energy equation of state parameter w and its dependence on cosmic time. These large surveys have at least four photometric bands, and they use a rolling search strategy in which the same instrument is used for both discovery and photometric follow-up observations. These surveys include the Supernova Legacy Survey (SNLS), Sloan Digital Sky Survey II (SDSS-II), Pan-STARRS 1 (PS1), Dark Energy Survey (DES), and Large Synoptic Survey Telescope (LSST). We discuss the development of how systematic uncertainties are evaluated, and how methods to reduce them play a major role is designing new surveys. The key systematic effects that we discuss are (1) calibration, measuring the telescope efficiency in each filter band, (2) biases from a magnitude-limited survey and from the analysis, and (3) photometric SN classification for current surveys that don't have enough resources to spectroscopically confirm each SN candidate.
Prompt emission from the counter jet of a short gamma-ray burst
NASA Astrophysics Data System (ADS)
Yamazaki, Ryo; Ioka, Kunihito; Nakamura, Takashi
2018-03-01
The counter jet of a short gamma-ray burst (sGRB) has not yet been observed, while recent discoveries of gravitational waves (GWs) from a binary neutron star merger GW170817 and the associated sGRB 170817A have demonstrated that off-axis sGRB jets are detectable. We calculate the prompt emission from the counter jet of an sGRB and show that it is typically 23-26 mag in the optical-infrared band 10-10^3 s after the GWs for an sGRB 170817A-like event, which is brighter than the early macronova (or kilonova) emission and detectable by LSST in the near future. We also propose a new method to constrain the unknown jet properties, such as the Lorentz factor, opening angle, emission radii, and jet launch time, by observing both the forward and counter jets. To scrutinize the counter jets, space GW detectors like DECIGO are powerful in forecasting the merger time (≲ 1 s) and position (≲ 1 arcmin) (˜ a week) before the merger.
The Zwicky Transient Facility: Overview and Commissioning Activities
NASA Astrophysics Data System (ADS)
Graham, Matthew; Zwicky Transient Facility (ZTF) Project Team
2018-01-01
The Zwicky Transient Facility (ZTF) is the first of a new generation of LSST-scope sky surveys to be realized. It will employ a 47 square degree field-of-view camera mounted on the Samuel Oschin 48-inch Schmidt telescope at Palomar Observatory to scan more than 3750 square degrees an hour to a depth of 20.5 – 21 mag. This will lead to unprecedented discovery rates for transients – a young supernova less than 24 hours after its explosion each night as well as rarer and more exotic sources. Repeated imaging of the Northern sky (including the Galactic Plane) will produce a photometric variability catalog with nearly 300 observations each year, ideal for studies of variable stars, binaries, AGN, and asteroids. ZTF represents a significant increase in scale relative to previous surveys in terms of both data volume and data complexity. It will be the first survey to produce one million alerts a night and the first to have a trillion row data archive. We will present an overview of the survey and its challenges and describe recent commissioning activities.
Priming the search for cosmic superstrings using GADGET2 simulations
NASA Astrophysics Data System (ADS)
Cousins, Bryce; Jia, Hewei; Braverman, William; Chernoff, David
2018-01-01
String theory is an extensive mathematical theory which, despite its broad explanatory power, is still lacking empirical support. However, this may change when considering the scope of cosmology, where “cosmic superstrings” may serve as observational evidence. According to string theory, these superstrings were stretched to cosmic scales in the early Universe and may now be detectable, via microlensing or gravitational radiation. Negative results from prior surveys have put some limits on superstring properties, so to investigate the parameter space more effectively, we ask: “where should we expect to find cosmic superstrings, and how many should we predict?” This research investigates these questions by simulating cosmic string behavior during structure formation in the universe using GADGET2. The sizes and locations of superstring clusters are assessed using kernel density estimation and radial correlation functions. Currently, only preliminary small-scale simulations have been performed, producing superstring clustering with low sensitivity. However, future simulations of greater magnitude will offer far higher resolution, allowing us to more precisely track superstring behavior within structures. Such results will guide future searches, most imminently those made possible by LSST and WFIRST.
Liverpool Telescope and Liverpool Telescope 2
NASA Astrophysics Data System (ADS)
Copperwheat, C. M.; Steele, I. A.; Barnsley, R. M.; Bates, S. D.; Clay, N. R.; Jermak, H.; Marchant, J. M.; Mottram, C. J.; Piascik, A.; Smith, R. J.
2016-12-01
The Liverpool Telescope is a fully robotic optical/near-infrared telescope with a 2-metre clear aperture, located at the Observatorio del Roque de los Muchachos on the Canary Island of La Palma. The telescope is owned and operated by Liverpool John Moores University, with financial support from the UK's Science and Technology Facilities Council. The telescope began routine science operations in 2004 and is a common-user facility with time available through a variety of committees via an open, peer reviewed process. Seven simultaneously mounted instruments support a broad science programme, with a focus on transient follow-up and other time domain topics well suited to the characteristics of robotic observing. Development has also begun on a successor facility, with the working title `Liverpool Telescope 2', to capitalise on the new era of time domain astronomy which will be brought about by the next generation of survey facilities such as LSST. The fully robotic Liverpool Telescope 2 will have a 4-metre aperture and an improved response time. In this paper we provide an overview of the current status of both facilities.
The dynamics and control of large flexible space structures-V
NASA Technical Reports Server (NTRS)
Bainum, P. M.; Reddy, A. S. S. R.; Diarra, C. M.; Kumar, V. K.
1982-01-01
A general survey of the progress made in the areas of mathematical modelling of the system dynamics, structural analysis, development of control algorithms, and simulation of environmental disturbances is presented. The use of graph theory techniques is employed to examine the effects of inherent damping associated with LSST systems on the number and locations of the required control actuators. A mathematical model of the forces and moments induced on a flexible orbiting beam due to solar radiation pressure is developed and typical steady state open loop responses obtained for the case when rotations and vibrations are limited to occur within the orbit plane. A preliminary controls analysis based on a truncated (13 mode) finite element model of the 122m. Hoop/Column antenna indicates that a minimum of six appropriately placed actuators is required for controllability. An algorithm to evaluate the coefficients which describe coupling between the rigid rotational and flexible modes and also intramodal coupling was developed and numerical evaluation based on the finite element model of Hoop/Column system is currently in progress.
Science capabilities of the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Devost, Daniel; McConnachie, Alan; Flagey, Nicolas; Cote, Patrick; Balogh, Michael; Driver, Simon P.; Venn, Kim
2017-01-01
The Maunakea Spectroscopic Explorer (MSE) project will transform the CFHT 3.6m optical telescope into a 10m class dedicated multiobject spectroscopic facility, with an ability to simultaneously measure thousands of objects with a spectral resolution range spanning 2,000 to 20,000. The project is currently in design phase, with full science operations nominally starting in 2025. MSE will enable transformational science in areas as diverse as exoplanetary host characterization; stellar monitoring campaigns; tomographic mapping of the interstellar and intergalactic media; the in-situ chemical tagging of the distant Galaxy; connecting galaxies to the large scale structure of the Universe; measuring the mass functions of cold dark matter sub-halos in galaxy and cluster-scale hosts; reverberation mapping of supermassive black holes in quasars. MSE is an essential follow-up facility to current and next generations of multi-wavelength imaging surveys, including LSST, Gaia, Euclid, eROSITA, SKA, and WFIRST, and is an ideal feeder facility for E-ELT, TMT and GMT. I will give an update on the status of the project and review some of the most exciting scientific capabilities of the observatory.
Status of mirror segment production for the Giant Magellan Telescope
NASA Astrophysics Data System (ADS)
Martin, H. M.; Burge, J. H.; Davis, J. M.; Kim, D. W.; Kingsley, J. S.; Law, K.; Loeff, A.; Lutz, R. D.; Merrill, C.; Strittmatter, P. A.; Tuell, M. T.; Weinberger, S. N.; West, S. C.
2016-07-01
The Richard F. Caris Mirror Lab at the University of Arizona is responsible for production of the eight 8.4 m segments for the primary mirror of the Giant Magellan Telescope, including one spare off-axis segment. We report on the successful casting of Segment 4, the center segment. Prior to generating the optical surface of Segment 2, we carried out a major upgrade of our 8.4 m Large Optical Generator. The upgrade includes new hardware and software to improve accuracy, safety, reliability and ease of use. We are currently carrying out an upgrade of our 8.4 m polishing machine that includes improved orbital polishing capabilities. We added and modified several components of the optical tests during the manufacture of Segment 1, and we have continued to improve the systems in preparation for Segments 2-8. We completed two projects that were prior commitments before GMT Segment 2: casting and polishing the combined primary and tertiary mirrors for the LSST, and casting and generating a 6.5 m mirror for the Tokyo Atacama Observatory.
Detection technique for artificially illuminated objects in the outer solar system and beyond.
Loeb, Abraham; Turner, Edwin L
2012-04-01
Existing and planned optical telescopes and surveys can detect artificially illuminated objects, comparable in total brightness to a major terrestrial city, at the outskirts of the Solar System. Orbital parameters of Kuiper belt objects (KBOs) are routinely measured to exquisite precisions of<10(-3). Here, we propose to measure the variation of the observed flux F from such objects as a function of their changing orbital distances D. Sunlight-illuminated objects will show a logarithmic slope α ≡ (d log F/d log D)=-4, whereas artificially illuminated objects should exhibit α=-2. The proposed Large Synoptic Survey Telescope (LSST) and other planned surveys will provide superb data and allow measurement of α for thousands of KBOs. If objects with α=-2 are found, follow-up observations could measure their spectra to determine whether they are illuminated by artificial lighting. The search can be extended beyond the Solar System with future generations of telescopes on the ground and in space that would have the capacity to detect phase modulation due to very strong artificial illumination on the nightside of planets as they orbit their parent stars.
Making The Most Of Flaring M Dwarfs
NASA Astrophysics Data System (ADS)
Hunt-Walker, Nicholas; Hilton, E.; Kowalski, A.; Hawley, S.; Matthews, J.; Holtzman, J.
2011-01-01
We present observations of flare activity using the Microvariability and Oscillations of Stars (MOST) satellite in conjunction with simultaneous spectroscopic and photometric observations from the ARC 3.5-meter, NMSU 1.0-meter, and ARCSAT 0.5-meter telescopes at the Apache Point Observatory. The MOST observations enable unprecedented completeness with regard to observing frequent, low-energy flares on the well-known dMe flare star AD Leo with broadband photometry. The observations span approximately one week with a 60-second cadence and are sensitive to flares as small as 0.01-magnitudes. The time-resolved, ground-based spectroscopy gives measurements of Hα and other important chromospheric emission lines, whereas the Johnson U-, SDSS u-, and SDSS g-band photometry provide color information during the flare events and allow us to relate the MOST observations to decades of previous broadband observations. Understanding the rates and energetics of flare events on M dwarfs will help characterize this source of variability in large time-domain surveys such as LSST and Pan-STARRS. Flare rates are also of interest to astrobiology, since flares affect the habitability of exoplanets orbiting M dwarfs.
Synchronization of autonomous objects in discrete event simulation
NASA Technical Reports Server (NTRS)
Rogers, Ralph V.
1990-01-01
Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.
Spatial decision support system to evaluate crop residue energy potential by anaerobic digestion.
Escalante, Humberto; Castro, Liliana; Gauthier-Maradei, Paola; Rodríguez De La Vega, Reynel
2016-11-01
Implementing anaerobic digestion (AD) in energy production from crop residues requires development of decision tools to assess its feasibility and sustainability. A spatial decision support system (SDSS) was constructed to assist decision makers to select appropriate feedstock according to biomethanation potential, identify the most suitable location for biogas facilities, determine optimum plant capacity and supply chain, and evaluate associated risks and costs. SDSS involves a spatially explicit analysis, fuzzy multi-criteria analysis, and statistical and optimization models. The tool was validated on seven crop residues located in Santander, Colombia. For example, fique bagasse generates about 0.21millionm(3)CH4year(-1) (0.329m(3)CH4kg(-1) volatile solids) with a minimum profitable plant of about 2000tonyear(-1) and an internal rate of return of 10.5%. SDSS can be applied to evaluate other biomass resources, availability periods, and co-digestion potential. Copyright © 2016. Published by Elsevier Ltd.
Metabolic Flexibility as a Major Predictor of Spatial Distribution in Microbial Communities
Carbonero, Franck; Oakley, Brian B.; Purdy, Kevin J.
2014-01-01
A better understand the ecology of microbes and their role in the global ecosystem could be achieved if traditional ecological theories can be applied to microbes. In ecology organisms are defined as specialists or generalists according to the breadth of their niche. Spatial distribution is often used as a proxy measure of niche breadth; generalists have broad niches and a wide spatial distribution and specialists a narrow niche and spatial distribution. Previous studies suggest that microbial distribution patterns are contrary to this idea; a microbial generalist genus (Desulfobulbus) has a limited spatial distribution while a specialist genus (Methanosaeta) has a cosmopolitan distribution. Therefore, we hypothesise that this counter-intuitive distribution within generalist and specialist microbial genera is a common microbial characteristic. Using molecular fingerprinting the distribution of four microbial genera, two generalists, Desulfobulbus and the methanogenic archaea Methanosarcina, and two specialists, Methanosaeta and the sulfate-reducing bacteria Desulfobacter were analysed in sediment samples from along a UK estuary. Detected genotypes of both generalist genera showed a distinct spatial distribution, significantly correlated with geographic distance between sites. Genotypes of both specialist genera showed no significant differential spatial distribution. These data support the hypothesis that the spatial distribution of specialist and generalist microbes does not match that seen with specialist and generalist large organisms. It may be that generalist microbes, while having a wider potential niche, are constrained, possibly by intrageneric competition, to exploit only a small part of that potential niche while specialists, with far fewer constraints to their niche, are more capable of filling their potential niche more effectively, perhaps by avoiding intrageneric competition. We suggest that these counter-intuitive distribution patterns may be a common feature of microbes in general and represent a distinct microbial principle in ecology, which is a real challenge if we are to develop a truly inclusive ecology. PMID:24465487
High Resolution Tissue Imaging Using the Single-probe Mass Spectrometry under Ambient Conditions
NASA Astrophysics Data System (ADS)
Rao, Wei; Pan, Ning; Yang, Zhibo
2015-06-01
Ambient mass spectrometry imaging (MSI) is an emerging field with great potential for the detailed spatial analysis of biological samples with minimal pretreatment. We have developed a miniaturized sampling and ionization device, the Single-probe, which uses in-situ surface micro-extraction to achieve high detection sensitivity and spatial resolution during MSI experiments. The Single-probe was coupled to a Thermo LTQ Orbitrap XL mass spectrometer and was able to create high spatial and high mass resolution MS images at 8 ± 2 and 8.5 μm on flat polycarbonate microscope slides and mouse kidney sections, respectively, which are among the highest resolutions available for ambient MSI techniques. Our proof-of-principle experiments indicate that the Single-probe MSI technique has the potential to obtain ambient MS images with very high spatial resolutions with minimal sample preparation, which opens the possibility for subcellular ambient tissue MSI to be performed in the future.
The spatial scaling of species interaction networks.
Galiana, Nuria; Lurgi, Miguel; Claramunt-López, Bernat; Fortin, Marie-Josée; Leroux, Shawn; Cazelles, Kevin; Gravel, Dominique; Montoya, José M
2018-05-01
Species-area relationships (SARs) are pivotal to understand the distribution of biodiversity across spatial scales. We know little, however, about how the network of biotic interactions in which biodiversity is embedded changes with spatial extent. Here we develop a new theoretical framework that enables us to explore how different assembly mechanisms and theoretical models affect multiple properties of ecological networks across space. We present a number of testable predictions on network-area relationships (NARs) for multi-trophic communities. Network structure changes as area increases because of the existence of different SARs across trophic levels, the preferential selection of generalist species at small spatial extents and the effect of dispersal limitation promoting beta-diversity. Developing an understanding of NARs will complement the growing body of knowledge on SARs with potential applications in conservation ecology. Specifically, combined with further empirical evidence, NARs can generate predictions of potential effects on ecological communities of habitat loss and fragmentation in a changing world.
NASA Astrophysics Data System (ADS)
Zhang, Chaosheng
2010-05-01
Outliers in urban soil geochemical databases may imply potential contaminated land. Different methodologies which can be easily implemented for the identification of global and spatial outliers were applied for Pb concentrations in urban soils of Galway City in Ireland. Due to its strongly skewed probability feature, a Box-Cox transformation was performed prior to further analyses. The graphic methods of histogram and box-and-whisker plot were effective in identification of global outliers at the original scale of the dataset. Spatial outliers could be identified by a local indicator of spatial association of local Moran's I, cross-validation of kriging, and a geographically weighted regression. The spatial locations of outliers were visualised using a geographical information system. Different methods showed generally consistent results, but differences existed. It is suggested that outliers identified by statistical methods should be confirmed and justified using scientific knowledge before they are properly dealt with.
Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.
2015-01-01
In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.
NASA Astrophysics Data System (ADS)
Sun, Y.
2017-09-01
In development of sustainable transportation and green city, policymakers encourage people to commute by cycling and walking instead of motor vehicles in cities. One the one hand, cycling and walking enables decrease in air pollution emissions. On the other hand, cycling and walking offer health benefits by increasing people's physical activity. Earlier studies on investigating spatial patterns of active travel (cycling and walking) are limited by lacks of spatially fine-grained data. In recent years, with the development of information and communications technology, GPS-enabled devices are popular and portable. With smart phones or smart watches, people are able to record their cycling or walking GPS traces when they are moving. A large number of cyclists and pedestrians upload their GPS traces to sport social media to share their historical traces with other people. Those sport social media thus become a potential source for spatially fine-grained cycling and walking data. Very recently, Strava Metro offer aggregated cycling and walking data with high spatial granularity. Strava Metro aggregated a large amount of cycling and walking GPS traces of Strava users to streets or intersections across a city. Accordingly, as a kind of crowdsourced geographic information, the aggregated data is useful for investigating spatial patterns of cycling and walking activities, and thus is of high potential in understanding cycling or walking behavior at a large spatial scale. This study is a start of demonstrating usefulness of Strava Metro data for exploring cycling or walking patterns at a large scale.
Jones, Leslie A.; Muhlfeld, Clint C.; Marshall, Lucy A.; McGlynn, Brian L.; Kershner, Jeffrey L.
2013-01-01
Understanding the vulnerability of aquatic species and habitats under climate change is critical for conservation and management of freshwater systems. Climate warming is predicted to increase water temperatures in freshwater ecosystems worldwide, yet few studies have developed spatially explicit modelling tools for understanding the potential impacts. We parameterized a nonspatial model, a spatial flow-routed model, and a spatial hierarchical model to predict August stream temperatures (22-m resolution) throughout the Flathead River Basin, USA and Canada. Model comparisons showed that the spatial models performed significantly better than the nonspatial model, explaining the spatial autocorrelation found between sites. The spatial hierarchical model explained 82% of the variation in summer mean (August) stream temperatures and was used to estimate thermal regimes for threatened bull trout (Salvelinus confluentus) habitats, one of the most thermally sensitive coldwater species in western North America. The model estimated summer thermal regimes of spawning and rearing habitats at <13 C° and foraging, migrating, and overwintering habitats at <14 C°. To illustrate the useful application of such a model, we simulated climate warming scenarios to quantify potential loss of critical habitats under forecasted climatic conditions. As air and water temperatures continue to increase, our model simulations show that lower portions of the Flathead River Basin drainage (foraging, migrating, and overwintering habitat) may become thermally unsuitable and headwater streams (spawning and rearing) may become isolated because of increasing thermal fragmentation during summer. Model results can be used to focus conservation and management efforts on populations of concern, by identifying critical habitats and assessing thermal changes at a local scale.
Li, C; Huang, P; Lu, Q; Zhou, M; Guo, L; Xu, X
2014-11-07
Spatial memory retrieval and hippocampal long-term potentiation (LTP) are impaired by stress. KCNQ/Kv7 channels are closely associated with memory and the KCNQ/Kv7 channel activator flupirtine represents neuroprotective effects. This study aims to test whether KCNQ/Kv7 channel activation prevents acute stress-induced impairments of spatial memory retrieval and hippocampal LTP. Rats were placed on an elevated platform in the middle of a bright room for 30 min to evoke acute stress. The expression of KCNQ/Kv7 subunits was analyzed at 1, 3 and 12 h after stress by Western blotting. Spatial memory was examined by the Morris water maze (MWM) and the field excitatory postsynaptic potential (fEPSP) in the hippocampal CA1 area was recorded in vivo. Acute stress transiently decreased the expression of KCNQ2 and KCNQ3 in the hippocampus. Acute stress impaired the spatial memory retrieval and hippocampal LTP, the KCNQ/Kv7 channel activator flupirtine prevented the impairments, and the protective effects of flupirtine were blocked by XE-991 (10,10-bis(4-Pyridinylmethyl)-9(10H)-anthracenone), a selective KCNQ channel blocker. Furthermore, acute stress decreased the phosphorylation of glycogen synthase kinase-3β (GSK-3β) at Ser9 in the hippocampus, and flupirtine inhibited the reduction. These results suggest that the KCNQ/Kv7 channels may be a potential target for protecting both hippocampal synaptic plasticity and spatial memory retrieval from acute stress influences. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.
Extracting potential bus lines of Customized City Bus Service based on public transport big data
NASA Astrophysics Data System (ADS)
Ren, Yibin; Chen, Ge; Han, Yong; Zheng, Huangcheng
2016-11-01
Customized City Bus Service (CCBS) can reduce the traffic congestion and environmental pollution that caused by the increasing in private cars, effectively. This study aims to extract the potential bus lines and each line's passenger density of CCBS by mining the public transport big data. The datasets used in this study are mainly Smart Card Data (SCD) and bus GPS data of Qingdao, China, from October 11th and November 7th 2015. Firstly, we compute the temporal-origin-destination (TOD) of passengers by mining SCD and bus GPS data. Compared with the traditional OD, TOD not only has the spatial location, but also contains the trip's boarding time. Secondly, based on the traditional DBSCAN algorithm, we put forwards an algorithm, named TOD-DBSCAN, combined with the spatial-temporal features of TOD.TOD-DBSCAN is used to cluster the TOD trajectories in peak hours of all working days. Then, we define two variables P and N to describe the possibility and passenger destiny of a potential CCBS line. P is the probability of the CCBS line. And N represents the potential passenger destiny of the line. Lastly, we visualize the potential CCBS lines extracted by our procedure on the map and analyse relationship between potential CCBS lines and the urban spatial structure.
Environmental boundaries as a mechanism for correcting and anchoring spatial maps
2016-01-01
Abstract Ubiquitous throughout the animal kingdom, path integration‐based navigation allows an animal to take a circuitous route out from a home base and using only self‐motion cues, calculate a direct vector back. Despite variation in an animal's running speed and direction, medial entorhinal grid cells fire in repeating place‐specific locations, pointing to the medial entorhinal circuit as a potential neural substrate for path integration‐based spatial navigation. Supporting this idea, grid cells appear to provide an environment‐independent metric representation of the animal's location in space and preserve their periodic firing structure even in complete darkness. However, a series of recent experiments indicate that spatially responsive medial entorhinal neurons depend on environmental cues in a more complex manner than previously proposed. While multiple types of landmarks may influence entorhinal spatial codes, environmental boundaries have emerged as salient landmarks that both correct error in entorhinal grid cells and bind internal spatial representations to the geometry of the external spatial world. The influence of boundaries on error correction and grid symmetry points to medial entorhinal border cells, which fire at a high rate only near environmental boundaries, as a potential neural substrate for landmark‐driven control of spatial codes. The influence of border cells on other entorhinal cell populations, such as grid cells, could depend on plasticity, raising the possibility that experience plays a critical role in determining how external cues influence internal spatial representations. PMID:26563618
Robert E. Keane; Janice L. Garner; Kirsten M. Schmidt; Donald G. Long; James P. Menakis; Mark A. Finney
1998-01-01
Fuel and vegetation spatial data layers required by the spatially explicit fire growth model FARSITE were developed for all lands in and around the Selway-Bitterroot Wilderness Area in Idaho and Montana. Satellite imagery and terrain modeling were used to create the three base vegetation spatial data layers of potential vegetation, cover type, and structural stage....
In vitro spatially organizing the differentiation in individual multicellular stem cell aggregates.
Qi, Hao; Huang, Guoyou; Han, Yu Long; Lin, Wang; Li, Xiujun; Wang, Shuqi; Lu, Tian Jian; Xu, Feng
2016-01-01
With significant potential as a robust source to produce specific somatic cells for regenerative medicine, stem cells have attracted increasing attention from both academia and government. In vivo, stem cell differentiation is a process under complicated regulations to precisely build tissue with unique spatial structures. Since multicellular spheroidal aggregates of stem cells, commonly called as embryoid bodies (EBs), are considered to be capable of recapitulating the events in early stage of embryonic development, a variety of methods have been developed to form EBs in vitro for studying differentiation of embryonic stem cells. The regulation of stem cell differentiation is crucial in directing stem cells to build tissue with the correct spatial architecture for specific functions. However, stem cells within the three-dimensional multicellular aggregates undergo differentiation in a less unpredictable and spatially controlled manner in vitro than in vivo. Recently, various microengineering technologies have been developed to manipulate stem cells in vitro in a spatially controlled manner. Herein, we take the spotlight on these technologies and researches that bring us the new potential for manipulation of stem cells for specific purposes.
A periodic spatio-spectral filter for event-related potentials.
Ghaderi, Foad; Kim, Su Kyoung; Kirchner, Elsa Andrea
2016-12-01
With respect to single trial detection of event-related potentials (ERPs), spatial and spectral filters are two of the most commonly used pre-processing techniques for signal enhancement. Spatial filters reduce the dimensionality of the data while suppressing the noise contribution and spectral filters attenuate frequency components that most likely belong to noise subspace. However, the frequency spectrum of ERPs overlap with that of the ongoing electroencephalogram (EEG) and different types of artifacts. Therefore, proper selection of the spectral filter cutoffs is not a trivial task. In this research work, we developed a supervised method to estimate the spatial and finite impulse response (FIR) spectral filters, simultaneously. We evaluated the performance of the method on offline single trial classification of ERPs in datasets recorded during an oddball paradigm. The proposed spatio-spectral filter improved the overall single-trial classification performance by almost 9% on average compared with the case that no spatial filters were used. We also analyzed the effects of different spectral filter lengths and the number of retained channels after spatial filtering. Copyright © 2016. Published by Elsevier Ltd.
Hoover, Joseph H; Coker, Eric; Barney, Yolanda; Shuey, Chris; Lewis, Johnnye
2018-08-15
Contaminant mixtures are identified regularly in public and private drinking water supplies throughout the United States; however, the complex and often correlated nature of mixtures makes identification of relevant combinations challenging. This study employed a Bayesian clustering method to identify subgroups of water sources with similar metal and metalloid profiles. Additionally, a spatial scan statistic assessed spatial clustering of these subgroups and a human health metric was applied to investigate potential for human toxicity. These methods were applied to a dataset comprised of metal and metalloid measurements from unregulated water sources located on the Navajo Nation, in the southwest United States. Results indicated distinct subgroups of water sources with similar contaminant profiles and that some of these subgroups were spatially clustered. Several profiles had metal and metalloid concentrations that may have potential for human toxicity including arsenic, uranium, lead, manganese, and selenium. This approach may be useful for identifying mixtures in water sources, spatially evaluating the clusters, and help inform toxicological research investigating mixtures. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Interactive Marine Spatial Planning: Siting Tidal Energy Arrays around the Mull of Kintyre
Alexander, Karen A.; Janssen, Ron; Arciniegas, Gustavo; O'Higgins, Timothy G.; Eikelboom, Tessa; Wilding, Thomas A.
2012-01-01
The rapid development of the offshore renewable energy sector has led to an increased requirement for Marine Spatial Planning (MSP) and, increasingly, this is carried out in the context of the ‘ecosystem approach’ (EA) to management. We demonstrate a novel method to facilitate implementation of the EA. Using a real-time interactive mapping device (touch-table) and stakeholder workshops we gathered data and facilitated negotiation of spatial trade-offs at a potential site for tidal renewable energy off the Mull of Kintyre (Scotland). Conflicts between the interests of tidal energy developers and commercial and recreational users of the area were identified, and use preferences and concerns of stakeholders were highlighted. Social, cultural and spatial issues associated with conversion of common pool to private resource were also revealed. The method identified important gaps in existing spatial data and helped to fill these through interactive user inputs. The workshops developed a degree of consensus between conflicting users on the best areas for potential development suggesting that this approach should be adopted during MSP. PMID:22253865
Carlson, Mary H.; Zientek, Michael L.; Causey, J. Douglas; Kayser, Helen Z.; Spanski, Gregory T.; Wilson, Anna B.; Van Gosen, Bradley S.; Trautwein, Charles M.
2007-01-01
This report compiles selected results from 13 U.S. Geological Survey (USGS) mineral resource assessment studies conducted in Idaho and Montana into consistent spatial databases that can be used in a geographic information system. The 183 spatial databases represent areas of mineral potential delineated in these studies and include attributes on mineral deposit type, level of mineral potential, certainty, and a reference. The assessments were conducted for five 1? x 2? quadrangles (Butte, Challis, Choteau, Dillon, and Wallace), several U.S. Forest Service (USFS) National Forests (including Challis, Custer, Gallatin, Helena, and Payette), and one Bureau of Land Management (BLM) Resource Area (Dillon). The data contained in the spatial databases are based on published information: no new interpretations are made. This digital compilation is part of an ongoing effort to provide mineral resource information formatted for use in spatial analysis. In particular, this is one of several reports prepared to address USFS needs for science information as forest management plans are revised in the Northern Rocky Mountains.
Hevesi, Joseph A.; Flint, Alan L.; Flint, Lorraine E.
2003-01-01
This report presents the development and application of the distributed-parameter watershed model, INFILv3, for estimating the temporal and spatial distribution of net infiltration and potential recharge in the Death Valley region, Nevada and California. The estimates of net infiltration quantify the downward drainage of water across the lower boundary of the root zone and are used to indicate potential recharge under variable climate conditions and drainage basin characteristics. Spatial variability in recharge in the Death Valley region likely is high owing to large differences in precipitation, potential evapotranspiration, bedrock permeability, soil thickness, vegetation characteristics, and contributions to recharge along active stream channels. The quantity and spatial distribution of recharge representing the effects of variable climatic conditions and drainage basin characteristics on recharge are needed to reduce uncertainty in modeling ground-water flow. The U.S. Geological Survey, in cooperation with the Department of Energy, developed a regional saturated-zone ground-water flow model of the Death Valley regional ground-water flow system to help evaluate the current hydrogeologic system and the potential effects of natural or human-induced changes. Although previous estimates of recharge have been made for most areas of the Death Valley region, including the area defined by the boundary of the Death Valley regional ground-water flow system, the uncertainty of these estimates is high, and the spatial and temporal variability of the recharge in these basins has not been quantified. To estimate the magnitude and distribution of potential recharge in response to variable climate and spatially varying drainage basin characteristics, the INFILv3 model uses a daily water-balance model of the root zone with a primarily deterministic representation of the processes controlling net infiltration and potential recharge. The daily water balance includes precipitation (as either rain or snow), snow accumulation, sublimation, snowmelt, infiltration into the root zone, evapotranspiration, drainage, water content change throughout the root-zone profile (represented as a 6-layered system), runoff (defined as excess rainfall and snowmelt) and surface water run-on (defined as runoff that is routed downstream), and net infiltration (simulated as drainage from the bottom root-zone layer). Potential evapotranspiration is simulated using an hourly solar radiation model to simulate daily net radiation, and daily evapotranspiration is simulated as an empirical function of root zone water content and potential evapotranspiration. The model uses daily climate records of precipitation and air temperature from a regionally distributed network of 132 climate stations and a spatially distributed representation of drainage basin characteristics defined by topography, geology, soils, and vegetation to simulate daily net infiltration at all locations, including stream channels with intermittent streamflow in response to runoff from rain and snowmelt. The temporal distribution of daily, monthly, and annual net infiltration can be used to evaluate the potential effect of future climatic conditions on potential recharge. The INFILv3 model inputs representing drainage basin characteristics were developed using a geographic information system (GIS) to define a set of spatially distributed input parameters uniquely assigned to each grid cell of the INFILv3 model grid. The model grid, which was defined by a digital elevation model (DEM) of the Death Valley region, consists of 1,252,418 model grid cells with a uniform grid cell dimension of 278.5 meters in the north-south and east-west directions. The elevation values from the DEM were used with monthly regression models developed from the daily climate data to estimate the spatial distribution of daily precipitation and air temperature. The elevation values were also used to simulate atmosp
Application of a Chimera Full Potential Algorithm for Solving Aerodynamic Problems
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Kwak, Dochan (Technical Monitor)
1997-01-01
A numerical scheme utilizing a chimera zonal grid approach for solving the three dimensional full potential equation is described. Special emphasis is placed on describing the spatial differencing algorithm around the chimera interface. Results from two spatial discretization variations are presented; one using a hybrid first-order/second-order-accurate scheme and the second using a fully second-order-accurate scheme. The presentation is highlighted with a number of transonic wing flow field computations.
Spatial shaping for generating arbitrary optical dipole traps for ultracold degenerate gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Jeffrey G., E-mail: jglee@umd.edu; Institute for Physical Science and Technology, University of Maryland, College Park, Maryland 20742; Hill, W. T., E-mail: wth@umd.edu
2014-10-15
We present two spatial-shaping approaches – phase and amplitude – for creating two-dimensional optical dipole potentials for ultracold neutral atoms. When combined with an attractive or repulsive Gaussian sheet formed by an astigmatically focused beam, atoms are trapped in three dimensions resulting in planar confinement with an arbitrary network of potentials – a free-space atom chip. The first approach utilizes an adaptation of the generalized phase-contrast technique to convert a phase structure embedded in a beam after traversing a phase mask, to an identical intensity profile in the image plane. Phase masks, and a requisite phase-contrast filter, can be chemicallymore » etched into optical material (e.g., fused silica) or implemented with spatial light modulators; etching provides the highest quality while spatial light modulators enable prototyping and realtime structure modification. This approach was demonstrated on an ensemble of thermal atoms. Amplitude shaping is possible when the potential structure is made as an opaque mask in the path of a dipole trap beam, followed by imaging the shadow onto the plane of the atoms. While much more lossy, this very simple and inexpensive approach can produce dipole potentials suitable for containing degenerate gases. High-quality amplitude masks can be produced with standard photolithography techniques. Amplitude shaping was demonstrated on a Bose-Einstein condensate.« less
De Azevedo, Thiago S; Bourke, Brian Patrick; Piovezan, Rafael; Sallum, Maria Anice M
2018-05-08
We addressed the potential associations among the temporal and spatial distribution of larval habitats of Aedes (Stegomyia) aegypti, the presence of urban heat islands and socioeconomic factors. Data on larval habitats were collected in Santa Bárbara d'Oeste, São Paulo, Brazil, from 2004 to 2006, and spatial and temporal variations were analysed using a wavelet-based approach. We quantified urban heat islands by calculating surface temperatures using the results of wavelet analyses and grey level transformation from Thematic Mapper images (Landsat 5). Ae. aegypti larval habitats were geo-referenced corresponding to the wavelet analyses to test the potential association between geographical distribution of habitats and surface temperature. In an inhomogeneous spatial point process, we estimated the frequency of occurrence of larval habitats in relation to temperature. The São Paulo State Social Vulnerability Index in the municipality of Santa Barbára d'Oeste was used to test the potential association between presence of larval habitats and social vulnerability. We found abundant Ae. aegypti larval habitats in areas of higher surface temperature and social vulnerability and fewer larval habitats in areas with lower surface temperature and social vulnerability.
Continuous Variable Cluster State Generation over the Optical Spatial Mode Comb
Pooser, Raphael C.; Jing, Jietai
2014-10-20
One way quantum computing uses single qubit projective measurements performed on a cluster state (a highly entangled state of multiple qubits) in order to enact quantum gates. The model is promising due to its potential scalability; the cluster state may be produced at the beginning of the computation and operated on over time. Continuous variables (CV) offer another potential benefit in the form of deterministic entanglement generation. This determinism can lead to robust cluster states and scalable quantum computation. Recent demonstrations of CV cluster states have made great strides on the path to scalability utilizing either time or frequency multiplexingmore » in optical parametric oscillators (OPO) both above and below threshold. The techniques relied on a combination of entangling operators and beam splitter transformations. Here we show that an analogous transformation exists for amplifiers with Gaussian inputs states operating on multiple spatial modes. By judicious selection of local oscillators (LOs), the spatial mode distribution is analogous to the optical frequency comb consisting of axial modes in an OPO cavity. We outline an experimental system that generates cluster states across the spatial frequency comb which can also scale the amount of quantum noise reduction to potentially larger than in other systems.« less
Biocapacity optimization in regional planning
Guo, Jianjun; Yue, Dongxia; Li, Kai; Hui, Cang
2017-01-01
Ecological overshoot has been accelerating across the globe. Optimizing biocapacity has become a key to resolve the overshoot of ecological demand in regional sustainable development. However, most literature has focused on reducing ecological footprint but ignores the potential of spatial optimization of biocapacity through regional planning of land use. Here we develop a spatial probability model and present four scenarios for optimizing biocapacity of a river basin in Northwest China. The potential of enhanced biocapacity and its effects on ecological overshoot and water consumption in the region were explored. Two scenarios with no restrictions on croplands and water use reduced the overshoot by 29 to 53%, and another two scenarios which do not allow croplands and water use to increase worsened the overshoot by 11 to 15%. More spatially flexible transition rules of land use led to higher magnitude of change after optimization. However, biocapacity optimization required a large amount of additional water resources, casting considerable pressure on the already water-scarce socio-ecological system. Our results highlight the potential for policy makers to manage/optimize regional land use which addresses ecological overshoot. Investigation on the feasibility of such spatial optimization complies with the forward-looking policies for sustainable development and deserves further attention. PMID:28112224
Calibration of a distributed hydrologic model using observed spatial patterns from MODIS data
NASA Astrophysics Data System (ADS)
Demirel, Mehmet C.; González, Gorka M.; Mai, Juliane; Stisen, Simon
2016-04-01
Distributed hydrologic models are typically calibrated against streamflow observations at the outlet of the basin. Along with these observations from gauging stations, satellite based estimates offer independent evaluation data such as remotely sensed actual evapotranspiration (aET) and land surface temperature. The primary objective of the study is to compare model calibrations against traditional downstream discharge measurements with calibrations against simulated spatial patterns and combinations of both types of observations. While the discharge based model calibration typically improves the temporal dynamics of the model, it seems to give rise to minimum improvement of the simulated spatial patterns. In contrast, objective functions specifically targeting the spatial pattern performance could potentially increase the spatial model performance. However, most modeling studies, including the model formulations and parameterization, are not designed to actually change the simulated spatial pattern during calibration. This study investigates the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale hydrologic model (mHM). This model is selected as it allows for a change in the spatial distribution of key soil parameters through the optimization of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) values directly as input. In addition the simulated aET can be estimated at a spatial resolution suitable for comparison to the spatial patterns observed with MODIS data. To increase our control on spatial calibration we introduced three additional parameters to the model. These new parameters are part of an empirical equation to the calculate crop coefficient (Kc) from daily LAI maps and used to update potential evapotranspiration (PET) as model inputs. This is done instead of correcting/updating PET with just a uniform (or aspect driven) factor used in the mHM model (version 5.3). We selected the 20 most important parameters out of 53 mHM parameters based on a comprehensive sensitivity analysis (Cuntz et al., 2015). We calibrated 1km-daily mHM for the Skjern basin in Denmark using the Shuffled Complex Evolution (SCE) algorithm and inputs at different spatial scales i.e. meteorological data at 10km and morphological data at 250 meters. We used correlation coefficients between observed monthly (summer months only) MODIS data calculated from cloud free days over the calibration period from 2001 to 2008 and simulated aET from mHM over the same period. Similarly other metrics, e.g mapcurves and fraction skill-score, are also included in our objective function to assess the co-location of the grid-cells. The preliminary results show that multi-objective calibration of mHM against observed streamflow and spatial patterns together does not significantly reduce the spatial errors in aET while it improves the streamflow simulations. This is a strong signal for further investigation of the multi parameter regionalization affecting spatial aET patterns and weighting the spatial metrics in the objective function relative to the streamflow metrics.
NASA Astrophysics Data System (ADS)
Liu, Shurong; Herbst, Michael; Bol, Roland; Gottselig, Nina; Pütz, Thomas; Weymann, Daniel; Wiekenkamp, Inge; Vereecken, Harry; Brüggemann, Nicolas
2016-04-01
Hydroxylamine (NH2OH), a reactive intermediate of several microbial nitrogen turnover processes, is a potential precursor of nitrous oxide (N2O) formation in the soil. However, the contribution of soil NH2OH to soil N2O emission rates in natural ecosystems is unclear. Here, we determined the spatial variability of NH2OH content and potential N2O emission rates of organic (Oh) and mineral (Ah) soil layers of a Norway spruce forest, using a recently developed analytical method for the determination of soil NH2OH content, combined with a geostatistical Kriging approach. Potential soil N2O emission rates were determined by laboratory incubations under oxic conditions, followed by gas chromatographic analysis and complemented by ancillary measurements of soil characteristics. Stepwise multiple regressions demonstrated that the potential N2O emission rates, NH2OH and nitrate (NO3-) content were spatially highly correlated, with hotspots for all three parameters observed in the headwater of a small creek flowing through the sampling area. In contrast, soil ammonium (NH4+) was only weakly correlated with potential N2O emission rates, and was excluded from the multiple regression models. While soil NH2OH content explained the potential soil N2O emission rates best for both layers, also NO3- and Mn content turned out to be significant parameters explaining N2O formation in both soil layers. The Kriging approach was improved markedly by the addition of the co-variable information of soil NH2OH and NO3- content. The results indicate that determination of soil NH2OH content could provide crucial information for the prediction of the spatial variability of soil N2O emissions.
APPLICATION OF SPATIAL INFORMATION TECHNOLOGY TO PETROLEUM RESOURCE ASSESSMENT ANALYSIS.
Miller, Betty M.; Domaratz, Michael A.
1984-01-01
Petroleum resource assessment procedures require the analysis of a large volume of spatial data. The US Geological Survey (USGS) has developed and applied spatial information handling procedures and digital cartographic techniques to a recent study involving the assessment of oil and gas resource potential for 74 million acres of designated and proposed wilderness lands in the western United States. The part of the study which dealt with the application of spatial information technology to petroleum resource assessment procedures is reviewed. A method was designed to expedite the gathering, integrating, managing, manipulating and plotting of spatial data from multiple data sources that are essential in modern resource assessment procedures.
Shafer, Sarah L; Bartlein, Patrick J; Gray, Elizabeth M; Pelltier, Richard T
2015-01-01
Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0-58.0°N latitude by 136.6-103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070-2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.
Three dimensional simulation of spatial and temporal variability of stratospheric hydrogen chloride
NASA Technical Reports Server (NTRS)
Kaye, Jack A.; Rood, Richard B.; Jackman, Charles H.; Allen, Dale J.; Larson, Edmund M.
1989-01-01
Spatial and temporal variability of atmospheric HCl columns are calculated for January 1979 using a three-dimensional chemistry-transport model designed to provide the best possible representation of stratospheric transport. Large spatial and temporal variability of the HCl columns is shown to be correlated with lower stratospheric potential vorticity and thus to be of dynamical origin. Systematic longitudinal structure is correlated with planetary wave structure. These results can help place spatially and temporally isolated column and profile measurements in a regional and/or global perspective.
Functional CAR models for large spatially correlated functional datasets.
Zhang, Lin; Baladandayuthapani, Veerabhadran; Zhu, Hongxiao; Baggerly, Keith A; Majewski, Tadeusz; Czerniak, Bogdan A; Morris, Jeffrey S
2016-01-01
We develop a functional conditional autoregressive (CAR) model for spatially correlated data for which functions are collected on areal units of a lattice. Our model performs functional response regression while accounting for spatial correlations with potentially nonseparable and nonstationary covariance structure, in both the space and functional domains. We show theoretically that our construction leads to a CAR model at each functional location, with spatial covariance parameters varying and borrowing strength across the functional domain. Using basis transformation strategies, the nonseparable spatial-functional model is computationally scalable to enormous functional datasets, generalizable to different basis functions, and can be used on functions defined on higher dimensional domains such as images. Through simulation studies, we demonstrate that accounting for the spatial correlation in our modeling leads to improved functional regression performance. Applied to a high-throughput spatially correlated copy number dataset, the model identifies genetic markers not identified by comparable methods that ignore spatial correlations.
Spatializing health research: what we know and where we are heading
Yang, Tse-Chuan; Shoff, Carla; Noah, Aggie J.
2013-01-01
Beyond individual-level factors, researchers have adopted a spatial perspective to explore potentially modifiable environmental determinants of health. A spatial perspective can be integrated into health research by incorporating spatial data into studies or analyzing georeferenced data. Given the rapid changes in data collection methods and the complex dynamics between individuals and environment, we argue that GIS functions have shortcomings with respect to analytical capability and are limited when it comes to visualizing the temporal component in spatio-temporal data. In addition, we maintain that relatively little effort has been made to handle spatial heterogeneity. To that end, health researchers should be persuaded to better justify the theoretical meaning underlying the spatial matrix in analysis, while spatial data collectors, GIS specialists, spatial analysis methodologists, and the different breeds of users should be encouraged to work together making health research move forward through addressing these issues. PMID:23733281
Dark gap solitons in exciton-polariton condensates in a periodic potential.
Cheng, Szu-Cheng; Chen, Ting-Wei
2018-03-01
We show that dark spatial gap solitons can occur inside the band gap of an exciton-polariton condensate (EPC) in a one-dimensional periodic potential. The energy dispersions of an EPC loaded into a periodic potential show a band-gap structure. Using the effective-mass model of the complex Gross-Pitaevskii equation with pump and dissipation in an EPC in a periodic potential, dark gap solitons are demonstrated near the minimum energy points of the band center and band edge of the first and second bands, respectively. The excitation energies of dark gap solitons are below these minimum points and fall into the band gap. The spatial width of a dark gap soliton becomes smaller as the pump power is increased.
Dark gap solitons in exciton-polariton condensates in a periodic potential
NASA Astrophysics Data System (ADS)
Cheng, Szu-Cheng; Chen, Ting-Wei
2018-03-01
We show that dark spatial gap solitons can occur inside the band gap of an exciton-polariton condensate (EPC) in a one-dimensional periodic potential. The energy dispersions of an EPC loaded into a periodic potential show a band-gap structure. Using the effective-mass model of the complex Gross-Pitaevskii equation with pump and dissipation in an EPC in a periodic potential, dark gap solitons are demonstrated near the minimum energy points of the band center and band edge of the first and second bands, respectively. The excitation energies of dark gap solitons are below these minimum points and fall into the band gap. The spatial width of a dark gap soliton becomes smaller as the pump power is increased.
ERIC Educational Resources Information Center
Sanderson, David J.; Good, Mark A.; Skelton, Kathryn; Sprengel, Rolf; Seeburg, Peter H.; Rawlins, J. Nicholas P.; Bannerman, David M.
2009-01-01
The GluA1 AMPA receptor subunit is a key mediator of hippocampal synaptic plasticity and is especially important for a rapidly-induced, short-lasting form of potentiation. GluA1 gene deletion impairs hippocampus-dependent, spatial working memory, but spares hippocampus-dependent spatial reference memory. These findings may reflect the necessity of…
TBI-Induced Formation of Toxic Tau and Its Biochemical Similarities to Tau in AD Brains
2016-10-01
onto wild-type mice markedly reduces 1) memory including contextual fear memory and spatial memory, and 2) long-term potentiation, a type of...TERMS Tau, contextual fear memory, spatial memory, synaptic plasticity, traumatic brain injury, Alzheimer’s disease 16. SECURITY CLASSIFICATION OF: 17...mechanism leading to TBI and AD. 2 KEYWORDS Tau, contextual fear memory, spatial memory, synaptic plasticity, traumatic brain injury, Alzheimer’s
Application of the automated spatial surveillance program to birth defects surveillance data.
Gardner, Bennett R; Strickland, Matthew J; Correa, Adolfo
2007-07-01
Although many birth defects surveillance programs incorporate georeferenced records into their databases, practical methods for routine spatial surveillance are lacking. We present a macroprogram written for the software package R designed for routine exploratory spatial analysis of birth defects data, the Automated Spatial Surveillance Program (ASSP), and present an application of this program using spina bifida prevalence data for metropolitan Atlanta. Birth defects surveillance data were collected by the Metropolitan Atlanta Congenital Defects Program. We generated ASSP maps for two groups of years that correspond roughly to the periods before (1994-1998) and after (1999-2002) folic acid fortification of flour. ASSP maps display census tract-specific spina bifida prevalence, smoothed prevalence contours, and locations of statistically elevated prevalence. We used these maps to identify areas of elevated prevalence for spina bifida. We identified a large area of potential concern in the years following fortification of grains and cereals with folic acid. This area overlapped census tracts containing large numbers of Hispanic residents. The potential utility of ASSP for spatial disease monitoring was demonstrated by the identification of areas of high prevalence of spina bifida and may warrant further study and monitoring. We intend to further develop ASSP so that it becomes practical for routine spatial monitoring of birth defects. (c) 2007 Wiley-Liss, Inc.
SPATIAL AND DIEL AVAILABILITY OF FLYING INSECTS AS POTENTIAL DUCKLING FOOD IN PRAIRIE WETLANDS.
The study examined spatial and diel availibility of flying insects that are a critical food resource to young duckings. Insects were sampled in three native prairie wetlands on the Woodworth Study Area of south-central North Dakota.
Spatio-temporal patterns of key exploited marine species in the Northwestern Mediterranean Sea.
Morfin, Marie; Fromentin, Jean-Marc; Jadaud, Angélique; Bez, Nicolas
2012-01-01
This study analyzes the temporal variability/stability of the spatial distributions of key exploited species in the Gulf of Lions (Northwestern Mediterranean Sea). To do so, we analyzed data from the MEDITS bottom-trawl scientific surveys from 1994 to 2010 at 66 fixed stations and selected 12 key exploited species. We proposed a geostatistical approach to handle zero-inflated and non-stationary distributions and to test for the temporal stability of the spatial structures. Empirical Orthogonal Functions and other descriptors were then applied to investigate the temporal persistence and the characteristics of the spatial patterns. The spatial structure of the distribution (i.e. the pattern of spatial autocorrelation) of the 12 key species studied remained highly stable over the time period sampled. The spatial distributions of all species obtained through kriging also appeared to be stable over time, while each species displayed a specific spatial distribution. Furthermore, adults were generally more densely concentrated than juveniles and occupied areas included in the distribution of juveniles. Despite the strong persistence of spatial distributions, we also observed that the area occupied by each species was correlated to its abundance: the more abundant the species, the larger the occupation area. Such a result tends to support MacCall's basin theory, according to which density-dependence responses would drive the expansion of those 12 key species in the Gulf of Lions. Further analyses showed that these species never saturated their habitats, suggesting that they are below their carrying capacity; an assumption in agreement with the overexploitation of several of these species. Finally, the stability of their spatial distributions over time and their potential ability to diffuse outside their main habitats give support to Marine Protected Areas as a potential pertinent management tool.
NASA Astrophysics Data System (ADS)
Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.
2016-03-01
Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.
A geo-spatial data management system for potentially active volcanoes—GEOWARN project
NASA Astrophysics Data System (ADS)
Gogu, Radu C.; Dietrich, Volker J.; Jenny, Bernhard; Schwandner, Florian M.; Hurni, Lorenz
2006-02-01
Integrated studies of active volcanic systems for the purpose of long-term monitoring and forecast and short-term eruption prediction require large numbers of data-sets from various disciplines. A modern database concept has been developed for managing and analyzing multi-disciplinary volcanological data-sets. The GEOWARN project (choosing the "Kos-Yali-Nisyros-Tilos volcanic field, Greece" and the "Campi Flegrei, Italy" as test sites) is oriented toward potentially active volcanoes situated in regions of high geodynamic unrest. This article describes the volcanological database of the spatial and temporal data acquired within the GEOWARN project. As a first step, a spatial database embedded in a Geographic Information System (GIS) environment was created. Digital data of different spatial resolution, and time-series data collected at different intervals or periods, were unified in a common, four-dimensional representation of space and time. The database scheme comprises various information layers containing geographic data (e.g. seafloor and land digital elevation model, satellite imagery, anthropogenic structures, land-use), geophysical data (e.g. from active and passive seismicity, gravity, tomography, SAR interferometry, thermal imagery, differential GPS), geological data (e.g. lithology, structural geology, oceanography), and geochemical data (e.g. from hydrothermal fluid chemistry and diffuse degassing features). As a second step based on the presented database, spatial data analysis has been performed using custom-programmed interfaces that execute query scripts resulting in a graphical visualization of data. These query tools were designed and compiled following scenarios of known "behavior" patterns of dormant volcanoes and first candidate signs of potential unrest. The spatial database and query approach is intended to facilitate scientific research on volcanic processes and phenomena, and volcanic surveillance.
Soto-Moyano, Rubén; Burgos, Héctor; Flores, Francisco; Valladares, Luis; Sierralta, Walter; Fernández, Victor; Pérez, Hernán; Hernández, Paula; Hernández, Alejandro
2006-10-01
Melatonin has been shown to inhibit long-term potentiation (LTP) in hippocampal slices of rats. Since LTP may be one of the main mechanisms by which memory traces are encoded and stored in the central nervous system, it is possible that melatonin could modulate cognitive performance by interfering with the cellular and/or molecular mechanisms involved in LTP. We investigated in rats the effects of intraperitoneally-administered melatonin (0.1, 1 and 10 mg/kg), its saline-ethanol solvent, or saline alone, on the acquisition of visuo-spatial memory as well as on the ability of the cerebral cortex to develop LTP in vivo. Visuo-spatial performance was assessed daily in rats, for 10 days, in an 8-arm radial maze, 30 min after they received a single daily dose of melatonin. Visual cortex LTP was determined in sodium pentobarbital anesthetized rats (65 mg/kg i.p.), by potentiating transcallosal evoked responses with a tetanizing train (312 Hz, 500 ms duration) 30 min after administration of a single dose of melatonin. Results showed that melatonin impaired visuo-spatial performance in rats, as revealed by the greater number of errors committed and time spent to solve the task in the radial maze. Melatonin also prevented the induction of neocortical LTP. It is concluded that melatonin, at the doses utilized in this study, could alter some forms of neocortical plasticity involved in short- and long-term visuo-spatial memories in rats.
NASA Astrophysics Data System (ADS)
Scaduto, David A.; Lubinsky, Anthony R.; Rowlands, John A.; Kenmotsu, Hidenori; Nishimoto, Norihito; Nishino, Takeshi; Tanioka, Kenkichi; Zhao, Wei
2014-03-01
We have previously proposed SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout), a novel detector concept with potentially superior spatial resolution and low-dose performance compared with existing flat-panel imagers. The detector comprises a scintillator that is optically coupled to an amorphous selenium photoconductor operated with avalanche gain, known as high-gain avalanche rushing photoconductor (HARP). High resolution electron beam readout is achieved using a field emitter array (FEA). This combination of avalanche gain, allowing for very low-dose imaging, and electron emitter readout, providing high spatial resolution, offers potentially superior image quality compared with existing flat-panel imagers, with specific applications to fluoroscopy and breast imaging. Through the present collaboration, a prototype HARP sensor with integrated electrostatic focusing and nano- Spindt FEA readout technology has been fabricated. The integrated electron-optic focusing approach is more suitable for fabricating large-area detectors. We investigate the dependence of spatial resolution on sensor structure and operating conditions, and compare the performance of electrostatic focusing with previous technologies. Our results show a clear dependence of spatial resolution on electrostatic focusing potential, with performance approaching that of the previous design with external mesh-electrode. Further, temporal performance (lag) of the detector is evaluated and the results show that the integrated electrostatic focusing design exhibits comparable or better performance compared with the mesh-electrode design. This study represents the first technical evaluation and characterization of the SAPHIRE concept with integrated electrostatic focusing.
Confronting the Paradox of Enrichment to the Metacommunity Perspective
Hauzy, Céline; Nadin, Grégoire; Canard, Elsa; Gounand, Isabelle; Mouquet, Nicolas; Ebenman, Bo
2013-01-01
Resource enrichment can potentially destabilize predator-prey dynamics. This phenomenon historically referred as the "paradox of enrichment" has mostly been explored in spatially homogenous environments. However, many predator-prey communities exchange organisms within spatially heterogeneous networks called metacommunities. This heterogeneity can result from uneven distribution of resources among communities and thus can lead to the spreading of local enrichment within metacommunities. Here, we adapted the original Rosenzweig-MacArthur predator-prey model, built to study the paradox of enrichment, to investigate the effect of regional enrichment and of its spatial distribution on predator-prey dynamics in metacommunities. We found that the potential for destabilization was depending on the connectivity among communities and the spatial distribution of enrichment. In one hand, we found that at low dispersal regional enrichment led to the destabilization of predator-prey dynamics. This destabilizing effect was more pronounced when the enrichment was uneven among communities. In the other hand, we found that high dispersal could stabilize the predator-prey dynamics when the enrichment was spatially heterogeneous. Our results illustrate that the destabilizing effect of enrichment can be dampened when the spatial scale of resource enrichment is lower than that of organismss movements (heterogeneous enrichment). From a conservation perspective, our results illustrate that spatial heterogeneity could decrease the regional extinction risk of species involved in specialized trophic interactions. From the perspective of biological control, our results show that the heterogeneous distribution of pest resource could favor or dampen outbreaks of pests and of their natural enemies, depending on the spatial scale of heterogeneity. PMID:24358242
Towards a 3d Spatial Urban Energy Modelling Approach
NASA Astrophysics Data System (ADS)
Bahu, J.-M.; Koch, A.; Kremers, E.; Murshed, S. M.
2013-09-01
Today's needs to reduce the environmental impact of energy use impose dramatic changes for energy infrastructure and existing demand patterns (e.g. buildings) corresponding to their specific context. In addition, future energy systems are expected to integrate a considerable share of fluctuating power sources and equally a high share of distributed generation of electricity. Energy system models capable of describing such future systems and allowing the simulation of the impact of these developments thus require a spatial representation in order to reflect the local context and the boundary conditions. This paper describes two recent research approaches developed at EIFER in the fields of (a) geo-localised simulation of heat energy demand in cities based on 3D morphological data and (b) spatially explicit Agent-Based Models (ABM) for the simulation of smart grids. 3D city models were used to assess solar potential and heat energy demand of residential buildings which enable cities to target the building refurbishment potentials. Distributed energy systems require innovative modelling techniques where individual components are represented and can interact. With this approach, several smart grid demonstrators were simulated, where heterogeneous models are spatially represented. Coupling 3D geodata with energy system ABMs holds different advantages for both approaches. On one hand, energy system models can be enhanced with high resolution data from 3D city models and their semantic relations. Furthermore, they allow for spatial analysis and visualisation of the results, with emphasis on spatially and structurally correlations among the different layers (e.g. infrastructure, buildings, administrative zones) to provide an integrated approach. On the other hand, 3D models can benefit from more detailed system description of energy infrastructure, representing dynamic phenomena and high resolution models for energy use at component level. The proposed modelling strategies conceptually and practically integrate urban spatial and energy planning approaches. The combined modelling approach that will be developed based on the described sectorial models holds the potential to represent hybrid energy systems coupling distributed generation of electricity with thermal conversion systems.
Considerations for applying digital soil mapping to ecological sites
USDA-ARS?s Scientific Manuscript database
Recent advancements in the spatial prediction of soil properties are not currently being fully utilized for ecological studies. Linking digital soil mapping (DSM) with ecological sites (ES) has the potential to better land management decisions by improving spatial resolution and precision as well as...
Spatial Learning and Wayfinding in an Immersive Environment: The Digital Fulldome.
Hedge, Craig; Weaver, Ruth; Schnall, Simone
2017-05-01
Previous work has examined whether immersive technologies can benefit learning in virtual environments, but the potential benefits of technology in this context are confounded by individual differences such as spatial ability. We assessed spatial knowledge acquisition in male and female participants using a technology not previously examined empirically: the digital fulldome. Our primary aim was to examine whether performance on a test of survey knowledge was better in a fulldome (N = 28, 12 males) relative to a large, flat screen display (N = 27, 13 males). Regression analysis showed that, compared to a flat screen display, males showed higher levels of performance on a test of survey knowledge after learning in the fulldome, but no benefit occurred for females. Furthermore, performance correlated with spatial visualization ability in male participants, but not in female participants. Thus, the digital fulldome is a potentially useful learning aid, capable of accommodating multiple users, but individual differences and use of strategy need to be considered.
Zhang, Chaosheng; Tang, Ya; Luo, Lin; Xu, Weilin
2009-11-01
Outliers in urban soil geochemical databases may imply potential contaminated land. Different methodologies which can be easily implemented for the identification of global and spatial outliers were applied for Pb concentrations in urban soils of Galway City in Ireland. Due to its strongly skewed probability feature, a Box-Cox transformation was performed prior to further analyses. The graphic methods of histogram and box-and-whisker plot were effective in identification of global outliers at the original scale of the dataset. Spatial outliers could be identified by a local indicator of spatial association of local Moran's I, cross-validation of kriging, and a geographically weighted regression. The spatial locations of outliers were visualised using a geographical information system. Different methods showed generally consistent results, but differences existed. It is suggested that outliers identified by statistical methods should be confirmed and justified using scientific knowledge before they are properly dealt with.
Brown, Jason L; Cameron, Alison; Yoder, Anne D; Vences, Miguel
2014-10-09
Pattern and process are inextricably linked in biogeographic analyses, though we can observe pattern, we must infer process. Inferences of process are often based on ad hoc comparisons using a single spatial predictor. Here, we present an alternative approach that uses mixed-spatial models to measure the predictive potential of combinations of hypotheses. Biodiversity patterns are estimated from 8,362 occurrence records from 745 species of Malagasy amphibians and reptiles. By incorporating 18 spatially explicit predictions of 12 major biogeographic hypotheses, we show that mixed models greatly improve our ability to explain the observed biodiversity patterns. We conclude that patterns are influenced by a combination of diversification processes rather than by a single predominant mechanism. A 'one-size-fits-all' model does not exist. By developing a novel method for examining and synthesizing spatial parameters such as species richness, endemism and community similarity, we demonstrate the potential of these analyses for understanding the diversification history of Madagascar's biota.
Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.
Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J
2017-06-12
A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.
Recording and assessment of evoked potentials with electrode arrays.
Miljković, N; Malešević, N; Kojić, V; Bijelić, G; Keller, T; Popović, D B
2015-09-01
In order to optimize procedure for the assessment of evoked potentials and to provide visualization of the flow of action potentials along the motor systems, we introduced array electrodes for stimulation and recording and developed software for the analysis of the recordings. The system uses a stimulator connected to an electrode array for the generation of evoked potentials, an electrode array connected to the amplifier, A/D converter and computer for the recording of evoked potentials, and a dedicated software application. The method has been tested for the assessment of the H-reflex on the triceps surae muscle in six healthy humans. The electrode array with 16 pads was positioned over the posterior aspect of the thigh, while the recording electrode array with 16 pads was positioned over the triceps surae muscle. The stimulator activated all the pads of the stimulation electrode array asynchronously, while the signals were recorded continuously at all the recording sites. The results are topography maps (spatial distribution of evoked potentials) and matrices (spatial visualization of nerve excitability). The software allows the automatic selection of the lowest stimulation intensity to achieve maximal H-reflex amplitude and selection of the recording/stimulation pads according to predefined criteria. The analysis of results shows that the method provides rich information compared with the conventional recording of the H-reflex with regard the spatial distribution.
The role of potential agents in making spatial perspective taking social
Clements-Stephens, Amy M.; Vasiljevic, Katarina; Murray, Alexandra J.; Shelton, Amy L.
2013-01-01
A striking relationship between visual spatial perspective taking (VSPT) and social skills has been demonstrated for perspective-taking tasks in which the target of the imagined or inferred perspective is a potential agent, suggesting that the presence of a potential agent may create a social context for the seemingly spatial task of imagining a novel visual perspective. In a series of studies, we set out to investigate how and when a target might be viewed as sufficiently agent-like to incur a social influence on VSPT performance. By varying the perceptual and conceptual features that defined the targets as potential agents, we find that even something as simple as suggesting animacy for a simple wooden block may be sufficient. More critically, we found that experience with one potential agent influenced the performance with subsequent targets, either by inducing or eliminating the influence of social skills on VSPT performance. These carryover effects suggest that the relationship between social skills and VSPT performance is mediated by a complex relationship that includes the task, the target, and the context in which that target is perceived. These findings highlight potential problems that arise when identifying a task as belonging exclusively to a single cognitive domain and stress instead the highly interactive nature of cognitive domains and their susceptibility to cross-domain individual differences. PMID:24046735
NASA Astrophysics Data System (ADS)
Jang, Cheng-Shin; Liu, Chen-Wuing
2005-10-01
This study aimed to analyze the contamination potential associated with the reactive transport of nitrate-N and ammonium-N in the Choushui River alluvial fan, Taiwan and to evaluate a risk region in developing a groundwater protection policy in 2021. In this area, an aquifer redox sequence provided a good understanding of the spatial distributions of nitrate-N and ammonium-N and of aerobic and anaerobic environments. Equiprobable hydraulic conductivity ( K) fields reproduced by geostatistical methods characterized the spatial uncertainty of contaminant transport in the heterogeneous aquifer. Nitrogen contamination potential fronts for high and low threshold concentrations based on a 95% risk probability were used to assess different levels of risk. The simulated result reveals that the spatial uncertainty of highly heterogeneous K fields governs the contamination potential assessment of the nitrogen compounds along the regional flow directions. The contamination potential of nitrate-N is more uncertain than that for ammonium-N. The high nitrate-N concentrations (≧ 3 mg/L) are prevalent in the aerobic environment. The low concentration nitrate-N plumes (0.5-3 mg/L) gradually migrate to the mid-fan area and to a maximum distance of 15 km from the aerobic region. The nitrate-N plumes pose a potential human health risk in the aerobic and anaerobic environments. The ammonium-N plumes remain stably confined to the distal-fan and partial mid-fan areas.
González, Cristián; Castillo, Miguel; García-Chevesich, Pablo; Barrios, Juan
2018-02-01
A spatial modeling was applied to Chilean wildfire occurrence, through the Dempster-Shafer's evidence theory and considering the 2006-2010 period for the Valparaiso Region (central Chile), a representative area for this experiment. Results indicate strong spatial correlation between documented wildfires and cumulative evidence maps, resulting in a powerful tool for future wildfire risk prevention programs. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chelsky, A.; Marton, J. M.; Bernhard, A. E.; Giblin, A. E.; Setta, S. P.; Hill, T. D.; Roberts, B. J.
2016-02-01
Louisiana salt marshes are important sites for carbon and nitrogen cycling because they can mitigate fluxes of nutrients and carbon to the Gulf of Mexico where a large hypoxic zone develops annually. The aim of this study was to investigate spatial and temporal patterns of biogeochemical processes in Louisiana coastal wetlands during peak growing season, and to investigate whether the Deepwater Horizon oil spill resulted in persistent changes to these rates. We measured nitrification potential and sediment characteristics at two pairs of oiled/unoiled marshes in three regions across the Louisiana coast (Terrebonne and east and west Barataria Bay) in July from 2012 to 2015, with plots along a gradient from the salt marsh edge to the interior. Rates of nitrification potential across the coast (overall mean of 901 ± 115 nmol gdw-1 d-1 from 2012-2014) were high compared to other published rates for salt marshes but displayed high variability at the plot level (4 orders of magnitude). Within each region interannual means varied by factors of 2-5. Nitrification potential did not differ with oiling history, but did display consistent spatial patterns within each region that corresponded to changes in relative elevation and inundation, which influence patterns of soil properties and microbial communities. In 2015, we also measured greenhouse gas (CO2, N2O and CH4) production and denitrification enzyme activity rates in addition to nitrification potential across the region to investigate spatial relationships between these processes.
NASA Astrophysics Data System (ADS)
Edler, Karl T.
The issue of eddy currents induced by the rapid switching of magnetic field gradients is a long-standing problem in magnetic resonance imaging. A new method for dealing with this problem is presented whereby spatial harmonic components of the magnetic field are continuously sensed, through their temporal rates of change, and corrected. In this way, the effects of the eddy currents on multiple spatial harmonic components of the magnetic field can be detected and corrections applied during the rise time of the gradients. Sensing the temporal changes in each spatial harmonic is made possible with specially designed detection coils. However to make the design of these coils possible, general relationships between the spatial harmonics of the field, scalar potential, and vector potential are found within the quasi-static approximation. These relationships allow the vector potential to be found from the field -- an inverse curl operation -- and may be of use beyond the specific problem of detection coil design. Using the detection coils as sensors, methods are developed for designing a negative feedback system to control the eddy current effects and optimizing that system with respect to image noise and distortion. The design methods are successfully tested in a series of proof-of-principle experiments which lead to a discussion of how to incorporate similar designs into an operational MRI. Keywords: magnetic resonance imaging, eddy currents, dynamic shimming, negative feedback, quasi-static fields, vector potential, inverse curl
Regional risk assessment for contaminated sites part 2: ranking of potentially contaminated sites.
Pizzol, Lisa; Critto, Andrea; Agostini, Paola; Marcomini, Antonio
2011-11-01
Environmental risks are traditionally assessed and presented in non spatial ways although the heterogeneity of the contaminants spatial distributions, the spatial positions and relations between receptors and stressors, as well as the spatial distribution of the variables involved in the risk assessment, strongly influence exposure estimations and hence risks. Taking into account spatial variability is increasingly being recognized as a further and essential step in sound exposure and risk assessment. To address this issue an innovative methodology which integrates spatial analysis and a relative risk approach was developed. The purpose of this methodology is to prioritize sites at regional scale where a preliminary site investigation may be required. The methodology aimed at supporting the inventory of contaminated sites was implemented within the spatial decision support sYstem for Regional rIsk Assessment of DEgraded land, SYRIADE, and was applied to the case-study of the Upper Silesia region (Poland). The developed methodology and tool are both flexible and easy to adapt to different regional contexts, allowing the user to introduce the regional relevant parameters identified on the basis of user expertise and regional data availability. Moreover, the used GIS functionalities, integrated with mathematical approaches, allow to take into consideration, all at once, the multiplicity of sources and impacted receptors within the region of concern, to assess the risks posed by all contaminated sites in the region and, finally, to provide a risk-based ranking of the potentially contaminated sites. Copyright © 2011. Published by Elsevier Ltd.
Thapa, Dharendra; Shepherd, Danielle L.
2014-01-01
Cardiac tissue contains discrete pools of mitochondria that are characterized by their subcellular spatial arrangement. Subsarcolemmal mitochondria (SSM) exist below the cell membrane, interfibrillar mitochondria (IFM) reside in rows between the myofibrils, and perinuclear mitochondria are situated at the nuclear poles. Microstructural imaging of heart tissue coupled with the development of differential isolation techniques designed to sequentially separate spatially distinct mitochondrial subpopulations have revealed differences in morphological features including shape, absolute size, and internal cristae arrangement. These findings have been complemented by functional studies indicating differences in biochemical parameters and, potentially, functional roles for the ATP generated, based upon subcellular location. Consequently, mitochondrial subpopulations appear to be influenced differently during cardiac pathologies including ischemia/reperfusion, heart failure, aging, exercise, and diabetes mellitus. These influences may be the result of specific structural and functional disparities between mitochondrial subpopulations such that the stress elicited by a given cardiac insult differentially impacts subcellular locales and the mitochondria contained within. The goal of this review is to highlight some of the inherent structural and functional differences that exist between spatially distinct cardiac mitochondrial subpopulations as well as provide an overview of the differential impact of various cardiac pathologies on spatially distinct mitochondrial subpopulations. As an outcome, we will instill a basis for incorporating subcellular spatial location when evaluating the impact of cardiac pathologies on the mitochondrion. Incorporation of subcellular spatial location may offer the greatest potential for delineating the influence of cardiac pathology on this critical organelle. PMID:24778166
Bayesian spatio-temporal discard model in a demersal trawl fishery
NASA Astrophysics Data System (ADS)
Grazia Pennino, M.; Muñoz, Facundo; Conesa, David; López-Quílez, Antonio; Bellido, José M.
2014-07-01
Spatial management of discards has recently been proposed as a useful tool for the protection of juveniles, by reducing discard rates and can be used as a buffer against management errors and recruitment failure. In this study Bayesian hierarchical spatial models have been used to analyze about 440 trawl fishing operations of two different metiers, sampled between 2009 and 2012, in order to improve our understanding of factors that influence the quantity of discards and to identify their spatio-temporal distribution in the study area. Our analysis showed that the relative importance of each variable was different for each metier, with a few similarities. In particular, the random vessel effect and seasonal variability were identified as main driving variables for both metiers. Predictive maps of the abundance of discards and maps of the posterior mean of the spatial component show several hot spots with high discard concentration for each metier. We argue how the seasonal/spatial effects, and the knowledge about the factors influential to discarding, could potentially be exploited as potential mitigation measures for future fisheries management strategies. However, misidentification of hotspots and uncertain predictions can culminate in inappropriate mitigation practices which can sometimes be irreversible. The proposed Bayesian spatial method overcomes these issues, since it offers a unified approach which allows the incorporation of spatial random-effect terms, spatial correlation of the variables and the uncertainty of the parameters in the modeling process, resulting in a better quantification of the uncertainty and accurate predictions.
Spatial data analysis for exploration of regional scale geothermal resources
NASA Astrophysics Data System (ADS)
Moghaddam, Majid Kiavarz; Noorollahi, Younes; Samadzadegan, Farhad; Sharifi, Mohammad Ali; Itoi, Ryuichi
2013-10-01
Defining a comprehensive conceptual model of the resources sought is one of the most important steps in geothermal potential mapping. In this study, Fry analysis as a spatial distribution method and 5% well existence, distance distribution, weights of evidence (WofE), and evidential belief function (EBFs) methods as spatial association methods were applied comparatively to known geothermal occurrences, and to publicly-available regional-scale geoscience data in Akita and Iwate provinces within the Tohoku volcanic arc, in northern Japan. Fry analysis and rose diagrams revealed similar directional patterns of geothermal wells and volcanoes, NNW-, NNE-, NE-trending faults, hotsprings and fumaroles. Among the spatial association methods, WofE defined a conceptual model correspondent with the real world situations, approved with the aid of expert opinion. The results of the spatial association analyses quantitatively indicated that the known geothermal occurrences are strongly spatially-associated with geological features such as volcanoes, craters, NNW-, NNE-, NE-direction faults and geochemical features such as hotsprings, hydrothermal alteration zones and fumaroles. Geophysical data contains temperature gradients over 100 °C/km and heat flow over 100 mW/m2. In general, geochemical and geophysical data were better evidence layers than geological data for exploring geothermal resources. The spatial analyses of the case study area suggested that quantitative knowledge from hydrothermal geothermal resources was significantly useful for further exploration and for geothermal potential mapping in the case study region. The results can also be extended to the regions with nearly similar characteristics.
How Math Anxiety Relates to Number-Space Associations.
Georges, Carrie; Hoffmann, Danielle; Schiltz, Christine
2016-01-01
Given the considerable prevalence of math anxiety, it is important to identify the factors contributing to it in order to improve mathematical learning. Research on math anxiety typically focusses on the effects of more complex arithmetic skills. Recent evidence, however, suggests that deficits in basic numerical processing and spatial skills also constitute potential risk factors of math anxiety. Given these observations, we determined whether math anxiety also depends on the quality of spatial-numerical associations. Behavioral evidence for a tight link between numerical and spatial representations is given by the SNARC (spatial-numerical association of response codes) effect, characterized by faster left-/right-sided responses for small/large digits respectively in binary classification tasks. We compared the strength of the SNARC effect between high and low math anxious individuals using the classical parity judgment task in addition to evaluating their spatial skills, arithmetic performance, working memory and inhibitory control. Greater math anxiety was significantly associated with stronger spatio-numerical interactions. This finding adds to the recent evidence supporting a link between math anxiety and basic numerical abilities and strengthens the idea that certain characteristics of low-level number processing such as stronger number-space associations constitute a potential risk factor of math anxiety.
How Math Anxiety Relates to Number–Space Associations
Georges, Carrie; Hoffmann, Danielle; Schiltz, Christine
2016-01-01
Given the considerable prevalence of math anxiety, it is important to identify the factors contributing to it in order to improve mathematical learning. Research on math anxiety typically focusses on the effects of more complex arithmetic skills. Recent evidence, however, suggests that deficits in basic numerical processing and spatial skills also constitute potential risk factors of math anxiety. Given these observations, we determined whether math anxiety also depends on the quality of spatial-numerical associations. Behavioral evidence for a tight link between numerical and spatial representations is given by the SNARC (spatial-numerical association of response codes) effect, characterized by faster left-/right-sided responses for small/large digits respectively in binary classification tasks. We compared the strength of the SNARC effect between high and low math anxious individuals using the classical parity judgment task in addition to evaluating their spatial skills, arithmetic performance, working memory and inhibitory control. Greater math anxiety was significantly associated with stronger spatio-numerical interactions. This finding adds to the recent evidence supporting a link between math anxiety and basic numerical abilities and strengthens the idea that certain characteristics of low-level number processing such as stronger number–space associations constitute a potential risk factor of math anxiety. PMID:27683570
Wei, Jianbing; Feng, Hao; Cheng, Quanguo; Gao, Shiqian; Liu, Haiyan
2017-02-01
The objective of this study was to test the hypothesis that environmental regulators of riparian zone soil denitrification potential differ according to spatial scale within a watershed; consequently, a second objective was to provide spatial strategies for conserving and restoring the purification function of runoff in riparian ecosystems. The results show that soil denitrification in riparian zones was more heterogeneous at the profile scale than at the cross-section and landscape scales. At the profile scale, biogeochemical factors (including soil total organic carbon, total nitrogen, and nitrate-nitrogen) were the major direct regulators of the spatial distribution of soil denitrification enzyme activity (DEA). At the cross-section scale, factors included distance from river bank and vegetation density, while landscape-scale factors, including topographic index, elevation, and land use types, indirectly regulated the spatial distribution of DEA. At the profile scale, soil DEA was greatest in the upper soil layers. At the cross-section scale, maximum soil DEA occurred in the mid-part of the riparian zone. At the landscape scale, soil DEA showed an increasing trend towards downstream sites, except for those in urbanized areas.
Norris, Edmund J; Coats, Joel R
2017-01-29
Every year, approximately 700,000 people die from complications associated with etiologic disease agents transmitted by mosquitoes. While insecticide-based vector control strategies are important for the management of mosquito-borne diseases, insecticide-resistance and other logistical hurdles may lower the efficacy of this approach, especially in developing countries. Repellent technologies represent another fundamental aspect of preventing mosquito-borne disease transmission. Among these technologies, spatial repellents are promising alternatives to the currently utilized contact repellents and may significantly aid in the prevention of mosquito-borne disease if properly incorporated into integrated pest management approaches. As their deployment would not rely on prohibitively expensive or impractical novel accessory technologies and resources, they have potential utility in developing countries where the burden of mosquito-borne disease is most prevalent. This review aims to describe the history of various repellent technologies, highlight the potential of repellent technologies in preventing the spread of mosquito-borne disease, and discuss currently known mechanisms that confer resistance to current contact and spatial repellents, which may lead to the failures of these repellents. In the subsequent section, current and future research projects aimed at exploring long-lasting non-pyrethroid spatial repellent molecules along with new paradigms and rationale for their development will be discussed.