Survey Strategy Optimization for the Atacama Cosmology Telescope
NASA Technical Reports Server (NTRS)
De Bernardis, F.; Stevens, J. R.; Hasselfield, M.; Alonso, D.; Bond, J. R.; Calabrese, E.; Choi, S. K.; Crowley, K. T.; Devlin, M.; Wollack, E. J.
2016-01-01
In recent years there have been significant improvements in the sensitivity and the angular resolution of the instruments dedicated to the observation of the Cosmic Microwave Background (CMB). ACTPol is the first polarization receiver for the Atacama Cosmology Telescope (ACT) and is observing the CMB sky with arcmin resolution over approximately 2000 square degrees. Its upgrade, Advanced ACTPol (AdvACT), will observe the CMB in five frequency bands and over a larger area of the sky. We describe the optimization and implementation of the ACTPol and AdvACT surveys. The selection of the observed fields is driven mainly by the science goals, that is, small angular scale CMB measurements, B-mode measurements and cross-correlation studies. For the ACTPol survey we have observed patches of the southern galactic sky with low galactic foreground emissions which were also chosen to maximize the overlap with several galaxy surveys to allow unique cross-correlation studies. A wider field in the northern galactic cap ensured significant additional overlap with the BOSS spectroscopic survey. The exact shapes and footprints of the fields were optimized to achieve uniform coverage and to obtain cross-linked maps by observing the fields with different scan directions. We have maximized the efficiency of the survey by implementing a close to 24-hour observing strategy, switching between daytime and nighttime observing plans and minimizing the telescope idle time. We describe the challenges represented by the survey optimization for the significantly wider area observed by AdvACT, which will observe roughly half of the low-foreground sky. The survey strategies described here may prove useful for planning future ground-based CMB surveys, such as the Simons Observatory and CMB Stage IV surveys.
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
Optimizing baryon acoustic oscillation surveys - II. Curvature, redshifts and external data sets
NASA Astrophysics Data System (ADS)
Parkinson, David; Kunz, Martin; Liddle, Andrew R.; Bassett, Bruce A.; Nichol, Robert C.; Vardanyan, Mihran
2010-02-01
We extend our study of the optimization of large baryon acoustic oscillation (BAO) surveys to return the best constraints on the dark energy, building on Paper I of this series by Parkinson et al. The survey galaxies are assumed to be pre-selected active, star-forming galaxies observed by their line emission with a constant number density across the redshift bin. Star-forming galaxies have a redshift desert in the region 1.6 < z < 2, and so this redshift range was excluded from the analysis. We use the Seo & Eisenstein fitting formula for the accuracies of the BAO measurements, using only the information for the oscillatory part of the power spectrum as distance and expansion rate rulers. We go beyond our earlier analysis by examining the effect of including curvature on the optimal survey configuration and updating the expected `prior' constraints from Planck and the Sloan Digital Sky Survey. We once again find that the optimal survey strategy involves minimizing the exposure time and maximizing the survey area (within the instrumental constraints), and that all time should be spent observing in the low-redshift range (z < 1.6) rather than beyond the redshift desert, z > 2. We find that, when assuming a flat universe, the optimal survey makes measurements in the redshift range 0.1 < z < 0.7, but that including curvature as a nuisance parameter requires us to push the maximum redshift to 1.35, to remove the degeneracy between curvature and evolving dark energy. The inclusion of expected other data sets (such as WiggleZ, the Baryon Oscillation Spectroscopic Survey and a stage III Type Ia supernova survey) removes the necessity of measurements below redshift 0.9, and pushes the maximum redshift up to 1.5. We discuss considerations in determining the best survey strategy in light of uncertainty in the true underlying cosmological model.
Ocean data assimilation using optimal interpolation with a quasi-geostrophic model
NASA Technical Reports Server (NTRS)
Rienecker, Michele M.; Miller, Robert N.
1991-01-01
A quasi-geostrophic (QG) stream function is analyzed by optimal interpolation (OI) over a 59-day period in a 150-km-square domain off northern California. Hydrographic observations acquired over five surveys were assimilated into a QG open boundary ocean model. Assimilation experiments were conducted separately for individual surveys to investigate the sensitivity of the OI analyses to parameters defining the decorrelation scale of an assumed error covariance function. The analyses were intercompared through dynamical hindcasts between surveys. The best hindcast was obtained using the smooth analyses produced with assumed error decorrelation scales identical to those of the observed stream function. The rms difference between the hindcast stream function and the final analysis was only 23 percent of the observation standard deviation. The two sets of OI analyses were temporally smoother than the fields from statistical objective analysis and in good agreement with the only independent data available for comparison.
NASA Astrophysics Data System (ADS)
Jouvel, S.; Kneib, J.-P.; Bernstein, G.; Ilbert, O.; Jelinsky, P.; Milliard, B.; Ealet, A.; Schimd, C.; Dahlen, T.; Arnouts, S.
2011-08-01
Context. With the discovery of the accelerated expansion of the universe, different observational probes have been proposed to investigate the presence of dark energy, including possible modifications to the gravitation laws by accurately measuring the expansion of the Universe and the growth of structures. We need to optimize the return from future dark energy surveys to obtain the best results from these probes. Aims: A high precision weak-lensing analysis requires not an only accurate measurement of galaxy shapes but also a precise and unbiased measurement of galaxy redshifts. The survey strategy has to be defined following both the photometric redshift and shape measurement accuracy. Methods: We define the key properties of the weak-lensing instrument and compute the effective PSF and the overall throughput and sensitivities. We then investigate the impact of the pixel scale on the sampling of the effective PSF, and place upper limits on the pixel scale. We then define the survey strategy computing the survey area including in particular both the Galactic absorption and Zodiacal light variation accross the sky. Using the Le Phare photometric redshift code and realistic galaxy mock catalog, we investigate the properties of different filter-sets and the importance of the u-band photometry quality to optimize the photometric redshift and the dark energy figure of merit (FoM). Results: Using the predicted photometric redshift quality, simple shape measurement requirements, and a proper sky model, we explore what could be an optimal weak-lensing dark energy mission based on FoM calculation. We find that we can derive the most accurate the photometric redshifts for the bulk of the faint galaxy population when filters have a resolution ℛ ~ 3.2. We show that an optimal mission would survey the sky through eight filters using two cameras (visible and near infrared). Assuming a five-year mission duration, a mirror size of 1.5 m and a 0.5 deg2 FOV with a visible pixel scale of 0.15'', we found that a homogeneous survey reaching a survey population of IAB = 25.6 (10σ) with a sky coverage of ~11 000 deg2 maximizes the weak lensing FoM. The effective number density of galaxies used for WL is then ~45 gal/arcmin2, which is at least a factor of two higher than ground-based surveys. Conclusions: This study demonstrates that a full account of the observational strategy is required to properly optimize the instrument parameters and maximize the FoM of the future weak-lensing space dark energy mission.
Optimization of spectroscopic surveys for testing non-Gaussianity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raccanelli, Alvise; Doré, Olivier; Dalal, Neal, E-mail: alvise@caltech.edu, E-mail: Olivier.P.Dore@jpl.nasa.gov, E-mail: dalaln@illinois.edu
We investigate optimization strategies to measure primordial non-Gaussianity with future spectroscopic surveys. We forecast measurements coming from the 3D galaxy power spectrum and compute constraints on primordial non-Gaussianity parameters f{sub NL} and n{sub NG}. After studying the dependence on those parameters upon survey specifications such as redshift range, area, number density, we assume a reference mock survey and investigate the trade-off between number density and area surveyed. We then define the observational requirements to reach the detection of f{sub NL} of order 1. Our results show that power spectrum constraints on non-Gaussianity from future spectroscopic surveys can improve on currentmore » CMB limits, but the multi-tracer technique and higher order correlations will be needed in order to reach an even better precision in the measurements of the non-Gaussianity parameter f{sub NL}.« less
Observation strategies with the Fermi Gamma-ray Space Telescope
NASA Astrophysics Data System (ADS)
McEnery, Julie E.; Fermi mission Teams
2015-01-01
During the first few years of the Fermi mission, the default observation mode has been an all-sky survey, optimized to provide relatively uniform coverage of the entire sky every three hours. Over 95% of the mission has been performed in this observation mode. However, Fermi is capable of flexible survey mode patterns, and inertially pointed observations both of which allow increased coverage of selected parts of the sky. In this presentation, we will describe the types of observations that Fermi can make, the relative advantages and disadvantages of various observations, and provide guidelines to help Fermi users plan and evaluate non-standard observations.
Using simulation to improve wildlife surveys: Wintering mallards in Mississippi, USA
Pearse, A.T.; Reinecke, K.J.; Dinsmore, S.J.; Kaminski, R.M.
2009-01-01
Wildlife conservation plans generally require reliable data about population abundance and density. Aerial surveys often can provide these data; however, associated costs necessitate designing and conducting surveys efficiently. We developed methods to simulate population distributions of mallards (Anas platyrhynchos) wintering in western Mississippi, USA, by combining bird observations from three previous strip-transect surveys and habitat data from three sets of satellite images representing conditions when surveys were conducted. For each simulated population distribution, we compared 12 primary survey designs and two secondary design options by using coefficients of variation (CV) of population indices as the primary criterion for assessing survey performance. In all, 3 of the 12 primary designs provided the best precision (CV???11.7%) and performed equally well (WR08082E1d.gif diff???0.6%). Features of the designs that provided the largest gains in precision were optimal allocation of sample effort among strata and configuring the study area into five rather than four strata, to more precisely estimate mallard indices in areas of consistently high density. Of the two secondary design options, we found including a second observer to double the size of strip transects increased precision or decreased costs, whereas ratio estimation using auxiliary habitat data from satellite images did not increase precision appreciably. We recommend future surveys of mallard populations in our study area use the strata we developed, optimally allocate samples among strata, employ PPS or EPS sampling, and include two observers when qualified staff are available. More generally, the methods we developed to simulate population distributions from prior survey data provide a cost-effective method to assess performance of alternative wildlife surveys critical to informing management decisions, and could be extended to account for effects of detectability on estimates of true abundance. ?? 2009 CSIRO.
The ESSENCE Supernova Survey: Survey Optimization, Observations, and Supernova Photometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miknaitis, Gajus; Pignata, G.; Rest, A.
We describe the implementation and optimization of the ESSENCE supernova survey, which we have undertaken to measure the equation of state parameter of the dark energy. We present a method for optimizing the survey exposure times and cadence to maximize our sensitivity to the dark energy equation of state parameter w = P/{rho}c{sup 2} for a given fixed amount of telescope time. For our survey on the CTIO 4m telescope, measuring the luminosity distances and redshifts for supernovae at modest redshifts (z {approx} 0.5 {+-} 0.2) is optimal for determining w. We describe the data analysis pipeline based on usingmore » reliable and robust image subtraction to find supernovae automatically and in near real-time. Since making cosmological inferences with supernovae relies crucially on accurate measurement of their brightnesses, we describe our efforts to establish a thorough calibration of the CTIO 4m natural photometric system. In its first four years, ESSENCE has discovered and spectroscopically confirmed 102 type Ia SNe, at redshifts from 0.10 to 0.78, identified through an impartial, effective methodology for spectroscopic classification and redshift determination. We present the resulting light curves for the all type Ia supernovae found by ESSENCE and used in our measurement of w, presented in Wood-Vasey et al. (2007).« less
When to stop managing or surveying cryptic threatened species
Chadès, Iadine; McDonald-Madden, Eve; McCarthy, Michael A.; Wintle, Brendan; Linkie, Matthew; Possingham, Hugh P.
2008-01-01
Threatened species become increasingly difficult to detect as their populations decline. Managers of such cryptic threatened species face several dilemmas: if they are not sure the species is present, should they continue to manage for that species or invest the limited resources in surveying? We find optimal solutions to this problem using a Partially Observable Markov Decision Process and rules of thumb derived from an analytical approximation. We discover that managing a protected area for a cryptic threatened species can be optimal even if we are not sure the species is present. The more threatened and valuable the species is, relative to the costs of management, the more likely we are to manage this species without determining its continued persistence by using surveys. If a species remains unseen, our belief in the persistence of the species declines to a point where the optimal strategy is to shift resources from saving the species to surveying for it. Finally, when surveys lead to a sufficiently low belief that the species is extant, we surrender resources to other conservation actions. We illustrate our findings with a case study using parameters based on the critically endangered Sumatran tiger (Panthera tigris sumatrae), and we generate rules of thumb on how to allocate conservation effort for any cryptic species. Using Partially Observable Markov Decision Processes in conservation science, we determine the conditions under which it is better to abandon management for that species because our belief that it continues to exist is too low. PMID:18779594
When to stop managing or surveying cryptic threatened species.
Chadès, Iadine; McDonald-Madden, Eve; McCarthy, Michael A; Wintle, Brendan; Linkie, Matthew; Possingham, Hugh P
2008-09-16
Threatened species become increasingly difficult to detect as their populations decline. Managers of such cryptic threatened species face several dilemmas: if they are not sure the species is present, should they continue to manage for that species or invest the limited resources in surveying? We find optimal solutions to this problem using a Partially Observable Markov Decision Process and rules of thumb derived from an analytical approximation. We discover that managing a protected area for a cryptic threatened species can be optimal even if we are not sure the species is present. The more threatened and valuable the species is, relative to the costs of management, the more likely we are to manage this species without determining its continued persistence by using surveys. If a species remains unseen, our belief in the persistence of the species declines to a point where the optimal strategy is to shift resources from saving the species to surveying for it. Finally, when surveys lead to a sufficiently low belief that the species is extant, we surrender resources to other conservation actions. We illustrate our findings with a case study using parameters based on the critically endangered Sumatran tiger (Panthera tigris sumatrae), and we generate rules of thumb on how to allocate conservation effort for any cryptic species. Using Partially Observable Markov Decision Processes in conservation science, we determine the conditions under which it is better to abandon management for that species because our belief that it continues to exist is too low.
Matanda, Dennis Juma; Urke, Helga Bjørnøy; Mittelmark, Maurice B
2016-01-01
Using nationally representative surveys conducted in Kenya, this study examined optimal health promoting childcare practices in 2003, 2008-9 and 2014. This was undertaken in the context of continuous child health promotion activities conducted by government and non-government organizations throughout Kenya. It was the aim of such activities to increase the prevalence of health promoting childcare practices; to what extent have there been changes in optimal childcare practices in Kenya during the 11-year period under study? Cross-sectional data were obtained from the Kenya Demographic and Health Surveys conducted in 2003, 2008-9 and 2014. Women 15-49 years old with children 0-59 months were interviewed about a range of childcare practices. Logistic regression analysis was used to examine changes in, and correlates of, optimal childcare practices using the 2003, 2008-9 and 2014 data. Samples of 5949, 6079 and 20964 women interviewed in 2003, 2008-9 and 2014 respectively were used in the analysis. Between 2003 and 2014, there were increases in all health facility-based childcare practices with major increases observed in seeking medical treatment for diarrhoea and complete child vaccination. Mixed results were observed in home-based care where increases were noted in the use of insecticide treated bed nets, sanitary stool disposal and use of oral rehydration solutions, while decreases were observed in the prevalence of urging more fluid/food during diarrhoea and consumption of a minimum acceptable diet. Logit models showed that area of residence (region), household wealth, maternal education, parity, mother's age, child's age and pregnancy history were significant determinants of optimal childcare practices across the three surveys. The study observed variation in the uptake of the recommended optimal childcare practices in Kenya. National, regional and local child health promotion activities, coupled with changes in society and in living conditions between 2003 and 2014, could have influenced uptake of certain recommended childcare practices in Kenya. Decreases in the prevalence of children who were offered same/more fluid/food when they had diarrhea and children who consumed the minimum acceptable diet is alarming and perhaps a red flag to stakeholders who may have focused more on health facility-based care at the expense of home-based care. Concerted efforts are needed to address the consistent inequities in the uptake of the recommended childcare practices. Such efforts should be cognizant of the underlying factors that affect childcare in Kenya, herein defined as region, household wealth, maternal education, parity, mother's age, child's age and pregnancy history.
VizieR Online Data Catalog: AKARI NEP Survey sources at 18um (Pearson+, 2014)
NASA Astrophysics Data System (ADS)
Pearson, C. P.; Serjeant, S.; Oyabu, S.; Matsuhara, H.; Wada, T.; Goto, T.; Takagi, T.; Lee, H. M.; Im, M.; Ohyama, Y.; Kim, S. J.; Murata, K.
2015-04-01
The NEP-Deep survey at 18u in the IRC-L18W band is constructed from a total of 87 individual pointed observations taken between May 2006 to August 2007, using the IRC Astronomical Observing Template (AOT) designed for deep observations (IRC05), with approximately 2500 second exposures per IRC filter in all mid-infrared bands. The deep imaging IRC05 AOT has no explicit dithering built into the AOT operation, therefore dithering is achieved by layering separate pointed observations on at least three positions on a given piece of sky. The NEP-Wide survey consists of 446 pointed observations with .300 second exposures for each filter. The NEP-Wide survey uses the shallower IRC03 AOT optimized for large area multi-band mapping with the dithering included within the AOT. Note that for both surveys, although images are taken simultaneously in all three IRC channels, the target area of sky in the MIR-L channel is offset from the corresponding area of sky in the NIR/MIR-S channel by ~20arcmin. (2 data files).
Cary, L.E.
1984-01-01
The U.S. Geological Survey 's precipitation-runoff modeling system was tested using 2 year 's data for the daily mode and 17 storms for the storm mode from a basin in southeastern Montana. Two hydrologic response unit delineations were studied. The more complex delineation did not provide superior results. In this application, the optimum numbers of hydrologic response units were 16 and 18 for the two alternatives. The first alternative with 16 units was modified to facilitate interfacing with the storm mode. A parameter subset was defined for the daily mode using sensitivity analysis. Following optimization, the simulated hydrographs approximated the observed hydrograph during the first year, a year of large snowfall. More runoff was simulated than observed during the second year. There was reasonable correspondence between the observed snowpack and the simulated snowpack the first season but poor the second. More soil moisture was withdrawn than was indicated by soil moisture observations. Optimization of parameters in the storm mode resulted in much larger values than originally estimated, commonly larger than published values of the Green and Ampt parameters. Following optimization, variable results were obtained. The results obtained are probably related to inadequate representation of basin infiltration characteristics and to precipitation variability. (USGS)
VizieR Online Data Catalog: HST/COS survey of z<0.9 AGNs. I. (Danforth+, 2016)
NASA Astrophysics Data System (ADS)
Danforth, C. W.; Keeney, B. A.; Tilton, E. M.; Shull, J. M.; Stocke, J. T.; Stevans, M.; Pieri, M. M.; Savage, B. D.; France, K.; Syphers, D.; Smith, B. D.; Green, J. C.; Froning, C.; Penton, S. V.; Osterman, S. N.
2016-05-01
COS is the fourth-generation UV spectrograph on board HST and is optimized for medium-resolution (R~18000, Δv~17km/s) spectroscopy of point sources in the 1135-1800Å band. To constitute our survey, we selected 82 AGN sight lines from the archive which met the selection criteria. Most of the AGNs observed in Cycles 18-20 under the Guaranteed Time Observation programs (GTO; PI-Green) are included, along with numerous archival data sets collected under various Guest Observer programs. Observational and programatic details are presented in Table 2; see also section 2.1. (5 data files).
NASA Astrophysics Data System (ADS)
Bhattacharjya, D.; Mukerji, T.; Mascarenhas, O.; Weyant, J.
2005-12-01
Designing a cost-effective and reliable monitoring program is crucial to the success of any geological CO2 storage project. Effective design entails determining both, the optimal measurement modality, as well as the frequency of monitoring the site. Time-lapse seismic provides the best spatial coverage and resolution for reservoir monitoring. Initial results from Sleipner (Norway) have demonstrated effective monitoring of CO2 plume movement. However, time-lapse seismic is an expensive monitoring technique especially over the long term life of a storage project and should be used judiciously. We present a mathematical model based on dynamic programming that can be used to estimate site-specific optimal frequency of time-lapse surveys. The dynamics of the CO2 sequestration process are simplified and modeled as a four state Markov process with transition probabilities. The states are M: injected CO2 safely migrating within the target zone; L: leakage from the target zone to the adjacent geosphere; R: safe migration after recovery from leakage state; and S: seepage from geosphere to the biosphere. The states are observed only when a monitoring survey is performed. We assume that the system may go to state S only from state L. We also assume that once observed to be in state L, remedial measures are always taken to bring it back to state R. Remediation benefits are captured by calculating the expected penalty if CO2 seeped into the biosphere. There is a trade-off between the conflicting objectives of minimum discounted costs of performing the next time-lapse survey and minimum risk of seepage and its associated costly consequences. A survey performed earlier would spot the leakage earlier. Remediation methods would have been utilized earlier, resulting in savings in costs attributed to excessive seepage. On the other hand, there are also costs for the survey and remedial measures. The problem is solved numerically using Bellman's optimality principal of dynamic programming to optimize over the entire finite time horizon. We use a Monte Carlo approach to explore trade-offs between survey costs, remediation costs, and survey frequency and to analyze the sensitivity to leakage probabilities, and carbon tax. The model can be useful in determining a monitoring regime appropriate to a specific site's risk and set of remediation options, rather than a generic one based on a maximum downside risk threshold for CO2 storage as a whole. This may have implications on the overall costs associated with deploying Carbon capture and storage on a large scale.
Calculating Proper Motions in the WFCAM Science Archive for the UKIRT Infrared Deep Sky Surveys
NASA Astrophysics Data System (ADS)
Collins, R.; Hambly, N.
2012-09-01
The ninth data release from the UKIRT Infrared Deep Sky Surveys (hereafter UKIDSS DR9), represents five years worth of observations by its wide-field camera (WFCAM) and will be the first to include proper motion values in its source catalogues for the shallow, wide-area surveys; the Large Area Survey (LAS), Galactic Clusters Survey (GCS) and (ultimately) Galactic Plane Survey (GPS). We, the Wide Field Astronomy Unit (WFAU) at the University of Edinburgh who prepare these regular data releases in the WFCAM Science Archive (WSA), describe in this paper how we make optimal use of the individual detection catalogues from each observation to derive high-quality astrometric fits for the positions of each detection enabling us to calculate a proper motion solution across multiple epochs and passbands when constructing a merged source catalogue. We also describe how the proper motion solutions affect the calculation of the various attributes provided in the database source catalogue tables, what measures of data quality we provide and a demonstration of the results for observations of the Pleiades cluster.
Matanda, Dennis Juma; Urke, Helga Bjørnøy; Mittelmark, Maurice B.
2016-01-01
Objective(s) Using nationally representative surveys conducted in Kenya, this study examined optimal health promoting childcare practices in 2003, 2008–9 and 2014. This was undertaken in the context of continuous child health promotion activities conducted by government and non-government organizations throughout Kenya. It was the aim of such activities to increase the prevalence of health promoting childcare practices; to what extent have there been changes in optimal childcare practices in Kenya during the 11-year period under study? Methods Cross-sectional data were obtained from the Kenya Demographic and Health Surveys conducted in 2003, 2008–9 and 2014. Women 15–49 years old with children 0–59 months were interviewed about a range of childcare practices. Logistic regression analysis was used to examine changes in, and correlates of, optimal childcare practices using the 2003, 2008–9 and 2014 data. Samples of 5949, 6079 and 20964 women interviewed in 2003, 2008–9 and 2014 respectively were used in the analysis. Results Between 2003 and 2014, there were increases in all health facility-based childcare practices with major increases observed in seeking medical treatment for diarrhoea and complete child vaccination. Mixed results were observed in home-based care where increases were noted in the use of insecticide treated bed nets, sanitary stool disposal and use of oral rehydration solutions, while decreases were observed in the prevalence of urging more fluid/food during diarrhoea and consumption of a minimum acceptable diet. Logit models showed that area of residence (region), household wealth, maternal education, parity, mother's age, child’s age and pregnancy history were significant determinants of optimal childcare practices across the three surveys. Conclusions The study observed variation in the uptake of the recommended optimal childcare practices in Kenya. National, regional and local child health promotion activities, coupled with changes in society and in living conditions between 2003 and 2014, could have influenced uptake of certain recommended childcare practices in Kenya. Decreases in the prevalence of children who were offered same/more fluid/food when they had diarrhea and children who consumed the minimum acceptable diet is alarming and perhaps a red flag to stakeholders who may have focused more on health facility-based care at the expense of home-based care. Concerted efforts are needed to address the consistent inequities in the uptake of the recommended childcare practices. Such efforts should be cognizant of the underlying factors that affect childcare in Kenya, herein defined as region, household wealth, maternal education, parity, mother's age, child’s age and pregnancy history. PMID:27532665
Real-time dedispersion for fast radio transient surveys, using auto tuning on many-core accelerators
NASA Astrophysics Data System (ADS)
Sclocco, A.; van Leeuwen, J.; Bal, H. E.; van Nieuwpoort, R. V.
2016-01-01
Dedispersion, the removal of deleterious smearing of impulsive signals by the interstellar matter, is one of the most intensive processing steps in any radio survey for pulsars and fast transients. We here present a study of the parallelization of this algorithm on many-core accelerators, including GPUs from AMD and NVIDIA, and the Intel Xeon Phi. We find that dedispersion is inherently memory-bound. Even in a perfect scenario, hardware limitations keep the arithmetic intensity low, thus limiting performance. We next exploit auto-tuning to adapt dedispersion to different accelerators, observations, and even telescopes. We demonstrate that the optimal settings differ between observational setups, and that auto-tuning significantly improves performance. This impacts time-domain surveys from Apertif to SKA.
NASA Astrophysics Data System (ADS)
Ma, Shu-Guo; Esamdin, Ali; Ma, Lu; Niu, Hu-Biao; Fu, Jian-Ning; Zhang, Yu; Liu, Jin-Zhong; Yang, Tao-Zhi; Song, Fang-Fang; Pu, Guang-Xin
2018-04-01
Following the LAMOST Spectroscopic Survey and the Xuyi's Photometric Survey of the Galactic Anti-center, we plan to carry out a time-domain survey of the Galactic Anti-center (TDS-GAC) to study variable stars by using the Nanshan 1-meter telescope. Before the beginning of TDS-GAC, a precursive sky survey (PSS) has been executed. The goal of the PSS is to optimize the observation strategy of TDS-GAC and to detect some strong transient events, as well as to find some short time-scale variable stars of different types. By observing a discontinuous sky area of 15.03 deg2 with the standard Johnson-Cousin-Bessel V filter, 48 variable stars are found and the time series are analyzed. Based on the behaviors of the light curves, 28 eclipsing binary stars, 10 RR Lyraes, 3 periodic pulsating variables of other types have been classified. The rest 7 variables stay unclassified with deficient data. In addition, the observation strategy of TD-GAC is described, and the pipeline of data reduction is tested.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen
2014-08-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
Sparsely sampling the sky: a Bayesian experimental design approach
NASA Astrophysics Data System (ADS)
Paykari, P.; Jaffe, A. H.
2013-08-01
The next generation of galaxy surveys will observe millions of galaxies over large volumes of the Universe. These surveys are expensive both in time and cost, raising questions regarding the optimal investment of this time and money. In this work, we investigate criteria for selecting amongst observing strategies for constraining the galaxy power spectrum and a set of cosmological parameters. Depending on the parameters of interest, it may be more efficient to observe a larger, but sparsely sampled, area of sky instead of a smaller contiguous area. In this work, by making use of the principles of Bayesian experimental design, we will investigate the advantages and disadvantages of the sparse sampling of the sky and discuss the circumstances in which a sparse survey is indeed the most efficient strategy. For the Dark Energy Survey (DES), we find that by sparsely observing the same area in a smaller amount of time, we only increase the errors on the parameters by a maximum of 0.45 per cent. Conversely, investing the same amount of time as the original DES to observe a sparser but larger area of sky, we can in fact constrain the parameters with errors reduced by 28 per cent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soto, M.; Bellini, A.; Anderson, J.
The Hubble Space Telescope (HST) UV Legacy Survey of Galactic Globular Clusters (GO-13297) has been specifically designed to complement the existing F606W and F814W observations of the Advanced Camera for Surveys (ACS) Globular Cluster Survey (GO-10775) by observing the most accessible 47 of the previous survey’s 65 clusters in three WFC3/UVIS filters F275W, F336W, and F438W. The new survey also adds super-solar metallicity open cluster NGC 6791 to increase the metallicity diversity. The combined survey provides a homogeneous 5-band data set that can be used to pursue a broad range of scientific investigations. In particular, the chosen UV filters allow themore » identification of multiple stellar populations by targeting the regions of the spectrum that are sensitive to abundance variations in C, N, and O. In order to provide the community with uniform preliminary catalogs, we have devised an automated procedure that performs high-quality photometry on the new UV observations (along with similar observations of seven other programs in the archive). This procedure finds and measures the potential sources on each individual exposure using library point-spread functions and cross-correlates these observations with the original ACS-Survey catalog. The catalog of 57 clusters we publish here will be useful to identify stars in the different stellar populations, in particular for spectroscopic follow-up. Eventually, we will construct a more sophisticated catalog and artificial-star tests based on an optimal reduction of the UV survey data, but the catalogs presented here give the community the chance to make early use of this HST Treasury survey.« less
Advancing the LSST Operations Simulator
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group
2013-01-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.
Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G
2015-07-01
Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The use of bulk EC a gradient as an exhaustive variable, known at any node of an interpolation grid, has allowed the optimization of the sampling scheme, distinguishing among areas with different priority levels.
Structure, Features, and Faculty Content in ARL Member Repositories
ERIC Educational Resources Information Center
Mercer, Holly; Koenig, Jay; McGeachin, Robert B.; Tucker, Sandra L.
2011-01-01
Questions about the optimal way to present repository content to authors, submitters, and end-users, prompted this study. The authors examined, through an observation and a survey, the institutional repositories of peer institutions in the ARL for good practices related to the presentation and organization of faculty-authored institutional…
ICE-COLA: towards fast and accurate synthetic galaxy catalogues optimizing a quasi-N-body method
NASA Astrophysics Data System (ADS)
Izard, Albert; Crocce, Martin; Fosalba, Pablo
2016-07-01
Next generation galaxy surveys demand the development of massive ensembles of galaxy mocks to model the observables and their covariances, what is computationally prohibitive using N-body simulations. COmoving Lagrangian Acceleration (COLA) is a novel method designed to make this feasible by following an approximate dynamics but with up to three orders of magnitude speed-ups when compared to an exact N-body. In this paper, we investigate the optimization of the code parameters in the compromise between computational cost and recovered accuracy in observables such as two-point clustering and halo abundance. We benchmark those observables with a state-of-the-art N-body run, the MICE Grand Challenge simulation. We find that using 40 time-steps linearly spaced since zI ˜ 20, and a force mesh resolution three times finer than that of the number of particles, yields a matter power spectrum within 1 per cent for k ≲ 1 h Mpc-1 and a halo mass function within 5 per cent of those in the N-body. In turn, the halo bias is accurate within 2 per cent for k ≲ 0.7 h Mpc-1 whereas, in redshift space, the halo monopole and quadrupole are within 4 per cent for k ≲ 0.4 h Mpc-1. These results hold for a broad range in redshift (0 < z < 1) and for all halo mass bins investigated (M > 1012.5 h-1 M⊙). To bring accuracy in clustering to one per cent level we study various methods that re-calibrate halo masses and/or velocities. We thus propose an optimized choice of COLA code parameters as a powerful tool to optimally exploit future galaxy surveys.
Optimization of Planet Finder Observing Strategy
NASA Astrophysics Data System (ADS)
Sinukoff, E.
2014-03-01
We evaluate radial velocity observing strategies to be considered for future planethunting surveys with the Automated Planet Finder, a new 2.4-m telescope at Lick Observatory. Observing strategies can be optimized to mitigate stellar noise, which can mask and imitate the weak Doppler signals of low-mass planets. We estimate and compare sensitivities of 5 different observing strategies to planets around G2-M2 dwarfs, constructing RV noise models for each stellar spectral type, accounting for acoustic, granulation, and magnetic activity modes. The strategies differ in exposure time, nightly and monthly cadence, and number of years. Synthetic RV time-series are produced by injecting a planet signal onto the stellar noise, sampled according to each observing strategy. For each star and each observing strategy, thousands of planet injection recovery trials are conducted to determine the detection efficiency as a function of orbital period, minimum mass, and eccentricity. We find that 4-year observing strategies of 10 nights per month are sensitive to planets ~25-40% lower in mass than the corresponding 1 year strategies of 30 nights per month. Three 5-minute exposures spaced evenly throughout each night provide a 10% gain in sensitivity over the corresponding single 15-minute exposure strategies. All strategies are sensitive to planets of lowest mass around the modeled K7 dwarf. This study indicates that APF surveys adopting the 4-year strategies should detect Earth-mass planets on < 10-day orbits around quiet late-K dwarfs as well as > 1.6 Earth-mass planets in their habitable zones.
Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas
NASA Astrophysics Data System (ADS)
Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg
2014-05-01
The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the simulation results. A set of calibration runs were performed and the model accuracy in reproducing the surface circulation were defined. Therefore, a numerical simulation was conducted to predict the wind induced lagoon water circulation and the paths followed by numerical particles inside the lagoon domain. The simulated particles paths was analyzed and the optimal configuration for the buoys deployment was designed in real-time. The selected deployment geometry was then tested during a further field campaign. The obtained dataset revealed that the chosen measurement strategy provided a near-synoptic survey with the longest records for the considered specific observing experiment. This work is aimed to emphasize the mutual usefulness of observations and numerical simulations in coastal ocean applications and it proposes an efficient approach to harmonize different expertise toward the investigation of a given specific research issue. A Cucco, M Sinerchia, A Ribotti, A Olita, L Fazioli, A Perilli, B Sorgente, M Borghini, K Schroeder, R Sorgente. 2012. A high-resolution real-time forecasting system for predicting the fate of oil spills in the Strait of Bonifacio (western Mediterranean Sea). Marine Pollution Bulletin. 64. 6, 1186-1200. Kallos, G., Nickovic, S., Papadopoulos, A., Jovic, D., Kakaliagou, O., Misirlis, N., Boukas, L., Mimikou, N., G., S., J., P., Anadranistakis, E., and Manousakis, M.. 1997. The regional weather forecasting system Skiron: An overview, in: Proceedings of the Symposium on Regional Weather Prediction on Parallel Computer Environments, 109-122, Athens, Greece. Umgiesser, G., Melaku Canu, D., Cucco, A., Solidoro, C., 2004. A finite element model for the Venice Lagoon. Development, set up, calibration and validation. Journal of Marine Systems 51, 123-145.
NASA Astrophysics Data System (ADS)
Reiser, Fabienne; Schmelzbach, Cedric; Maurer, Hansruedi; Greenhalgh, Stewart; Hellwig, Olaf
2017-04-01
A primary focus of geothermal seismic imaging is to map dipping faults and fracture zones that control rock permeability and fluid flow. Vertical seismic profiling (VSP) is therefore a most valuable means to image the immediate surroundings of an existing borehole to guide, for example, the placing of new boreholes to optimize production from known faults and fractures. We simulated 2D and 3D acoustic synthetic seismic data and processed it through to pre-stack depth migration to optimize VSP survey layouts for mapping moderately to steeply dipping fracture zones within possible basement geothermal reservoirs. Our VSP survey optimization procedure for sequentially selecting source locations to define the area where source points are best located for optimal imaging makes use of a cross-correlation statistic, by which a subset of migrated shot gathers is compared with a target or reference image from a comprehensive set of source gathers. In geothermal exploration at established sites, it is reasonable to assume that sufficient à priori information is available to construct such a target image. We generally obtained good results with a relatively small number of optimally chosen source positions distributed over an ideal source location area for different fracture zone scenarios (different dips, azimuths, and distances from the surveying borehole). Adding further sources outside the optimal source area did not necessarily improve the results, but rather resulted in image distortions. It was found that fracture zones located at borehole-receiver depths and laterally offset from the borehole by 300 m can be imaged reliably for a range of the different dips, but more source positions and large offsets between sources and the borehole are required for imaging steeply dipping interfaces. When such features cross-cut the borehole, they are particularly difficult to image. For fracture zones with different azimuths, 3D effects are observed. Far offset source positions contribute less to the image quality as fracture zone azimuth increases. Our optimization methodology is best suited for designing future field surveys with a favorable benefit-cost ratio in areas with significant à priori knowledge. Moreover, our optimization workflow is valuable for selecting useful subsets of acquired data for optimum target-oriented processing.
NASA Astrophysics Data System (ADS)
Hutter, Anne; Trott, Cathryn M.; Dayal, Pratika
2018-06-01
Detections of the cross correlation signal between the 21cm signal during reionization and high-redshift Lyman Alpha emitters (LAEs) are subject to observational uncertainties which mainly include systematics associated with radio interferometers and LAE selection. These uncertainties can be reduced by increasing the survey volume and/or the survey luminosity limit, i.e. the faintest detectable Lyman Alpha (Lyα) luminosity. We use our model of high-redshift LAEs and the underlying reionization state to compute the uncertainties of the 21cm-LAE cross correlation function at z ≃ 6.6 for observations with SKA1-Low and LAE surveys with Δz = 0.1 for three different values of the average IGM ionization state (⟨χHI⟩≃ 0.1, 0.25, 0.5). At z ≃ 6.6, we find SILVERRUSH type surveys, with a field of view of 21 deg2 and survey luminosity limits of Lα ≥ 7.9 × 1042erg s-1, to be optimal to distinguish between an inter-galactic medium (IGM) that is 50%, 25% and 10% neutral, while surveys with smaller fields of view and lower survey luminosity limits, such as the 5 and 10 deg2 surveys with WFIRST, can only discriminate between a 50% and 10% neutral IGM.
NASA Astrophysics Data System (ADS)
Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.
2016-12-01
Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.
GPS baseline configuration design based on robustness analysis
NASA Astrophysics Data System (ADS)
Yetkin, M.; Berber, M.
2012-11-01
The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configuration
Optimal surveys for weak-lensing tomography
NASA Astrophysics Data System (ADS)
Amara, Adam; Réfrégier, Alexandre
2007-11-01
Weak-lensing surveys provide a powerful probe of dark energy through the measurement of the mass distribution of the local Universe. A number of ground-based and space-based surveys are being planned for this purpose. Here, we study the optimal strategy for these future surveys using the joint constraints on the equation-of-state parameter wn and its evolution wa as a figure of merit by considering power spectrum tomography. For this purpose, we first consider an `ideal' survey which is both wide and deep and exempt from systematics. We find that such a survey has great potential for dark energy studies, reaching 1σ precisions of 1 and 10 per cent on the two parameters, respectively. We then study the relative impact of various limitations by degrading this ideal survey. In particular, we consider the effect of sky coverage, survey depth, shape measurement systematics, photometric redshift systematics and uncertainties in the non-linear power spectrum predictions. We find that, for a given observing time, it is always advantageous to choose a wide rather than a deep survey geometry. We also find that the dark energy constraints from power spectrum tomography are robust to photometric redshift errors and catastrophic failures, if a spectroscopic calibration sample of 104-105 galaxies are available. The impact of these systematics is small compared to the limitations that come from potential uncertainties in the power spectrum, due to shear measurement and theoretical errors. To help the planning of future surveys, we summarize our results with comprehensive scaling relations which avoid the need for full Fisher matrix calculations.
Optimal systems of geoscience surveying A preliminary discussion
NASA Astrophysics Data System (ADS)
Shoji, Tetsuya
2006-10-01
In any geoscience survey, each survey technique must be effectively applied, and many techniques are often combined optimally. An important task is to get necessary and sufficient information to meet the requirement of the survey. A prize-penalty function quantifies effectiveness of the survey, and hence can be used to determine the best survey technique. On the other hand, an information-cost function can be used to determine the optimal combination of survey techniques on the basis of the geoinformation obtained. Entropy is available to evaluate geoinformation. A simple model suggests the possibility that low-resolvability techniques are generally applied at early stages of survey, and that higher-resolvability techniques should alternate with lower-resolvability ones with the progress of the survey.
2003-07-01
4, Gnanadesikan , 1977). An entity whose measured features fall into one of the regions is classified accordingly. For the approaches we discuss here... Gnanadesikan , R. 1977. Methods for Statistical Data Analysis of Multivariate Observations. John Wiley & Sons, New York. Hassig, N. L., O’Brien, R. F
Optical+Near-IR Bayesian Classification of Quasars
NASA Astrophysics Data System (ADS)
Mehta, Sajjan S.; Richards, G. T.; Myers, A. D.
2011-05-01
We describe the details of an optimal Bayesian classification of quasars with combined optical+near-IR photometry from the SDSS and UKIDSS LAS surveys. Using only deep co-added SDSS photometry from the "Stripe 82" region and requiring full four-band UKIDSS detections, we reliably identify 2665 quasar candidates with a computed efficiency in excess of 99%. Relaxing the data constraints to combinations of two-band detections yields up to 6424 candidates with minimal trade-off in completeness and efficiency. The completeness and efficiency of the sample are investigated with existing spectra from the SDSS, 2SLAQ, and AUS surveys in addition to recent single-slit observations from Palomar Observatory, which revealed 22 quasars from a subsample of 29 high-z candidates. SDSS-III/BOSS observations will allow further exploration of the completeness/efficiency of the sample over 2.2
Multidisciplinary aerospace design optimization: Survey of recent developments
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.
1995-01-01
The increasing complexity of engineering systems has sparked increasing interest in multidisciplinary optimization (MDO). This paper presents a survey of recent publications in the field of aerospace where interest in MDO has been particularly intense. The two main challenges of MDO are computational expense and organizational complexity. Accordingly the survey is focussed on various ways different researchers use to deal with these challenges. The survey is organized by a breakdown of MDO into its conceptual components. Accordingly, the survey includes sections on Mathematical Modeling, Design-oriented Analysis, Approximation Concepts, Optimization Procedures, System Sensitivity, and Human Interface. With the authors' main expertise being in the structures area, the bulk of the references focus on the interaction of the structures discipline with other disciplines. In particular, two sections at the end focus on two such interactions that have recently been pursued with a particular vigor: Simultaneous Optimization of Structures and Aerodynamics, and Simultaneous Optimization of Structures Combined With Active Control.
WFIRST: Exoplanet Target Selection and Scheduling with Greedy Optimization
NASA Astrophysics Data System (ADS)
Keithly, Dean; Garrett, Daniel; Delacroix, Christian; Savransky, Dmitry
2018-01-01
We present target selection and scheduling algorithms for missions with direct imaging of exoplanets, and the Wide Field Infrared Survey Telescope (WFIRST) in particular, which will be equipped with a coronagraphic instrument (CGI). Optimal scheduling of CGI targets can maximize the expected value of directly imaged exoplanets (completeness). Using target completeness as a reward metric and integration time plus overhead time as a cost metric, we can maximize the sum completeness for a mission with a fixed duration. We optimize over these metrics to create a list of target stars using a greedy optimization algorithm based off altruistic yield optimization (AYO) under ideal conditions. We simulate full missions using EXOSIMS by observing targets in this list for their predetermined integration times. In this poster, we report the theoretical maximum sum completeness, mean number of detected exoplanets from Monte Carlo simulations, and the ideal expected value of the simulated missions.
NASA Astrophysics Data System (ADS)
Braun, A.; Walter, C. A.; Parvar, K.
2016-12-01
The current platforms for collecting magnetic data include dense coverage, but low resolution traditional airborne surveys, and high resolution, but low coverage terrestrial surveys. Both platforms leave a critical observation gap between the ground surface and approximately 100m above ground elevation, which can be navigated efficiently by new technologies, such as Unmanned Aerial Vehicles (UAVs). Specifically, multi rotor UAV platforms provide the ability to sense the magnetic field in a full 3-D tensor, which increases the quality of data collected over other current platform types. Payload requirements and target requirements must be balanced to fully exploit the 3-D magnetic tensor. This study outlines the integration of a GEM Systems Cesium Vapour UAV Magnetometer, a Lightware SF-11 Laser Altimeter and a uBlox EVK-7P GPS module with a DJI s900 Multi Rotor UAV. The Cesium Magnetometer is suspended beneath the UAV platform by a cable of varying length. A set of surveys was carried out to optimize the sensor orientation, sensor cable length beneath the UAV and data collection methods of the GEM Systems Cesium Vapour UAV Magnetometer when mounted on the DJI s900. The target for these surveys is a 12 inch steam pipeline located approximately 2 feet below the ground surface. A systematic variation of cable length, sensor orientation and inclination was conducted. The data collected from the UAV magnetometer was compared to a terrestrial survey conducted with the GEM GST-19 Proton Procession Magnetometer at the same elevation, which also served a reference station. This allowed for a cross examination between the UAV system and a proven industry standard for magnetic field data collection. The surveys resulted in optimizing the above parameters based on minimizing instrument error and ensuring reliable data acquisition. The results demonstrate that optimizing the UAV magnetometer survey can yield to industry standard measurements.
NASA Astrophysics Data System (ADS)
Zackay, Barak; Ofek, Eran O.
2017-02-01
Stacks of digital astronomical images are combined in order to increase image depth. The variable seeing conditions, sky background, and transparency of ground-based observations make the coaddition process nontrivial. We present image coaddition methods that maximize the signal-to-noise ratio (S/N) and optimized for source detection and flux measurement. We show that for these purposes, the best way to combine images is to apply a matched filter to each image using its own point-spread function (PSF) and only then to sum the images with the appropriate weights. Methods that either match the filter after coaddition or perform PSF homogenization prior to coaddition will result in loss of sensitivity. We argue that our method provides an increase of between a few and 25% in the survey speed of deep ground-based imaging surveys compared with weighted coaddition techniques. We demonstrate this claim using simulated data as well as data from the Palomar Transient Factory data release 2. We present a variant of this coaddition method, which is optimal for PSF or aperture photometry. We also provide an analytic formula for calculating the S/N for PSF photometry on single or multiple observations. In the next paper in this series, we present a method for image coaddition in the limit of background-dominated noise, which is optimal for any statistical test or measurement on the constant-in-time image (e.g., source detection, shape or flux measurement, or star-galaxy separation), making the original data redundant. We provide an implementation of these algorithms in MATLAB.
NASA Astrophysics Data System (ADS)
Allahyari, M.; Olsen, M. J.; Gillins, D. T.; Dennis, M. L.
2016-12-01
Many current surveying standards in the United States require several long-duration, static Global Navigation Satellite System (GNSS) observations to derive high-accuracy geodetic coordinates. However, over the past decade, many entities have established real-time GNSS networks (RTNs), which could reduce the field time for establishing geodetic control from hours to minutes. To evaluate the accuracy of RTN GNSS observations, data collected from two National Geodetic Survey (NGS) surveys in South Carolina and Oregon were studied. The objectives were to: 1) determine the accuracy of a real-time observation as a function of duration; 2) examine the influence of including GLONASS (Russia's version of GPS); 3) compare results using a single base to the full RTN network solution; and 4) assess the effect of baseline length on accuracy. In South Carolina, 360 observations ranging from 5 to 600 seconds were collected on 20 passive marks using RTN and single-base solutions, both with GPS+GLONASS and GPS-only. In Oregon, 18 passive marks were observed from 5 to 900 seconds using GPS-only with the RTN, and with GPS+GLONASS and GPS-only from a single-base. To develop "truth" coordinates, at least 30 hours of static GPS data were also collected on all marks. Each static survey session was post-processed in OPUS-Projects, and the resulting vectors were used to build survey networks that were least-squares adjusted using the NGS software ADJUST. The resulting coordinates provided the basis for evaluating the accuracy of the real-time observations. Results from this study indicate great potential in the use of RTNs for accurate derivation of geodetic coordinates. Both case studies showed an optimal observation duration of 180 seconds. RTN data tended to be more accurate and consistent than single-base data, and GLONASS slightly improved accuracy. A key benefit of GLONASS was the ability to obtain more fixed solutions at longer baseline lengths than single-base solutions.
Optimization of Exposure Time Division for Multi-object Photometry
NASA Astrophysics Data System (ADS)
Popowicz, Adam; Kurek, Aleksander R.
2017-09-01
Optical observations of wide fields of view entail the problem of selecting the best exposure time. As many objects are usually observed simultaneously, the quality of photometry of the brightest ones is always better than that of the dimmer ones, even though all of them are frequently equally interesting for astronomers. Thus, measuring all objects with the highest possible precision is desirable. In this paper, we present a new optimization algorithm, dedicated for the division of exposure time into sub-exposures, which enables photometry with a more balanced noise budget. The proposed technique increases the photometric precision of dimmer objects at the expense of the measurement fidelity of the brightest ones. We have tested the method on real observations using two telescope setups, demonstrating its usefulness and good consistency with theoretical expectations. The main application of our approach is a wide range of sky surveys, including ones performed by space telescopes. The method can be used to plan virtually any photometric observation of objects that show a wide range of magnitudes.
NASA Astrophysics Data System (ADS)
Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.
2018-07-01
Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3-D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.
NASA Astrophysics Data System (ADS)
Uhlemann, Sebastian; Wilkinson, Paul B.; Maurer, Hansruedi; Wagner, Florian M.; Johnson, Timothy C.; Chambers, Jonathan E.
2018-03-01
Within geoelectrical imaging, the choice of measurement configurations and electrode locations is known to control the image resolution. Previous work has shown that optimized survey designs can provide a model resolution that is superior to standard survey designs. This paper demonstrates a methodology to optimize resolution within a target area, while limiting the number of required electrodes, thereby selecting optimal electrode locations. This is achieved by extending previous work on the `Compare-R' algorithm, which by calculating updates to the resolution matrix optimizes the model resolution in a target area. Here, an additional weighting factor is introduced that allows to preferentially adding measurement configurations that can be acquired on a given set of electrodes. The performance of the optimization is tested on two synthetic examples and verified with a laboratory study. The effect of the weighting factor is investigated using an acquisition layout comprising a single line of electrodes. The results show that an increasing weight decreases the area of improved resolution, but leads to a smaller number of electrode positions. Imaging results superior to a standard survey design were achieved using 56 per cent fewer electrodes. The performance was also tested on a 3D acquisition grid, where superior resolution within a target at the base of an embankment was achieved using 22 per cent fewer electrodes than a comparable standard survey. The effect of the underlying resistivity distribution on the performance of the optimization was investigated and it was shown that even strong resistivity contrasts only have minor impact. The synthetic results were verified in a laboratory tank experiment, where notable image improvements were achieved. This work shows that optimized surveys can be designed that have a resolution superior to standard survey designs, while requiring significantly fewer electrodes. This methodology thereby provides a means for improving the efficiency of geoelectrical imaging.
NASA Technical Reports Server (NTRS)
Young, Larry A.; Pisanich, Gregory; Ippolito, Corey; Alena, Rick
2005-01-01
The objective of this paper is to review the anticipated imaging and remote-sensing technology requirements for aerial vehicle survey missions to other planetary bodies in our Solar system that can support in-atmosphere flight. In the not too distant future such planetary aerial vehicle (a.k.a. aerial explorers) exploration missions will become feasible. Imaging and remote-sensing observations will be a key objective for these missions. Accordingly, it is imperative that optimal solutions in terms of imaging acquisition and real-time autonomous analysis of image data sets be developed for such vehicles.
Recent Advances in Bathymetric Surveying of Continental Shelf Regions Using Autonomous Vehicles
NASA Astrophysics Data System (ADS)
Holland, K. T.; Calantoni, J.; Slocum, D.
2016-02-01
Obtaining bathymetric observations within the continental shelf in areas closer to the shore is often time consuming and dangerous, especially when uncharted shoals and rocks present safety concerns to survey ships and launches. However, surveys in these regions are critically important to numerical simulation of oceanographic processes, as bathymetry serves as the bottom boundary condition in operational forecasting models. We will present recent progress in bathymetric surveying using both traditional vessels retrofitted for autonomous operations and relatively inexpensive, small team deployable, Autonomous Underwater Vehicles (AUV). Both systems include either high-resolution multibeam echo sounders or interferometric sidescan sonar sensors with integrated inertial navigation system capabilities consistent with present commercial-grade survey operations. The advantages and limitations of these two configurations employing both unmanned and autonomous strategies are compared using results from several recent survey operations. We will demonstrate how sensor data collected from unmanned platforms can augment or even replace traditional data collection technologies. Oceanographic observations (e.g., sound speed, temperature and currents) collected simultaneously with bathymetry using autonomous technologies provide additional opportunities for advanced data assimilation in numerical forecasts. Discussion focuses on our vision for unmanned and autonomous systems working in conjunction with manned or in-situ systems to optimally and simultaneously collect data in environmentally hostile or difficult to reach areas.
NASA Astrophysics Data System (ADS)
Rhoads, James
Central objectives: WFIRST-AFTA has tremendous potential for studying the epoch of "Cosmic Dawn" the period encompassing the formation of the first galaxies and quasars, and their impact on the surrounding universe through cosmological reionization. Our goal is to ensure that this potential is realized through the middle stages of mission planning, culminating in designs for both WFIRST and its core surveys that meet the core objectives in dark energy and exoplanet science, while maximizing the complementary Cosmic Dawn science. Methods: We will consider a combined approach to studying Cosmic Dawn using a judicious mixture of guest investigator data analysis of the primary WFIRST surveys, and a specifically designed Guest Observer program to complement those surveys. The Guest Observer program will serve primarily to obtain deep field observations, with particular attention to the capabilities of WFIRST for spectroscopic deep fields using the WFI grism. We will bring to bear our years of experience with slitless spectroscopy on the Hubble Space Telescope, along with an expectation of JWST slitless grism spectroscopy. We will use this experience to examine the implications of WFIRST’s grism resolution and wavelength coverage for deep field observations, and if appropriate, to suggest potential modifications of these parameters to optimize the science return on WFIRST. We have assembled a team of experts specializing in (1) Lyman Break Galaxies at redshifts higher than 7 (2) Quasars at high redshifts (3) Lyman-alpha galaxies as probes of reionization (4) Theoretical simulations of high-redshift galaxies (5) Simulations of grism observations (6) post-processing analysis to find emission line galaxies and high redshift galaxies (7) JWST observations and calibrations. With this team we intend to do end-to-end simulations starting with halo populations and expected spectra of high redshift galaxies and finally extracting what we can learn about (a) reionization using the Lyman-alpha test (b) the sources of reionization - both galaxies and AGN and (c) how to optimize WFIRST-AFTA surveys to maximize scientific output of this mission. Along the way, we will simulate the galaxy and AGN populations expected beyond redshift 7, and will simulate observations and data analysis of these populations with WFIRST. Significance of work: Cosmic Dawn is one of the central pillars of the "New Worlds, New Horizons" decadal survey. WFIRST's highly sensitive and wide-field near-infrared capabilities offer a natural tool to obtain statistically useful samples of faint galaxies and AGN beyond redshift 7. Thus, we expect Cosmic Dawn observations will constitute a major component of the GO program ultimately executed by WFIRST. By supporting our Science Investigation Team to consider the interplay between the mission parameters and the ultimate harvest of Cosmic Dawn science, NASA will help ensure the success of WFIRST as a broadly focused flagship mission.
Optimizing cosmological surveys in a crowded market
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.
2005-04-01
Optimizing the major next-generation cosmological surveys (such as SNAP, KAOS, etc.) is a key problem given our ignorance of the physics underlying cosmic acceleration and the plethora of surveys planned. We propose a Bayesian design framework which (1) maximizes the discrimination power of a survey without assuming any underlying dark-energy model, (2) finds the best niche survey geometry given current data and future competing experiments, (3) maximizes the cross section for serendipitous discoveries and (4) can be adapted to answer specific questions (such as “is dark energy dynamical?”). Integrated parameter-space optimization (IPSO) is a design framework that integrates projected parameter errors over an entire dark energy parameter space and then extremizes a figure of merit (such as Shannon entropy gain which we show is stable to off-diagonal covariance matrix perturbations) as a function of survey parameters using analytical, grid or MCMC techniques. We discuss examples where the optimization can be performed analytically. IPSO is thus a general, model-independent and scalable framework that allows us to appropriately use prior information to design the best possible surveys.
Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.
2018-01-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
A Survey of Distributed Optimization and Control Algorithms for Electric Power Systems
Molzahn, Daniel K.; Dorfler, Florian K.; Sandberg, Henrik; ...
2017-07-25
Historically, centrally computed algorithms have been the primary means of power system optimization and control. With increasing penetrations of distributed energy resources requiring optimization and control of power systems with many controllable devices, distributed algorithms have been the subject of significant research interest. Here, this paper surveys the literature of distributed algorithms with applications to optimization and control of power systems. In particular, this paper reviews distributed algorithms for offline solution of optimal power flow (OPF) problems as well as online algorithms for real-time solution of OPF, optimal frequency control, optimal voltage control, and optimal wide-area control problems.
A Survey of Distributed Optimization and Control Algorithms for Electric Power Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molzahn, Daniel K.; Dorfler, Florian K.; Sandberg, Henrik
Historically, centrally computed algorithms have been the primary means of power system optimization and control. With increasing penetrations of distributed energy resources requiring optimization and control of power systems with many controllable devices, distributed algorithms have been the subject of significant research interest. Here, this paper surveys the literature of distributed algorithms with applications to optimization and control of power systems. In particular, this paper reviews distributed algorithms for offline solution of optimal power flow (OPF) problems as well as online algorithms for real-time solution of OPF, optimal frequency control, optimal voltage control, and optimal wide-area control problems.
Hamel, Wolfgang; Köppen, Johannes A; Alesch, François; Antonini, Angelo; Barcia, Juan A; Bergman, Hagai; Chabardes, Stephan; Contarino, Maria Fiorella; Cornu, Philippe; Demmel, Walter; Deuschl, Günther; Fasano, Alfonso; Kühn, Andrea A; Limousin, Patricia; McIntyre, Cameron C; Mehdorn, H Maximilian; Pilleri, Manuela; Pollak, Pierre; Rodríguez-Oroz, Maria C; Rumià, Jordi; Samuel, Michael; Timmermann, Lars; Valldeoriola, Francesc; Vesper, Jan; Visser-Vandewalle, Veerle; Volkmann, Jens; Lozano, Andres M
2017-03-01
Deep brain stimulation within or adjacent to the subthalamic nucleus (STN) represents the most common stereotactic procedure performed for Parkinson disease. Better STN imaging is often regarded as a requirement for improving stereotactic targeting. However, it is unclear whether there is consensus about the optimal target. To obtain an expert opinion on the site regarded optimal for "STN stimulation," movement disorder specialists were asked to indicate their preferred position for an active contact on hard copies of the Schaltenbrand and Wahren atlas depicting the STN in all 3 planes. This represented an idealized setting, and it mimicked optimal imaging for direct target definition in a perfectly delineated STN. The suggested targets were heterogeneous, although some clustering was observed in the dorsolateral STN and subthalamic area. In particular, in the anteroposterior direction, the intended targets differed to a great extent. Most of the indicated targets are thought to also result in concomitant stimulation of structures adjacent to the STN, including the zona incerta, fields of Forel, and internal capsule. This survey illustrates that most sites regarded as optimal for STN stimulation are close to each other, but there appears to be no uniform perception of the optimal anatomic target, possibly influencing surgical results. The anatomic sweet zone for STN stimulation needs further specification, as this information is likely to make magnetic resonance imaging-based target definition less variable when applied to individual patients. Copyright © 2016 Elsevier Inc. All rights reserved.
The SED Machine: A Robotic Spectrograph for Fast Transient Classification
NASA Astrophysics Data System (ADS)
Blagorodnova, Nadejda; Neill, James D.; Walters, Richard; Kulkarni, Shrinivas R.; Fremling, Christoffer; Ben-Ami, Sagi; Dekany, Richard G.; Fucik, Jason R.; Konidaris, Nick; Nash, Reston; Ngeow, Chow-Choong; Ofek, Eran O.; O’ Sullivan, Donal; Quimby, Robert; Ritter, Andreas; Vyhmeister, Karl E.
2018-03-01
Current time domain facilities are finding several hundreds of transient astronomical events a year. The discovery rate is expected to increase in the future as soon as new surveys such as the Zwicky Transient Facility (ZTF) and the Large Synoptic Sky Survey (LSST) come online. Presently, the rate at which transients are classified is approximately one order or magnitude lower than the discovery rate, leading to an increasing “follow-up drought”. Existing telescopes with moderate aperture can help address this deficit when equipped with spectrographs optimized for spectral classification. Here, we provide an overview of the design, operations and first results of the Spectral Energy Distribution Machine (SEDM), operating on the Palomar 60-inch telescope (P60). The instrument is optimized for classification and high observing efficiency. It combines a low-resolution (R ∼ 100) integral field unit (IFU) spectrograph with “Rainbow Camera” (RC), a multi-band field acquisition camera which also serves as multi-band (ugri) photometer. The SEDM was commissioned during the operation of the intermediate Palomar Transient Factory (iPTF) and has already lived up to its promise. The success of the SEDM demonstrates the value of spectrographs optimized for spectral classification.
Genetic Algorithm for Initial Orbit Determination with Too Short Arc
NASA Astrophysics Data System (ADS)
Li, Xin-ran; Wang, Xin
2017-01-01
A huge quantity of too-short-arc (TSA) observational data have been obtained in sky surveys of space objects. However, reasonable results for the TSAs can hardly be obtained with the classical methods of initial orbit determination (IOD). In this paper, the IOD is reduced to a two-stage hierarchical optimization problem containing three variables for each stage. Using the genetic algorithm, a new method of the IOD for TSAs is established, through the selections of the optimized variables and the corresponding genetic operators for specific problems. Numerical experiments based on the real measurements show that the method can provide valid initial values for the follow-up work.
Genetic Algorithm for Initial Orbit Determination with Too Short Arc
NASA Astrophysics Data System (ADS)
Li, X. R.; Wang, X.
2016-01-01
The sky surveys of space objects have obtained a huge quantity of too-short-arc (TSA) observation data. However, the classical method of initial orbit determination (IOD) can hardly get reasonable results for the TSAs. The IOD is reduced to a two-stage hierarchical optimization problem containing three variables for each stage. Using the genetic algorithm, a new method of the IOD for TSAs is established, through the selection of optimizing variables as well as the corresponding genetic operator for specific problems. Numerical experiments based on the real measurements show that the method can provide valid initial values for the follow-up work.
Optimizing measurements of cluster velocities and temperatures for CCAT-prime and future surveys
NASA Astrophysics Data System (ADS)
Mittal, Avirukt; de Bernardis, Francesco; Niemack, Michael D.
2018-02-01
Galaxy cluster velocity correlations and mass distributions are sensitive probes of cosmology and the growth of structure. Upcoming microwave surveys will enable extraction of velocities and temperatures from many individual clusters for the first time. We forecast constraints on peculiar velocities, electron temperatures, and optical depths of galaxy clusters obtainable with upcoming multi-frequency measurements of the kinematic, thermal, and relativistic Sunyaev-Zeldovich effects. The forecasted constraints are compared for different measurement configurations with frequency bands between 90 GHz and 1 THz, and for different survey strategies for the 6-meter CCAT-prime telescope. We study methods for improving cluster constraints by removing emission from dusty star forming galaxies, and by using X-ray temperature priors from eROSITA. Cluster constraints are forecast for several model cluster masses. A sensitivity optimization for seven frequency bands is presented for a CCAT-prime first light instrument and a next generation instrument that takes advantage of the large optical throughput of CCAT-prime. We find that CCAT-prime observations are expected to enable measurement and separation of the SZ effects to characterize the velocity, temperature, and optical depth of individual massive clusters (~1015 Msolar). Submillimeter measurements are shown to play an important role in separating these components from dusty galaxy contamination. Using a modular instrument configuration with similar optical throughput for each detector array, we develop a rule of thumb for the number of detector arrays desired at each frequency to optimize extraction of these signals. Our results are relevant for a future "Stage IV" cosmic microwave background survey, which could enable galaxy cluster measurements over a larger range of masses and redshifts than will be accessible by other experiments.
Optimal measurement of ice-sheet deformation from surface-marker arrays
NASA Astrophysics Data System (ADS)
Macayeal, D. R.
Surface strain rate is best observed by fitting a strain-rate ellipsoid to the measured movement of a stake network or other collection of surface features, using a least squares procedure. Error of the resulting fit varies as 1/(L delta t square root of N), where L is the stake separation, delta is the time period between initial and final stake survey, and n is the number of stakes in the network. This relation suggests that if n is sufficiently high, the traditional practice of revisiting stake-network sites on successive field seasons may be replaced by a less costly single year operation. A demonstration using Ross Ice Shelf data shows that reasonably accurate measurements are obtained from 12 stakes after only 4 days of deformation. It is possible for the least squares procedure to aid airborne photogrammetric surveys because reducing the time interval between survey and re-survey permits better surface feature recognition.
Searching for Single Pulses Using Heimdall
NASA Astrophysics Data System (ADS)
Walsh, Gregory; Lynch, Ryan
2018-01-01
In radio pulsar surveys, the interstellar medium causes a frequency dependent dispersive delay of a pulsed signal across the observing band. If not corrected, this delay substantially lowers S/N and makes most pulses undetectable. The delay is proportional to an unknown dispersion measure (DM), which must be searched over with many trial values. A number of new, GPU-accelerated codes are now available to optimize this dedispersion task, and to search for transient pulsed radio emission. We report on the use of Heimdall, one such GPU-accelerated tree dedispersion utility, to search for transient radio sources in a Green Bank Telescope survey of the Cygnus Region and North Galactic Plane. The survey is carried out at central frequency of 820 MHz with a goal of finding Fast Radio Bursts, Rotating Radio Transients, young pulsars, and millisecond pulsars. We describe the the survey, data processing pipeline, and follow-up of candidate sources.
WFIRST: Project Overview and Status
NASA Astrophysics Data System (ADS)
Kruk, Jeffrey; WFIRST Formulation Science Working Group, WFIRST Project Team
2018-01-01
The Wide-Field InfraRed Survey Telescope (WFIRST) will be the next Astrophysics flagship mission to follow JWST. The observatory payload consists of a Hubble-size telescope aperture with a wide-field NIR instrument and a coronagraph operating at visible wavelengths and employing state-of-the-art wavefront sensing and control. The Wide-field instrument is optimized for large area NIR imaging and spectroscopic surveys, with performance requirements driven by programs to study cosmology and exoplanet detection via gravitational microlensing. All data will be public immediately, and a substantial guest observer program will be supported.The WFIRST Project is presently in Phase A, with a transition to Phase B expected in early to mid 2018. Candidate observing programs are under detailed study in order to inform the mission design, but the actual science investigations will not be selected until much closer to launch. We will present an overview of the present mission design and expected performance, a summary of Project status, and plans for selecting the observing programs.
OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST
NASA Astrophysics Data System (ADS)
Roming, Peter; van der Horst, Alexander; OCTOCAM Team
2018-01-01
The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.
Optimizing future imaging survey of galaxies to confront dark energy and modified gravity models
NASA Astrophysics Data System (ADS)
Yamamoto, Kazuhiro; Parkinson, David; Hamana, Takashi; Nichol, Robert C.; Suto, Yasushi
2007-07-01
We consider the extent to which future imaging surveys of galaxies can distinguish between dark energy and modified gravity models for the origin of the cosmic acceleration. Dynamical dark energy models may have similar expansion rates as models of modified gravity, yet predict different growth of structure histories. We parametrize the cosmic expansion by the two parameters, w0 and wa, and the linear growth rate of density fluctuations by Linder’s γ, independently. Dark energy models generically predict γ≈0.55, while the Dvali-Gabadadze-Porrati (DGP) model γ≈0.68. To determine if future imaging surveys can constrain γ within 20% (or Δγ<0.1), we perform the Fisher matrix analysis for a weak-lensing survey such as the ongoing Hyper Suprime-Cam (HSC) project. Under the condition that the total observation time is fixed, we compute the figure of merit (FoM) as a function of the exposure time texp. We find that the tomography technique effectively improves the FoM, which has a broad peak around texp≃several˜10min; a shallow and wide survey is preferred to constrain the γ parameter. While Δγ<0.1 cannot be achieved by the HSC weak-lensing survey alone, one can improve the constraints by combining with a follow-up spectroscopic survey like Wide-field Fiber-fed Multi-Object Spectrograph (WFMOS) and/or future cosmic microwave background (CMB) observations.
The automated data processing architecture for the GPI Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce
2017-09-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
Simulating the Performance of Ground-Based Optical Asteroid Surveys
NASA Astrophysics Data System (ADS)
Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.
2014-11-01
We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel yet unverified survey cadences (e.g. the baseline LSST cadence) that sparsely spread the observations required for detection over several days or weeks.
A Technical Survey on Optimization of Processing Geo Distributed Data
NASA Astrophysics Data System (ADS)
Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.
2018-04-01
With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.
Surveying multidisciplinary aspects in real-time distributed coding for Wireless Sensor Networks.
Braccini, Carlo; Davoli, Franco; Marchese, Mario; Mongelli, Maurizio
2015-01-27
Wireless Sensor Networks (WSNs), where a multiplicity of sensors observe a physical phenomenon and transmit their measurements to one or more sinks, pertain to the class of multi-terminal source and channel coding problems of Information Theory. In this category, "real-time" coding is often encountered for WSNs, referring to the problem of finding the minimum distortion (according to a given measure), under transmission power constraints, attainable by encoding and decoding functions, with stringent limits on delay and complexity. On the other hand, the Decision Theory approach seeks to determine the optimal coding/decoding strategies or some of their structural properties. Since encoder(s) and decoder(s) possess different information, though sharing a common goal, the setting here is that of Team Decision Theory. A more pragmatic vision rooted in Signal Processing consists of fixing the form of the coding strategies (e.g., to linear functions) and, consequently, finding the corresponding optimal decoding strategies and the achievable distortion, generally by applying parametric optimization techniques. All approaches have a long history of past investigations and recent results. The goal of the present paper is to provide the taxonomy of the various formulations, a survey of the vast related literature, examples from the authors' own research, and some highlights on the inter-play of the different theories.
Designing occupancy studies when false-positive detections occur
Clement, Matthew
2016-01-01
1.Recently, estimators have been developed to estimate occupancy probabilities when false-positive detections occur during presence-absence surveys. Some of these estimators combine different types of survey data to improve estimates of occupancy. With these estimators, there is a tradeoff between the number of sample units surveyed, and the number and type of surveys at each sample unit. Guidance on efficient design of studies when false positives occur is unavailable. 2.For a range of scenarios, I identified survey designs that minimized the mean square error of the estimate of occupancy. I considered an approach that uses one survey method and two observation states and an approach that uses two survey methods. For each approach, I used numerical methods to identify optimal survey designs when model assumptions were met and parameter values were correctly anticipated, when parameter values were not correctly anticipated, and when the assumption of no unmodelled detection heterogeneity was violated. 3.Under the approach with two observation states, false positive detections increased the number of recommended surveys, relative to standard occupancy models. If parameter values could not be anticipated, pessimism about detection probabilities avoided poor designs. Detection heterogeneity could require more or fewer repeat surveys, depending on parameter values. If model assumptions were met, the approach with two survey methods was inefficient. However, with poor anticipation of parameter values, with detection heterogeneity, or with removal sampling schemes, combining two survey methods could improve estimates of occupancy. 4.Ignoring false positives can yield biased parameter estimates, yet false positives greatly complicate the design of occupancy studies. Specific guidance for major types of false-positive occupancy models, and for two assumption violations common in field data, can conserve survey resources. This guidance can be used to design efficient monitoring programs and studies of species occurrence, species distribution, or habitat selection, when false positives occur during surveys.
A Catalog of Cool Dwarf Targets for the Transiting Exoplanet Survey Satellite
NASA Astrophysics Data System (ADS)
Muirhead, Philip S.; Dressing, Courtney D.; Mann, Andrew W.; Rojas-Ayala, Bárbara; Lépine, Sébastien; Paegert, Martin; De Lee, Nathan; Oelkers, Ryan
2018-04-01
We present a catalog of cool dwarf targets (V-J> 2.7, T eff ≲ 4000 K) and their stellar properties for the upcoming Transiting Exoplanet Survey Satellite (TESS), for the purpose of determining which cool dwarfs should be observed using two minute observations. TESS has the opportunity to search tens of thousands of nearby, cool, late K- and M-type dwarfs for transiting exoplanets, an order of magnitude more than current or previous transiting exoplanet surveys, such as Kepler, K2, and ground-based programs. This necessitates a new approach to choosing cool dwarf targets. Cool dwarfs are chosen by collating parallax and proper motion catalogs from the literature and subjecting them to a variety of selection criteria. We calculate stellar parameters and TESS magnitudes using the best possible relations from the literature while maintaining uniformity of methods for the sake of reproducibility. We estimate the expected planet yield from TESS observations using statistical results from the Kepler mission, and use these results to choose the best targets for two minute observations, optimizing for small planets for which masses can conceivably be measured using follow-up Doppler spectroscopy by current and future Doppler spectrometers. The catalog is available in machine readable format and is incorporated into the TESS Input Catalog and TESS Candidate Target List until a more complete and accurate cool dwarf catalog identified by ESA’s Gaia mission can be incorporated.
Molina, Carlos Martin; Pringle, Jamie K; Saumett, Miguel; Evans, Gethin T
2016-04-01
In most Latin American countries there are significant numbers of both missing people and forced disappearances, ∼71,000 Colombia alone. Successful detection of buried human remains by forensic search teams can be difficult in varying terrain and climates. Three clandestine burials were simulated at two different depths commonly encountered in Latin America. In order to gain critical knowledge of optimum geophysical detection techniques, burials were monitored using: ground penetrating radar, magnetic susceptibility, bulk ground conductivity and electrical resistivity up to twenty-two months post-burial. Radar survey results showed good detection of modern 1/2 clothed pig cadavers throughout the survey period on 2D profiles, with the 250MHz antennae judged optimal. Both skeletonised and decapitated and burnt human remains were poorly imaged on 2D profiles with loss in signal continuity observed throughout the survey period. Horizontal radar time slices showed good anomalies observed over targets, but these decreased in amplitude over the post-burial time. These were judged due to detecting disturbed grave soil rather than just the buried targets. Magnetic susceptibility and electrical resistivity were successful at target detection in contrast to bulk ground conductivity surveys which were unsuccessful. Deeper burials were all harder to image than shallower ones. Forensic geophysical surveys should be undertaken at suspected burial sites. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
2012-01-01
Background The effect of pregnancy intention on post-natal practices like breastfeeding is still poorly understood in the Philippines. In this light, this study aims to determine the association between pregnancy intention and optimal breastfeeding practices in the Philippines. Methods This is a cross-sectional study design using the 2003 Philippine National Demographic and Health Survey. Logistic regression analysis was used to determine the independent association of pregnancy intention and optimal breastfeeding practices. The study includes 3,044 last-born children aged 6–36 months at the time of survey. Dead children were also included as long as their age of death satisfies the age criterion. Results Children born from mistimed pregnancies are more likely to have late breastfeeding initiation compared to children born from wanted pregnancies (OR = 1.44; 90%CI: 1.17-1.78). However, this occurs only among children belonging to households with low socio-economic status. Among children belonging to households with high socio-economic status, no significant effect of pregnancy intention on breastfeeding initiation was observed. Children born from unwanted pregnancies are less likely to have short breastfeeding duration (OR = 0.60; 90%CI: 0.48-0.76). However, this occurs only among children belonging to households with high socioeconomic status. No significant effect of pregnancy intention on breastfeeding duration was observed among children belonging to households with low socio-economic status. Conclusion The findings of this study suggest that there are different effects of pregnancy intention on the two types of optimal breastfeeding practices examined. With regards to breastfeeding duration, it was found that among infants belonging to high SES, the odds of having short breastfeeding duration is lower among children born from unwanted pregnancies compared to children born from wanted one. Conversely, children belonging to low SES household, the odds of having late breastfeeding initiation among children born from mistimed pregnancies are higher compared to children born from wanted pregnancies. PMID:22823890
Ulep, Valerie Gilbert T; Borja, Maridel P
2012-07-23
The effect of pregnancy intention on post-natal practices like breastfeeding is still poorly understood in the Philippines. In this light, this study aims to determine the association between pregnancy intention and optimal breastfeeding practices in the Philippines. This is a cross-sectional study design using the 2003 Philippine National Demographic and Health Survey. Logistic regression analysis was used to determine the independent association of pregnancy intention and optimal breastfeeding practices. The study includes 3,044 last-born children aged 6-36 months at the time of survey. Dead children were also included as long as their age of death satisfies the age criterion. Children born from mistimed pregnancies are more likely to have late breastfeeding initiation compared to children born from wanted pregnancies (OR = 1.44; 90%CI: 1.17-1.78). However, this occurs only among children belonging to households with low socio-economic status. Among children belonging to households with high socio-economic status, no significant effect of pregnancy intention on breastfeeding initiation was observed. Children born from unwanted pregnancies are less likely to have short breastfeeding duration (OR = 0.60; 90%CI: 0.48-0.76). However, this occurs only among children belonging to households with high socioeconomic status. No significant effect of pregnancy intention on breastfeeding duration was observed among children belonging to households with low socio-economic status. The findings of this study suggest that there are different effects of pregnancy intention on the two types of optimal breastfeeding practices examined. With regards to breastfeeding duration, it was found that among infants belonging to high SES, the odds of having short breastfeeding duration is lower among children born from unwanted pregnancies compared to children born from wanted one. Conversely, children belonging to low SES household, the odds of having late breastfeeding initiation among children born from mistimed pregnancies are higher compared to children born from wanted pregnancies.
How to find and type red/brown dwarf stars in near-infrared imaging space observatories
NASA Astrophysics Data System (ADS)
Willemn Holwerda, Benne; Ryan, Russell; Bridge, Joanna; Pirzkal, Nor; Kenworthy, Matthew; Andersen, Morten; Wilkins, Stephen; Trenti, Michele; Meshkat, Tiffany; Bernard, Stephanie; Smit, Renske
2018-01-01
Here we evaluate the near-infrared colors of brown dwarfs as observed with four major infrared imaging space observatories: the Hubble Space Telescope (HST), the James Webb Space Telescope (JWST), the EUCLID mission, and the WFIRST telescope. We use the splat ISPEX spectroscopic library to map out the colors of the M, L, and T-type brown dwarfs. We identify which color-color combination is optimal for identifying broad type and which single color is optimal to then identify the subtype (e.g., T0-9). We evaluate each observatory separately as well as the the narrow-field (HST and JWST) and wide-field (EULID and WFIRST) combinations.HST filters used thus far for high-redshift searches (e.g. CANDELS and BoRG) are close to optimal within the available filter combinations. A clear improvement over HST is one of two broad/medium filter combinations on JWST: pairing F140M with either F150W or F162M discriminates well between brown dwarf subtypes. The improvement of JWST the filter set over the HST one is so marked that any combination of HST and JWST filters does not improve the classification.The EUCLID filter set alone performs poorly in terms of typing brown dwarfs and WFIRST performs only marginally better, despite a wider selection of filters. A combined EUCLID and WFIRST observation, using WFIRST's W146 and F062 and EUCLID's Y-band, allows for a much better discrimination between broad brown dwarf categories. In this respect, WFIRST acts as a targeted follow-up observatory for the all-sky EUCLID survey. However, subsequent subtyping with the combination of EUCLID and WFIRST observations remains uncertain due to the lack of medium or narrow-band filters in this wavelength range. We argue that a medium band added to the WFIRST filter selection would greatly improve its ability to preselect against brown dwarfs in high-latitude surveys.
AlKherayf, Fahad; Xu, Yan; Westwick, Harrison; Moldovan, Ioana Doina; Wells, Philip S
2017-03-01
While oral anticoagulation (OAC) is universally indicated for patients with mechanical heart valves (MHVs), OAC resumption following anticoagulant-associated intracerebral hemorrhage (ICH) is an area of uncertainty. We sought to determine the practice preferences of North American neurosurgeons and thrombosis experts on optimal timing of OAC re-initiation. A cross-sectional survey was disseminated to North American members of the American Association of Neurological Surgeons and the International Society for Thrombosis and Haemostasis. Demographic factors, as well as a clinical scenario with 14 modifiable clinical risk factors were included in the survey. 504 physicians completed our survey (response rate 34.3%). Majority of participants were affiliated with academic centres, and managed≤10 ICH patients with MHV per year. There was wide distribution in response in optimal timing for OAC resumption following an ICH: 59% and 60% preferred to re-start OAC between 3 and 14 days following the hemorrhagic event (median of 6-7 days). Smaller hemorrhages (<30cm 2 ). CHADS 2 score ≥2, concomitant venous thromboembolism, mitral valve prosthesis, caged-ball valves and multiple valves prompted earlier OAC resumption. Wide variation in the current practice of neurosurgeons and thrombosis specialists exist when they encounter patients with ICH and MHV, though decisions were influenced by patient- and valve-related factors. As our observed variation likely reflects the immense gap in current evidence, prospective randomized trials in this population are therefore urgently needed. Copyright © 2017 Elsevier B.V. All rights reserved.
Marital status and optimism score among breast cancer survivors.
Croft, Lindsay; Sorkin, John; Gallicchio, Lisa
2014-11-01
There are an increasing number of breast cancer survivors, but their psychosocial and supportive care needs are not well-understood. Recent work has found marital status, social support, and optimism to be associated with quality of life, but little research has been conducted to understand how these factors relate to one another. Survey data from 722 breast cancer survivors were analyzed to estimate the association between marital status and optimism score, as measured using the Life Orientation Test-Revised. Linear regression was used to estimate the relationship of marital status and optimism, controlling for potential confounding variables and assessing effect modification. The results showed that the association between marital status and optimism was modified by time since breast cancer diagnosis. Specifically, in those most recently diagnosed (within 5 years), married breast cancer survivors had a 1.50 higher mean optimism score than unmarried survivors (95 % confidence interval (CI) 0.37, 2.62; p = 0.009). The difference in optimism score by marital status was not present more than 5 years from breast cancer diagnosis. Findings suggest that among breast cancer survivors within 5 years since diagnosis, those who are married have higher optimism scores than their unmarried counterparts; this association was not observed among longer-term breast cancer survivors. Future research should examine whether the difference in optimism score among this subgroup of breast cancer survivors is clinically relevant.
How can we Optimize Global Satellite Observations of Glacier Velocity and Elevation Changes?
NASA Astrophysics Data System (ADS)
Willis, M. J.; Pritchard, M. E.; Zheng, W.
2015-12-01
We have started a global compilation of glacier surface elevation change rates measured by altimeters and differencing of Digital Elevation Models and glacier velocities measured by Synthetic Aperture Radar (SAR) and optical feature tracking as well as from Interferometric SAR (InSAR). Our goal is to compile statistics on recent ice flow velocities and surface elevation change rates near the fronts of all available glaciers using literature and our own data sets of the Russian Arctic, Patagonia, Alaska, Greenland and Antarctica, the Himalayas, and other locations. We quantify the percentage of the glaciers on the planet that can be regarded as fast flowing glaciers, with surface velocities of more than 50 meters per year, while also recording glaciers that have elevation change rates of more than 2 meters per year. We examine whether glaciers have significant interannual variations in velocities, or have accelerated or stagnated where time series of ice motions are available. We use glacier boundaries and identifiers from the Randolph Glacier Inventory. Our survey highlights glaciers that are likely to react quickly to changes in their mass accumulation rates. The study also identifies geographical areas where our knowledge of glacier dynamics remains poor. Our survey helps guide how frequently observations must be made in order to provide quality satellite-derived velocity and ice elevation observations at a variety of glacier thermal regimes, speeds and widths. Our objectives are to determine to what extent the joint NASA and Indian Space Research Organization Synthetic Aperture Radar mission (NISAR) will be able to provide global precision coverage of ice speed changes and to determine how to optimize observations from the global constellation of satellite missions to record important changes to glacier elevations and velocities worldwide.
ART: Surveying the Local Universe at 2-11 keV
NASA Technical Reports Server (NTRS)
O'Dell, S. L.; Ramsey, B. D.; Adams, M. L.; Brandt, W. N.; Bubarev, M. V.; Hassinger, G.; Pravlinski, M.; Predehl, P.; Romaine, S. E.; Swartz, D. A.;
2008-01-01
The Astronomical Rontgen Telescope (ART) is a medium-energy x-ray telescope system proposed for the Russian-led mission Spectrum Rontgen-Gamma (SRG). Optimized for performance over the 2-11-keV band, ART complements the softer response of the SRG prime instrument-the German eROSITA x-ray telescope system. The anticipated number of ART detections is 50,000-with 1,000 heavily-obscured (N(sub H)> 3x10(exp 23)/sq cm) AGN-in the SRG 4-year all-sky survey, plus a comparable number in deeper wide-field (500 deg(sup 2) total) surveys. ART's surveys will provide a minimally-biased, nearly-complete census of the local Universe in the medium-energy x-ray band (including Fe-K lines), at CCD spectral resolution. During long (approx.100-ks) pointed observations, ART can obtain statistically significant spectral data up to about 15 keY for bright sources and medium-energy x-ray continuum and Fe-K-line spectra of AGN detected with the contemporaneous NuSTAR hard-x-ray mission.
Use of High Fidelity Methods in Multidisciplinary Optimization-A Preliminary Survey
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)
2002-01-01
Multidisciplinary optimization is a key element of design process. To date multidiscipline optimization methods that use low fidelity methods are well advanced. Optimization methods based on simple linear aerodynamic equations and plate structural equations have been applied to complex aerospace configurations. However, use of high fidelity methods such as the Euler/ Navier-Stokes for fluids and 3-D (three dimensional) finite elements for structures has begun recently. As an activity of Multidiscipline Design Optimization Technical Committee (MDO TC) of AIAA (American Institute of Aeronautics and Astronautics), an effort was initiated to assess the status of the use of high fidelity methods in multidisciplinary optimization. Contributions were solicited through the members MDO TC committee. This paper provides a summary of that survey.
Pipeline Reduction of Binary Light Curves from Large-Scale Surveys
NASA Astrophysics Data System (ADS)
Prša, Andrej; Zwitter, Tomaž
2007-08-01
One of the most important changes in observational astronomy of the 21st Century is a rapid shift from classical object-by-object observations to extensive automatic surveys. As CCD detectors are getting better and their prices are getting lower, more and more small and medium-size observatories are refocusing their attention to detection of stellar variability through systematic sky-scanning missions. This trend is additionally powered by the success of pioneering surveys such as ASAS, DENIS, OGLE, TASS, their space counterpart Hipparcos and others. Such surveys produce massive amounts of data and it is not at all clear how these data are to be reduced and analysed. This is especially striking in the eclipsing binary (EB) field, where most frequently used tools are optimized for object-by-object analysis. A clear need for thorough, reliable and fully automated approaches to modeling and analysis of EB data is thus obvious. This task is very difficult because of limited data quality, non-uniform phase coverage and parameter degeneracy. The talk will review recent advancements in putting together semi-automatic and fully automatic pipelines for EB data processing. Automatic procedures have already been used to process the Hipparcos data, LMC/SMC observations, OGLE and ASAS catalogs etc. We shall discuss the advantages and shortcomings of these procedures and overview the current status of automatic EB modeling pipelines for the upcoming missions such as CoRoT, Kepler, Gaia and others.
Scheduling Algorithm for the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Ichharam, Jaimal; Stubbs, Christopher
2015-01-01
The Large Synoptic Survey Telescope (LSST) is a wide-field telescope currently under construction and scheduled to be deployed in Chile by 2022 and operate for a ten-year survey. As a ground-based telescope with the largest etendue ever constructed, and the ability to take images approximately once every eighteen seconds, the LSST will be able to capture the entirety of the observable sky every few nights in six different band passes. With these remarkable features, LSST is primed to provide the scientific community with invaluable data in numerous areas of astronomy, including the observation of near-Earth asteroids, the detection of transient optical events such as supernovae, and the study of dark matter and energy through weak gravitational lensing.In order to maximize the utility that LSST will provide toward achieving these scientific objectives, it proves necessary to develop a flexible scheduling algorithm for the telescope which both optimizes its observational efficiency and allows for adjustment based on the evolving needs of the astronomical community.This work defines a merit function that incorporates the urgency of observing a particular field in the sky as a function of time elapsed since last observed, dynamic viewing conditions (in particular transparency and sky brightness), and a measure of scientific interest in the field. The problem of maximizing this merit function, summed across the entire observable sky, is then reduced to a classic variant of the dynamic traveling salesman problem. We introduce a new approximation technique that appears particularly well suited for this situation. We analyze its effectiveness in resolving this problem, obtaining some promising initial results.
Optimizing an experimental design for an electromagnetic experiment
NASA Astrophysics Data System (ADS)
Roux, Estelle; Garcia, Xavier
2013-04-01
Most of geophysical studies focus on data acquisition and analysis, but another aspect which is gaining importance is the discussion on acquisition of suitable datasets. This can be done through the design of an optimal experiment. Optimizing an experimental design implies a compromise between maximizing the information we get about the target and reducing the cost of the experiment, considering a wide range of constraints (logistical, financial, experimental …). We are currently developing a method to design an optimal controlled-source electromagnetic (CSEM) experiment to detect a potential CO2 reservoir and monitor this reservoir during and after CO2 injection. Our statistical algorithm combines the use of linearized inverse theory (to evaluate the quality of one given design via the objective function) and stochastic optimization methods like genetic algorithm (to examine a wide range of possible surveys). The particularity of our method is that it uses a multi-objective genetic algorithm that searches for designs that fit several objective functions simultaneously. One main advantage of this kind of technique to design an experiment is that it does not require the acquisition of any data and can thus be easily conducted before any geophysical survey. Our new experimental design algorithm has been tested with a realistic one-dimensional resistivity model of the Earth in the region of study (northern Spain CO2 sequestration test site). We show that a small number of well distributed observations have the potential to resolve the target. This simple test also points out the importance of a well chosen objective function. Finally, in the context of CO2 sequestration that motivates this study, we might be interested in maximizing the information we get about the reservoir layer. In that case, we show how the combination of two different objective functions considerably improve its resolution.
Optimal surveillance strategy for invasive species management when surveys stop after detection.
Guillera-Arroita, Gurutzeta; Hauser, Cindy E; McCarthy, Michael A
2014-05-01
Invasive species are a cause for concern in natural and economic systems and require both monitoring and management. There is a trade-off between the amount of resources spent on surveying for the species and conducting early management of occupied sites, and the resources that are ultimately spent in delayed management at sites where the species was present but undetected. Previous work addressed this optimal resource allocation problem assuming that surveys continue despite detection until the initially planned survey effort is consumed. However, a more realistic scenario is often that surveys stop after detection (i.e., follow a "removal" sampling design) and then management begins. Such an approach will indicate a different optimal survey design and can be expected to be more efficient. We analyze this case and compare the expected efficiency of invasive species management programs under both survey methods. We also evaluate the impact of mis-specifying the type of sampling approach during the program design phase. We derive analytical expressions that optimize resource allocation between monitoring and management in surveillance programs when surveys stop after detection. We do this under a scenario of unconstrained resources and scenarios where survey budget is constrained. The efficiency of surveillance programs is greater if a "removal survey" design is used, with larger gains obtained when savings from early detection are high, occupancy is high, and survey costs are not much lower than early management costs at a site. Designing a surveillance program disregarding that surveys stop after detection can result in an efficiency loss. Our results help guide the design of future surveillance programs for invasive species. Addressing program design within a decision-theoretic framework can lead to a better use of available resources. We show how species prevalence, its detectability, and the benefits derived from early detection can be considered.
Development and tuning of an original search engine for patent libraries in medicinal chemistry.
Pasche, Emilie; Gobeill, Julien; Kreim, Olivier; Oezdemir-Zaech, Fatma; Vachon, Therese; Lovis, Christian; Ruch, Patrick
2014-01-01
The large increase in the size of patent collections has led to the need of efficient search strategies. But the development of advanced text-mining applications dedicated to patents of the biomedical field remains rare, in particular to address the needs of the pharmaceutical & biotech industry, which intensively uses patent libraries for competitive intelligence and drug development. We describe here the development of an advanced retrieval engine to search information in patent collections in the field of medicinal chemistry. We investigate and combine different strategies and evaluate their respective impact on the performance of the search engine applied to various search tasks, which covers the putatively most frequent search behaviours of intellectual property officers in medical chemistry: 1) a prior art search task; 2) a technical survey task; and 3) a variant of the technical survey task, sometimes called known-item search task, where a single patent is targeted. The optimal tuning of our engine resulted in a top-precision of 6.76% for the prior art search task, 23.28% for the technical survey task and 46.02% for the variant of the technical survey task. We observed that co-citation boosting was an appropriate strategy to improve prior art search tasks, while IPC classification of queries was improving retrieval effectiveness for technical survey tasks. Surprisingly, the use of the full body of the patent was always detrimental for search effectiveness. It was also observed that normalizing biomedical entities using curated dictionaries had simply no impact on the search tasks we evaluate. The search engine was finally implemented as a web-application within Novartis Pharma. The application is briefly described in the report. We have presented the development of a search engine dedicated to patent search, based on state of the art methods applied to patent corpora. We have shown that a proper tuning of the system to adapt to the various search tasks clearly increases the effectiveness of the system. We conclude that different search tasks demand different information retrieval engines' settings in order to yield optimal end-user retrieval.
Development and tuning of an original search engine for patent libraries in medicinal chemistry
2014-01-01
Background The large increase in the size of patent collections has led to the need of efficient search strategies. But the development of advanced text-mining applications dedicated to patents of the biomedical field remains rare, in particular to address the needs of the pharmaceutical & biotech industry, which intensively uses patent libraries for competitive intelligence and drug development. Methods We describe here the development of an advanced retrieval engine to search information in patent collections in the field of medicinal chemistry. We investigate and combine different strategies and evaluate their respective impact on the performance of the search engine applied to various search tasks, which covers the putatively most frequent search behaviours of intellectual property officers in medical chemistry: 1) a prior art search task; 2) a technical survey task; and 3) a variant of the technical survey task, sometimes called known-item search task, where a single patent is targeted. Results The optimal tuning of our engine resulted in a top-precision of 6.76% for the prior art search task, 23.28% for the technical survey task and 46.02% for the variant of the technical survey task. We observed that co-citation boosting was an appropriate strategy to improve prior art search tasks, while IPC classification of queries was improving retrieval effectiveness for technical survey tasks. Surprisingly, the use of the full body of the patent was always detrimental for search effectiveness. It was also observed that normalizing biomedical entities using curated dictionaries had simply no impact on the search tasks we evaluate. The search engine was finally implemented as a web-application within Novartis Pharma. The application is briefly described in the report. Conclusions We have presented the development of a search engine dedicated to patent search, based on state of the art methods applied to patent corpora. We have shown that a proper tuning of the system to adapt to the various search tasks clearly increases the effectiveness of the system. We conclude that different search tasks demand different information retrieval engines' settings in order to yield optimal end-user retrieval. PMID:24564220
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Lin, Lin
Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.
Resolving the Milky Way and Nearby Galaxies with WFIRST
NASA Astrophysics Data System (ADS)
Kalirai, Jasonjot
High-resolution studies of nearby stellar populations have served as a foundation for our quest to understand the nature of galaxies. Today, studies of resolved stellar populations constrain fundamental relations -- such as the initial mass function of stars, the time scales of stellar evolution, the timing of mass loss and amount of energetic feedback, the color-magnitude relation and its dependency on age and metallicity, the stellar-dark matter connection in galaxy halos, and the build up of stellar populations over cosmic time -- that represent key ingredients in our prescription to interpret light from the Universe and to measure the physical state of galaxies. More than in any other area of astrophysics, WFIRST will yield a transformative impact in measuring and characterizing resolved stellar populations in the Milky Way and nearby galaxies. The proximity and level of detail that such populations need to be studied at directly map to all three pillars of WFIRST capabilities - sensitivity from a 2.4 meter space based telescope, resolution from 0.1" pixels, and large 0.3 degree field of view from multiple detectors. Our WFIRST GO Science Investigation Team (F) will develop three WFIRST (notional) GO programs related to resolved stellar populations to fully stress WFIRST's Wide Field Instrument. The programs will include a Survey of the Milky Way, a Survey of Nearby Galaxy Halos, and a Survey of Star-Forming Galaxies. Specific science goals for each program will be validated through a wide range of observational data sets, simulations, and new algorithms. As an output of this study, our team will deliver optimized strategies and tools to maximize stellar population science with WFIRST. This will include: new grids of IR-optimized stellar evolution and synthetic spectroscopic models; pipelines and algorithms for optimal data reduction at the WFIRST sensitivity and pixel scale; wide field simulations of MW environments and galaxy halos; cosmological simulations of nearby galaxy halos matched to WFIRST observations; strategies and automated algorithms to find substructure and dwarf galaxies in WFIRST IR data sets; and documentation. Our team will work closely with the WFIRST Science Center to translate our notional programs into inputs that can help achieve readiness for WFIRST science operations. This includes building full observing programs with target definitions, observing sequences, scheduling constraints, data processing needs, and calibration requirements. Our team has been chosen carefully. Team members are leading scientists in stellar population work that will be a core science theme for WFIRST and are also involved in all large future astronomy projects that will operate in the WFIRST era. The team is intentionally small, and each member will "own" significant science projects. The team will aggressively advocate for WFIRST through innovative initiatives. The team is also diverse in geographical location, observers and theorists, and gender.
A Brief Survey of Modern Optimization for Statisticians
Lange, Kenneth; Chi, Eric C.; Zhou, Hua
2014-01-01
Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include nondifferentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience. PMID:25242858
ERIC Educational Resources Information Center
Hsieh, Chuan-Chung; Yen, Hung-Chin; Kuan, Liu-Yen
2014-01-01
This study empirically investigates the relationships among principals' technology leadership, teaching innovations, and students' academic optimism by surveying elementary school educators across Taiwan. Of the total 1,080 questionnaires distributed, 755 valid surveys were returned for a 69.90% return rate. Teachers were asked to indicate the…
The DiskMass Survey. II. Error Budget
NASA Astrophysics Data System (ADS)
Bershady, Matthew A.; Verheijen, Marc A. W.; Westfall, Kyle B.; Andersen, David R.; Swaters, Rob A.; Martinsson, Thomas
2010-06-01
We present a performance analysis of the DiskMass Survey. The survey uses collisionless tracers in the form of disk stars to measure the surface density of spiral disks, to provide an absolute calibration of the stellar mass-to-light ratio (Υ_{*}), and to yield robust estimates of the dark-matter halo density profile in the inner regions of galaxies. We find that a disk inclination range of 25°-35° is optimal for our measurements, consistent with our survey design to select nearly face-on galaxies. Uncertainties in disk scale heights are significant, but can be estimated from radial scale lengths to 25% now, and more precisely in the future. We detail the spectroscopic analysis used to derive line-of-sight velocity dispersions, precise at low surface-brightness, and accurate in the presence of composite stellar populations. Our methods take full advantage of large-grasp integral-field spectroscopy and an extensive library of observed stars. We show that the baryon-to-total mass fraction ({F}_bar) is not a well-defined observational quantity because it is coupled to the halo mass model. This remains true even when the disk mass is known and spatially extended rotation curves are available. In contrast, the fraction of the rotation speed supplied by the disk at 2.2 scale lengths (disk maximality) is a robust observational indicator of the baryonic disk contribution to the potential. We construct the error budget for the key quantities: dynamical disk mass surface density (Σdyn), disk stellar mass-to-light ratio (Υ^disk_{*}), and disk maximality ({F}_{*,max}^disk≡ V^disk_{*,max}/ V_c). Random and systematic errors in these quantities for individual galaxies will be ~25%, while survey precision for sample quartiles are reduced to 10%, largely devoid of systematic errors outside of distance uncertainties.
Multiple sensitive estimation and optimal sample size allocation in the item sum technique.
Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz
2018-01-01
For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
JAMES WEBB SPACE TELESCOPE CAN DETECT KILONOVAE IN GRAVITATIONAL WAVE FOLLOW-UP SEARCH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartos, I.; Márka, S.; Huard, T. L., E-mail: ibartos@phys.columbia.edu
Kilonovae represent an important electromagnetic counterpart for compact binary mergers, which could become the most commonly detected gravitational-wave (GW) source. Follow-up observations of kilonovae, triggered by GW events, are nevertheless difficult due to poor localization by GW detectors and due to their faint near-infrared peak emission, which has limited observational capability. We show that the Near-Infrared Camera (NIRCam) on the James Webb Space Telescope will be able to detect kilonovae within the relevant GW-detection range of ∼200 Mpc in short (≲12-s) exposure times for a week following the merger. Despite this sensitivity, a kilonova search fully covering a fiducial localizedmore » area of 10 deg{sup 2} will not be viable with NIRCam due to its limited field of view. However, targeted surveys may be developed to optimize the likelihood of discovering kilonovae efficiently within limited observing time. We estimate that a survey of 10 deg{sup 2} focused on galaxies within 200 Mpc would require about 13 hr, dominated by overhead times; a survey further focused on galaxies exhibiting high star formation rates would require ∼5 hr. The characteristic time may be reduced to as little as ∼4 hr, without compromising the likelihood of detecting kilonovae, by surveying sky areas associated with 50%, rather than 90%, confidence regions of 3 GW events, rather than a single event. Upon the detection and identification of a kilonova, a limited number of NIRCam follow-up observations could constrain the properties of matter ejected by the binary and the equation of state of dense nuclear matter.« less
Caya, Teresa; Musuuza, Jackson; Yanke, Eric; Schmitz, Michelle; Anderson, Brooke; Carayon, Pascale; Safdar, Nasia
2015-01-01
We undertook a systems engineering approach to evaluate housewide implementation of daily chlorhexidine bathing. We performed direct observations of the bathing process and conducted provider and patient surveys. The main outcome was compliance with bathing using a checklist. Fifty-seven percent of baths had full compliance with the chlorhexidine bathing protocol. Additional time was the main barrier. Institutions undertaking daily chlorhexidine bathing should perform a rigorous assessment of implementation to optimize the benefits of this intervention.
WINGS: WFIRST Infrared Nearby Galaxy Survey
NASA Astrophysics Data System (ADS)
Williams, Benjamin
WFIRST's combination of wide field and high resolution will revolutionize the study of nearby galaxies. We propose to produce and analyze simulated WFIRST data of nearby galaxies and their halos to maximize the scientific yield in the limited observing time available, ensuring the legacy value of WFIRST's eventual archive. We will model both halo structure and resolved stellar populations to optimize WFIRST's constraints on both dark matter and galaxy formation models in the local universe. WFIRST can map galaxy structure down to ~35 mag/square arcsecond using individual stars. The resulting maps of stellar halos and accreting dwarf companions will provide stringent tests of galaxy formation and dark matter models on galactic (and even sub-galactic) scales, which is where the most theoretical tension exists with the Lambda-CDM model. With a careful, coordinated plan, WFIRST can be expected to improve current sample sizes by 2 orders of magnitude, down to surface brightness limits comparable to those currently reached only in the Local Group, and that are >4 magnitudes fainter than achievable from the ground due to limitations in star-galaxy separation. WFIRST's maps of galaxy halos will simultaneously produce photometry for billions of stars in the main bodies of galaxies within 10 Mpc. These data will transform studies of star formation histories that track stellar mass growth as a function of time and position within a galaxy. They also will constrain critical stellar evolution models of the near-infrared bright, rapidly evolving stars that can contribute significantly to the integrated light of galaxies in the near-infrared. Thus, with WFIRST we can derive the detailed evolution of individual galaxies, reconstruct the complete history of star formation in the nearby universe, and put crucial constraints on the theoretical models used to interpret near-infrared extragalactic observations. We propose a three-component work plan that will ensure these gains by testing and optimizing WFIRST observing strategies and providing science guidance to trade studies of observatory requirements such as field of view, pixel scale and filter selection. First, we will perform extensive simulations of galaxies' halo substructures and stellar populations that will be used as input for optimizing observing strategies and sample selection. Second, we will develop a pipeline that optimizes stellar photometry, proper motion, and variability measurements with WFIRST. This software will: maximize data quality & scientific yield; provide essential, independent calibrations to the larger WFIRST efforts; and rapidly provide accurate photometry and astrometry to the community. Third, we will derive quantitative performance metrics to fairly evaluate trade-offs between different survey strategies and WFIRST performance capabilities. The end result of this effort will be: (1) an efficient survey strategy that maximizes the scientific yield of what would otherwise be a chaotic archive of observations from small, un-coordinated programs; (2) a suite of analysis tools and a state-of-the-art pipeline that can be deployed after launch to rapidly deliver stellar photometry to the public; (3) a platform to independently verify the calibration and point spread function modeling that are essential to the primary WFIRST goals, but that are best tested from images of stellar populations. These activities will be carried out by a Science Investigation Team that has decades of experience in using nearby galaxies to inform fundamental topics in astrophysics. This team is composed of researchers who have led the charge in observational and theoretical studies of resolved stellar populations and stellar halos. With our combined background, we are poised to take full advantage of the large field of view and high spatial resolution WFIRST will offer.
Raymond, Jofrey; Kassim, Neema; Rose, Jerman W.; Agaba, Morris
2017-01-01
ABSTRACT Background: Achieving nutritional goals of infants and young children while maintaining the intake of local and culture-specific foods can be a daunting task. Diet optimisation using linear goal programming (LP) can effectively generate optimal formulations incorporating local and culturally acceptable foods. Objective: The primary objective of this study was to determine whether a realistic and affordable diet that achieves dietary recommended intakes (DRIs) for 22 selected nutrients can be formulated for rural 6–23-month-old children in Tanzania. Design: Dietary intakes of 400 children aged 6–23 months were assessed using a weighed dietary record (WDR), 24-hour dietary recalls and a 7-days food record. A market survey was also carried out to estimate the cost per 100 g of edible portion of foods that are commonly consumed in the study area. Dietary and market survey data were then used to define LP model parameters for diet optimisation. All LP analyses were done using linear program solver (LiPS) version 1.9.4 to generate optimal food formulations. Results: Optimal formulations that achieved DRIs for 20 nutrients for children aged 6–11 months and all selected nutrients for children aged 12–23 months were successfully developed at a twofold cost of the observed food purchase across age groups. Optimal formulations contained a mixture of ingredients such as wholegrain cereals, Irish potatoes, pulses and seeds, fish and poultry meat as well as fruits and vegetables that can be sourced locally. Conclusions: Our findings revealed that given the available food choices, it is possible to develop optimal formulations that can improve dietary adequacy for rural 6–23-month-old children if food budget for the child’s diets is doubled. These findings suggest the need for setting alternative interventions which can help households increase access to nutrient-dense foods that can fill the identified nutrient gaps. PMID:28814951
VizieR Online Data Catalog: The UV-bright Quasar Survey (UVQS) DR1 (Monroe+, 2016)
NASA Astrophysics Data System (ADS)
Monroe, T. R.; Prochaska, J. X.; Tejos, N.; Worseck, G.; Hennawi, J. F.; Schmidt, T.; Tumlinson, J.; Shen, Y.
2016-09-01
We have performed an all-sky survey for z~1, FUV-bright quasars selected from GALEX and WISE photometry. We generated a list of 1450 primary candidates (Table1). In several of the observing runs, conditions were unexpectedly favorable and we exhausted the primary candidates at certain right ascension ranges. To fill the remaining observing time, we generated a secondary candidate list. This secondary set of candidates is provided in Table2. We proceeded to obtain discovery-quality longslit spectra (i.e., low-dispersion, large wavelength coverage, modest signal-to-noise ratio (S/N) of our UV-bright Quasar Survey (UVQS) candidates in one calendar year. Our principal facilities were: (i) the dual Kast spectrometer on the 3m Shane telescope at the Lick Observatory; (ii) the Boller & Chivens (BCS) spectrometer on the Irenee du Pont 100'' telescope at the Las Campanas Observatory; and (iii) the Calar Alto Faint Object Spectrograph on the CAHA 2.2m telescope at the Calar Alto Observatory (CAHA). We acquired an additional ~20 spectra on larger aperture telescopes (Keck/ESI, MMT/MBC, Magellan/MagE) during twilight or under poor observing conditions. Typical exposure times were limited to <~200s, with adjustments for fainter sources or sub-optimal observing conditions. Table3 provides a list of the observed candidates. There are 93 sources with a good quality spectrum for which we cannot recover a secure redshift. The majority of these have been previously cataloged as blazars (or BL Lac objects). Table6 lists the sample of these unknowns. (6 data files).
Investigating the Bright End of LSST Photometry
NASA Astrophysics Data System (ADS)
Ojala, Elle; Pepper, Joshua; LSST Collaboration
2018-01-01
The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.; Agüeros, Marcel A.; Fournier, Amanda P.; Street, Rachel; Ofek, Eran O.; Covey, Kevin R.; Levitan, David; Laher, Russ R.; Sesar, Branimir; Surace, Jason
2014-01-01
Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ~20,000 deg2 footprint. While the median 7.26 deg2 PTF field has been imaged ~40 times in the R band, ~2300 deg2 have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 109 light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.
The Clustering of Lifestyle Behaviours in New Zealand and their Relationship with Optimal Wellbeing.
Prendergast, Kate B; Mackay, Lisa M; Schofield, Grant M
2016-10-01
The purpose of this research was to determine (1) associations between multiple lifestyle behaviours and optimal wellbeing and (2) the extent to which five lifestyle behaviours-sleep, physical activity, sedentary behaviour, sugary drink consumption, and fruit and vegetable intake-cluster in a national sample. A national sample of New Zealand adults participated in a web-based wellbeing survey. Five lifestyle behaviours-sleep, physical activity, sedentary behaviour, sugary drink consumption, and fruit and vegetable intake-were dichotomised into healthy (meets recommendations) and unhealthy (does not meet recommendations) categories. Optimal wellbeing was calculated using a multi-dimensional flourishing scale, and binary logistic regression analysis was used to calculate the relationship between multiple healthy behaviours and optimal wellbeing. Clustering was examined by comparing the observed and expected prevalence rates (O/E) of healthy and unhealthy two-, three-, four-, and five-behaviour combinations. Data from 9425 participants show those engaging in four to five healthy behaviours (23 %) were 4.7 (95 % confidence interval (CI) 3.8-5.7) times more likely to achieve optimal wellbeing compared to those engaging in zero to one healthy behaviour (21 %). Clustering was observed for healthy (5 %, O/E 2.0, 95 % CI 1.8-2.2) and unhealthy (5 %, O/E 2.1, 95 % CI 1.9-2.3) five-behaviour combinations and for four- and three-behaviour combinations. At the two-behaviour level, healthy fruit and vegetable intake clustered with all behaviours, except sleep which did not cluster with any behaviour. Multiple lifestyle behaviours were positively associated with optimal wellbeing. The results show lifestyle behaviours cluster, providing support for multiple behaviour lifestyle-based interventions for optimising wellbeing.
Fisher, Jason C.
2013-01-01
Long-term groundwater monitoring networks can provide essential information for the planning and management of water resources. Budget constraints in water resource management agencies often mean a reduction in the number of observation wells included in a monitoring network. A network design tool, distributed as an R package, was developed to determine which wells to exclude from a monitoring network because they add little or no beneficial information. A kriging-based genetic algorithm method was used to optimize the monitoring network. The algorithm was used to find the set of wells whose removal leads to the smallest increase in the weighted sum of the (1) mean standard error at all nodes in the kriging grid where the water table is estimated, (2) root-mean-squared-error between the measured and estimated water-level elevation at the removed sites, (3) mean standard deviation of measurements across time at the removed sites, and (4) mean measurement error of wells in the reduced network. The solution to the optimization problem (the best wells to retain in the monitoring network) depends on the total number of wells removed; this number is a management decision. The network design tool was applied to optimize two observation well networks monitoring the water table of the eastern Snake River Plain aquifer, Idaho; these networks include the 2008 Federal-State Cooperative water-level monitoring network (Co-op network) with 166 observation wells, and the 2008 U.S. Geological Survey-Idaho National Laboratory water-level monitoring network (USGS-INL network) with 171 wells. Each water-level monitoring network was optimized five times: by removing (1) 10, (2) 20, (3) 40, (4) 60, and (5) 80 observation wells from the original network. An examination of the trade-offs associated with changes in the number of wells to remove indicates that 20 wells can be removed from the Co-op network with a relatively small degradation of the estimated water table map, and 40 wells can be removed from the USGS-INL network before the water table map degradation accelerates. The optimal network designs indicate the robustness of the network design tool. Observation wells were removed from high well-density areas of the network while retaining the spatial pattern of the existing water-table map.
Degeneracy in NLP and the development of results motivated by its presence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiacco, A.; Liu, J.
We study notions of nondegeneracy and several levels of increasing degeneracy from the perspective of the local behavior of a local solution of a nonlinear program when problem parameters are slightly perturbed. This overview may be viewed as a structured survey of sensitivity and stability results: the focus is on progressive levels of degeneracy. We note connections of nondegeneracy with the convergence of algorithms and observe the striking parallel between the effects of nondegeneracy and degeneracy on optimality conditions, stability analysis and algorithmic convergence behavior. Although our orientation here is primarily interpretive and noncritical, we conclude that more effort ismore » needed to unify optimality, stability and convergence theory and more results are needed in all three areas for radically degenerate problems.« less
Exploring the Outer Solar System with the ESSENCE Supernova Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, A.C.; /Washington U., Seattle, Astron. Dept.; Arraki, K.
We report the discovery and orbital determination of 14 trans-Neptunian objects (TNOs) from the ESSENCE Supernova Survey difference imaging data set. Two additional objects discovered in a similar search of the SDSS-II Supernova Survey database were recovered in this effort. ESSENCE repeatedly observed fields far from the solar system ecliptic (-21{sup o} < {beta} < -5{sup o}), reaching limiting magnitudes per observation of I {approx} 23.1 and R {approx} 23.7. We examine several of the newly detected objects in detail, including 2003 UC{sub 414}, which orbits entirely between Uranus and Neptune and lies very close to a dynamical region thatmore » would make it stable for the lifetime of the solar system. 2003 SS{sub 422} and 2007 TA{sub 418} have high eccentricities and large perihelia, making them candidate members of an outer class of TNOs. We also report a new member of the 'extended' or 'detached' scattered disk, 2004 VN{sub 112}, and verify the stability of its orbit using numerical simulations. This object would have been visible to ESSENCE for only {approx}2% of its orbit, suggesting a vast number of similar objects across the sky. We emphasize that off-ecliptic surveys are optimal for uncovering the diversity of such objects, which in turn will constrain the history of gravitational influences that shaped our early solar system.« less
Using electronic surveys in nursing research.
Cope, Diane G
2014-11-01
Computer and Internet use in businesses and homes in the United States has dramatically increased since the early 1980s. In 2011, 76% of households reported having a computer, compared with only 8% in 1984 (File, 2013). A similar increase in Internet use has also been seen, with 72% of households reporting access of the Internet in 2011 compared with 18% in 1997 (File, 2013). This emerging trend in technology has prompted use of electronic surveys in the research community as an alternative to previous telephone and postal surveys. Electronic surveys can offer an efficient, cost-effective method for data collection; however, challenges exist. An awareness of the issues and strategies to optimize data collection using web-based surveys is critical when designing research studies. This column will discuss the different types and advantages and disadvantages of using electronic surveys in nursing research, as well as methods to optimize the quality and quantity of survey responses.
eGSM: A extended Sky Model of Diffuse Radio Emission
NASA Astrophysics Data System (ADS)
Kim, Doyeon; Liu, Adrian; Switzer, Eric
2018-01-01
Both cosmic microwave background and 21cm cosmology observations must contend with astrophysical foreground contaminants in the form of diffuse radio emission. For precise cosmological measurements, these foregrounds must be accurately modeled over the entire sky Ideally, such full-sky models ought to be primarily motivated by observations. Yet in practice, these observations are limited, with data sets that are observed not only in a heterogenous fashion, but also over limited frequency ranges. Previously, the Global Sky Model (GSM) took some steps towards solving the problem of incomplete observational data by interpolating over multi-frequency maps using principal component analysis (PCA).In this poster, we present an extended version of GSM (called eGSM) that includes the following improvements: 1) better zero-level calibration 2) incorporation of non-uniform survey resolutions and sky coverage 3) the ability to quantify uncertainties in sky models 4) the ability to optimally select spectral models using Bayesian Evidence techniques.
THE LUPUS TRANSIT SURVEY FOR HOT JUPITERS: RESULTS AND LESSONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bayliss, Daniel D. R.; Sackett, Penny D.; Weldrake, David T. F.
2009-05-15
We present the results of a deep, wide-field transit survey targeting 'Hot Jupiter' planets in the Lupus region of the Galactic plane conducted over 53 nights concentrated in two epochs separated by a year. Using the Australian National University 40-inch telescope at Siding Spring Observatory (SSO), the survey covered a 0.66 deg{sup 2} region close to the Galactic plane (b = 11{sup 0}) and monitored a total of 110,372 stars (15.0 {<=} V {<=} 22.0). Using difference imaging photometry, 16,134 light curves with a photometric precision of {sigma} < 0.025 mag were obtained. These light curves were searched for transits,more » and four candidates were detected that displayed low-amplitude variability consistent with a transiting giant planet. Further investigations, including spectral typing and radial velocity measurements for some candidates, revealed that of the four, one is a true planetary companion (Lupus-TR-3), two are blended systems (Lupus-TR-1 and 4), and one is a binary (Lupus-TR-2). The results of this successful survey are instructive for optimizing the observational strategy and follow-up procedure for deep searches for transiting planets, including an upcoming survey using the SkyMapper telescope at SSO.« less
Optimizing Methods of Obtaining Stellar Parameters for the H3 Survey
NASA Astrophysics Data System (ADS)
Ivory, KeShawn; Conroy, Charlie; Cargile, Phillip
2018-01-01
The Stellar Halo at High Resolution with Hectochelle Survey (H3) is in the process of observing and collecting stellar parameters for stars in the Milky Way's halo. With a goal of measuring radial velocities for fainter stars, it is crucial that we have optimal methods of obtaining this and other parameters from the data from these stars.The method currently developed is The Payne, named after Cecilia Payne-Gaposchkin, a code that uses neural networks and Markov Chain Monte Carlo methods to utilize both spectra and photometry to obtain values for stellar parameters. This project was to investigate the benefit of fitting both spectra and spectral energy distributions (SED). Mock spectra using the parameters of the Sun were created and noise was inserted at various signal to noise values. The Payne then fit each mock spectrum with and without a mock SED also generated from solar parameters. The result was that at high signal to noise, the spectrum dominated and the effect of fitting the SED was minimal. But at low signal to noise, the addition of the SED greatly decreased the standard deviation of the data and resulted in more accurate values for temperature and metallicity.
Antecedent and Consequence of School Academic Optimism and Teachers' Academic Optimism Model
ERIC Educational Resources Information Center
Hong, Fu-Yuan
2017-01-01
The main purpose of this research was to examine the relationships among school principals' transformational leadership, school academic optimism, teachers' academic optimism and teachers' professional commitment. This study conducted a questionnaire survey on 367 teachers from 20 high schools in Taiwan by random sampling, using principals'…
Taber, Jennifer M.; Klein, William M. P.; Ferrer, Rebecca A.; Kent, Erin E.; Harris, Peter R.
2016-01-01
Background Optimism and self-affirmation promote adaptive coping, goal achievement, and better health. Purpose To examine the associations of optimism and spontaneous self-affirmation (SSA) with physical, mental, and cognitive health and information seeking among cancer survivors. Methods Cancer survivors (n=326) completed the Health Information National Trends Survey 2013, a national survey of U.S. adults. Participants reported optimism, SSA, cognitive and physical impairment, affect, health status, and information seeking. Results Participants higher in optimism reported better health on nearly all indices examined, even when controlling for SSA. Participants higher in SSA reported lower likelihood of cognitive impairment, greater happiness and hopefulness, and greater likelihood of cancer information seeking. SSA remained significantly associated with greater hopefulness and cancer information seeking when controlling for optimism. Conclusions Optimism and SSA may be associated with beneficial health-related outcomes among cancer survivors. Given the demonstrated malleability of self-affirmation, these findings represent important avenues for future research. PMID:26497697
Taber, Jennifer M; Klein, William M P; Ferrer, Rebecca A; Kent, Erin E; Harris, Peter R
2016-04-01
Optimism and self-affirmation promote adaptive coping, goal achievement, and better health. The aim of this study is to examine the associations of optimism and spontaneous self-affirmation (SSA) with physical, mental, and cognitive health and information seeking among cancer survivors. Cancer survivors (n = 326) completed the Health Information National Trends Survey 2013, a national survey of US adults. Participants reported optimism, SSA, cognitive and physical impairment, affect, health status, and information seeking. Participants higher in optimism reported better health on nearly all indices examined, even when controlling for SSA. Participants higher in SSA reported lower likelihood of cognitive impairment, greater happiness and hopefulness, and greater likelihood of cancer information seeking. SSA remained significantly associated with greater hopefulness and cancer information seeking when controlling for optimism. Optimism and SSA may be associated with beneficial health-related outcomes among cancer survivors. Given the demonstrated malleability of self-affirmation, these findings represent important avenues for future research.
Sparsely sampling the sky: Regular vs. random sampling
NASA Astrophysics Data System (ADS)
Paykari, P.; Pires, S.; Starck, J.-L.; Jaffe, A. H.
2015-09-01
Aims: The next generation of galaxy surveys, aiming to observe millions of galaxies, are expensive both in time and money. This raises questions regarding the optimal investment of this time and money for future surveys. In a previous work, we have shown that a sparse sampling strategy could be a powerful substitute for the - usually favoured - contiguous observation of the sky. In our previous paper, regular sparse sampling was investigated, where the sparse observed patches were regularly distributed on the sky. The regularity of the mask introduces a periodic pattern in the window function, which induces periodic correlations at specific scales. Methods: In this paper, we use a Bayesian experimental design to investigate a "random" sparse sampling approach, where the observed patches are randomly distributed over the total sparsely sampled area. Results: We find that in this setting, the induced correlation is evenly distributed amongst all scales as there is no preferred scale in the window function. Conclusions: This is desirable when we are interested in any specific scale in the galaxy power spectrum, such as the matter-radiation equality scale. As the figure of merit shows, however, there is no preference between regular and random sampling to constrain the overall galaxy power spectrum and the cosmological parameters.
NASA Astrophysics Data System (ADS)
Frith, J.; Barker, E.; Cowardin, H.; Buckalew, B.; Anz-Meador, P.; Lederer, S.
The National Aeronautics and Space Administration (NASA) Orbital Debris Program Office (ODPO) recently commissioned the Meter Class Autonomous Telescope (MCAT) on Ascension Island with the primary goal of obtaining population statistics of the geosynchronous (GEO) orbital debris environment. To help facilitate this, studies have been conducted using MCAT’s known and projected capabilities to estimate the accuracy and timeliness in which it can survey the GEO environment, including collected weather data and the proposed observational data collection cadence. To optimize observing cadences and probability of detection, on-going work using a simulated GEO debris population sampled at various cadences are run through the Constrained Admissible Region Multi Hypotheses Filter (CAR-MHF). The orbits computed from the results are then compared to the simulated data to assess MCAT’s ability to determine accurately the orbits of debris at various sample rates. The goal of this work is to discriminate GEO and near-GEO objects from GEO transfer orbit objects that can appear as GEO objects in the environmental models due to the short arc observation and an assumed circular orbit. The specific methods and results are presented here.
Source finding in linear polarization for LOFAR, and SKA predecessor surveys, using Faraday moments
NASA Astrophysics Data System (ADS)
Farnes, J. S.; Heald, G.; Junklewitz, H.; Mulcahy, D. D.; Haverkorn, M.; Van Eck, C. L.; Riseley, C. J.; Brentjens, M.; Horellou, C.; Vacca, V.; Jones, D. I.; Horneffer, A.; Paladino, R.
2018-03-01
The optimal source-finding strategy for linear polarization data is an unsolved problem, with many inhibitive factors imposed by the technically challenging nature of polarization observations. Such an algorithm is essential for Square Kilometre Array (SKA) pathfinder surveys, such as the Multifrequency Snapshot Sky Survey with the LOw Frequency ARray (LOFAR), as data volumes are significant enough to prohibit manual inspection. We present a new strategy of `Faraday Moments' for source-finding in linear polarization with LOFAR, using the moments of the frequency-dependent full-Stokes data (i.e. the mean, standard deviation, skewness, and excess kurtosis). Through simulations of the sky, we find that moments can identify polarized sources with a high completeness: 98.5 per cent at a signal to noise of 5. While the method has low reliability, rotation measure (RM) synthesis can be applied per candidate source to filter out instrumental and spurious detections. This combined strategy will result in a complete and reliable catalogue of polarized sources that includes the full sensitivity of the observational bandwidth. We find that the technique can reduce the number of pixels on which RM Synthesis needs to be performed by a factor of ≈1 × 105 for source distributions anticipated with modern radio telescopes. Through tests on LOFAR data, we find that the technique works effectively in the presence of diffuse emission. Extensions of this method are directly applicable to other upcoming radio surveys such as the POlarization Sky Survey of the Universe's Magnetism with the Australia Square Kilometre Array Pathfinder, and the SKA itself.
Comparative risk assessment and cessation information seeking among smokeless tobacco users.
Jun, Jungmi; Nan, Xiaoli
2018-05-01
This research examined (1) smokeless tobacco users' comparative optimism in assessing the health and addiction risks of their own product in comparison with cigarettes, and (2) the effects of comparative optimism on cessation information-seeking. A nationally-representative sample from the 2015 Health Information National Trends Survey (HINTS)-FDA was employed. The analyses revealed the presence of comparative optimism in assessing both health and addiction risks among smokeless tobacco users. Comparative optimism was negatively correlated with most cessation information-seeking variables. Health bias (the health risk rating gap between the subject's own tobacco product and cigarettes) was associated with decreased intent to use cessation support. However, the health bias and addiction bias (the addiction risk rating gap between the subject's own tobacco product and cigarettes) were not consistent predictors of all cessation information-seeking variables, when covariates of socio-demographics and tobacco use status were included. In addition, positive correlations between health bias and past/recent cessation-information searches were observed. Optimisic biases may negatively influence cessation behaviors not only directly but also indirectly by influencing an important moderator, cessation information-seeking. Future interventions should prioritize dispelling the comparative optimism in perceiving risks of smokeless tobacco use, as well as provide more reliable cessation information specific to smokeless tobacco users. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazkoz, Ruth; Escamilla-Rivera, Celia; Salzano, Vincenzo
Cosmography provides a model-independent way to map the expansion history of the Universe. In this paper we simulate a Euclid-like survey and explore cosmographic constraints from future Baryonic Acoustic Oscillations (BAO) observations. We derive general expressions for the BAO transverse and radial modes and discuss the optimal order of the cosmographic expansion that provides reliable cosmological constraints. Through constraints on the deceleration and jerk parameters, we show that future BAO data have the potential to provide a model-independent check of the cosmic acceleration as well as a discrimination between the standard ΛCDM model and alternative mechanisms of cosmic acceleration.
NASA Technical Reports Server (NTRS)
Gillett, Frederick; Houck, James; Bally, John; Becklin, Eric; Brown, Robert Hamilton; Draine, Bruce; Frogel, Jay; Gatley, Ian; Gehrz, Robert; Hildebrand, Roger
1991-01-01
The decade of 1990's presents an opportunity to address fundamental astrophysical issues through observations at IR wavelengths made possible by technological and scientific advances during the last decade. The major elements of recommended program are: the Space Infrared Telescope Facility (SIRTF), the Stratospheric Observatory For Infrared Astronomy (SOFIA) and the IR Optimized 8-m Telescope (IRO), a detector and instrumentation program, the SubMilliMeter Mission (SMMM), the 2 Microns All Sky Survey (2MASS), a sound infrastructure, and technology development programs. Also presented are: perspective, science opportunities, technical overview, project recommendations, future directions, and infrastructure.
Vetter, Thomas R; Barman, Joydip; Boudreaux, Arthur M; Jones, Keith A
2016-03-22
Persistently variable success has been experienced in locally translating even well-grounded national clinical practice guidelines, including in the perioperative setting. We have sought greater applicability and acceptance of clinical practice guidelines and protocols with our novel Perioperative Risk Optimization and Management Planning Tool (PROMPT™). This study was undertaken to survey our institutional perioperative clinicians regarding (a) their qualitative recommendations for (b) their quantitative perceptions of the relative importance of a series of clinical issues and patient medical conditions as potential topics for creating a PROMPT™. We applied a mixed methods research design that involved collecting, analyzing, and "mixing" both qualitative and quantitative methods and data in a single study to answer a research question. Survey One was qualitative in nature and asked the study participants to list as free text up to 12 patient medical conditions or clinical issues that they perceived to be high priority topics for development of a PROMPT™. Survey Two was quantitative in nature and asked the study participants to rate each of these 57 specific, pre-selected clinical issues and patient medical conditions on an 11-point Likert scale of perceived importance as a potential topic for a PROMPT™. The two electronic, online surveys were completed by participants who were recruited from the faculty in our Department of Anesthesiology and Perioperative Medicine and Department of Surgery, and the cohort of hospital-employed certified registered nurse anesthetists. A total of 57 possible topics for a PROMPT™ was created and prioritized by our stakeholders. A strong correlation (r = 0.82, 95% CI: 0.71, 0.89, P < 0.001) was observed between the quantitative clinician survey rating scores reported by the anesthesiologists/certified registered nurse anesthetists versus the surgeons. The quantitative survey displayed strong inter-rater reliability (ICC = 0.92, P < 0.001). Our qualitative clinician stakeholder survey generated a comprehensive roster of clinical issues and patient medical conditions. Our subsequent quantitative clinician stakeholder survey indicated that there is generally strong agreement among anesthesiologists/certified registered nurse anesthetists and surgeons about the relative importance of these clinical issues and patient medical conditions as potential topics for perioperative optimization and risk management.
Real-time localization of mobile device by filtering method for sensor fusion
NASA Astrophysics Data System (ADS)
Fuse, Takashi; Nagara, Keita
2017-06-01
Most of the applications with mobile devices require self-localization of the devices. GPS cannot be used in indoor environment, the positions of mobile devices are estimated autonomously by using IMU. Since the self-localization is based on IMU of low accuracy, and then the self-localization in indoor environment is still challenging. The selflocalization method using images have been developed, and the accuracy of the method is increasing. This paper develops the self-localization method without GPS in indoor environment by integrating sensors, such as IMU and cameras, on mobile devices simultaneously. The proposed method consists of observations, forecasting and filtering. The position and velocity of the mobile device are defined as a state vector. In the self-localization, observations correspond to observation data from IMU and camera (observation vector), forecasting to mobile device moving model (system model) and filtering to tracking method by inertial surveying and coplanarity condition and inverse depth model (observation model). Positions of a mobile device being tracked are estimated by system model (forecasting step), which are assumed as linearly moving model. Then estimated positions are optimized referring to the new observation data based on likelihood (filtering step). The optimization at filtering step corresponds to estimation of the maximum a posterior probability. Particle filter are utilized for the calculation through forecasting and filtering steps. The proposed method is applied to data acquired by mobile devices in indoor environment. Through the experiments, the high performance of the method is confirmed.
Blinded evaluation of the effects of hyaluronic acid filler injections on first impressions.
Dayan, Steven H; Arkins, John P; Gal, Thomas J
2010-11-01
Facial appearance has profound influence on the first impression that is projected to others. To determine the effects that complete correction of the nasolabial folds (NLFs) with hyaluronic acid (HA) filler has on the first impression one makes. Twenty-two subjects received injections of HA filler into the NLFs. Photographs of the face in a relaxed pose were taken at baseline, optimal correction visit, and 4 weeks after optimal correction. Three hundred four blinded evaluators completed a survey rating first impression on various measures of success for each photo. In total, 5,776 first impressions were recorded, totaling 46,208 individual assessments of first impression. Our findings indicate a significant improvement in mean first impression in the categories of dating success, attractiveness, financial success, relationship success, athletic success, and overall first impression at the optimal correction visit. At 4 weeks after the optimal correction visit, significance was observed in all categories measured: social skills, academic performance, dating success, occupational success, attractiveness, financial success, relationship success, athletic success, and overall first impression. Full correction of the NLFs with HA filler significantly and positively influences the first impression an individual projects. © 2010 by the American Society for Dermatologic Surgery, Inc.
Levin, Adeera; Steven, Soroka; Selina, Allu; Flora, Au; Sarah, Gil; Braden, Manns
2014-01-01
The goals of care for patients with chronic kidney disease (CKD) are to delay progression to end stage renal disease, reduce complications, and to ensure timely transition to dialysis or transplantation, while optimizing independence. Recent guidelines recommend that multidisciplinary team based care should be available to patients with CKD. While most provinces fund CKD care, the specific models by which these outcomes are achieved are not known. Funding for clinics is hospital or program based. To describe the structure and function of clinics in order to understand the current models of care, inform best practice and potentially standardize models of care. Prospective cross sectional observational survey study. Canadian nephrology programs in all provinces. Using an open-ended semi-structured questionnaire, we surveyed 71 of 84 multidisciplinary adult CKD clinics across Canada, by telephone and with written semi-structured questionnaires; (June 2012 to November 2013). Standardized introductory scripts were used, in both English and French. CKD clinic structure and models of care vary significantly across Canada. Large variation exists in staffing ratios (Nephrologist, dieticians, pharmacists and nurses to patients), and in referral criteria. Dialysis initiation decisions were usually made by MDs. The majority of clinics (57%) had a consistent model of care (the same Nephrologist and nurse per patient), while others had patients seeing a different nephrologist and nurses at each clinic visit. Targets for various modality choices varied, as did access to those modalities. No patient or provider educational tools describing the optimal time to start dialysis exist in any of the clinics. The surveys rely on self reporting without validation from independent sources, and there was limited involvement of Quebec clinics. These are relative limitations and do not affect the main results. The variability in clinic structure and function offers an opportunity to explore the relationship of these elements to patient outcomes, and to determine optimal models of care. This list of contacts generated through this study, serves as a basis for establishing a CKD clinic network. This network is anticipated to facilitate the conduct of clinical trials to test novel interventions or strategies within the context of well characterized models of care.
NASA Astrophysics Data System (ADS)
Higdon, S. J. U.; Weedman, D.; Higdon, J. L.; Houck, J. R.; Soifer, B. T.; Armus, L.; Charmandaris, V.; Herter, T. L.; Brandl, B. R.; Brown, M. J. I.; Dey, A.; Jannuzi, B.; Le Floc'h, E.; Rieke, M.
2004-12-01
We have surveyed a field covering 8.4 degrees2 within the NOAO Deep Wide Field Survey region in Boötes with the Multiband Imaging Photometer on the Spitzer Space Telescope to a limiting 24 um flux density of 0.3 mJy, identifying ˜ 22,000 point sources. Thirty one sources from this survey with F(24 um) > 0.75 mJy , which are optically ``invisible'' (R > 26) or very faint (I > 24) have been observed with the low-resolution modules of the Infrared Spectrograph on SST. The spectra were extracted using the IRS SMART spectral analysis package in order to optimize their signal to noise. A suite of mid-IR spectral templates of well known galaxies, observed as part of the IRS GTO program, is used to perform formal fits to the spectral energy distribution of the Boötes sources. These fits enable us to measure their redshift, to calculate the depth of the 9.7 um silicate feature along with the strength of 7.7 um PAH, as well as to estimate their bolometric luminosities. We compare the mid-IR slope, the measured PAH luminosity, and the optical depth of these sources with those of galaxies in the local Universe. As a result we are able to estimate the contribution of a dust enshrouded active nucleus to the mid-IR and bolometric luminosity of these systems. This work is based [in part] on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory, California Institute of Technology under NASA contract 1407. Support for this work was provided by NASA through Contract Number 1257184 issued by JPL/Caltech.
Optimization of a hydrometric network extension using specific flow, kriging and simulated annealing
NASA Astrophysics Data System (ADS)
Chebbi, Afef; Kebaili Bargaoui, Zoubeida; Abid, Nesrine; da Conceição Cunha, Maria
2017-12-01
In hydrometric stations, water levels are continuously observed and discharge rating curves are constantly updated to achieve accurate river levels and discharge observations. An adequate spatial distribution of hydrological gauging stations presents a lot of interest in linkage with the river regime characterization, water infrastructures design, water resources management and ecological survey. Due to the increase of riverside population and the associated flood risk, hydrological networks constantly need to be developed. This paper suggests taking advantage of kriging approaches to improve the design of a hydrometric network. The context deals with the application of an optimization approach using ordinary kriging and simulated annealing (SA) in order to identify the best locations to install new hydrometric gauges. The task at hand is to extend an existing hydrometric network in order to estimate, at ungauged sites, the average specific annual discharge which is a key basin descriptor. This methodology is developed for the hydrometric network of the transboundary Medjerda River in the North of Tunisia. A Geographic Information System (GIS) is adopted to delineate basin limits and centroids. The latter are adopted to assign the location of basins in kriging development. Scenarios where the size of an existing 12 stations network is alternatively increased by 1, 2, 3, 4 and 5 new station(s) are investigated using geo-regression and minimization of the variance of kriging errors. The analysis of the optimized locations from a scenario to another shows a perfect conformity with respect to the location of the new sites. The new locations insure a better spatial coverage of the study area as seen with the increase of both the average and the maximum of inter-station distances after optimization. The optimization procedure selects the basins that insure the shifting of the mean drainage area towards higher specific discharges.
Perceived Parenting Styles on College Students' Optimism
ERIC Educational Resources Information Center
Baldwin, Debora R.; McIntyre, Anne; Hardaway, Elizabeth
2007-01-01
The purpose of this study was to examine the relationship between perceived parenting styles and levels of optimism in undergraduate college students. Sixty-three participants were administered surveys measuring dispositional optimism and perceived parental Authoritative and Authoritarian styles. Multiple regression analysis revealed that both…
Adaptive Critic Nonlinear Robust Control: A Survey.
Wang, Ding; He, Haibo; Liu, Derong
2017-10-01
Adaptive dynamic programming (ADP) and reinforcement learning are quite relevant to each other when performing intelligent optimization. They are both regarded as promising methods involving important components of evaluation and improvement, at the background of information technology, such as artificial intelligence, big data, and deep learning. Although great progresses have been achieved and surveyed when addressing nonlinear optimal control problems, the research on robustness of ADP-based control strategies under uncertain environment has not been fully summarized. Hence, this survey reviews the recent main results of adaptive-critic-based robust control design of continuous-time nonlinear systems. The ADP-based nonlinear optimal regulation is reviewed, followed by robust stabilization of nonlinear systems with matched uncertainties, guaranteed cost control design of unmatched plants, and decentralized stabilization of interconnected systems. Additionally, further comprehensive discussions are presented, including event-based robust control design, improvement of the critic learning rule, nonlinear H ∞ control design, and several notes on future perspectives. By applying the ADP-based optimal and robust control methods to a practical power system and an overhead crane plant, two typical examples are provided to verify the effectiveness of theoretical results. Overall, this survey is beneficial to promote the development of adaptive critic control methods with robustness guarantee and the construction of higher level intelligent systems.
NASA Astrophysics Data System (ADS)
Louie, Dana; Albert, Loic; Deming, Drake
2017-01-01
The 2018 launch of James Webb Space Telescope (JWST), coupled with the 2017 launch of the Transiting Exoplanet Survey Satellite (TESS), heralds a new era in Exoplanet Science, with TESS projected to detect over one thousand transiting sub-Neptune-sized planets (Ricker et al, 2014), and JWST offering unprecedented spectroscopic capabilities. Sullivan et al (2015) used Monte Carlo simulations to predict the properties of the planets that TESS is likely to detect, and published a catalog of 962 simulated TESS planets. Prior to TESS launch, the re-scoped Kepler K2 mission and ground-based surveys such as MEarth continue to seek nearby Earth-like exoplanets orbiting M-dwarf host stars. The exoplanet community will undoubtedly employ JWST for atmospheric characterization follow-up studies of promising exoplanets, but the targeted planets for these studies must be chosen wisely to maximize JWST science return. The goal of this project is to estimate the capabilities of JWST’s Near InfraRed Imager and Slitless Spectrograph (NIRISS)—operating with the GR700XD grism in Single Object Slitless Spectrography (SOSS) mode—during observations of exoplanets transiting their host stars. We compare results obtained for the simulated TESS planets, confirmed K2-discovered super-Earths, and exoplanets discovered using ground-based surveys. By determining the target planet characteristics that result in the most favorable JWST observing conditions, we can optimize the choice of target planets in future JWST follow-on atmospheric characterization studies.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-06
... a nonuse valuation survey of the U.S. public. A key aspect of the survey design process is to use... use the results of these information collection activities to optimize the design of the survey... information to address several key questions relating to the survey and, in particular, the conjoint design...
Hernandez, Wilmar
2007-01-01
In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.
ERIC Educational Resources Information Center
Hamby, Tyler; Taylor, Wyn
2016-01-01
This study examined the predictors and psychometric outcomes of survey satisficing, wherein respondents provide quick, "good enough" answers (satisficing) rather than carefully considered answers (optimizing). We administered surveys to university students and respondents--half of whom held college degrees--from a for-pay survey website,…
Initial Results of a Survey of Earth's L4 Point for Possible Earth Trojan Asteroids
NASA Astrophysics Data System (ADS)
Connors, M.; Veillet, C.; Wiegert, P.; Innanen, K.; Mikkola, S.
2000-10-01
Using the Canada-France-Hawaii 3.6 m telescope and the new CFH12k wide-field CCD imager, a survey of the region near Earth's L4 (morning) Lagrange Point was conducted in May and July/August 2000, in hopes of finding asteroids at or near this point. This survey was motivated by the dynamical interest of a possible Earth Trojan asteroid (ETA) population and by the fact that they would be the easiest asteroids to access from Earth. Recent calculations (Wiegert, Innanen and Mikkola, 2000, Icarus v. 145, 33-43) indicate stability of objects in ETA orbits over a million year timescale and that their on-sky density would be greatest roughly five degrees sunward of the L4 position. An optimized search technique was used, with tracking at the anticipated rate of the target bodies, near real-time scanning of images, and duplication of fields to aid in detection and permit followup. Limited time is available on any given night to search near the Lagrange points, and operations must be conducted at large air mass. Approximately 9 square degrees were efficiently searched and two interesting asteroids were found, NEA 2000 PM8 and our provisionally named CFZ001. CFZ001 cannot be excluded from being an Earth Trojan although that is not the optimal solution for the short arc we observed. This object, of R magnitude 22, was easily detected, suggesting that our search technique worked well. This survey supports the earlier conclusion of Whitely and Tholen (1998, Icarus v. 136, 154-167) that a large population of several hundred meter diameter ETAs does not exist. However, our effective search technique and the discovery of two interesting asteroids suggest the value of completing the survey with approximately 10 more square degrees to be searched near L4 and a comparable search to be done at L5. Funding from Canada's NSERC and HIA and the Academic Research Fund of Athabasca University is gratefully acknowledged.
Calibration of HST wide field camera for quantitative analysis of faint galaxy images
NASA Technical Reports Server (NTRS)
Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.
1994-01-01
We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.
NASA Astrophysics Data System (ADS)
Jeffers, S. V.; Schöfer, P.; Lamert, A.; Reiners, A.; Montes, D.; Caballero, J. A.; Cortés-Contreras, M.; Marvin, C. J.; Passegger, V. M.; Zechmeister, M.; Quirrenbach, A.; Alonso-Floriano, F. J.; Amado, P. J.; Bauer, F. F.; Casal, E.; Alonso, E. Diez; Herrero, E.; Morales, J. C.; Mundt, R.; Ribas, I.; Sarmiento, L. F.
2018-06-01
CARMENES is a spectrograph for radial velocity surveys of M dwarfs with the aim of detecting Earth-mass planets orbiting in the habitable zones of their host stars. To ensure an optimal use of the CARMENES guaranteed time observations, in this paper we investigate the correlation of activity and rotation for approximately 2200 M dwarfs, ranging in spectral type from M0.0 V to M9.0 V. We present new high-resolution spectroscopic observations with FEROS, CAFE, and HRS of approximately 500 M dwarfs. For each new observation, we determined its radial velocity and measured its Hα activity index and its rotation velocity. Additionally, we have multiple observations of many stars to investigate if there are any radial velocity variations due to multiplicity. The results of our survey confirm that early-M dwarfs are Hα inactive with low rotational velocities and that late-M dwarfs are Hα active with very high rotational velocities. The results of this high-resolution analysis comprise the most extensive catalogue of rotation and activity in M dwarfs currently available. Based on observations made at the Calar Alto Observatory, Spain, the European Southern Observatory, La Silla, Chile and McDonald Observatory, USA.Tables A.1-A.3 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/614/A76
Artificial intelligence for the EChO long-term mission planning tool
NASA Astrophysics Data System (ADS)
García-Piquer, Álvaro; Ribas, Ignasi; Colomé, Josep
2014-08-01
The Exoplanet Characterisation Observatory (EChO) was an ESA mission candidate competing for a launch opportunity within the M3 call. Its main aim was to carry out research on the physics and chemistry of atmospheres of transiting planets. This requires the observation of two types of events: primary and secondary eclipses. The events of each exoplanet have to be observed several times in order to obtain measurements with adequate Signal-to-Noise Ratio. Furthermore, several criteria must be considered to perform an observation, among which we can highlight the exoplanet visibility, its event duration, and the avoidance of overlapping with other tasks. It is important to emphasize that, since the communications for transferring data from ground stations to the spacecraft are restricted, it is necessary to compute a long-term plan of observations in order to provide autonomy to the observatory. Thus, a suitable mission plan will increase the efficiency of telescope operation, and this will result in a raise of the scientific return and a reduction of operational costs. Obtaining a long-term mission plan becomes unaffordable for human planners due to the complexity of computing the large amount of possible combinations for finding a near-optimal solution. In this contribution we present a long-term mission planning tool based on Genetic Algorithms, which are focused on solving optimization problems such as the planning of several tasks. Specifically, the proposed tool finds a solution that highly optimizes the objectives defined, which are based on the maximization of the time spent on scientific observations and the scientific return (e.g., the coverage of the mission survey). The results obtained on the large experimental set up support that the proposed scheduler technology is robust and can function in a variety of scenarios, offering a competitive performance which does not depend on the collection of objects to be observed. Finally, it is noteworthy that the conducted experiments allow us to size some aspects of the mission with the aim of guaranteeing its feasibility.
Towards a Future ICRF Realization
NASA Technical Reports Server (NTRS)
Ma, Chopo; Gordon, D.; MacMillan, D.; Petrov, L.; Smith, David E. (Technical Monitor)
2001-01-01
The data and analysis for the ICRF were completed in 1995 to define a frame to which the Hipparcos optical catalog could be fixed. Additional observations on most of the 608 sources in the overall ICRF catalog have been acquired using a small portion of geodetic observing time as well as astrometric sessions concentrating on the southern hemisphere. Positions of new sources have been determined, including approx.1200 from a VLBA phase calibrator survey. A future ICRF realization will require improved geophysical modeling, sophisticated treatment of position variations and/or source structure, optimized data selection and weighting, and reidentification of defining sources. The motivation for the next realization could be significant improvement in accuracy and density or preparation for optical extragalactic catalogs with microarcsecond precision.
Towards a Future ICRF Realization
NASA Technical Reports Server (NTRS)
Ma, Chopo; Gordon, David; MacMillan, Daniel; Petrov, Leonid
2002-01-01
The data and analysis for the ICRF were completed in 1995 to define a frame to which the Hipparcos optical catalog could be fixed. Additional observations on most of the 608 sources in the overall ICRF catalog have been acquired using a small portion of geodetic observing time as well as astrometric sessions concentrating on the Southern Hemisphere. Positions of new sources have been determined, including approximately 1200 from a VLBA phase calibrator survey. A future ICRF realization will require improved geophysical modeling, sophisticated treatment of position variations and/or source structure, optimized data selection and weighting, and re-identification of defining sources. The motivation for the next realization could be significant improvement in accuracy and density or preparation for optical extragalactic catalogs with microarcsecond precision.
Optimization and Control of Cyber-Physical Vehicle Systems
Bradley, Justin M.; Atkins, Ella M.
2015-01-01
A cyber-physical system (CPS) is composed of tightly-integrated computation, communication and physical elements. Medical devices, buildings, mobile devices, robots, transportation and energy systems can benefit from CPS co-design and optimization techniques. Cyber-physical vehicle systems (CPVSs) are rapidly advancing due to progress in real-time computing, control and artificial intelligence. Multidisciplinary or multi-objective design optimization maximizes CPS efficiency, capability and safety, while online regulation enables the vehicle to be responsive to disturbances, modeling errors and uncertainties. CPVS optimization occurs at design-time and at run-time. This paper surveys the run-time cooperative optimization or co-optimization of cyber and physical systems, which have historically been considered separately. A run-time CPVS is also cooperatively regulated or co-regulated when cyber and physical resources are utilized in a manner that is responsive to both cyber and physical system requirements. This paper surveys research that considers both cyber and physical resources in co-optimization and co-regulation schemes with applications to mobile robotic and vehicle systems. Time-varying sampling patterns, sensor scheduling, anytime control, feedback scheduling, task and motion planning and resource sharing are examined. PMID:26378541
Optimization and Control of Cyber-Physical Vehicle Systems.
Bradley, Justin M; Atkins, Ella M
2015-09-11
A cyber-physical system (CPS) is composed of tightly-integrated computation, communication and physical elements. Medical devices, buildings, mobile devices, robots, transportation and energy systems can benefit from CPS co-design and optimization techniques. Cyber-physical vehicle systems (CPVSs) are rapidly advancing due to progress in real-time computing, control and artificial intelligence. Multidisciplinary or multi-objective design optimization maximizes CPS efficiency, capability and safety, while online regulation enables the vehicle to be responsive to disturbances, modeling errors and uncertainties. CPVS optimization occurs at design-time and at run-time. This paper surveys the run-time cooperative optimization or co-optimization of cyber and physical systems, which have historically been considered separately. A run-time CPVS is also cooperatively regulated or co-regulated when cyber and physical resources are utilized in a manner that is responsive to both cyber and physical system requirements. This paper surveys research that considers both cyber and physical resources in co-optimization and co-regulation schemes with applications to mobile robotic and vehicle systems. Time-varying sampling patterns, sensor scheduling, anytime control, feedback scheduling, task and motion planning and resource sharing are examined.
Ehlers, Diane K; Huberty, Jennifer; Buman, Matthew; Hooker, Steven; Todd, Michael; de Vreede, Gert-Jan
2016-03-01
Commercially available mobile and Internet technologies present a promising opportunity to feasibly conduct ecological momentary assessment (EMA). The purpose of this study was to describe a novel EMA protocol administered on middle-aged women's smartphones via text messaging and mobile Internet. Women (N = 9; mean age = 46.2 ± 8.2 y) received 35 text message prompts to a mobile survey assessing activity, self-worth, and self-efficacy over 14 days. Prompts were scheduled and surveys were administered using commercial, Internet-based programs. Prompting was tailored to each woman's daily wake/sleep schedule. Women concurrently wore a wrist-worn accelerometer. Feasibility was assessed via survey completion, accelerometer wear, participant feedback, and researcher notes. Of 315 prompted surveys, 287 responses were valid (91.1%). Average completion time was 1.52 ± 1.03 minutes. One participant's activity data were excluded due to accelerometer malfunction, resulting in complete data from 8 participants (n = 252 [80.0%] valid observations). Women reported the survey was easily and quickly read/completed. However, most thought the accelerometer was inconvenient. High completion rates and perceived usability suggest capitalizing on widely available technology and tailoring prompting schedules may optimize EMA in middle-aged women. However, researchers may need to carefully select objective monitors to maintain data validity while limiting participant burden.
ERIC Educational Resources Information Center
Messick, Penelope Pope
2012-01-01
This study examined the relationships among enabling school structures, academic optimism, and organizational citizenship behaviors. Additionally, it sought to determine if academic optimism served as a mediator between enabling school structures and organizational citizenship behaviors. Three existing survey instruments, previously tested for…
VizieR Online Data Catalog: VIMOS Public Extragalactic Survey (VIPERS) DR1 (Garilli+, 2014)
NASA Astrophysics Data System (ADS)
Garilli, B.; Guzzo, L.; Scodeggio, M.; Bolzonella, M.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bottini, D.; Branchini, E.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; de Lucia, G.; de la Torre, S.; Franzetti, P.; Fritz, A.; Fumana, M.; Granett, B. R.; Ilbert, O.; Iovino, A.; Krywult, J.; Le Brun, V.; Le Fevre, O.; Maccagni, D.; Malek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zamorani, G.; Zanichelli, A.; Burden, A.; di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Moscardini, L.; Nichol, R. C.; Peacock, J. A.; Percival, W. J.; Phleps, S.; Wolk, M.
2014-09-01
We present the first Public Data Release (PDR-1) of the VIMOS Public Extragalactic Survey (VIPERS). It comprises 57204 spectroscopic measurements together with all additional information necessary for optimal scientific exploitation of the data, in particular the associated photometric measurements and quantification of the photometric and survey completeness. VIPERS is an ESO Large Programme designed to build a spectroscopic sample of =~100000 galaxies with iAB<22.5 and 0.5
Optimizing weak lensing mass estimates for cluster profile uncertainty
Gruen, D.; Bernstein, G. M.; Lam, T. Y.; ...
2011-09-11
Weak lensing measurements of cluster masses are necessary for calibrating mass-observable relations (MORs) to investigate the growth of structure and the properties of dark energy. However, the measured cluster shear signal varies at fixed mass M 200m due to inherent ellipticity of background galaxies, intervening structures along the line of sight, and variations in the cluster structure due to scatter in concentrations, asphericity and substructure. We use N-body simulated halos to derive and evaluate a weak lensing circular aperture mass measurement M ap that minimizes the mass estimate variance <(M ap - M 200m) 2> in the presence of allmore » these forms of variability. Depending on halo mass and observational conditions, the resulting mass estimator improves on M ap filters optimized for circular NFW-profile clusters in the presence of uncorrelated large scale structure (LSS) about as much as the latter improve on an estimator that only minimizes the influence of shape noise. Optimizing for uncorrelated LSS while ignoring the variation of internal cluster structure puts too much weight on the profile near the cores of halos, and under some circumstances can even be worse than not accounting for LSS at all. As a result, we discuss the impact of variability in cluster structure and correlated structures on the design and performance of weak lensing surveys intended to calibrate cluster MORs.« less
The LSST Scheduler from design to construction
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Reuter, Michael A.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be a highly robotic facility, demanding a very high efficiency during its operation. To achieve this, the LSST Scheduler has been envisioned as an autonomous software component of the Observatory Control System (OCS), that selects the sequence of targets in real time. The Scheduler will drive the survey using optimization of a dynamic cost function of more than 200 parameters. Multiple science programs produce thousands of candidate targets for each observation, and multiple telemetry measurements are received to evaluate the external and the internal conditions of the observatory. The design of the LSST Scheduler started early in the project supported by Model Based Systems Engineering, detailed prototyping and scientific validation of the survey capabilities required. In order to build such a critical component, an agile development path in incremental releases is presented, integrated to the development plan of the Operations Simulator (OpSim) to allow constant testing, integration and validation in a simulated OCS environment. The final product is a Scheduler that is also capable of running 2000 times faster than real time in simulation mode for survey studies and scientific validation during commissioning and operations.
S.A. Bowe; R.L. Smith; D. Earl Kline; Philip A. Araman
2002-01-01
A nationwide survey of advanced scanning and optimizing technology in the hardwood sawmill industry was conducted in the fall of 1999. Three specific hardwood sawmill technologies were examined that included current edger-optimizer systems, future edger-optimizer systems, and future automated grading systems. The objectives of the research were to determine differences...
Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago
2017-05-15
We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertaintymore » $$\\sigma_z \\geq 0.02(1+z)$$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $$\\sigma_z \\geq 0.02(1+z)$$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.« less
Optimized clustering estimators for BAO measurements accounting for significant redshift uncertainty
NASA Astrophysics Data System (ADS)
Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago; Percival, Will J.; Dodelson, Scott; Garcia-Bellido, Juan; Crocce, Martin; Elvin-Poole, Jack; Giannantonio, Tommaso; Manera, Marc; Sevilla-Noarbe, Ignacio
2017-12-01
We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the baryon acoustic oscillation (BAO) information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line of sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty σz ≥ 0.02(1 + z), we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for σz ≥ 0.02(1 + z). For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations (combining two separate sets) of galaxy simulations mimicking the Dark Energy Survey Year 1 (DES Y1) sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.
Optimal Allocation of Sampling Effort in Depletion Surveys
We consider the problem of designing a depletion or removal survey as part of estimating animal abundance for populations with imperfect capture or detection rates. In a depletion survey, animals are captured from a given area, counted, and withheld from the population. This proc...
Peer observation and feedback of resident teaching.
Snydman, Laura; Chandler, Daniel; Rencic, Joseph; Sung, Yung-Chi
2013-02-01
Resident doctors (residents) play a significant role in the education of medical students. Morning work rounds provide an optimal venue to assess resident teaching. The purpose of this study was to assess the feasibility of peer observation of resident work rounds, to evaluate resident perceptions of peer observation and to evaluate resident perceptions of peer feedback. Twenty-four internal medicine residents were simultaneously observed by an attending physician and a peer while teaching during work rounds (between August2008 and May 2009). At year-end, residents received a survey to characterise their attitudes towards peer observation and feedback. Twenty-one residents (87.5%) completed the survey. Half (52.4%) felt that participating in the peer observation study stimulated their interest in teaching during work rounds. Prior to participation in the study, fewer than half (42.9%) felt comfortable being observed by their peers, compared with 71.4 percent after participation (p=0.02). The proportion of residents who felt comfortable giving feedback to peers increased from 26.3 to 65.0percent (p=0.004), and the proportion of residents who felt comfortable receiving feedback from peers increased from 76.2 to 95.2 percent (p=0.02). Peer observation and feedback of resident teaching during work rounds is feasible and rewarding for the residents involved. Comfort with regards to being observed by peers, with receiving feedback from peers and with giving feedback to peers significantly increased after the study. Most residents reported changes in their teaching behaviour resulting from feedback. Residents felt that observing a peer teach on work rounds was one of the most useful activities to improve their own teaching on work rounds. © Blackwell Publishing Ltd 2013.
Optimal satellite sampling to resolve global-scale dynamics in the I-T system
NASA Astrophysics Data System (ADS)
Rowland, D. E.; Zesta, E.; Connor, H. K.; Pfaff, R. F., Jr.
2016-12-01
The recent Decadal Survey highlighted the need for multipoint measurements of ion-neutral coupling processes to study the pathways by which solar wind energy drives dynamics in the I-T system. The emphasis in the Decadal Survey is on global-scale dynamics and processes, and in particular, mission concepts making use of multiple identical spacecraft in low earth orbit were considered for the GDC and DYNAMIC missions. This presentation will provide quantitative assessments of the optimal spacecraft sampling needed to significantly advance our knowledge of I-T dynamics on the global scale.We will examine storm time and quiet time conditions as simulated by global circulation models, and determine how well various candidate satellite constellations and satellite schemes can quantify the plasma and neutral convection patterns and global-scale distributions of plasma density, neutral density, and composition, and their response to changes in the IMF. While the global circulation models are data-starved, and do not contain all the physics that we might expect to observe with a global-scale constellation mission, they are nonetheless an excellent "starting point" for discussions of the implementation of such a mission. The result will be of great utility for the design of future missions, such as GDC, to study the global-scale dynamics of the I-T system.
The Swift Gamma-Ray Burst Host Galaxy Legacy Survey. I. Sample Selection and Redshift Distribution
NASA Technical Reports Server (NTRS)
Perley, D. A.; Kruhler, T.; Schulze, S.; Postigo, A. De Ugarte; Hjorth, J.; Berger, E.; Cenko, S. B.; Chary, R.; Cucchiara, A.; Ellis, R.;
2016-01-01
We introduce the Swift Gamma-Ray Burst Host Galaxy Legacy Survey (SHOALS), a multi-observatory high redshift galaxy survey targeting the largest unbiased sample of long-duration gamma-ray burst (GRB) hosts yet assembled (119 in total). We describe the motivations of the survey and the development of our selection criteria, including an assessment of the impact of various observability metrics on the success rate of afterglow-based redshift measurement. We briefly outline our host galaxy observational program, consisting of deep Spitzer/IRAC imaging of every field supplemented by similarly deep, multicolor optical/near-IR photometry, plus spectroscopy of events without preexisting redshifts. Our optimized selection cuts combined with host galaxy follow-up have so far enabled redshift measurements for 110 targets (92%) and placed upper limits on all but one of the remainder. About 20% of GRBs in the sample are heavily dust obscured, and at most 2% originate from z > 5.5. Using this sample, we estimate the redshift-dependent GRB rate density, showing it to peak at z approx. 2.5 and fall by at least an order of magnitude toward low (z = 0) redshift, while declining more gradually toward high (z approx. 7) redshift. This behavior is consistent with a progenitor whose formation efficiency varies modestly over cosmic history. Our survey will permit the most detailed examination to date of the connection between the GRB host population and general star-forming galaxies, directly measure evolution in the host population over cosmic time and discern its causes, and provide new constraints on the fraction of cosmic star formation occurring in undetectable galaxies at all redshifts.
Prpić, Katarina
2011-11-01
This paper finds that the Croatian public's and the social elites' perceptions of science are a mixture of scientific and technological optimism, of the tendency to absolve science of social responsibility, of skepticism about the social effects of science, and of cognitive optimism and skepticism. However, perceptions differ significantly according to the different social roles and the wider value system of the observed groups. The survey data show some key similarities, as well as certain specificities in the configuration of the types of views of the four groups--the public, scientists, politicians and managers. The results suggest that the well-known typology of the four cultures reveals some of the ideologies of the key actors of scientific and technological policy. The greatest social, primarily educational and socio-spatial, differentiation of the perceptions of science was found in the general public.
Applications of numerical optimization methods to helicopter design problems: A survey
NASA Technical Reports Server (NTRS)
Miura, H.
1984-01-01
A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.
Applications of numerical optimization methods to helicopter design problems - A survey
NASA Technical Reports Server (NTRS)
Miura, H.
1985-01-01
A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.
Applications of numerical optimization methods to helicopter design problems - A survey
NASA Technical Reports Server (NTRS)
Miura, H.
1984-01-01
A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.
SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys
Nord, B.; Amara, A.; Refregier, A.; ...
2016-03-03
The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less
NASA Astrophysics Data System (ADS)
Frolov, Sergey; Garau, Bartolame; Bellingham, James
2014-08-01
Regular grid ("lawnmower") survey is a classical strategy for synoptic sampling of the ocean. Is it possible to achieve a more effective use of available resources if one takes into account a priori knowledge about variability in magnitudes of uncertainty and decorrelation scales? In this article, we develop and compare the performance of several path-planning algorithms: optimized "lawnmower," a graph-search algorithm (A*), and a fully nonlinear genetic algorithm. We use the machinery of the best linear unbiased estimator (BLUE) to quantify the ability of a vehicle fleet to synoptically map distribution of phytoplankton off the central California coast. We used satellite and in situ data to specify covariance information required by the BLUE estimator. Computational experiments showed that two types of sampling strategies are possible: a suboptimal space-filling design (produced by the "lawnmower" and the A* algorithms) and an optimal uncertainty-aware design (produced by the genetic algorithm). Unlike the space-filling designs that attempted to cover the entire survey area, the optimal design focused on revisiting areas of high uncertainty. Results of the multivehicle experiments showed that fleet performance predictors, such as cumulative speed or the weight of the fleet, predicted the performance of a homogeneous fleet well; however, these were poor predictors for comparing the performance of different platforms.
The Science and Prospects of Astrophysical Observations with New Horizons
NASA Astrophysics Data System (ADS)
Nguyen, Chi; Zemcov, Michael; Cooray, Asantha; Lisse, Carey; Poppe, Andrew
2018-01-01
Astrophysical observation from the outer solar system provides a unique and quiet vantage point from which to understand our cosmos. If properly designed, such observations enable several niche science cases that are difficult or impossible to perform near Earth. NASA's New Horizons mission includes several instruments with ~10cm telescopes that provide imaging capability from UV to near-IR wavelengths with moderate spectral resolution. A carefully designed survey can optimize the expendable propellant and limited data telemetry bandwidth to allow several unique measurements, including a detailed understanding of the cosmic extragalactic background light in the optical and near-IR, studies of the local and extragalactic UV background, measurements of the properties of dust and ice in the outer solar system, searches for moons and other faint structures around exoplanets, and determinations of the mass of planets far from their parent stars using gravitational microlensing. New Horizons is currently in an extended mission, that will conclude in 2021, designed to survey distant objects in the Kuiper Belt at high phase angles and perform a close flyby of KBO 2014 MU69. Afterwards, the astrophysics community will have a unique, generational opportunity to use this mission for astronomical observations at heliocentric distances beyond 50 AU. In this poster, we present the science case for an extended 2021 - 2026 astrophysics mission, and discuss some of the practical considerations that must be addressed to maximize the potential science return.
ICE-COLA: fast simulations for weak lensing observables
NASA Astrophysics Data System (ADS)
Izard, Albert; Fosalba, Pablo; Crocce, Martin
2018-01-01
Approximate methods to full N-body simulations provide a fast and accurate solution to the development of mock catalogues for the modelling of galaxy clustering observables. In this paper we extend ICE-COLA, based on an optimized implementation of the approximate COLA method, to produce weak lensing maps and halo catalogues in the light-cone using an integrated and self-consistent approach. We show that despite the approximate dynamics, the catalogues thus produced enable an accurate modelling of weak lensing observables one decade beyond the characteristic scale where the growth becomes non-linear. In particular, we compare ICE-COLA to the MICE Grand Challenge N-body simulation for some fiducial cases representative of upcoming surveys and find that, for sources at redshift z = 1, their convergence power spectra agree to within 1 per cent up to high multipoles (i.e. of order 1000). The corresponding shear two point functions, ξ+ and ξ-, yield similar accuracy down to 2 and 20 arcmin respectively, while tangential shear around a z = 0.5 lens sample is accurate down to 4 arcmin. We show that such accuracy is stable against an increased angular resolution of the weak lensing maps. Hence, this opens the possibility of using approximate methods for the joint modelling of galaxy clustering and weak lensing observables and their covariance in ongoing and future galaxy surveys.
NASA Astrophysics Data System (ADS)
He, Xin
2017-03-01
The ideal observer is widely used in imaging system optimization. One practical question remains open: do the ideal and human observers have the same preference in system optimization and evaluation? Based on the ideal observer's mathematical properties proposed by Barrett et. al. and the empirical properties of human observers investigated by Myers et. al., I attempt to pursue the general rules regarding the applicability of the ideal observer in system optimization. Particularly, in software optimization, the ideal observer pursues data conservation while humans pursue data presentation or perception. In hardware optimization, the ideal observer pursues a system with the maximum total information, while humans pursue a system with the maximum selected (e.g., certain frequency bands) information. These different objectives may result in different system optimizations between human and the ideal observers. Thus, an ideal observer optimized system is not necessarily optimal for humans. I cite empirical evidence in search and detection tasks, in hardware and software evaluation, in X-ray CT, pinhole imaging, as well as emission computed tomography to corroborate the claims. (Disclaimer: the views expressed in this work do not necessarily represent those of the FDA)
H2 Fluorescence in M dwarf Systems: A Stellar Origin
NASA Astrophysics Data System (ADS)
Kruczek, Nicholas; France, Kevin; Evonosky, William; Youngblood, Allison; Loyd, R. O. Parke
2017-01-01
Observations of Lyα-driven H2 fluorescence can be a useful tool for measuring the abundance of H2 in exoplanet atmospheres. This emission has been previously observed in M dwarfs with planetary systems but at too low of a signal to determine its origin. It may have been originating in the atmospheres of planets, but conditions within these systems also mean that the H2 could be residing on the stellar surface or in a circumstellar disk. We use observations from the ``Measurements of the Ultraviolet Spectral Characteristics of Low-mass Exoplanet Host Stars" (MUSCLES) Hubble Space Telescope (HST) Treasury Survey to study H2 fluorescence in M dwarfs with and without confirmed planets to determine the origin of the emission. The results are further supported by the direct imaging of a candidate M dwarf system using the HST-Advanced Camera for Surveys/Solar Blind Channel. We constrain the location of the fluorescing H2 through analysis of the line profiles and determine that the emission is originating on the star. We verify that this interpretation is consistent with 1D radiative transfer models that are optimized using the spectra of the MUSCLES stars and find that the H2 likely resides in starspots or a cool region of the lower chromosphere.
Studying Galaxy Formation with the Hubble, Spitzer and James Webb Space Telescopes
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2007-01-01
The deepest optical to infrared observations of the universe include the Hubble Deep Fields, the Great Observatories Origins Deep Survey and the recent Hubble Ultra-Deep Field. Galaxies are seen in these surveys at redshifts 2x3, less than 1 Gyr after the Big Bang, at the end of a period when light from the galaxies has reionized Hydrogen in the inter-galactic medium. These observations, combined with theoretical understanding, indicate that the first stars and galaxies formed at z>lO, beyond the reach of the Hubble and Spitzer Space Telescopes. To observe the first galaxies, NASA is planning the James Webb Space Telescope (JWST), a large (6.5m), cold (<50K), infrared-optimized observatory to be launched early in the next decade into orbit around the second Earth- Sun Lagrange point. JWST will have four instruments: The Near-Infrared Camera, the Near-Infrared multi-object Spectrograph, and the Tunable Filter Imager will cover the wavelength range 0.6 to 5 microns, while the Mid-Infrared Instrument will do both imaging and spectroscopy from 5 to 28.5 microns. In addition to JWST's ability to study the formation and evolution of galaxies, I will also briefly review its expected contributions to studies of the formation of stars and planetary systems.
Studying Galaxy Formation with the Hubble, Spitzer and James Webb Space Telescopes
NASA Technical Reports Server (NTRS)
Gardner, Jonathan F.; Barbier, L. M.; Barthelmy, S. D.; Cummings, J. R.; Fenimore, E. E.; Gehrels, N.; Hullinger, D. D.; Markwardt, C. B.; Palmer, D. M.; Parsons, A. M.;
2006-01-01
The deepest optical to infrared observations of the universe include the Hubble Deep Fields, the Great Observatories Origins Deep Survey and the recent Hubble Ultra-Deep Field. Galaxies are seen in these surveys at redshifts 2-6, less than 1 Gyr after the Big Bang, at the end of a period when light from the galaxies has reionized Hydrogen in the inter-galactic medium. These observations, combined with theoretical understanding, indicate that the first stars and galaxies formed at z>10, beyond the reach of the Hubble and Spitzer Space Telescopes. To observe the first galaxies, NASA is planning the James Webb Space Telescope (JWST), a large (6.5m), cold (50K), infrared-optimized observatory to be launched early in the next decade into orbit around the second Earth- Sun Lagrange point. JWST will have four instruments: The Near-Infrared Camera, the Near-Infrared multi-object Spectrograph, and the Tunable Filter Imager will cover the wavelength range 0.6 to 5 microns, while the Mid-Infrared Instrument will do both imaging and spectroscopy from 5 to 27 microns. In addition to JWST s ability to study the formation and evolution of galaxies, I will also briefly review its expected contributions to studies of the formation of stars and planetary systems.
Studying Galaxy Formation with the Hubble, Spitzer and James Webb Space Telescopes
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2007-01-01
The deepest optical to infrared observations of the universe include the Hubble Deep Fields, the Great Observatories Origins Deep Survey and the recent Hubble Ultra-Deep Field. Galaxies are seen in these surveys at redshifts z>6, less than 1 Gyr after the Big Bang, at the end of a period when light from the galaxies has reionized Hydrogen in the inter-galactic medium. These observations, combined with theoretical understanding, indicate that the first stars and galaxies formed at z>10, beyond the reach of the Hubble and Spitzer Space Telescopes. To observe the first galaxies, NASA is planning the James Webb Space Telescope (JWST), a large (6.5m), cold (<50K), infrared-optimized observatory to be launched early in the next decade into orbit around the second Earth- Sun Lagrange point. JWST will have four instruments: The Near-Infrared Camera, the Near-Infrared multi-object Spectrograph, and the Tunable Filter Imager will cover the wavelength range 0.6 to 5 microns, while the Mid-Infrared Instrument will do both imaging and spectroscopy from 5 to 28.5 microns. In addition to JWST's ability to study the formation and evolution of galaxies, I will also briefly review its expected contributions to studies of the formation of stars and planetary systems.
A sectional denture as the optimal prosthesis.
Cohen, K
1989-08-01
A case is described where because of various local factors--anterior ridge loss, Class III skeletal relationship, survey lines, appearance, retention and support problems--a sectional prosthesis was found to be the optimal restoration.
NASA Technical Reports Server (NTRS)
Friedmann, Peretz P.
1991-01-01
This paper presents a survey of the state-of-the-art in the field of structural optimization when applied to vibration reduction of helicopters in forward flight with aeroelastic and multidisciplinary constraints. It emphasizes the application of the modern approach where the optimization is formulated as a mathematical programming problem, the objective function consists of the vibration levels at the hub, and behavior constraints are imposed on the blade frequencies and aeroelastic stability margins, as well as on a number of additional ingredients that can have a significant effect on the overall performance and flight mechanics of the helicopter. It is shown that the integrated multidisciplinary optimization of rotorcraft offers the potential for substantial improvements, which can be achieved by careful preliminary design and analysis without requiring additional hardware such as rotor vibration absorbers of isolation systems.
Joosten, Alexandre; Desebbe, Olivier; Suehiro, Koichi; Essiet, Mfonobong; Alexander, Brenton; Ricks, Cameron; Rinehart, Joseph; Faraoni, David; Cecconi, Maurizio; Van der Linden, Philippe; Cannesson, Maxime
2017-02-01
To assess the relationship between the addition of advanced monitoring variables and changes in clinical decision-making. A 15-questions survey was anonymously emailed to international experts and physician members of five anesthesia societies which focused on assessing treatment decisions of clinicians during three realistic clinical scenarios measured at two distinct time points. The first is when typical case information and basic monitoring (T1) were provided, and then once again after the addition of advanced monitoring variables (T2). We hypothesized that the addition of advanced variables would increase the incidence of an optimal therapeutic decision (a priori defined as the answer with the highest percentage of expert agreement) and decrease the variability among the physician's suggested treatments. The survey was completed by 18 experts and 839 physicians. Overall, adding advanced monitoring did not significantly increase physician response accuracy, with the least substantial changes noted on questions related to volume expansion or vasopressor administration. Moreover, advanced monitoring data did not significantly decrease the high level of initial practice variability in physician suggested treatments (P = 0.13), in contrast to the low variability observed within the expert group (P = 0.039). Additionally, 5-10 years of practice (P < 0.0001) and a cardiovascular subspecialty (P = 0.048) were both physician characteristics associated with a higher rate of optimal therapeutic decisions. The addition of advanced variables was of limited benefit for most physicians, further indicating the need for more in depth education on the clinical value and technical understanding of such variables.
DSSD detectors development PACT, a new space Compton telescope at the horizon 2025
NASA Astrophysics Data System (ADS)
Laurent, P.; Khalil, M.; Dolgorouki, Y.; Bertoli, W.; Oger, R.; Bréelle, E.
2015-07-01
PACT is a Pair and Compton telescope that aims to make a sensitive survey of the gamma-ray sky between 100 keV and 100 MeV . It will be devoted to the detection of radioactivity lines from present and past supernova explosions, the observation of thousands of new blazars, and the study of polarized radiations from gamma-ray bursts, pulsars and accreting black holes. It will reach a sensitivity of one to two orders of magnitude lower than COMPTEL/CGRO (e.g. about 50 times lower for the broad-band, survey sensitivity at 1 MeV after 5 years). The PACT telescope is based upon three main components: a silicon-based gamma-ray tracker, a crystal-based calorimeter (e.g. CeBr3), and an anticoincidence detector made of plastic scintillator panels. Prototypes of the Silicon detector planes have been optimized and are currently tested in the APC laboratory.
Comparing cosmic web classifiers using information theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Ourmore » study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.« less
NASA Astrophysics Data System (ADS)
Koay, J. Y.; Macquart, J.-P.; Jauncey, D. L.; Pursimo, T.; Giroletti, M.; Bignall, H. E.; Lovell, J. E. J.; Rickett, B. J.; Kedziora-Chudczer, L.; Ojha, R.; Reynolds, C.
2018-03-01
We investigate the relationship between 5 GHz interstellar scintillation (ISS) and 15 GHz intrinsic variability of compact, radio-selected active galactic nuclei (AGNs) drawn from the Microarcsecond Scintillation-Induced Variability (MASIV) Survey and the Owens Valley Radio Observatory blazar monitoring program. We discover that the strongest scintillators at 5 GHz (modulation index, m5 ≥ 0.02) all exhibit strong 15 GHz intrinsic variability (m15 ≥ 0.1). This relationship can be attributed mainly to the mutual dependence of intrinsic variability and ISS amplitudes on radio core compactness at ˜ 100 μas scales, and to a lesser extent, on their mutual dependences on source flux density, arcsec-scale core dominance and redshift. However, not all sources displaying strong intrinsic variations show high amplitude scintillation, since ISS is also strongly dependent on Galactic line-of-sight scattering properties. This observed relationship between intrinsic variability and ISS highlights the importance of optimizing the observing frequency, cadence, timespan and sky coverage of future radio variability surveys, such that these two effects can be better distinguished to study the underlying physics. For the full MASIV sample, we find that Fermi-detected gamma-ray loud sources exhibit significantly higher 5 GHz ISS amplitudes than gamma-ray quiet sources. This relationship is weaker than the known correlation between gamma-ray loudness and the 15 GHz variability amplitudes, most likely due to jet opacity effects.
NASA Astrophysics Data System (ADS)
Megan Gillies, D.; Knudsen, D.; Donovan, E.; Jackel, B.; Gillies, R.; Spanswick, E.
2017-08-01
We present a comprehensive survey of 630 nm (red-line) emission discrete auroral arcs using the newly deployed Redline Emission Geospace Observatory. In this study we discuss the need for observations of 630 nm aurora and issues with the large-altitude range of the red-line aurora. We compare field-aligned currents (FACs) measured by the Swarm constellation of satellites with the location of 10 red-line (630 nm) auroral arcs observed by all-sky imagers (ASIs) and find that a characteristic emission height of 200 km applied to the ASI maps gives optimal agreement between the two observations. We also compare the new FAC method against the traditional triangulation method using pairs of all-sky imagers (ASIs), and against electron density profiles obtained from the Resolute Bay Incoherent Scatter Radar-Canadian radar, both of which are consistent with a characteristic emission height of 200 km.
Ivan P. Edwards; Jennifer L. Cripliver; Andrew R. Gillespie; Kurt H. Johnsen; M. Scholler; Ronald F. Turco
2004-01-01
We investigated the effect of an optimal nutrition strategy designed to maximize loblolly pine (Pinus taeda) growth on the rank abundance structure and diversity of associated basidiomycete communities.We conducted both small- and large-scale below-ground surveys 10 years after the initiation of optimal...
IP Subsurface Imaging in the Presence of Buried Steel Infrastructure
NASA Astrophysics Data System (ADS)
Smart, N. H.; Everett, M. E.
2017-12-01
The purpose of this research is to explore the use of induced polarization to image closely-spaced steel columns at a controlled test site. Texas A&M University's Riverside Campus (RELLIS) was used as a control test site to examine the difference between actual and remotely-sensed observed depths. Known borehole depths and soil composition made this site ideal. The subsurface metal structures were assessed using a combination of ER (Electrical Resistivity) and IP (Induced Polarization), and later processed using data inversion. Surveying was set up in reference to known locations and depths of steel structures in order to maximize control data quality. In comparing of known and remotely-sensed foundation depths a series of questions is raised regarding how percent error between imaged and actual depths can be lowered. We are able to draw questions from the results of our survey, as we compare them with the known depth and width of the metal beams. As RELLIS offers a control for us to conduct research, ideal survey geometry and inversion parameters can be met to achieve optimal results and resolution
Autonomous In-Situ Resources Prospector
NASA Technical Reports Server (NTRS)
Dissly, R. W.; Buehler, M. G.; Schaap, M. G.; Nicks, D.; Taylor, G. J.; Castano, R.; Suarez, D.
2004-01-01
This presentation will describe the concept of an autonomous, intelligent, rover-based rapid surveying system to identify and map several key lunar resources to optimize their ISRU (In Situ Resource Utilization) extraction potential. Prior to an extraction phase for any target resource, ground-based surveys are needed to provide confirmation of remote observation, to quantify and map their 3-D distribution, and to locate optimal extraction sites (e.g. ore bodies) with precision to maximize their economic benefit. The system will search for and quantify optimal minerals for oxygen production feedstock, water ice, and high glass-content regolith that can be used for building materials. These are targeted because of their utility and because they are, or are likely to be, variable in quantity over spatial scales accessible to a rover (i.e., few km). Oxygen has benefits for life support systems and as an oxidizer for propellants. Water is a key resource for sustainable exploration, with utility for life support, propellants, and other industrial processes. High glass-content regolith has utility as a feedstock for building materials as it readily sinters upon heating into a cohesive matrix more readily than other regolith materials or crystalline basalts. Lunar glasses are also a potential feedstock for oxygen production, as many are rich in iron and titanium oxides that are optimal for oxygen extraction. To accomplish this task, a system of sensors and decision-making algorithms for an autonomous prospecting rover is described. One set of sensors will be located in the wheel tread of the robotic search vehicle providing contact sensor data on regolith composition. Another set of instruments will be housed on the platform of the rover, including VIS-NIR imagers and spectrometers, both for far-field context and near-field characterization of the regolith in the immediate vicinity of the rover. Also included in the sensor suite are a neutron spectrometer, ground-penetrating radar, and an instrumented cone penetrometer for subsurface assessment. Output from these sensors will be evaluated autonomously in real-time by decision-making software to evaluate if any of the targeted resources has been detected, and if so, to quantify their abundance. Algorithms for optimizing the mapping strategy based on target resource abundance and distribution are also included in the autonomous software. This approach emphasizes on-the-fly survey measurements to enable efficient and rapid prospecting of large areas, which will improve the economics of ISRU system approaches. The mature technology will enable autonomous rovers to create in-situ resource maps of lunar or other planetary surfaces, which will facilitate human and robotic exploration.
The Pan-STARRS Survey for Transients (PSST)
NASA Astrophysics Data System (ADS)
Huber, Mark; Carter Chambers, Kenneth; Flewelling, Heather; Smartt, Stephen J.; Smith, Ken; Wright, Darryl
2015-08-01
The Pan-STARRS1 (PS1) Science Consortium finished the 3Pi survey of the whole sky north of -30 degrees between 2010-2014 in grizy (PS1 specific filters) and the PS1 telescope has been running a wide-field survey for near earth objects, funded by NASA through the NEO Observation Program. This survey takes data in a w-band (wide-band filter spanning g,r,i) in dark time, and combinations of r, i, z and y during bright time. We are now processing these data through the Pan-STARRS IPP difference imaging pipeline and recovering stationary transients. Effectively the 3Pi survey for transients that started during the PS1 Science Consortium is being continued under the new NEO optimized operations mode. The observing procedure in this case is to take a quad of exposures, typically 30-45 seconds separated by 10-20 minutes each, typically revealing high confidence transients (greater than 5-sigma) to depths of i~ 20.7, y~18.3 (AB mags). This cadence may be repeated on subsequent nights in a return pointing.Continuing the public release of the first 880 transients from the PS1 3Pi survey during the search period September 2013 - January 2014, beginning February 2015, the transient events using the data from the the Pan-STARRS NEO Science Consortium are now regularly added. These are mostly supernova candidates, but the list also contains some variable stars, AGN, and nuclear transients. The light curves are too sparsely sampled to be of standalone use, but they may be of use to the community in combining with existing data (e.g. Fraser et al. 2013, ApJ, 779, L8), constraining explosion and rise times (e.g. Nicholl et al. 2013, Nature, 502, 346) as well as many being new discoveries.For additional details visit http://star.pst.qub.ac.uk/ps1threepi/
Lanier, Wendy E.; Bailey, Larissa L.; Muths, Erin L.
2016-01-01
Conservation of imperiled species often requires knowledge of vital rates and population dynamics. However, these can be difficult to estimate for rare species and small populations. This problem is further exacerbated when individuals are not available for detection during some surveys due to limited access, delaying surveys and creating mismatches between the breeding behavior and survey timing. Here we use simulations to explore the impacts of this issue using four hypothetical boreal toad (Anaxyrus boreas boreas) populations, representing combinations of logistical access (accessible, inaccessible) and breeding behavior (synchronous, asynchronous). We examine the bias and precision of survival and breeding probability estimates generated by survey designs that differ in effort and timing for these populations. Our findings indicate that the logistical access of a site and mismatch between the breeding behavior and survey design can greatly limit the ability to yield accurate and precise estimates of survival and breeding probabilities. Simulations similar to what we have performed can help researchers determine an optimal survey design(s) for their system before initiating sampling efforts.
Use of combined radar and radiometer systems in space for precipitation measurement: Some ideas
NASA Technical Reports Server (NTRS)
Moore, R. K.
1981-01-01
A brief survey is given of some fundamental physical concepts of optimal polarization characteristics of a transmission path or scatter ensemble of hydrometers. It is argued that, based on this optimization concept, definite advances in remote atmospheric sensing are to be expected. Basic properties of Kennaugh's optimal polarization theory are identified.
NASA Astrophysics Data System (ADS)
Mohammadi, B.; Pironneau, O.
2002-12-01
This paper is a short survey of optimal shape design (OSD) for fluids. OSD is an interesting field both mathematically and for industrial applications. Existence, sensitivity, correct discretization are important theoretical issues. Practical implementation issues for airplane designs are critical too. The paper is also a summary of the material covered in our recent book, Applied Optimal Shape Design, Oxford University Press, 2001.
Jackson, George L; Zullig, Leah L; Phelan, Sean M; Provenzale, Dawn; Griffin, Joan M; Clauser, Steven B; Haggstrom, David A; Jindal, Rahul M; van Ryn, Michelle
2015-07-01
The current study was performed to determine whether patient characteristics, including race/ethnicity, were associated with patient-reported care coordination for patients with colorectal cancer (CRC) who were treated in the Veterans Affairs (VA) health care system, with the goal of better understanding potential goals of quality improvement efforts aimed at improving coordination. The nationwide Cancer Care Assessment and Responsive Evaluation Studies survey involved VA patients with CRC who were diagnosed in 2008 (response rate, 67%). The survey included a 4-item scale of patient-reported frequency ("never," "sometimes," "usually," and "always") of care coordination activities (scale score range, 1-4). Among 913 patients with CRC who provided information regarding care coordination, demographics, and symptoms, multivariable logistic regression was used to examine odds of patients reporting optimal care coordination. VA patients with CRC were found to report high levels of care coordination (mean scale score, 3.50 [standard deviation, 0.61]). Approximately 85% of patients reported a high level of coordination, including the 43% reporting optimal/highest-level coordination. There was no difference observed in the odds of reporting optimal coordination by race/ethnicity. Patients with early-stage disease (odds ratio [OR], 0.60; 95% confidence interval [95% CI], 0.45-0.81), greater pain (OR, 0.97 for a 1-point increase in pain scale; 95% CI, 0.96-0.99), and greater levels of depression (OR, 0.97 for a 1-point increase in depression scale; 95% CI, 0.96-0.99) were less likely to report optimal coordination. Patients with CRC in the VA reported high levels of care coordination. Unlike what has been reported in settings outside the VA, there appears to be no racial/ethnic disparity in reported coordination. However, challenges remain in ensuring coordination of care for patients with less advanced disease and a high symptom burden. Cancer 2015;121:2207-2213. © 2015 American Cancer Society. © 2015 American Cancer Society.
Optimum structural design with plate bending elements - A survey
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Prasad, B.
1981-01-01
A survey is presented of recently published papers in the field of optimum structural design of plates, largely with respect to the minimum-weight design of plates subject to such constraints as fundamental frequency maximization. It is shown that, due to the availability of powerful computers, the trend in optimum plate design is away from methods tailored to specific geometry and loads and toward methods that can be easily programmed for any kind of plate, such as finite element methods. A corresponding shift is seen in optimization from variational techniques to numerical optimization algorithms. Among the topics covered are fully stressed design and optimality criteria, mathematical programming, smooth and ribbed designs, design against plastic collapse, buckling constraints, and vibration constraints.
Survey of optimization techniques for nonlinear spacecraft trajectory searches
NASA Technical Reports Server (NTRS)
Wang, Tseng-Chan; Stanford, Richard H.; Sunseri, Richard F.; Breckheimer, Peter J.
1988-01-01
Mathematical analysis of the optimal search of a nonlinear spacecraft trajectory to arrive at a set of desired targets is presented. A high precision integrated trajectory program and several optimization software libraries are used to search for a converged nonlinear spacecraft trajectory. Several examples for the Galileo Jupiter Orbiter and the Ocean Topography Experiment (TOPEX) are presented that illustrate a variety of the optimization methods used in nonlinear spacecraft trajectory searches.
NASA Astrophysics Data System (ADS)
Pryet, A.; d'Ozouville, N.; Violette, S.; Deffontaines, B.; Auken, E.
2012-12-01
Many volcanic islands face freshwater stress and the situation may worsen with climate change and sea level rise. In this context, an optimum management of freshwater resources becomes crucial, but is often impeded by the lack of data. With the aim of investigating the hydrogeological settings of southern San Cristóbal Island (Galapagos), we conducted a helicopter-borne, transient electromagnetic survey with the SkyTEM system. It provided unprecedented insights into the 3-D resistivity structure of this extinct basaltic shield. Combined with remote sensing and fieldwork, it allowed the definition of the first hydrogeological conceptual model of the island. Springs are fed by a series of perched aquifers overlying a regional basal aquifer subject to seawater intrusion. Dykes, evidenced by alignments of eruptive cones at the surface, correspond to sharp sub-vertical contrasts in resistivity in the subsurface, and impound groundwater in a summit channel. Combined with geomorphological observations, airborne electromagnetics are shown to be a useful for hydrogeological exploratory studies in complex, poorly known environments. They allow optimal development of land-based geophysical surveys and drilling campaigns.
Lluch, Anne; Maillot, Matthieu; Gazan, Rozenn; Vieux, Florent; Delaere, Fabien; Vaudaine, Sarah; Darmon, Nicole
2017-02-20
Dietary changes needed to achieve nutritional adequacy for 33 nutrients were determined for 1719 adults from a representative French national dietary survey. For each individual, an iso-energy nutritionally adequate diet was generated using diet modeling, staying as close as possible to the observed diet. The French food composition table was completed with free sugar (FS) content. Results were analyzed separately for individuals with FS intakes in their observed diets ≤10% or >10% of their energy intake (named below FS-ACCEPTABLE and FS-EXCESS, respectively). The FS-EXCESS group represented 41% of the total population (average energy intake of 14.2% from FS). Compared with FS-ACCEPTABLE individuals, FS-EXCESS individuals had diets of lower nutritional quality and consumed more energy (2192 vs. 2123 kcal/day), particularly during snacking occasions (258 vs. 131 kcal/day) (all p -values < 0.01). In order to meet nutritional targets, for both FS-ACCEPTABLE and FS-EXCESS individuals, the main dietary changes in optimized diets were significant increases in fresh fruits, starchy foods, water, hot beverages and plain yogurts; and significant decreases in mixed dishes/sandwiches, meat/eggs/fish and cheese. For FS-EXCESS individuals only, the optimization process significantly increased vegetables and significantly decreased sugar-sweetened beverages, sweet products and fruit juices. The diets of French adults with excessive intakes of FS are of lower nutritional quality, but can be optimized via specific dietary changes.
Lluch, Anne; Maillot, Matthieu; Gazan, Rozenn; Vieux, Florent; Delaere, Fabien; Vaudaine, Sarah; Darmon, Nicole
2017-01-01
Dietary changes needed to achieve nutritional adequacy for 33 nutrients were determined for 1719 adults from a representative French national dietary survey. For each individual, an iso-energy nutritionally adequate diet was generated using diet modeling, staying as close as possible to the observed diet. The French food composition table was completed with free sugar (FS) content. Results were analyzed separately for individuals with FS intakes in their observed diets ≤10% or >10% of their energy intake (named below FS-ACCEPTABLE and FS-EXCESS, respectively). The FS-EXCESS group represented 41% of the total population (average energy intake of 14.2% from FS). Compared with FS-ACCEPTABLE individuals, FS-EXCESS individuals had diets of lower nutritional quality and consumed more energy (2192 vs. 2123 kcal/day), particularly during snacking occasions (258 vs. 131 kcal/day) (all p-values < 0.01). In order to meet nutritional targets, for both FS-ACCEPTABLE and FS-EXCESS individuals, the main dietary changes in optimized diets were significant increases in fresh fruits, starchy foods, water, hot beverages and plain yogurts; and significant decreases in mixed dishes/sandwiches, meat/eggs/fish and cheese. For FS-EXCESS individuals only, the optimization process significantly increased vegetables and significantly decreased sugar-sweetened beverages, sweet products and fruit juices. The diets of French adults with excessive intakes of FS are of lower nutritional quality, but can be optimized via specific dietary changes. PMID:28230722
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Ostrouchov; W.E.Doll; D.A.Wolf
2003-07-01
Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.
NASA Astrophysics Data System (ADS)
Sorini, D.
2017-04-01
Measuring the clustering of galaxies from surveys allows us to estimate the power spectrum of matter density fluctuations, thus constraining cosmological models. This requires careful modelling of observational effects to avoid misinterpretation of data. In particular, signals coming from different distances encode information from different epochs. This is known as ``light-cone effect'' and is going to have a higher impact as upcoming galaxy surveys probe larger redshift ranges. Generalising the method by Feldman, Kaiser and Peacock (1994) [1], I define a minimum-variance estimator of the linear power spectrum at a fixed time, properly taking into account the light-cone effect. An analytic expression for the estimator is provided, and that is consistent with the findings of previous works in the literature. I test the method within the context of the Halofit model, assuming Planck 2014 cosmological parameters [2]. I show that the estimator presented recovers the fiducial linear power spectrum at present time within 5% accuracy up to k ~ 0.80 h Mpc-1 and within 10% up to k ~ 0.94 h Mpc-1, well into the non-linear regime of the growth of density perturbations. As such, the method could be useful in the analysis of the data from future large-scale surveys, like Euclid.
Hwang, S; Choi, H S; Kim, K M; Rhee, Y; Lim, S K
2015-01-01
The association between 25-hydroxyvitamin D (25(OH)D) levels and bone mineral density (BMD) and proximal femur bone geometry was examined in the Korean population. A positive relationship between skeletal health and 25(OH)D levels was observed. However, there were no significant differences in skeletal health between the groups with 25(OH)D level of 50-75 nmol/L and greater than 75 nmol/L. Vitamin D plays an important role in calcium and phosphate homeostasis and normal mineralization of bone. However, the optimal level of vitamin D for skeletal health has not been clearly established. We analyzed the associations between serum 25(OH)D and BMD and proximal femur bone geometry and determined the optimal 25(OH)D level. This was a cross-sectional study of 10,062 participants (20-95 years, 4,455 men, 5,607 women) in the Fourth Korea National Health and Nutrition Examination Surveys (KNHANES IV) conducted from 2008 to 2009. Participants were divided into groups according to 25(OH)D level (<25, 25-50, 50-75, and ≥75 nmol/L). BMD and proximal femur geometric indices were measured. The group with 25(OH)D levels of 50-75 nmol/L had greater bone density values, with the exception of the lumbar spine, and also had greater femur neck cortical thickness, cross-sectional area, and cross-sectional moment of inertia, as well as a lesser buckling ratio than the groups with 25(OH)D level of 25-50 nmol/L and less than 25 nmol/L. However, there were no significant differences in BMD and proximal femur geometry properties between the groups with 50-75 nmol/L and greater than 75 nmol/L of 25(OH)D. The skeletal outcomes, including BMD and proximal femur geometric indices observed in this study, suggest that serum 25(OH)D levels of 50 to <75 nmol/L are optimal for skeletal health.
NASA Astrophysics Data System (ADS)
Zhu, Zhe; Gallant, Alisa L.; Woodcock, Curtis E.; Pengra, Bruce; Olofsson, Pontus; Loveland, Thomas R.; Jin, Suming; Dahal, Devendra; Yang, Limin; Auch, Roger F.
2016-12-01
The U.S. Geological Survey's Land Change Monitoring, Assessment, and Projection (LCMAP) initiative is a new end-to-end capability to continuously track and characterize changes in land cover, use, and condition to better support research and applications relevant to resource management and environmental change. Among the LCMAP product suite are annual land cover maps that will be available to the public. This paper describes an approach to optimize the selection of training and auxiliary data for deriving the thematic land cover maps based on all available clear observations from Landsats 4-8. Training data were selected from map products of the U.S. Geological Survey's Land Cover Trends project. The Random Forest classifier was applied for different classification scenarios based on the Continuous Change Detection and Classification (CCDC) algorithm. We found that extracting training data proportionally to the occurrence of land cover classes was superior to an equal distribution of training data per class, and suggest using a total of 20,000 training pixels to classify an area about the size of a Landsat scene. The problem of unbalanced training data was alleviated by extracting a minimum of 600 training pixels and a maximum of 8000 training pixels per class. We additionally explored removing outliers contained within the training data based on their spectral and spatial criteria, but observed no significant improvement in classification results. We also tested the importance of different types of auxiliary data that were available for the conterminous United States, including: (a) five variables used by the National Land Cover Database, (b) three variables from the cloud screening "Function of mask" (Fmask) statistics, and (c) two variables from the change detection results of CCDC. We found that auxiliary variables such as a Digital Elevation Model and its derivatives (aspect, position index, and slope), potential wetland index, water probability, snow probability, and cloud probability improved the accuracy of land cover classification. Compared to the original strategy of the CCDC algorithm (500 pixels per class), the use of the optimal strategy improved the classification accuracies substantially (15-percentage point increase in overall accuracy and 4-percentage point increase in minimum accuracy).
Emergency department discharge prescription interventions by emergency medicine pharmacists.
Cesarz, Joseph L; Steffenhagen, Aaron L; Svenson, James; Hamedani, Azita G
2013-02-01
We determine the rate and details of interventions associated with emergency medicine pharmacist review of discharge prescriptions for patients discharged from the emergency department (ED). Additionally, we evaluate care providers' satisfaction with such services provided by emergency medicine pharmacists. This was a prospective observational study in the ED of an academic medical center that serves both adult and pediatric patients. Details of emergency medicine pharmacist interventions on discharge prescriptions were compiled with a standardized form. Interventions were categorized as error prevention or optimization of therapy. The staff of the ED was surveyed related to the influence and satisfaction of this new emergency medicine pharmacist-provided service. The 674 discharge prescriptions reviewed by emergency medicine pharmacists during the study period included 602 (89.3%) for adult patients and 72 (10.7%) for pediatric patients. Emergency medicine pharmacists intervened on 68 prescriptions, resulting in an intervention rate of 10.1% (95% confidence interval [CI] 8.0% to 12.7%). The intervention rate was 8.5% (95% CI 6.4% to 11.1%) for adult prescriptions and 23.6% for pediatric prescriptions (95% CI 14.7% to 35.3%) (difference 15.1%; 95% CI 5.1% to 25.2%). There were a similar number of interventions categorized as error prevention and optimization of medication therapy, 37 (54%) and 31 (46%), respectively. More than 95% of survey respondents believed that the new pharmacist services improved patient safety, optimized medication regimens, and improved patient satisfaction. Emergency medicine pharmacist review of discharge prescriptions for discharged ED patients has the potential to significantly improve patient care associated with suboptimal prescriptions and is highly valued by ED care providers. Copyright © 2012. Published by Mosby, Inc.
A Satellite Survey of Cloud Cover and Water Vapor in the Southwestern USA and Northern Mexico
NASA Astrophysics Data System (ADS)
Carrasco, E.; Avila, R.; Erasmus, A.; Djorgovski, S. G.; Walker, A. R.; Blum, R.
2017-03-01
Cloud cover and water vapor conditions in the southwestern USA and northern Mexico were surveyed as a preparatory work for the Thirty Meter Telescope (TMT) in situ site testing program. Although the telescope site is already selected, the TMT site testing team decided to make public these results for its usefulness for the community. Using 58 months of meteorological satellite observations between 1993 July and 1999 September, different atmospheric parameters were quantified from data of the 10.7 μm and of 6.7 μm windows. In particular, cloud cover and water vapor conditions were identified in preferred areas. As a result of the aerial analysis, 15 sites of existing and potential telescope were selected, compared, and ranked in terms of their observing quality. The clearest sites are located along the spine of the Baja peninsula and into southern California on mountain peaks above the temperature inversion layer. A steep gradient of cloudiness was observed along the coast where coastal cloud and fog are trapped below the inversion layer. Moving from west to east over the continent, a significant increase in cloudiness was observed. The analysis shows that San Pedro Mártir, San Gorgonio Mountain and San Jacinto Peak have the largest fraction of clear sky conditions (˜74%). The site with the optimal combination of clear skies and low precipitable water vapor is Boundary Peak, Nevada. An approach based in satellite data provided a reliable method for sites comparison.
NASA Astrophysics Data System (ADS)
Tian, Baoqing; Xu, Peifen; Ling, Suqun; Du, Jianguo; Xu, Xueqiu; Pang, Zhonghe
2017-10-01
Geophysical techniques are critical tools of geothermal resource surveys. In recent years, the microtremor survey method, which has two branch techniques (the microtremor sounding technique and the two-dimensional (2D) microtremor profiling technique), has become a common method for geothermal resource exploration. The results of microtremor surveys provide important deep information for probing structures of geothermal storing basins and researching the heat-controlling structures, as well as providing the basis for drilling positions of geothermal wells. In this paper, the southern Jiangsu geothermal resources area is taken as a study example. By comparing the results of microtremor surveys and drilling conclusions, and analyzing microtremor survey effectiveness, and geological and technical factors such as observation radius and sampling frequency, we study the applicability of the microtremor survey method and the optimal way of working with this method to achieve better detection results. A comparative study of survey results and geothermal drilling results shows that the microtremor sounding technique effectively distinguishes sub-layers and determines the depth of geothermal reservoirs in the area with excellent layer conditions. The error of depth is generally no more than 8% compared with the results of drilling. It detects deeper by adjusting the size of the probing radius. The 2D microtremor profiling technique probes exactly the buried structures which display as low velocity anomalies in the apparent velocity profile of the S-wave. The anomaly is the critical symbol of the 2D microtremor profiling technique to distinguish and explain the buried geothermal structures. 2D microtremor profiling results provide an important basis for locating exactly the geothermal well and reducing the risk of drilling dry wells.
Inverse Regional Modeling with Adjoint-Free Technique
NASA Astrophysics Data System (ADS)
Yaremchuk, M.; Martin, P.; Panteleev, G.; Beattie, C.
2016-02-01
The ongoing parallelization trend in computer technologies facilitates the use ensemble methods in geophysical data assimilation. Of particular interest are ensemble techniques which do not require the development of tangent linear numerical models and their adjoints for optimization. These ``adjoint-free'' methods minimize the cost function within the sequence of subspaces spanned by a carefully chosen sets perturbations of the control variables. In this presentation, an adjoint-free variational technique (a4dVar) is demonstrated in an application estimating initial conditions of two numerical models: the Navy Coastal Ocean Model (NCOM), and the surface wave model (WAM). With the NCOM, performance of both adjoint and adjoint-free 4dVar data assimilation techniques is compared in application to the hydrographic surveys and velocity observations collected in the Adriatic Sea in 2006. Numerical experiments have shown that a4dVar is capable of providing forecast skill similar to that of conventional 4dVar at comparable computational expense while being less susceptible to excitation of ageostrophic modes that are not supported by observations. Adjoint-free technique constrained by the WAM model is tested in a series of data assimilation experiments with synthetic observations in the southern Chukchi Sea. The types of considered observations are directional spectra estimated from point measurements by stationary buoys, significant wave height (SWH) observations by coastal high-frequency radars and along-track SWH observations by satellite altimeters. The a4dVar forecast skill is shown to be 30-40% better than the skill of the sequential assimilaiton method based on optimal interpolation which is currently used in operations. Prospects of further development of the a4dVar methods in regional applications are discussed.
Johnson, R.H.; Poeter, E.P.
2007-01-01
Perchloroethylene (PCE) saturations determined from GPR surveys were used as observations for inversion of multiphase flow simulations of a PCE injection experiment (Borden 9??m cell), allowing for the estimation of optimal bulk intrinsic permeability values. The resulting fit statistics and analysis of residuals (observed minus simulated PCE saturations) were used to improve the conceptual model. These improvements included adjustment of the elevation of a permeability contrast, use of the van Genuchten versus Brooks-Corey capillary pressure-saturation curve, and a weighting scheme to account for greater measurement error with larger saturation values. A limitation in determining PCE saturations through one-dimensional GPR modeling is non-uniqueness when multiple GPR parameters are unknown (i.e., permittivity, depth, and gain function). Site knowledge, fixing the gain function, and multiphase flow simulations assisted in evaluating non-unique conceptual models of PCE saturation, where depth and layering were reinterpreted to provide alternate conceptual models. Remaining bias in the residuals is attributed to the violation of assumptions in the one-dimensional GPR interpretation (which assumes flat, infinite, horizontal layering) resulting from multidimensional influences that were not included in the conceptual model. While the limitations and errors in using GPR data as observations for inverse multiphase flow simulations are frustrating and difficult to quantify, simulation results indicate that the error and bias in the PCE saturation values are small enough to still provide reasonable optimal permeability values. The effort to improve model fit and reduce residual bias decreases simulation error even for an inversion based on biased observations and provides insight into alternate GPR data interpretations. Thus, this effort is warranted and provides information on bias in the observation data when this bias is otherwise difficult to assess. ?? 2006 Elsevier B.V. All rights reserved.
Optimal galaxy survey for detecting the dipole in the cross-correlation with 21 cm Intensity Mapping
NASA Astrophysics Data System (ADS)
Lepori, Francesca; Di Dio, Enea; Villa, Eleonora; Viel, Matteo
2018-05-01
We investigate the future perspectives of the detection of the relativistic dipole by cross-correlating the 21 cm emission in Intensity Mapping (IM) and galaxy surveys at low redshift. We model the neutral hydrogen (HI) and the galaxy population by means of the halo model to relate the parameters that affect the dipole signal such as the biases of the two tracers and the Poissonian noise. We investigate the behavior of the signal-to-noise as a function of the galaxy and magnification biases, for two fixed models of the neutral hydrogen. In both cases we found that the signal-to-noise does not grow by increasing the difference between the biases of the two tracers, due to the larger shot-noise yields by highly biased tracers. We also study and provide an optimal luminosity-threshold galaxy catalogue to enhance the signal-to-noise ratio of the relativistic dipole. Interestingly, we show that the maximum magnitude provided by the survey does not lead to the maximum signal-to-noise for detecting relativistic effects and we predict the optimal value for the limiting magnitude. Our work suggests that an optimal analysis could increase the signal-to-noise ratio up to a factor five compared to a standard one.
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
Reconciling tensor and scalar observables in G-inflation
NASA Astrophysics Data System (ADS)
Ramírez, Héctor; Passaglia, Samuel; Motohashi, Hayato; Hu, Wayne; Mena, Olga
2018-04-01
The simple m2phi2 potential as an inflationary model is coming under increasing tension with limits on the tensor-to-scalar ratio r and measurements of the scalar spectral index ns. Cubic Galileon interactions in the context of the Horndeski action can potentially reconcile the observables. However, we show that this cannot be achieved with only a constant Galileon mass scale because the interactions turn off too slowly, leading also to gradient instabilities after inflation ends. Allowing for a more rapid transition can reconcile the observables but moderately breaks the slow-roll approximation leading to a relatively large and negative running of the tilt αs that can be of order ns‑1. We show that the observables on CMB and large scale structure scales can be predicted accurately using the optimized slow-roll approach instead of the traditional slow-roll expansion. Upper limits on |αs| place a lower bound of rgtrsim 0.005 and, conversely, a given r places a lower bound on |αs|, both of which are potentially observable with next generation CMB and large scale structure surveys.
IDENTIFYING IONIZED REGIONS IN NOISY REDSHIFTED 21 cm DATA SETS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malloy, Matthew; Lidz, Adam, E-mail: mattma@sas.upenn.edu
One of the most promising approaches for studying reionization is to use the redshifted 21 cm line. Early generations of redshifted 21 cm surveys will not, however, have the sensitivity to make detailed maps of the reionization process, and will instead focus on statistical measurements. Here, we show that it may nonetheless be possible to directly identify ionized regions in upcoming data sets by applying suitable filters to the noisy data. The locations of prominent minima in the filtered data correspond well with the positions of ionized regions. In particular, we corrupt semi-numeric simulations of the redshifted 21 cm signalmore » during reionization with thermal noise at the level expected for a 500 antenna tile version of the Murchison Widefield Array (MWA), and mimic the degrading effects of foreground cleaning. Using a matched filter technique, we find that the MWA should be able to directly identify ionized regions despite the large thermal noise. In a plausible fiducial model in which {approx}20% of the volume of the universe is neutral at z {approx} 7, we find that a 500-tile MWA may directly identify as many as {approx}150 ionized regions in a 6 MHz portion of its survey volume and roughly determine the size of each of these regions. This may, in turn, allow interesting multi-wavelength follow-up observations, comparing galaxy properties inside and outside of ionized regions. We discuss how the optimal configuration of radio antenna tiles for detecting ionized regions with a matched filter technique differs from the optimal design for measuring power spectra. These considerations have potentially important implications for the design of future redshifted 21 cm surveys.« less
HI Intensity Mapping with FAST
NASA Astrophysics Data System (ADS)
Bigot-Sazy, M.-A.; Ma, Y.-Z.; Battye, R. A.; Browne, I. W. A.; Chen, T.; Dickinson, C.; Harper, S.; Maffei, B.; Olivari, L. C.; Wilkinsondagger, P. N.
2016-02-01
We discuss the detectability of large-scale HI intensity fluctuations using the FAST telescope. We present forecasts for the accuracy of measuring the Baryonic Acoustic Oscillations and constraining the properties of dark energy. The FAST 19-beam L-band receivers (1.05-1.45 GHz) can provide constraints on the matter power spectrum and dark energy equation of state parameters (w0,wa) that are comparable to the BINGO and CHIME experiments. For one year of integration time we find that the optimal survey area is 6000 deg2. However, observing with larger frequency coverage at higher redshift (0.95-1.35 GHz) improves the projected errorbars on the HI power spectrum by more than 2 σ confidence level. The combined constraints from FAST, CHIME, BINGO and Planck CMB observations can provide reliable, stringent constraints on the dark energy equation of state.
NASA Astrophysics Data System (ADS)
Miyauchi, T.; Machimura, T.
2013-12-01
In the simulation using an ecosystem process model, the adjustment of parameters is indispensable for improving the accuracy of prediction. This procedure, however, requires much time and effort for approaching the simulation results to the measurements on models consisting of various ecosystem processes. In this study, we tried to apply a general purpose optimization tool in the parameter optimization of an ecosystem model, and examined its validity by comparing the simulated and measured biomass growth of a woody plantation. A biometric survey of tree biomass growth was performed in 2009 in an 11-year old Eucommia ulmoides plantation in Henan Province, China. Climate of the site was dry temperate. Leaf, above- and below-ground woody biomass were measured from three cut trees and converted into carbon mass per area by measured carbon contents and stem density. Yearly woody biomass growth of the plantation was calculated according to allometric relationships determined by tree ring analysis of seven cut trees. We used Biome-BGC (Thornton, 2002) to reproduce biomass growth of the plantation. Air temperature and humidity from 1981 to 2010 was used as input climate condition. The plant functional type was deciduous broadleaf, and non-optimizing parameters were left default. 11-year long normal simulations were performed following a spin-up run. In order to select optimizing parameters, we analyzed the sensitivity of leaf, above- and below-ground woody biomass to eco-physiological parameters. Following the selection, optimization of parameters was performed by using the Dakota optimizer. Dakota is an optimizer developed by Sandia National Laboratories for providing a systematic and rapid means to obtain optimal designs using simulation based models. As the object function, we calculated the sum of relative errors between simulated and measured leaf, above- and below-ground woody carbon at each of eleven years. In an alternative run, errors at the last year (at the field survey) were weighted for priority. We compared some gradient-based global optimization methods of Dakota starting with the default parameters of Biome-BGC. In the result of sensitive analysis, carbon allocation parameters between coarse root and leaf, between stem and leaf, and SLA had high contribution on both leaf and woody biomass changes. These parameters were selected to be optimized. The measured leaf, above- and below-ground woody biomass carbon density at the last year were 0.22, 1.81 and 0.86 kgC m-2, respectively, whereas those simulated in the non-optimized control case using all default parameters were 0.12, 2.26 and 0.52 kgC m-2, respectively. After optimizing the parameters, the simulated values were improved to 0.19, 1.81 and 0.86 kgC m-2, respectively. The coliny global optimization method gave the better fitness than efficient global and ncsu direct method. The optimized parameters showed the higher carbon allocation rates to coarse roots and leaves and the lower SLA than the default parameters, which were consistent to the general water physiological response in a dry climate. The simulation using the weighted object function resulted in the closer simulations to the measurements at the last year with the lower fitness during the previous years.
Studying Galaxy Formation with the Hubble, Spitzer and James Webb Space Telescopes
NASA Technical Reports Server (NTRS)
Gardner, Jonathan P.
2009-01-01
The deepest optical to infrared observations of the universe include the Hubble Deep Fields, the Great Observatories Origins Deep Survey and the recent Hubble Ultra-Deep Field. Galaxies are seen in these surveys at redshifts z greater than 6, less than 1 Gyr after the Big Bang, at the end of a period when light from the galaxies has reionized Hydrogen in the inter-galactic medium. These observations, combined with theoretical understanding, indicate that the first stars and galaxies formed at z greater than 10, beyond the reach of the Hubble and Spitzer Space Telescopes. To observe the first galaxies, NASA is planning the James Webb Space Telescope (JWST), a large (6.5m), cold (less than 50K), infrared-optimized observatory to be launched early in the next decade into orbit around the second Earth-Sun Lagrange point. JWST will have four instruments: The Near-Infrared Camera, the Near-Infrared multi-object Spectrograph, and the Tunable Filter Imager will cover the wavelength range 0.6 to 5 microns, while the Mid-Infrared Instrument will do both imaging and spectroscopy from 5 to 28.5 microns. In addition to JWST's ability to study the formation and evolution of galaxies, I will also briefly review its expected contributions to studies of the formation of stars and planetary systems, and discuss recent progress in constructing the observatory.
Nappi, R E; Krychman, M L
2016-06-01
Vulvar and vaginal atrophy (VVA) is a common complaint in postmenopausal women and consists of a variety of symptoms and strong repercussions that negatively affect comfort during sexual activity and ultimately impact quality of life. The EU and US REVIVE surveys have detected significant barriers in health-care professional management and educational programs that prevent correct diagnosis and effective treatment. This was common in both Europe and the US, but differential behaviors and patterns could be detected after reviewing the published results. The frequency of reporting VVA symptoms was lower in European participants. However, a better knowledge that VVA is a consequence of menopause was evident in Europe, probably in relation to more frequent gynecological visits and more frequent specialist visits as a referral health-care professional. Moreover, a trend towards an improved satisfaction with management by the health-care professional was observed in Europe. European participants acknowledged a significantly higher impact of VVA symptoms on sexual intercourse and partner interaction than North American (US) participants, and both cohorts were observed to have differences between their respective VVA symptom profiles. These observations have implications in the overall concerns that participants stated with long-term VVA medication and for the optimal therapeutic approach, providing evidence to support the concept that unexplored methods to improve management of patients with VVA remain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calore, F.; Weniger, C.; Mauro, M. Di
The dense stellar environment of the Galactic center has been proposed to host a large population of as-yet undetected millisecond pulsars (MSPs). Recently, this hypothesis has found support in an analysis of gamma-rays detected using the Large Area Telescope onboard the Fermi satellite, which revealed an excess of diffuse GeV photons in the inner 15 deg about the Galactic center. The excess can be interpreted as the collective emission of thousands of MSPs in the Galactic bulge, with a spherical distribution strongly peaked toward the Galactic center. In order to fully establish the MSP interpretation, it is essential to findmore » corroborating evidence in multi-wavelength searches, most notably through the detection of radio pulsations from individual bulge MSPs. Based on globular cluster observations and gamma-ray emission from the inner Galaxy, we investigate the prospects for detecting MSPs in the Galactic bulge. While previous pulsar surveys failed to identify this population, we demonstrate that upcoming large-area surveys of this region should lead to the detection of dozens of bulge MSPs. Additionally, we show that deep targeted searches of unassociated Fermi sources should be able to detect the first few MSPs in the bulge. The prospects for these deep searches are enhanced by a tentative gamma-ray/radio correlation that we infer from high-latitude gamma-ray MSPs. Such detections would constitute the first clear discoveries of field MSPs in the Galactic bulge, with far-reaching implications for gamma-ray observations, the formation history of the central Milky Way, and strategy optimization for future deep radio pulsar surveys.« less
NASA Astrophysics Data System (ADS)
Williams, Christina C.; Curtis-Lake, Emma; Hainline, Kevin N.; Chevallard, Jacopo; Robertson, Brant E.; Charlot, Stephane; Endsley, Ryan; Stark, Daniel P.; Willmer, Christopher N. A.; Alberts, Stacey; Amorin, Ricardo; Arribas, Santiago; Baum, Stefi; Bunker, Andrew; Carniani, Stefano; Crandall, Sara; Egami, Eiichi; Eisenstein, Daniel J.; Ferruit, Pierre; Husemann, Bernd; Maseda, Michael V.; Maiolino, Roberto; Rawle, Timothy D.; Rieke, Marcia; Smit, Renske; Tacchella, Sandro; Willott, Chris J.
2018-06-01
We present an original phenomenological model to describe the evolution of galaxy number counts, morphologies, and spectral energy distributions across a wide range of redshifts (0.2< z< 15) and stellar masses [{log}(M/{M}ȯ )≥slant 6]. Our model follows observed mass and luminosity functions of both star-forming and quiescent galaxies, and reproduces the redshift evolution of colors, sizes, star formation, and chemical properties of the observed galaxy population. Unlike other existing approaches, our model includes a self-consistent treatment of stellar and photoionized gas emission and dust attenuation based on the BEAGLE tool. The mock galaxy catalogs generated with our new model can be used to simulate and optimize extragalactic surveys with future facilities such as the James Webb Space Telescope (JWST), and to enable critical assessments of analysis procedures, interpretation tools, and measurement systematics for both photometric and spectroscopic data. As a first application of this work, we make predictions for the upcoming JWST Advanced Deep Extragalactic Survey (JADES), a joint program of the JWST/NIRCam and NIRSpec Guaranteed Time Observations teams. We show that JADES will detect, with NIRCam imaging, 1000s of galaxies at z ≳ 6, and 10s at z ≳ 10 at {m}{AB}≲ 30 (5σ) within the 236 arcmin2 of the survey. The JADES data will enable accurate constraints on the evolution of the UV luminosity function at z > 8, and resolve the current debate about the rate of evolution of galaxies at z ≳ 8. Ready-to-use mock catalogs and software to generate new realizations are publicly available as the JAdes extraGalactic Ultradeep Artificial Realizations (JAGUAR) package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirk, Donnacha; Lahav, Ofer; Bridle, Sarah
The combination of multiple cosmological probes can produce measurements of cosmological parameters much more stringent than those possible with any individual probe. We examine the combination of two highly correlated probes of late-time structure growth: (i) weak gravitational lensing from a survey with photometric redshifts and (ii) galaxy clustering and redshift space distortions from a survey with spectroscopic redshifts. We choose generic survey designs so that our results are applicable to a range of current and future photometric redshift (e.g. KiDS, DES, HSC, Euclid) and spectroscopic redshift (e.g. DESI, 4MOST, Sumire) surveys. Combining the surveys greatly improves their power tomore » measure both dark energy and modified gravity. An independent, non-overlapping combination sees a dark energy figure of merit more than 4 times larger than that produced by either survey alone. The powerful synergies between the surveys are strongest for modified gravity, where their constraints are orthogonal, producing a non-overlapping joint figure of merit nearly 2 orders of magnitude larger than either alone. Our projected angular power spectrum formalism makes it easy to model the cross-correlation observable when the surveys overlap on the sky, producing a joint data vector and full covariance matrix. We calculate a same-sky improvement factor, from the inclusion of these cross-correlations, relative to non-overlapping surveys. We find nearly a factor of 4 for dark energy and more than a factor of 2 for modified gravity. The exact forecast figures of merit and same-sky benefits can be radically affected by a range of forecasts assumption, which we explore methodically in a sensitivity analysis. We show that that our fiducial assumptions produce robust results which give a good average picture of the science return from combining photometric and spectroscopic surveys.« less
NASA Astrophysics Data System (ADS)
Hütsi, Gert; Gilfanov, Marat; Kolodzig, Alexander; Sunyaev, Rashid
2014-12-01
We investigate the potential of large X-ray-selected AGN samples for detecting baryonic acoustic oscillations (BAO). Though AGN selection in X-ray band is very clean and efficient, it does not provide redshift information, and thus needs to be complemented with an optical follow-up. The main focus of this study is (i) to find the requirements needed for the quality of the optical follow-up and (ii) to formulate the optimal strategy of the X-ray survey, in order to detect the BAO. We demonstrate that redshift accuracy of σ0 = 10-2 at z = 1 and the catastrophic failure rate of ffail ≲ 30% are sufficient for a reliable detection of BAO in future X-ray surveys. Spectroscopic quality redshifts (σ0 = 10-3 and ffail ~ 0) will boost the confidence level of the BAO detection by a factor of ~2. For meaningful detection of BAO, X-ray surveys of moderate depth of Flim ~ few 10-15 erg s-1/cm2 covering sky area from a few hundred to ~ten thousand square degrees are required. The optimal strategy for the BAO detection does not necessarily require full sky coverage. For example, in a 1000 day-long survey by an eROSITA type telescope, an optimal strategy would be to survey a sky area of ~9000 deg2, yielding a ~16σ BAO detection. A similar detection will be achieved by ATHENA+ or WFXT class telescopes in a survey with a duration of 100 days, covering a similar sky area. XMM-Newton can achieve a marginal BAO detection in a 100-day survey covering ~400 deg2. These surveys would demand a moderate-to-high cost in terms the optical follow-ups, requiring determination of redshifts of ~105 (XMM-Newton) to ~3 × 106 objects (eROSITA, ATHENA+, and WFXT) in these sky areas.
Miller, Alicia S.; Shepherd, Gary R.; Fratantoni, Paula S.
2016-01-01
Black sea bass (Centropristis striata) migrations are believed to play a role in overwinter survival and connectivity between juvenile and adult populations. This study investigated oceanographic drivers of winter habitat choice and regional differences between populations of juvenile and adult black sea bass. Trends in cohort strength, as a result of juvenile survival, were also identified. Oceanographic and fisheries survey data were analyzed using generalized additive models. Among the oceanographic variables investigated, salinity was the main driver in habitat selection with an optimal range of 33–35 practical salinity units (PSU) for both juveniles and adults. Preferred temperature ranges varied between juveniles and adults, but held a similar minimum preference of >8°C. Salinity and temperature ranges also differed by regions north and south of Hudson Canyon. Shelf water volume had less of an effect than temperature or salinity, but showed an overall negative relationship with survey catch. The effect of winter conditions on juvenile abundance was also observed across state and federal survey index trends. A lack of correlation observed among surveys in the fall paired with a strong correlation in the spring identifies the winter period as a factor determining year-class strength of new recruits to the population. A rank order analysis of spring indices identified three of the largest year classes occurring during years with reduced shelf water volumes, warmer winter shelf waters, and a 34 PSU isohaline aligned farther inshore. While greater catches of black sea bass in the northwest Atlantic Ocean remain south of Hudson Canyon, the species’ range has expanded north in recent years. PMID:26824350
Lisa D. Jackson; Daniel A. Fieselmann
2011-01-01
The mission of the Cooperative Agricultural Pest Survey (CAPS) program is to provide a survey profile of exotic plant pests in the United States deemed to be of regulatory significance to USDA Animal and Plant Health Inspection Service (APHIS), Plant Protection and Quarantine (PPQ), State Departments of Agriculture, tribal governments, and cooperators by confirming the...
A Survey of Shape Parameterization Techniques
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
1999-01-01
This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.
2015-04-01
capability to conduct airfield surveys outside of a permissive environment. Optimizing the Rapid Raptor Forward Arming and Refueling Point (FARP...9] An Initial Approach at Dispersing Air Operations: Rapid Raptor Concept ................... [12] Rapid Raptor : Optimized...Approach at Dispersing Air Operations: Rapid Raptor Concept The Air Force Rapid Raptor Fighter Forward Arming and Refueling (FARP) concept is an
Tang, Liyang
2013-04-04
The main aim of China's Health Care System Reform was to help the decision maker find the optimal solution to China's institutional problem of health care provider selection. A pilot health care provider research system was recently organized in China's health care system, and it could efficiently collect the data for determining the optimal solution to China's institutional problem of health care provider selection from various experts, then the purpose of this study was to apply the optimal implementation methodology to help the decision maker effectively promote various experts' views into various optimal solutions to this problem under the support of this pilot system. After the general framework of China's institutional problem of health care provider selection was established, this study collaborated with the National Bureau of Statistics of China to commission a large-scale 2009 to 2010 national expert survey (n = 3,914) through the organization of a pilot health care provider research system for the first time in China, and the analytic network process (ANP) implementation methodology was adopted to analyze the dataset from this survey. The market-oriented health care provider approach was the optimal solution to China's institutional problem of health care provider selection from the doctors' point of view; the traditional government's regulation-oriented health care provider approach was the optimal solution to China's institutional problem of health care provider selection from the pharmacists' point of view, the hospital administrators' point of view, and the point of view of health officials in health administration departments; the public private partnership (PPP) approach was the optimal solution to China's institutional problem of health care provider selection from the nurses' point of view, the point of view of officials in medical insurance agencies, and the health care researchers' point of view. The data collected through a pilot health care provider research system in the 2009 to 2010 national expert survey could help the decision maker effectively promote various experts' views into various optimal solutions to China's institutional problem of health care provider selection.
Resources Management Strategy For Mud Crabs (Scylla spp.) In Pemalang Regency
NASA Astrophysics Data System (ADS)
Purnama Fitri, Aristi Dian; Boesono, Herry; Sabdono, Agus; Adlina, Nadia
2017-02-01
The aim of this research is to develop resources management strategies of mud crab (Scylla spp.) in Pemalang Regency. The method used is descriptive survey in a case study. This research used primary data and secondary data. Primary data were collected through field observations and in-depth interviews with key stakeholders. Secondary data were collected from related publications and documents issued by the competent institutions. SWOT Analysis was used to inventory the strengths, weaknesses, opportunities and threats. TOWS matrix was used to develop an alternative of resources management strategies. SWOT analysis was obtained by 6 alternative strategies that can be applied for optimization of fisheries development in Pemalang Regency. The strategies is the control of mud crab fishing gear, restricted size allowable in mud crab, control of mud crab fishing season, catch monitoring of mud crab, needs a management institutions which ensure the implementation of the regulation, and implementation for mud crab aquaculture. Each alternative strategy can be synergized to optimize the resources development in Pemalang Regency.
NASA Astrophysics Data System (ADS)
Sivasundaram, Seenith
2016-07-01
The review paper [1] is devoted to the survey of different structures that have been developed for the modeling and analysis of various types of fibrosis. Biomathematics, bioinformatics, biomechanics and biophysics modeling have been treated by means of a brief description of the different models developed. The review is impressive and clearly written, addressed to a reader interested not only in the theoretical modeling but also in the biological description. The models have been described without recurring to technical statements or mathematical equations thus allowing the non-specialist reader to understand what framework is more suitable at a certain observation scale. The review [1] concludes with the possibility to develop a multiscale approach considering also the definition of a therapeutical strategy for pathological fibrosis. In particular the control and optimization of therapeutics action is an important issue and this article aims at commenting on this topic.
Water Cycle Missions for the Next Decade
NASA Astrophysics Data System (ADS)
Houser, P. R.
2013-12-01
The global water cycle describes the circulation of water as a vital and dynamic substance in its liquid, solid, and vapor phases as it moves through the atmosphere, oceans and land. Life in its many forms exists because of water, and modern civilization depends on learning how to live within the constraints imposed by the availability of water. The scientific challenge posed by the need to observe the global water cycle is to integrate in situ and space-borne observations to quantify the key water-cycle state variables and fluxes. The vision to address that challenge is a series of Earth observation missions that will measure the states, stocks, flows, and residence times of water on regional to global scales followed by a series of coordinated missions that will address the processes, on a global scale, that underlie variability and changes in water in all its three phases. The accompanying societal challenge is to foster the improved use of water data and information as a basis for enlightened management of water resources, to protect life and property from effects of extremes in the water cycle. A major change in thinking about water science that goes beyond its physics to include its role in ecosystems and society is also required. Better water-cycle observations, especially on the continental and global scales, will be essential. Water-cycle predictions need to be readily available globally to reduce loss of life and property caused by water-related natural hazards. Building on the 2007 Earth Science Decadal Survey, NASA's Plan for a Climate-Centric Architecture for Earth Observations and Applications from Space , and the 2012 Chapman Conference on Remote Sensing of the Terrestrial Water Cycle, a workshop was held in April 2013 to gather wisdom and determine how to prepare for the next generation of water cycle missions in support of the second Earth Science Decadal Survey. This talk will present the outcomes of the workshop including the intersection between science questions, technology readiness and satellite design optimization. A series of next-generation water cycle mission working groups were proposed and white papers, designed to identify capacity gaps and inform NASA were developed. The workshop identified several visions for the next decade of water cycle satellite observations, and developed a roadmap and action plan for developing the foundation for these missions. Achieving this outcome will result in optimized community investments and better functionality of these future missions, and will help to foster broader range of scientists and professionals engaged in water cycle observation planning and development around the country, and the world.
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2017-08-01
The Transiting Exoplanet Survey Satellite (TESS) is an instrument consisting of four, wide fieldof- view CCD cameras dedicated to the discovery of exoplanets around the brightest stars, and understanding the diversity of planets and planetary systems in our galaxy. Each camera utilizes a seven-element lens assembly with low-power and low-noise CCD electronics. Advanced multivariable optimization and numerical simulation capabilities accommodating arbitrarily complex objective functions have been added to the internally developed Lincoln Laboratory Integrated Modeling and Analysis Software (LLIMAS) and used to assess system performance. Various optical phenomena are accounted for in these analyses including full dn/dT spatial distributions in lenses and charge diffusion in the CCD electronics. These capabilities are utilized to design CCD shims for thermal vacuum chamber testing and flight, and verify comparable performance in both environments across a range of wavelengths, field points and temperature distributions. Additionally, optimizations and simulations are used for model correlation and robustness optimizations.
The Deep Lens Survey : Real--time Optical Transient and Moving Object Detection
NASA Astrophysics Data System (ADS)
Becker, Andy; Wittman, David; Stubbs, Chris; Dell'Antonio, Ian; Loomba, Dinesh; Schommer, Robert; Tyson, J. Anthony; Margoniner, Vera; DLS Collaboration
2001-12-01
We report on the real-time optical transient program of the Deep Lens Survey (DLS). Meeting the DLS core science weak-lensing objective requires repeated visits to the same part of the sky, 20 visits for 63 sub-fields in 4 filters, on a 4-m telescope. These data are reduced in real-time, and differenced against each other on all available timescales. Our observing strategy is optimized to allow sensitivity to transients on several minute, one day, one month, and one year timescales. The depth of the survey allows us to detect and classify both moving and stationary transients down to ~ 25th magnitude, a relatively unconstrained region of astronomical variability space. All transients and moving objects, including asteroids, Kuiper belt (or trans-Neptunian) objects, variable stars, supernovae, 'unknown' bursts with no apparent host, orphan gamma-ray burst afterglows, as well as airplanes, are posted on the web in real-time for use by the community. We emphasize our sensitivity to detect and respond in real-time to orphan afterglows of gamma-ray bursts, and present one candidate orphan in the field of Abell 1836. See http://dls.bell-labs.com/transients.html.
2012 Workplace and Gender Relations Survey of Active Duty Members: Nonresponse Bias Analysis Report
2014-01-01
Control and Prevention ), or command climate surveys (e.g., DEOCS). 6 Table 1. Comparison of Trends in WGRA and SOFS-A Response Rates (Shown in...DMDC draws optimized samples to reduce survey burden on members as well as produce high levels of precision for important domain estimates by using...statistical significance at α= .05 Because paygrade is a significant predictor of survey response, we next examined the odds ratio of each paygrade levels
NASA Astrophysics Data System (ADS)
Wang, Dandan; Zhao, Gong-Bo; Wang, Yuting; Percival, Will J.; Ruggeri, Rossana; Zhu, Fangzhou; Tojeiro, Rita; Myers, Adam D.; Chuang, Chia-Hsun; Baumgarten, Falk; Zhao, Cheng; Gil-Marín, Héctor; Ross, Ashley J.; Burtin, Etienne; Zarrouk, Pauline; Bautista, Julian; Brinkmann, Jonathan; Dawson, Kyle; Brownstein, Joel R.; de la Macorra, Axel; Schneider, Donald P.; Shafieloo, Arman
2018-06-01
We present a measurement of the anisotropic and isotropic Baryon Acoustic Oscillations (BAO) from the extended Baryon Oscillation Spectroscopic Survey Data Release 14 quasar sample with optimal redshift weights. Applying the redshift weights improves the constraint on the BAO dilation parameter α(zeff) by 17 per cent. We reconstruct the evolution history of the BAO distance indicators in the redshift range of 0.8 < z < 2.2. This paper is part of a set that analyses the eBOSS DR14 quasar sample.
The impact of chief executive officer optimism on hospital strategic decision making.
Langabeer, James R; Yao, Emery
2012-01-01
Previous strategic decision making research has focused mostly on the analytical positioning approach, which broadly emphasizes an alignment between rationality and the external environment. In this study, we propose that hospital chief executive optimism (or the general tendency to expect positive future outcomes) will moderate the relationship between comprehensively rational decision-making process and organizational performance. The purpose of this study was to explore the impact that dispositional optimism has on the well-established relationship between rational decision-making processes and organizational performance. Specifically, we hypothesized that optimism will moderate the relationship between the level of rationality and the organization's performance. We further suggest that this relationship will be more negative for those with high, as opposed to low, optimism. We surveyed 168 hospital CEOs and used moderated hierarchical regression methods to statically test our hypothesis. On the basis of a survey study of 168 hospital CEOs, we found evidence of a complex interplay of optimism in the rationality-organizational performance relationship. More specifically, we found that the two-way interactions between optimism and rational decision making were negatively associated with performance and that where optimism was the highest, the rationality-performance relationship was the most negative. Executive optimism was positively associated with organizational performance. We also found that greater perceived environmental turbulence, when interacting with optimism, did not have a significant interaction effect on the rationality-performance relationship. These findings suggest potential for broader participation in strategic processes and the use of organizational development techniques that assess executive disposition and traits for recruitment processes, because CEO optimism influences hospital-level processes. Research implications include incorporating greater use of behavior and cognition constructs to better depict decision-making processes in complex organizations like hospitals.
NASA Technical Reports Server (NTRS)
Bolton, Adam S.; Burles, Scott; Koopmans, Leon V. E.; Treu, Tommaso; Moustakas, Leonidas A.
2006-01-01
The Sloan Lens ACS (SLACS) Survey is an efficient Hubble Space Telescope (HST) Snapshot imaging survey for new galaxy-scale strong gravitational lenses. The targeted lens candidates are selected spectroscopically from the Sloan Digital Sky Survey (SDSS) database of galaxy spectra for having multiple nebular emission lines at a redshift significantly higher than that of the SDSS target galaxy. The SLACS survey is optimized to detect bright early-type lens galaxies with faint lensed sources in order to increase the sample of known gravitational lenses suitable for detailed lensing, photometric, and dynamical modeling. In this paper, the first in a series on the current results of our HST Cycle 13 imaging survey, we present a catalog of 19 newly discovered gravitational lenses, along with nine other observed candidate systems that are either possible lenses, nonlenses, or nondetections. The survey efficiency is thus >=68%. We also present Gemini 8 m and Magellan 6.5 m integral-field spectroscopic data for nine of the SLACS targets, which further support the lensing interpretation. A new method for the effective subtraction of foreground galaxy images to reveal faint background features is presented. We show that the SLACS lens galaxies have colors and ellipticities typical of the spectroscopic parent sample from which they are drawn (SDSS luminous red galaxies and quiescent MAIN sample galaxies), but are somewhat brighter and more centrally concentrated. Several explanations for the latter bias are suggested. The SLACS survey provides the first statistically significant and homogeneously selected sample of bright early-type lens galaxies, furnishing a powerful probe of the structure of early-type galaxies within the half-light radius. The high confirmation rate of lenses in the SLACS survey suggests consideration of spectroscopic lens discovery as an explicit science goal of future spectroscopic galaxy surveys.
AUTONOMOUS GAUSSIAN DECOMPOSITION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindner, Robert R.; Vera-Ciro, Carlos; Murray, Claire E.
2015-04-15
We present a new algorithm, named Autonomous Gaussian Decomposition (AGD), for automatically decomposing spectra into Gaussian components. AGD uses derivative spectroscopy and machine learning to provide optimized guesses for the number of Gaussian components in the data, and also their locations, widths, and amplitudes. We test AGD and find that it produces results comparable to human-derived solutions on 21 cm absorption spectra from the 21 cm SPectral line Observations of Neutral Gas with the EVLA (21-SPONGE) survey. We use AGD with Monte Carlo methods to derive the H i line completeness as a function of peak optical depth and velocitymore » width for the 21-SPONGE data, and also show that the results of AGD are stable against varying observational noise intensity. The autonomy and computational efficiency of the method over traditional manual Gaussian fits allow for truly unbiased comparisons between observations and simulations, and for the ability to scale up and interpret the very large data volumes from the upcoming Square Kilometer Array and pathfinder telescopes.« less
Nanosurface design of dental implants for improved cell growth and function
NASA Astrophysics Data System (ADS)
Pan, Hsu-An; Hung, Yao-Ching; Chiou, Jin-Chern; Tai, Shih-Ming; Chen, Hsin-Hung; Huang, G. Steven
2012-08-01
A strategy was proposed for the topological design of dental implants based on an in vitro survey of optimized nanodot structures. An in vitro survey was performed using nanodot arrays with dot diameters ranging from 10 to 200 nm. MG63 osteoblasts were seeded on nanodot arrays and cultured for 3 days. Cell number, percentage undergoing apoptotic-like cell death, cell adhesion and cytoskeletal organization were evaluated. Nanodots with a diameter of approximately 50 nm enhanced cell number by 44%, minimized apoptotic-like cell death to 2.7%, promoted a 30% increase in microfilament bundles and maximized cell adhesion with a 73% increase in focal adhesions. An enhancement of about 50% in mineralization was observed, determined by von Kossa staining and by Alizarin Red S staining. Therefore, we provide a complete range of nanosurfaces for growing osteoblasts to discriminate their nanoscale environment. Nanodot arrays present an opportunity to positively and negatively modulate cell behavior and maturation. Our results suggest a topological approach which is beneficial for the design of dental implants.
Murray, Jessica R.; Svarc, Jerry L.
2017-01-01
The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.
Identifying nearby field T dwarfs in the UKIDSS Galactic Clusters Survey
NASA Astrophysics Data System (ADS)
Lodieu, N.; Burningham, B.; Hambly, N. C.; Pinfield, D. J.
2009-07-01
We present the discovery of two new late-T dwarfs identified in the UKIRT Infrared Deep Sky Survey (UKIDSS) Galactic Clusters Survey (GCS) Data Release 2 (DR2). These T dwarfs are nearby old T dwarfs along the line of sight to star-forming regions and open clusters targeted by the UKIDSS GCS. They are found towards the αPer cluster and Orion complex, respectively, from a search in 54deg2 surveyed in five filters. Photometric candidates were picked up in two-colour diagrams, in a very similar manner to candidates extracted from the UKIDSS Large Area Survey (LAS) but taking advantage of the Z filter employed by the GCS. Both candidates exhibit near-infrared J-band spectra with strong methane and water absorption bands characteristic of late-T dwarfs. We derive spectral types of T6.5 +/- 0.5 and T7 +/- 1 and estimate photometric distances less than 50 pc for UGCS J030013.86+490142.5 and UGCS J053022.52-052447.4, respectively. The space density of T dwarfs found in the GCS seems consistent with discoveries in the larger areal coverage of the UKIDSS LAS, indicating one T dwarf in 6-11deg2. The final area surveyed by the GCS, 1000deg2 in five passbands, will allow expansion of the LAS search area by 25 per cent, increase the probability of finding ultracool brown dwarfs, and provide optimal estimates of contamination by old field brown dwarfs in deep surveys to identify such objects in open clusters and star-forming regions. Based on observations made with the United Kingdom Infrared Telescope, operated by the Joint Astronomy Centre on behalf of the U.K. Science Technology and Facility Council. E-mail: nlodieu@iac.es
Photometric Type Ia supernova surveys in narrow-band filters
NASA Astrophysics Data System (ADS)
Xavier, Henrique S.; Abramo, L. Raul; Sako, Masao; Benítez, Narciso; Calvão, Maurício O.; Ederoclite, Alessandro; Marín-Franch, Antonio; Molino, Alberto; Reis, Ribamar R. R.; Siffert, Beatriz B.; Sodré, Laerte.
2014-11-01
We study the characteristics of a narrow-band Type Ia supernova (SN) survey through simulations based on the upcoming Javalambre Physics of the accelerating Universe Astrophysical Survey. This unique survey has the capabilities of obtaining distances, redshifts and the SN type from a single experiment thereby circumventing the challenges faced by the resource-intensive spectroscopic follow-up observations. We analyse the flux measurements signal-to-noise ratio and bias, the SN typing performance, the ability to recover light-curve parameters given by the SALT2 model, the photometric redshift precision from Type Ia SN light curves and the effects of systematic errors on the data. We show that such a survey is not only feasible but may yield large Type Ia SN samples (up to 250 SNe at z < 0.5 per month of search) with low core-collapse contamination (˜1.5 per cent), good precision on the SALT2 parameters (average σ _{m_B}=0.063, σ _{x_1}=0.47 and σc = 0.040) and on the distance modulus (average σμ = 0.16, assuming an intrinsic scatter σint = 0.14), with identified systematic uncertainties σsys ≲ 0.10σstat. Moreover, the filters are narrow enough to detect most spectral features and obtain excellent photometric redshift precision of σz = 0.005, apart from ˜2 per cent of outliers. We also present a few strategies for optimizing the survey's outcome. Together with the detailed host galaxy information, narrow-band surveys can be very valuable for the study of SN rates, spectral feature relations, intrinsic colour variations and correlations between SN and host galaxy properties, all of which are important information for SN cosmological applications.
A ubiquitous but ineffective intervention: Signs do not increase hand hygiene compliance.
Birnbach, David J; Rosen, Lisa F; Fitzpatrick, Maureen; Everett-Thomas, Ruth; Arheart, Kristopher L
Proper hand hygiene is critical for preventing healthcare-associated infection, but provider compliance remains suboptimal. While signs are commonly used to remind physicians and nurses to perform hand hygiene, the content of these signs is rarely based on specific, validated health behavior theories. This observational study assessed the efficacy of a hand hygiene sign disseminated by the Centers for Disease Control and Prevention in an intensive care unit compared to an optimized evidence-based sign designed by a team of patient safety experts. The optimized sign was developed by four patient safety experts to include known evidence-based components and was subsequently validated by surveying ten physicians and ten nurses using a 10 point Likert scale. Eighty-two physicians and 98 nurses (102 females; 78 males) were observed for hand hygiene (HH) compliance, and the total HH compliance rate was 16%. HH compliance was not significantly different among the signs (Baseline 10% vs. CDC 18% vs. OIS 20%; p=0.280). The findings of this study suggest that even when the content and design of a hand hygiene reminder sign incorporates evidence-based constructs, healthcare providers comply only a fraction of the time. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
1998-05-01
Coverage Probability with a Random Optimization Procedure: An Artificial Neural Network Approach by Biing T. Guan, George Z. Gertner, and Alan B...Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach 6. AUTHOR(S) Biing...coverage based on past coverage. Approach A literature survey was conducted to identify artificial neural network analysis techniques applicable for
Artificial intelligence for the EChO mission planning tool
NASA Astrophysics Data System (ADS)
Garcia-Piquer, Alvaro; Ribas, Ignasi; Colomé, Josep
2015-12-01
The Exoplanet Characterisation Observatory (EChO) has as its main goal the measurement of atmospheres of transiting planets. This requires the observation of two types of events: primary and secondary eclipses. In order to yield measurements of sufficient Signal-to-Noise Ratio to fulfil the mission objectives, the events of each exoplanet have to be observed several times. In addition, several criteria have to be considered to carry out each observation, such as the exoplanet visibility, its event duration, and no overlapping with other tasks. It is expected that a suitable mission plan increases the efficiency of telescope operation, which will represent an important benefit in terms of scientific return and operational costs. Nevertheless, to obtain a long term mission plan becomes unaffordable for human planners due to the complexity of computing the huge number of possible combinations for finding an optimum solution. In this contribution we present a long term mission planning tool based on Genetic Algorithms, which are focused on solving optimization problems such as the planning of several tasks. Specifically, the proposed tool finds a solution that highly optimizes the defined objectives, which are based on the maximization of the time spent on scientific observations and the scientific return (e.g., the coverage of the mission survey). The results obtained on the large experimental set up support that the proposed scheduler technology is robust and can function in a variety of scenarios, offering a competitive performance which does not depend on the collection of exoplanets to be observed. Specifically, the results show that, with the proposed tool, EChO uses 94% of the available time of the mission, so the amount of downtime is small, and it completes 98% of the targets.
Brazil, Kevin; Galway, Karen; Carter, Gillian; van der Steen, Jenny T
2017-05-01
The European Association for Palliative Care (EAPC) recently issued a framework that defines optimal palliative care in dementia. However, implementation of the guidelines may pose challenges for physicians working with dementia patients in practice. To measure and compare the perceptions of physicians in two European regions regarding the importance and challenges of implementing recommendations for optimal palliative care in dementia patients. Cross-sectional observational study. The Netherlands and the United Kingdom. Physicians (n = 317) providing palliative care to patients with dementia. Postal survey. Physicians in the Netherlands and Northern Ireland (NI), United Kingdom, prioritized the same domains of optimal palliative care for dementia and these match the priorities in the EAPC-endorsed guidelines. Respondents in both countries rated lack of education of professional teams and lack of awareness of the general public among the most important barriers to providing palliative care in dementia. NI respondents also identified access to specialist support as a barrier. The results indicate that there is a strong consensus among experts, elderly care physicians, and general practitioners across a variety of settings in Europe that person-centered care involving optimal communication and shared decision making is the top priority for delivering optimal palliative care in dementia. The current findings both support and enhance the new recommendations ratified by the EAPC. To take forward the implementation of EAPC guidelines for palliative care for dementia, it will be necessary to assess the challenges more thoroughly at a country-specific level and to design and test interventions that may include systemic changes to help physicians overcome such challenges.
Observatorio Astrofísico de Javalambre: observation scheduler and sequencer
NASA Astrophysics Data System (ADS)
Ederoclite, A.; Cristóbal-Hornillos, D.; Moles, M.; Cenarro, A. J.; Marín-Franch, A.; Yanes Díaz, A.; Gruel, N.; Varela, J.; Chueca, S.; Rueda-Teruel, F.; Rueda-Teruel, S.; Luis-Simoes, R.; Hernández-Fuertes, J.; López-Sainz, A.; Chioare Díaz-Martín, M.
2013-05-01
Observational strategy is a critical path in any large survey. The planning of a night requires the knowledge of the fields observed, the quality of the data already secured, and the ones still to be observed to optimize scientific returns. Finally, field maximum altitude, sky distance/brightness during the night and meteorological data (cloud coverage and seeing) have to be taken into account in order to increase the chance to have a successful observation. To support the execution of the J-PAS project at the Javalambre Astrophysical Observatory, we have prepared a scheduler and a sequencer (SCH/SQ) which takes into account all the relevant mentioned parameters. The scheduler first selects the fields which can be observed during the night and orders them on the basis of their figure of merit. It takes into account the quality and spectral coverage of the existing observations as well as the possibility to get a good observation during the night. The sequencer takes into account the meteorological variables in order to prepare the observation queue for the night. During the commissioning of the telescopes at OAJ, we expect to improve our figures of merit and eventually get to a system which can function semi-automatically. This poster describes the design of this software.
NASA Astrophysics Data System (ADS)
Giomi, Matteo; Gerard, Lucie; Maier, Gernot
2016-07-01
Variable emission is one of the defining characteristic of active galactic nuclei (AGN). While providing precious information on the nature and physics of the sources, variability is often challenging to observe with time- and field-of-view-limited astronomical observatories such as Imaging Atmospheric Cherenkov Telescopes (IACTs). In this work, we address two questions relevant for the observation of sources characterized by AGN-like variability: what is the most time-efficient way to detect such sources, and what is the observational bias that can be introduced by the choice of the observing strategy when conducting blind surveys of the sky. Different observing strategies are evaluated using simulated light curves and realistic instrument response functions of the Cherenkov Telescope Array (CTA), a future gamma-ray observatory. We show that strategies that makes use of very small observing windows, spread over large periods of time, allows for a faster detection of the source, and are less influenced by the variability properties of the sources, as compared to strategies that concentrate the observing time in a small number of large observing windows. Although derived using CTA as an example, our conclusions are conceptually valid for any IACTs facility, and in general, to all observatories with small field of view and limited duty cycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nord, B.; Amara, A.; Refregier, A.
The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less
Solis, Armando D
2015-12-01
To reduce complexity, understand generalized rules of protein folding, and facilitate de novo protein design, the 20-letter amino acid alphabet is commonly reduced to a smaller alphabet by clustering amino acids based on some measure of similarity. In this work, we seek the optimal alphabet that preserves as much of the structural information found in long-range (contact) interactions among amino acids in natively-folded proteins. We employ the Information Maximization Device, based on information theory, to partition the amino acids into well-defined clusters. Numbering from 2 to 19 groups, these optimal clusters of amino acids, while generated automatically, embody well-known properties of amino acids such as hydrophobicity/polarity, charge, size, and aromaticity, and are demonstrated to maintain the discriminative power of long-range interactions with minimal loss of mutual information. Our measurements suggest that reduced alphabets (of less than 10) are able to capture virtually all of the information residing in native contacts and may be sufficient for fold recognition, as demonstrated by extensive threading tests. In an expansive survey of the literature, we observe that alphabets derived from various approaches-including those derived from physicochemical intuition, local structure considerations, and sequence alignments of remote homologs-fare consistently well in preserving contact interaction information, highlighting a convergence in the various factors thought to be relevant to the folding code. Moreover, we find that alphabets commonly used in experimental protein design are nearly optimal and are largely coherent with observations that have arisen in this work. © 2015 Wiley Periodicals, Inc.
Schnelle, John F; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F
2009-10-01
Guidelines written for government surveyors who assess nursing home (NH) compliance with federal standards contain instructions to observe the quality of mealtime assistance. However, these instructions are vague and no protocol is provided for surveyors to record observational data. This study compared government survey staff observations of mealtime assistance quality to observations by research staff using a standardized protocol that met basic standards for accurate behavioral measurement. Survey staff used either the observation instructions in the standard survey process or those written for the revised Quality Improvement Survey (QIS). Trained research staff observed mealtime care in 20 NHs in 5 states during the same time period that survey staff evaluated care in the same facilities, although it could not be determined if survey and research staff observed the same residents during the same meals. Ten NHs were evaluated by government surveyors using the QIS survey instructions and 10 NHs were evaluated by surveyors using the standard survey instructions. Research staff observations using a standardized observation protocol identified a higher proportion of residents receiving inadequate feeding assistance during meals relative to survey staff using either the standard or QIS survey instructions. For example, more than 50% of the residents who ate less than half of their meals based on research staff observation were not offered an alternative to the served meal, and the lack of alternatives, or meal substitutions, was common in all 20 NHs. In comparison, the QIS survey teams documented only 2 instances when meal substitutes were not offered in 10 NHs and the standard survey teams documented no instances in 10 NHs. Standardized mealtime observations by research staff revealed feeding assistance care quality issues in all 20 study NHs. Surveyors following the instructions in either the standard or revised QIS surveys did not detect most of these care quality issues. Survey staff instructions for observation of nutritional care are not clearly written; thus, these instructions do not permit accurate behavioral measurement. These instructions should be revised in consideration of basic principles that guide accurate behavioral measurement and shared with NH providers to enable them to effectively implement quality improvement programs.
Bio-mimic optimization strategies in wireless sensor networks: a survey.
Adnan, Md Akhtaruzzaman; Abdur Razzaque, Mohammd; Ahmed, Ishtiaque; Isnin, Ismail Fauzi
2013-12-24
For the past 20 years, many authors have focused their investigations on wireless sensor networks. Various issues related to wireless sensor networks such as energy minimization (optimization), compression schemes, self-organizing network algorithms, routing protocols, quality of service management, security, energy harvesting, etc., have been extensively explored. The three most important issues among these are energy efficiency, quality of service and security management. To get the best possible results in one or more of these issues in wireless sensor networks optimization is necessary. Furthermore, in number of applications (e.g., body area sensor networks, vehicular ad hoc networks) these issues might conflict and require a trade-off amongst them. Due to the high energy consumption and data processing requirements, the use of classical algorithms has historically been disregarded. In this context contemporary researchers started using bio-mimetic strategy-based optimization techniques in the field of wireless sensor networks. These techniques are diverse and involve many different optimization algorithms. As far as we know, most existing works tend to focus only on optimization of one specific issue of the three mentioned above. It is high time that these individual efforts are put into perspective and a more holistic view is taken. In this paper we take a step in that direction by presenting a survey of the literature in the area of wireless sensor network optimization concentrating especially on the three most widely used bio-mimetic algorithms, namely, particle swarm optimization, ant colony optimization and genetic algorithm. In addition, to stimulate new research and development interests in this field, open research issues, challenges and future research directions are highlighted.
How does the cosmic large-scale structure bias the Hubble diagram?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleury, Pierre; Clarkson, Chris; Maartens, Roy, E-mail: pierre.fleury@uct.ac.za, E-mail: chris.clarkson@qmul.ac.uk, E-mail: roy.maartens@gmail.com
2017-03-01
The Hubble diagram is one of the cornerstones of observational cosmology. It is usually analysed assuming that, on average, the underlying relation between magnitude and redshift matches the prediction of a Friedmann-Lemaître-Robertson-Walker model. However, the inhomogeneity of the Universe generically biases these observables, mainly due to peculiar velocities and gravitational lensing, in a way that depends on the notion of average used in theoretical calculations. In this article, we carefully derive the notion of average which corresponds to the observation of the Hubble diagram. We then calculate its bias at second-order in cosmological perturbations, and estimate the consequences on themore » inference of cosmological parameters, for various current and future surveys. We find that this bias deeply affects direct estimations of the evolution of the dark-energy equation of state. However, errors in the standard inference of cosmological parameters remain smaller than observational uncertainties, even though they reach percent level on some parameters; they reduce to sub-percent level if an optimal distance indicator is used.« less
Lukewich, Julia; Mann, Elizabeth; VanDenKerkhof, Elizabeth; Tranmer, Joan
2015-11-01
The aim of this study was to describe chronic pain self-management from the perspective of individuals living with chronic pain in the context of primary care nursing. Self-management is a key chronic pain treatment modality and support for self-managing chronic pain is mainly provided in the context of primary care. Although nurses are optimally suited to facilitate self-management in primary care, there is a need to explore opportunities for optimizing their roles. Two cross-sectional studies. The Chronic Pain Self-Management Survey was conducted in 2011-2012 to explore the epidemiology and self-management of chronic pain in Canadian adults. The questionnaire was distributed to 1504 individuals in Ontario. In 2011, the Primary Care Nursing Roles Survey was distributed to 1911 primary care nurses in Ontario to explore their roles and to determine the extent to which chronic disease management strategies, including support for self-management, were implemented in primary care. Few respondents to the pain survey identified nurses as being the 'most helpful' facilitator of self-management while physicians were most commonly cited. Seventy-six per cent of respondents used medication to manage their chronic pain. Few respondents to the nursing survey worked in practices with specific programmes for individuals with chronic pain. Individuals with chronic pain identified barriers and facilitators to self-managing their pain and nurses identified barriers and facilitators to optimizing their role in primary care. There are several opportunities for primary care practices to facilitate self-management of chronic pain, including the optimization of the primary care nursing role. © 2015 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Yoder, Janice D.; Snell, Andrea F.; Tobias, Ann
2012-01-01
To identify a multivariate configuration of feminist beliefs best associated with optimal psychological functioning, 215 mostly White college women completed an online survey measuring their feminist beliefs (Feminist Perspectives Scale, Attitudes toward Feminism and the Women's Movement, sense of common fate, and Feminist Identity Composite) and…
23 CFR 1340.7 - Observation procedures.
Code of Federal Regulations, 2013 CFR
2013-04-01
... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.7 Observation procedures. (a) Data collection dates. All survey data shall be collected through direct observation completely within the...), the survey shall be conducted in accordance to the schedule determined in § 1340.6. (b) Roadway and...
23 CFR 1340.7 - Observation procedures.
Code of Federal Regulations, 2014 CFR
2014-04-01
... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.7 Observation procedures. (a) Data collection dates. All survey data shall be collected through direct observation completely within the...), the survey shall be conducted in accordance to the schedule determined in § 1340.6. (b) Roadway and...
23 CFR 1340.7 - Observation procedures.
Code of Federal Regulations, 2012 CFR
2012-04-01
... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.7 Observation procedures. (a) Data collection dates. All survey data shall be collected through direct observation completely within the...), the survey shall be conducted in accordance to the schedule determined in § 1340.6. (b) Roadway and...
Li, Shuang-Shuang; Pan, Shuo; Ma, Yi-Tong; Yang, Yi-Ning; Ma, Xiang; Li, Xiao-Mei; Fu, Zhen-Yan; Xie, Xiang; Liu, Fen; Chen, You; Chen, Bang-Dang; Yu, Zi-Xiang; He, Chun-Hui; Zheng, Ying-Ying; Abudukeremu, Nuremanguli; Abuzhalihan, Jialin; Wang, Yong-Tao
2014-07-29
The optimal cutoff of the waist-to-hip ratio (WHR) among Han adults in Xinjiang, which is located in the center of Asia, is unknown. We aimed to examine the relationship between different WHRs and cardiovascular risk factors among Han adults in Xinjiang, and determine the optimal cutoff of the WHR. The Cardiovascular Risk Survey was conducted from October 2007 to March 2010. A total of 14618 representative participants were selected using a four-stage stratified sampling method. A total of 5757 Han participants were included in the study. The present statistical analysis was restricted to the 5595 Han subjects who had complete anthropometric data. The sensitivity, specificity, and distance on the receiver operating characteristic (ROC) curve in each WHR level were calculated. The shortest distance in the ROC curves was used to determine the optimal cutoff of the WHR for detecting cardiovascular risk factors. In women, the WHR was positively associated with systolic blood pressure, diastolic blood pressure, and serum concentrations of serum total cholesterol. The prevalence of hypertension and hypertriglyceridemia increased as the WHR increased. The same results were not observed among men. The optimal WHR cutoffs for predicting hypertension, diabetes, dyslipidemia and ≥ two of these risk factors for Han adults in Xinjiang were 0.92, 0.92, 0.91, 0.92 in men and 0.88, 0.89, 0.88, 0.89 in women, respectively. Higher cutoffs for the WHR are required in the identification of Han adults aged ≥ 35 years with a high risk of cardiovascular diseases in Xinjiang.
NASA Astrophysics Data System (ADS)
Diallo, M. S.; Holschneider, M.; Kulesh, M.; Scherbaum, F.; Ohrnberger, M.; Lück, E.
2004-05-01
This contribution is concerned with the estimate of attenuation and dispersion characteristics of surface waves observed on a shallow seismic record. The analysis is based on a initial parameterization of the phase and attenuation functions which are then estimated by minimizing a properly defined merit function. To minimize the effect of random noise on the estimates of dispersion and attenuation we use cross-correlations (in Fourier domain) of preselected traces from some region of interest along the survey line. These cross-correlations are then expressed in terms of the parameterized attenuation and phase functions and the auto-correlation of the so-called source trace or reference trace. Cross-corelation that enter the optimization are selected so as to provide an average estimate of both the attenuation function and the phase (group) velocity of the area under investigation. The advantage of the method over the standard two stations method using Fourier technique is that uncertainties related to the phase unwrapping and the estimate of the number of 2π cycle skip in the phase phase are eliminated. However when mutliple modes arrival are observed, its become merely impossible to obtain reliable estimate the dipsersion curves for the different modes using optimization method alone. To circumvent this limitations we using the presented approach in conjunction with the wavelet propagation operator (Kulesh et al., 2003) which allows the application of band pass filtering in (ω -t) domain, to select a particular mode for the minimization. Also by expressing the cost function in the wavelet domain the optimization can be performed either with respect to the phase, the modulus of the transform or a combination of both. This flexibility in the design of the cost function provides an additional mean of constraining the optimization results. Results from the application of this dispersion and attenuation analysis method are shown for both synthetic and real 2D shallow seismic data sets. M. Kulesh, M. Holschneider, M. S. Diallo, Q. Xie and F. Scherbaum, Modeling of Wave Dispersion Using Wavelet Transfrom (Submitted to Pure and Applied Geophysics).
Cosmic shear bias and calibration in dark energy studies
NASA Astrophysics Data System (ADS)
Taylor, A. N.; Kitching, T. D.
2018-07-01
With the advent of large-scale weak lensing surveys there is a need to understand how realistic, scale-dependent systematics bias cosmic shear and dark energy measurements, and how they can be removed. Here, we show how spatially varying image distortions are convolved with the shear field, mixing convergence E and B modes, and bias the observed shear power spectrum. In practise, many of these biases can be removed by calibration to data or simulations. The uncertainty in this calibration is marginalized over, and we calculate how this propagates into parameter estimation and degrades the dark energy Figure-of-Merit. We find that noise-like biases affect dark energy measurements the most, while spikes in the bias power have the least impact. We argue that, in order to remove systematic biases in cosmic shear surveys and maintain statistical power, effort should be put into improving the accuracy of the bias calibration rather than minimizing the size of the bias. In general, this appears to be a weaker condition for bias removal. We also investigate how to minimize the size of the calibration set for a fixed reduction in the Figure-of-Merit. Our results can be used to correctly model the effect of biases and calibration on a cosmic shear survey, assess their impact on the measurement of modified gravity and dark energy models, and to optimize survey and calibration requirements.
Reducing random measurement error in assessing postural load on the back in epidemiologic surveys.
Burdorf, A
1995-02-01
The goal of this study was to design strategies to assess postural load on the back in occupational epidemiology by taking into account the reliability of measurement methods and the variability of exposure among the workers under study. Intermethod reliability studies were evaluated to estimate the systematic bias (accuracy) and random measurement error (precision) of various methods to assess postural load on the back. Intramethod reliability studies were reviewed to estimate random variability of back load over time. Intermethod surveys have shown that questionnaires have a moderate reliability for gross activities such as sitting, whereas duration of trunk flexion and rotation should be assessed by observation methods or inclinometers. Intramethod surveys indicate that exposure variability can markedly affect the reliability of estimates of back load if the estimates are based upon a single measurement over a certain time period. Equations have been presented to evaluate various study designs according to the reliability of the measurement method, the optimum allocation of the number of repeated measurements per subject, and the number of subjects in the study. Prior to a large epidemiologic study, an exposure-oriented survey should be conducted to evaluate the performance of measurement instruments and to estimate sources of variability for back load. The strategy for assessing back load can be optimized by balancing the number of workers under study and the number of repeated measurements per worker.
Operations research methods improve chemotherapy patient appointment scheduling.
Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott
2012-12-01
Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.
The Automation and Exoplanet Orbital Characterization from the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Jinfei Wang, Jason; Graham, James; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry; Kalas, Paul; arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Ruffio, Jean-Baptiste; Sivaramakrishnan, Anand; Gemini Planet Imager Exoplanet Survey Collaboration
2018-01-01
The Gemini Planet Imager (GPI) Exoplanet Survey (GPIES) is a multi-year 600-star survey to discover and characterize young Jovian exoplanets and their planet forming environments. For large surveys like GPIES, it is critical to have a uniform dataset processed with the latest techniques and calibrations. I will describe the GPI Data Cruncher, an automated data processing framework that is able to generate fully reduced data minutes after the data are taken and can also reprocess the entire campaign in a single day on a supercomputer. The Data Cruncher integrates into a larger automated data processing infrastructure which syncs, logs, and displays the data. I will discuss the benefits of the GPIES data infrastructure, including optimizing observing strategies, finding planets, characterizing instrument performance, and constraining giant planet occurrence. I will also discuss my work in characterizing the exoplanets we have imaged in GPIES through monitoring their orbits. Using advanced data processing algorithms and GPI's precise astrometric calibration, I will show that GPI can achieve one milliarcsecond astrometry on the extensively-studied planet Beta Pic b. With GPI, we can confidently rule out a possible transit of Beta Pic b, but have precise timings on a Hill sphere transit, and I will discuss efforts to search for transiting circumplanetary material this year. I will also discuss the orbital monitoring of other exoplanets as part of GPIES.
Gorouhi, Farzam; Alikhan, Ali; Rezaei, Arash; Fazel, Nasim
2014-01-01
Background. Dermatology residency programs are relatively diverse in their resident selection process. The authors investigated the importance of 25 dermatology residency selection criteria focusing on differences in program directors' (PDs') perception based on specific program demographics. Methods. This cross-sectional nationwide observational survey utilized a 41-item questionnaire that was developed by literature search, brainstorming sessions, and online expert reviews. The data were analyzed utilizing the reliability test, two-step clustering, and K-means methods as well as other methods. The main purpose of this study was to investigate the differences in PDs' perception regarding the importance of the selection criteria based on program demographics. Results. Ninety-five out of 114 PDs (83.3%) responded to the survey. The top five criteria for dermatology residency selection were interview, letters of recommendation, United States Medical Licensing Examination Step I scores, medical school transcripts, and clinical rotations. The following criteria were preferentially ranked based on different program characteristics: “advanced degrees,” “interest in academics,” “reputation of undergraduate and medical school,” “prior unsuccessful attempts to match,” and “number of publications.” Conclusions. Our survey provides up-to-date factual data on dermatology PDs' perception in this regard. Dermatology residency programs may find the reported data useful in further optimizing their residency selection process. PMID:24772165
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorini, D., E-mail: sorini@mpia-hd.mpg.de
2017-04-01
Measuring the clustering of galaxies from surveys allows us to estimate the power spectrum of matter density fluctuations, thus constraining cosmological models. This requires careful modelling of observational effects to avoid misinterpretation of data. In particular, signals coming from different distances encode information from different epochs. This is known as ''light-cone effect'' and is going to have a higher impact as upcoming galaxy surveys probe larger redshift ranges. Generalising the method by Feldman, Kaiser and Peacock (1994) [1], I define a minimum-variance estimator of the linear power spectrum at a fixed time, properly taking into account the light-cone effect. Anmore » analytic expression for the estimator is provided, and that is consistent with the findings of previous works in the literature. I test the method within the context of the Halofit model, assuming Planck 2014 cosmological parameters [2]. I show that the estimator presented recovers the fiducial linear power spectrum at present time within 5% accuracy up to k ∼ 0.80 h Mpc{sup −1} and within 10% up to k ∼ 0.94 h Mpc{sup −1}, well into the non-linear regime of the growth of density perturbations. As such, the method could be useful in the analysis of the data from future large-scale surveys, like Euclid.« less
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
GALEX: a UV telescope to map the star formation history of the universe
NASA Astrophysics Data System (ADS)
Milliard, Bruno; Grange, Robert; Martin, Christopher; Schiminovich, David
2017-11-01
The NASA Small Mission EXplorer GALEX (PI: C.Martin, Caltech) is under development at JPL for launch late 2001. It has been designed to map the history of star formation in the Universe over the redshift range 0-2, a major era where galaxies and gas content evolved dramatically. The expected depth and imaging quality matches the Palomar Observatory Surveys, allowing GALEX to provide the astronomical community with a database of FUV photometric and spectroscopic observations of several million galaxies in the nearby and distant Universe. The 1.24 degree FOV, 50 cm aperture compact Ritchey-Chrétien telescope is equipped with two 65 mm photon-counting detectors. It will perform several surveys of different coverage and depths, that will take advantage of a high throughput UV-transmissive Grism newly developed in France to easily switch between imagery and field spectroscopy modes. A thin aspherized fused silica dichroic component provides simultaneous observations in two UV bands (135-185 nm and 185-300 nm) as well as correction for field aberrations. We shall briefly present the mission science goals, and will describe the optical concept, along with the guidelines and compromises used for its optimization in the context of the "Faster, Better, Cheaper" NASA philosophy, and give a brief development status report.
Structural optimization: Status and promise
NASA Astrophysics Data System (ADS)
Kamat, Manohar P.
Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)
Optimal placement of tuning masses on truss structures by genetic algorithms
NASA Technical Reports Server (NTRS)
Ponslet, Eric; Haftka, Raphael T.; Cudney, Harley H.
1993-01-01
Optimal placement of tuning masses, actuators and other peripherals on large space structures is a combinatorial optimization problem. This paper surveys several techniques for solving this problem. The genetic algorithm approach to the solution of the placement problem is described in detail. An example of minimizing the difference between the two lowest frequencies of a laboratory truss by adding tuning masses is used for demonstrating some of the advantages of genetic algorithms. The relative efficiencies of different codings are compared using the results of a large number of optimization runs.
Multiobjective optimization in bioinformatics and computational biology.
Handl, Julia; Kell, Douglas B; Knowles, Joshua
2007-01-01
This paper reviews the application of multiobjective optimization in the fields of bioinformatics and computational biology. A survey of existing work, organized by application area, forms the main body of the review, following an introduction to the key concepts in multiobjective optimization. An original contribution of the review is the identification of five distinct "contexts," giving rise to multiple objectives: These are used to explain the reasons behind the use of multiobjective optimization in each application area and also to point the way to potential future uses of the technique.
PyOperators: Operators and solvers for high-performance computing
NASA Astrophysics Data System (ADS)
Chanial, P.; Barbey, N.
2012-12-01
PyOperators is a publicly available library that provides basic operators and solvers for small-to-very large inverse problems ({http://pchanial.github.com/pyoperators}). It forms the backbone of the package PySimulators, which implements specific operators to construct an instrument model and means to conveniently represent a map, a timeline or a time-dependent observation ({http://pchanial.github.com/pysimulators}). Both are part of the Tamasis (Tools for Advanced Map-making, Analysis and SImulations of Submillimeter surveys) toolbox, aiming at providing versatile, reliable, easy-to-use, and optimal map-making tools for Herschel and future generation of sub-mm instruments. The project is a collaboration between 4 institutes (ESO Garching, IAS Orsay, CEA Saclay, Univ. Leiden).
A Comprehensive Review of Swarm Optimization Algorithms
2015-01-01
Many swarm optimization algorithms have been introduced since the early 60’s, Evolutionary Programming to the most recent, Grey Wolf Optimization. All of these algorithms have demonstrated their potential to solve many optimization problems. This paper provides an in-depth survey of well-known optimization algorithms. Selected algorithms are briefly explained and compared with each other comprehensively through experiments conducted using thirty well-known benchmark functions. Their advantages and disadvantages are also discussed. A number of statistical tests are then carried out to determine the significant performances. The results indicate the overall advantage of Differential Evolution (DE) and is closely followed by Particle Swarm Optimization (PSO), compared with other considered approaches. PMID:25992655
The hubris hypothesis: The downside of comparative optimism displays.
Hoorens, Vera; Van Damme, Carolien; Helweg-Larsen, Marie; Sedikides, Constantine
2017-04-01
According to the hubris hypothesis, observers respond more unfavorably to individuals who express their positive self-views comparatively than to those who express their positive self-views non-comparatively, because observers infer that the former hold a more disparaging view of others and particularly of observers. Two experiments extended the hubris hypothesis in the domain of optimism. Observers attributed less warmth (but not less competence) to, and showed less interest in affiliating with, an individual displaying comparative optimism (the belief that one's future will be better than others' future) than with an individual displaying absolute optimism (the belief that one's future will be good). Observers responded differently to individuals displaying comparative versus absolute optimism, because they inferred that the former held a gloomier view of the observers' future. Consistent with previous research, observers still attributed more positive traits to a comparative or absolute optimist than to a comparative or absolute pessimist. Copyright © 2016. Published by Elsevier Inc.
Exploiting EMI Signals During Active Transmission
2010-08-12
Surveys) fixed wing airborne EM system. (Center) AeroTEM (AeroQuest Surveys) helicopter-based airborne time-domain EM system, (Right) VTEM ( GeoTech Ltd...Center) AeroTEM (AeroQuest Surveys) helicopter-based airborne time-domain EM system, (Right) VTEM ( GeoTech Ltd.) helicopter-borne AEM system. All three...systems such as the UTEM (Lamontange Geophysics) and the SPECTREM AEM systems. Geotech Ltd. uses a complicated waveform which has been optimized to
unWISE: Unblurred Coadds of the WISE Imaging
NASA Astrophysics Data System (ADS)
Lang, Dustin
2014-05-01
The Wide-field Infrared Survey Explorer (WISE) satellite observed the full sky in four mid-infrared bands in the 2.8-28 μm range. The primary mission was completed in 2010. The WISE team has done a superb job of producing a series of high-quality, well-documented, complete data releases in a timely manner. However, the "Atlas Image" coadds that are part of the recent AllWISE and previous data releases were intentionally blurred. Convolving the images by the point-spread function while coadding results in "matched-filtered" images that are close to optimal for detecting isolated point sources. But these matched-filtered images are sub-optimal or inappropriate for other purposes. For example, we are photometering the WISE images at the locations of sources detected in the Sloan Digital Sky Survey through forward modeling, and this blurring decreases the available signal-to-noise by effectively broadening the point-spread function. This paper presents a new set of coadds of the WISE images that have not been blurred. These images retain the intrinsic resolution of the data and are appropriate for photometry preserving the available signal-to-noise. Users should be cautioned, however, that the W3- and W4-band coadds contain artifacts around large, bright structures (large galaxies, dusty nebulae, etc.); eliminating these artifacts is the subject of ongoing work. These new coadds, and the code used to produce them, are publicly available at http://unwise.me.
MSE observatory: a revised and optimized astronomical facility
NASA Astrophysics Data System (ADS)
Bauman, Steven E.; Angers, Mathieu; Benedict, Tom; Crampton, David; Flagey, Nicolas; Gedig, Mike; Green, Greg; Liu, Andy; Lo, David; Loewen, Nathan; McConnachie, Alan; Murowinski, Rick; Racine, René; Salmon, Derrick; Stiemer, Siegfried; Szeto, Kei; Wu, Di
2016-07-01
The Canada-France-Hawaii-Telescope Corporation (CFHT) plans to repurpose its observatory on the summit of Maunakea and operate a (60 segment) 11.25m aperture wide field spectroscopic survey telescope, the Maunakea Spectroscopic Explorer (MSE). The prime focus telescope will be equipped with dedicated instrumentation to take advantage of one of the best sites in the northern hemisphere and offer its users the ability to perform large surveys. Central themes of the development plan are reusing and upgrading wherever possible. MSE will reuse the CFHT site and build upon the existing observatory infrastructure, using the same building and telescope pier as CFHT, while minimizing environmental impact on the summit. MSE will require structural support upgrades to the building to meet the latest building seismic code requirements and accommodate a new larger telescope and upgraded enclosure. It will be necessary to replace the current dome since a larger slit opening is needed for a larger telescope. MSE will use a thermal management system to remove heat generated by loads from the building, flush excess heat from lower levels, and maintain the observing environment temperature. This paper describes the design approach for redeveloping the CFHT facility for MSE. Once the project is completed the new facility will be almost indistinguishable on the outside from the current CFHT observatory. Past experience and lessons learned from CFHT staff and the astronomical community will be used to create a modern, optimized, and transformative scientific data collecting machine.
ERIC Educational Resources Information Center
Ritter, Lois A., Ed.; Sue, Valerie M., Ed.
2007-01-01
Research regarding the optimal fielding of online surveys is in its infancy and just beginning to offer clear suggestions for effective recruiting of participants as well as techniques for maximizing the response rate. In this article, the authors discuss the process of recruiting participants by e-mailing invitations to a list of recipients…
SLJ's 2011 Technology Survey: Things Are Changing. Fast
ERIC Educational Resources Information Center
Kenney, Brian
2011-01-01
Despite the funding challenges nearly all school libraries face, many media specialists are optimistic about the role of technology in the school library, according to "School Library Journal's" ("SLJ") 2011 Technology Survey. But in spite of the general optimism, others point to some significant obstacles: technological innovations are often…
Barriers to pediatric pain management: a nursing perspective.
Czarnecki, Michelle L; Simon, Katherine; Thompson, Jamie J; Armus, Cheryl L; Hanson, Tom C; Berg, Kristin A; Petrie, Jodie L; Xiang, Qun; Malin, Shelly
2011-09-01
This study describes strategies used by the Joint Clinical Practice Council of Children's Hospital of Wisconsin to identify barriers perceived as interfering with nurses' (RNs) ability to provide optimal pain management. A survey was used to ascertain how nurses described optimal pain management and how much nurses perceived potential barriers as interfering with their ability to provide that level of care. The survey, "Barriers to Optimal Pain management" (adapted from Van Hulle Vincent & Denyes, 2004), was distributed to all RNs working in all patient care settings. Two hundred seventy-two surveys were returned. The five most significant barriers identified were insufficient physician (MD) orders, insufficient MD orders before procedures, insufficient time to premedicate patients before procedures, the perception of a low priority given to pain management by medical staff, and parents' reluctance to have patients receive pain medication. Additional barriers were identified through narrative comments. Information regarding the impact of the Acute Pain Service on patient care, RNs' ability to overcome barriers, and RNs' perception of current pain management practices is included, as are several specific interventions aimed at improving or ultimately eliminating identified barriers. Copyright © 2011 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Morgan, Matthew; Lau, Davina; Jivraj, Tanaz; Principi, Tania; Dietrich, Sandra; Bell, Chaim M
2015-01-01
Email is becoming a widely accepted communication tool in healthcare settings. This study sought to test the feasibility of Internet-based email surveys of patient experience in the ambulatory setting. We conducted a study of email Internet-based surveys sent to patients in selected ambulatory clinics at Mount Sinai Hospital in Toronto, Canada. Our findings suggest that email links to Internet surveys are a feasible, timely and efficient method to solicit patient feedback about their experience. Further research is required to optimally leverage Internet-based email surveys as a tool to better understand the patient experience.
76 FR 18042 - Uniform Criteria for State Observational Surveys of Seat Belt Use
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... [Docket No. NHTSA-2010-0002] RIN 2127-AK41 Uniform Criteria for State Observational Surveys of Seat Belt... designing and conducting State seat belt use observational surveys and the procedures for obtaining NHTSA... use rate, known as the Uniform Criteria for State Observational Surveys of Seat Belt Use (the Uniform...
Wan, Eric Yuk Fai; Fung, Colman Siu Cheung; Wong, Carlos King Ho; Choi, Edmond Pui Hang; Jiao, Fang Fang; Chan, Anca Ka Chun; Chan, Karina Hiu Yen; Lam, Cindy Lo Kuen
2017-02-01
Little is known about how the patient-reported outcomes is influenced by multidisciplinary-risk-assessment-and-management-programme for patients with diabetes mellitus (RAMP-DM). This paper aims to evaluate the effectiveness of RAMP-DM on patient-reported outcomes. This was a prospective longitudinal study on 1039 diabetes mellitus patients (714/325 RAMP-DM/non-RAMP-DM) managed in primary care setting. 536 and 402 RAMP-DM participants, and 237 and 187 non-RAMP-DM participants were followed up at 12 and 24 months with completed survey, respectively. Patient-reported outcomes included health-related quality of life, change in global health condition and patient enablement measured by Short Form-12 Health Survey version-2 (SF-12v2), Global Rating Scale, Patient Enablement Instrument respectively. The effects of RAMP-DM on patient-reported outcomes were evaluated by mixed effect models. Subgroup analysis was performed by stratifying haemoglobin A1c (HbA1c) (optimal HbA1c < 7 % and suboptimal HbA1c ≥ 7 %). RAMP-DM with suboptimal HbA1c was associated with greater improvement in SF-12v2 physical component summary score at 12-month (coefficient:3.80; P-value < 0.05) and 24-month (coefficient:3.82;P-value < 0.05), more likely to feel more enabled at 12-month (odds ratio: 2.57; P-value < 0.05), and have improved in GRS at 24-month (odds ratio:4.05; P-value < 0.05) compared to non-RAMP-DM participants. However, there was no significant difference in patient-reported outcomes between RAMP-DM and non-RAMP-DM participants with optimal HbA1c. Participation in RAMP-DM is effective in improving physical component of HRQOL, Global Rating Scale and patient enablement among diabetes mellitus patients with suboptimal HbA1c, but not in those with optimal HbA1c. Patients with sub-optimal diabetes mellitus control should be the priority target population for RAMP-DM. This observational study design may have potential bias in the characteristics between groups, and randomized clinical trial is needed to confirm the results.
MARVELS 1D Pipeline Development, Optimization, and Performance
NASA Astrophysics Data System (ADS)
Thomas, Neil; Ge, Jian; Grieves, Nolan; Li, Rui; Sithajan, Sirinrat
2016-04-01
We describe the processing pipeline of one-dimensional spectra from the SDSS III Multi-object APO Radial Velocity Exoplanet Large-area Survey (MARVELS). This medium-resolution interferometric spectroscopic survey observed over 3300 stars over the course of four years with the primary goal of detecting and characterizing giant planets (>0.5 M Jup) from within a large, homogeneous sample of FGK stars. The successful extraction of radial velocities (RVs) from MARVELS is complicated by several instrument effects. The wide field nature of this multi-object spectrograph provides spectra that are initially distorted and require conditioning of the raw images for precise RV extraction. Also, the simultaneous observation of sixty stars per exposure leads to several effects not typically seen in a single-object instrument. For instance, fiber illumination changes over time can easily create the dominant source of RV measurement error when these changes are different for the stellar and calibration optical paths. We present a method for statistically quantifying these instrument effects to combat the difficulty of giant planet detection due to systematic RV errors. We also present an overview of the performance of the entire survey as it stands for the SDSS III DR 12 as well as key results from the very latest improvements. This includes a novel technique, called lucky RV, by which stable regions of spectra can be statistically determined and emphasized during RV extraction, leading to a large reduction of the long-term RV offsets in the MARVELS data. These improved RV data are to be released via NASA Exoplanet Archive in the fall of 2015.
The GALAH survey: chemical tagging of star clusters and new members in the Pleiades
NASA Astrophysics Data System (ADS)
Kos, Janez; Bland-Hawthorn, Joss; Freeman, Ken; Buder, Sven; Traven, Gregor; De Silva, Gayandhi M.; Sharma, Sanjib; Asplund, Martin; Duong, Ly; Lin, Jane; Lind, Karin; Martell, Sarah; Simpson, Jeffrey D.; Stello, Dennis; Zucker, Daniel B.; Zwitter, Tomaž; Anguiano, Borja; Da Costa, Gary; D'Orazi, Valentina; Horner, Jonathan; Kafle, Prajwal R.; Lewis, Geraint; Munari, Ulisse; Nataf, David M.; Ness, Melissa; Reid, Warren; Schlesinger, Katie; Ting, Yuan-Sen; Wyse, Rosemary
2018-02-01
The technique of chemical tagging uses the elemental abundances of stellar atmospheres to 'reconstruct' chemically homogeneous star clusters that have long since dispersed. The GALAH spectroscopic survey - which aims to observe one million stars using the Anglo-Australian Telescope - allows us to measure up to 30 elements or dimensions in the stellar chemical abundance space, many of which are not independent. How to find clustering reliably in a noisy high-dimensional space is a difficult problem that remains largely unsolved. Here, we explore t-distributed stochastic neighbour embedding (t-SNE) - which identifies an optimal mapping of a high-dimensional space into fewer dimensions - whilst conserving the original clustering information. Typically, the projection is made to a 2D space to aid recognition of clusters by eye. We show that this method is a reliable tool for chemical tagging because it can: (i) resolve clustering in chemical space alone, (ii) recover known open and globular clusters with high efficiency and low contamination, and (iii) relate field stars to known clusters. t-SNE also provides a useful visualization of a high-dimensional space. We demonstrate the method on a data set of 13 abundances measured in the spectra of 187 000 stars by the GALAH survey. We recover seven of the nine observed clusters (six globular and three open clusters) in chemical space with minimal contamination from field stars and low numbers of outliers. With chemical tagging, we also identify two Pleiades supercluster members (which we confirm kinematically), one as far as 6° - one tidal radius away from the cluster centre.
Adaptive sampling in behavioral surveys.
Thompson, S K
1997-01-01
Studies of populations such as drug users encounter difficulties because the members of the populations are rare, hidden, or hard to reach. Conventionally designed large-scale surveys detect relatively few members of the populations so that estimates of population characteristics have high uncertainty. Ethnographic studies, on the other hand, reach suitable numbers of individuals only through the use of link-tracing, chain referral, or snowball sampling procedures that often leave the investigators unable to make inferences from their sample to the hidden population as a whole. In adaptive sampling, the procedure for selecting people or other units to be in the sample depends on variables of interest observed during the survey, so the design adapts to the population as encountered. For example, when self-reported drug use is found among members of the sample, sampling effort may be increased in nearby areas. Types of adaptive sampling designs include ordinary sequential sampling, adaptive allocation in stratified sampling, adaptive cluster sampling, and optimal model-based designs. Graph sampling refers to situations with nodes (for example, people) connected by edges (such as social links or geographic proximity). An initial sample of nodes or edges is selected and edges are subsequently followed to bring other nodes into the sample. Graph sampling designs include network sampling, snowball sampling, link-tracing, chain referral, and adaptive cluster sampling. A graph sampling design is adaptive if the decision to include linked nodes depends on variables of interest observed on nodes already in the sample. Adjustment methods for nonsampling errors such as imperfect detection of drug users in the sample apply to adaptive as well as conventional designs.
2013-01-01
Background The main aim of China’s Health Care System Reform was to help the decision maker find the optimal solution to China’s institutional problem of health care provider selection. A pilot health care provider research system was recently organized in China’s health care system, and it could efficiently collect the data for determining the optimal solution to China’s institutional problem of health care provider selection from various experts, then the purpose of this study was to apply the optimal implementation methodology to help the decision maker effectively promote various experts’ views into various optimal solutions to this problem under the support of this pilot system. Methods After the general framework of China’s institutional problem of health care provider selection was established, this study collaborated with the National Bureau of Statistics of China to commission a large-scale 2009 to 2010 national expert survey (n = 3,914) through the organization of a pilot health care provider research system for the first time in China, and the analytic network process (ANP) implementation methodology was adopted to analyze the dataset from this survey. Results The market-oriented health care provider approach was the optimal solution to China’s institutional problem of health care provider selection from the doctors’ point of view; the traditional government’s regulation-oriented health care provider approach was the optimal solution to China’s institutional problem of health care provider selection from the pharmacists’ point of view, the hospital administrators’ point of view, and the point of view of health officials in health administration departments; the public private partnership (PPP) approach was the optimal solution to China’s institutional problem of health care provider selection from the nurses’ point of view, the point of view of officials in medical insurance agencies, and the health care researchers’ point of view. Conclusions The data collected through a pilot health care provider research system in the 2009 to 2010 national expert survey could help the decision maker effectively promote various experts’ views into various optimal solutions to China’s institutional problem of health care provider selection. PMID:23557082
A Perfect View of Vesta: Creating Pointing Observations for the Dawn Spacecraft on Asteroid 4 Vesta
NASA Technical Reports Server (NTRS)
Hay, Katrina M.
2005-01-01
The Dawn spacecraft has a timely and clever assignment in store. It will take a close look at two intact survivors from the dawn of the solar system (asteroids 4 Vesta and 1 Ceres) to understand more about solar system origin and evolution. To optimize science return, Dawn must make carefully designed observations on approach and in survey orbit, high altitude mapping orbit, and low altitude mapping orbit at each body. In this report, observations outlined in the science plan are modeled using the science opportunity analyzer program for the Vesta encounter. Specifically, I encoded Dawn's flight rules into the program, modeled pointing profiles of the optical instruments (framing camera, visible infrared spectrometer) and mapped their fields of view onto Vesta's surface. Visualization of coverage will provide the science team with information necessary to assess feasibility of alternative observation plans. Dawn launches in summer 2006 and ends its journey in 2016. Instrument observations on Vesta in 2011 will supply detailed information about Vesta's surface and internal structure. These data will be used to analyze the formation and history of the protoplanet and, therefore, complete an important step in understanding the development of our solar system.
Barriers to Quality Care for Dying Patients in Rural Communities
ERIC Educational Resources Information Center
Van Vorst, Rebecca F.; Crane, Lori A.; Barton, Phoebe Lindsey; Kutner, Jean S.; Kallail, K. James; Westfall, John M.
2006-01-01
Context: Barriers to providing optimal palliative care in rural communities are not well understood. Purpose: To identify health care personnel's perceptions of the care provided to dying patients in rural Kansas and Colorado and to identify barriers to providing optimal care. Methods: An anonymous self-administered survey was sent to health care…
NASA Astrophysics Data System (ADS)
Dobson, B.; Pianosi, F.; Wagener, T.
2016-12-01
Extensive scientific literature exists on the study of how operation decisions in water resource systems can be made more effectively through the use of optimization methods. However, to the best of the authors' knowledge, there is little in the literature on the implementation of these optimization methods by practitioners. We have performed a survey among UK reservoir operators to assess the current state of method implementation in practice. We also ask questions to assess the potential for implementation of operation optimization. This will help academics to target industry in their current research, identify any misconceptions in industry about the area and open new branches of research for which there is an unsatisfied demand. The UK is a good case study because the regulatory framework is changing to impose "no build" solutions for supply issues, as well as planning across entire water resource systems rather than individual components. Additionally there is a high appetite for efficiency due to the water industry's privatization and most operators are part of companies that control multiple water resources, increasing the potential for cooperation and coordination.
Bio-Mimic Optimization Strategies in Wireless Sensor Networks: A Survey
Adnan, Md. Akhtaruzzaman; Razzaque, Mohammd Abdur; Ahmed, Ishtiaque; Isnin, Ismail Fauzi
2014-01-01
For the past 20 years, many authors have focused their investigations on wireless sensor networks. Various issues related to wireless sensor networks such as energy minimization (optimization), compression schemes, self-organizing network algorithms, routing protocols, quality of service management, security, energy harvesting, etc., have been extensively explored. The three most important issues among these are energy efficiency, quality of service and security management. To get the best possible results in one or more of these issues in wireless sensor networks optimization is necessary. Furthermore, in number of applications (e.g., body area sensor networks, vehicular ad hoc networks) these issues might conflict and require a trade-off amongst them. Due to the high energy consumption and data processing requirements, the use of classical algorithms has historically been disregarded. In this context contemporary researchers started using bio-mimetic strategy-based optimization techniques in the field of wireless sensor networks. These techniques are diverse and involve many different optimization algorithms. As far as we know, most existing works tend to focus only on optimization of one specific issue of the three mentioned above. It is high time that these individual efforts are put into perspective and a more holistic view is taken. In this paper we take a step in that direction by presenting a survey of the literature in the area of wireless sensor network optimization concentrating especially on the three most widely used bio-mimetic algorithms, namely, particle swarm optimization, ant colony optimization and genetic algorithm. In addition, to stimulate new research and development interests in this field, open research issues, challenges and future research directions are highlighted. PMID:24368702
An ACOR-Based Multi-Objective WSN Deployment Example for Lunar Surveying.
López-Matencio, Pablo
2016-02-06
Wireless sensor networks (WSNs) can gather in situ real data measurements and work unattended for long periods, even in remote, rough places. A critical aspect of WSN design is node placement, as this determines sensing capacities, network connectivity, network lifetime and, in short, the whole operational capabilities of the WSN. This paper proposes and studies a new node placement algorithm that focus on these aspects. As a motivating example, we consider a network designed to describe the distribution of helium-3 (³He), a potential enabling element for fusion reactors, on the Moon. ³He is abundant on the Moon's surface, and knowledge of its distribution is essential for future harvesting purposes. Previous data are inconclusive, and there is general agreement that on-site measurements, obtained over a long time period, are necessary to better understand the mechanisms involved in the distribution of this element on the Moon. Although a mission of this type is extremely complex, it allows us to illustrate the main challenges involved in a multi-objective WSN placement problem, i.e., selection of optimal observation sites and maximization of the lifetime of the network. To tackle optimization, we use a recent adaptation of the ant colony optimization (ACOR) metaheuristic, extended to continuous domains. Solutions are provided in the form of a Pareto frontier that shows the optimal equilibria. Moreover, we compared our scheme with the four-directional placement (FDP) heuristic, which was outperformed in all cases.
NASA Technical Reports Server (NTRS)
Dubovik, O; Herman, M.; Holdak, A.; Lapyonok, T.; Taure, D.; Deuze, J. L.; Ducos, F.; Sinyuk, A.
2011-01-01
The proposed development is an attempt to enhance aerosol retrieval by emphasizing statistical optimization in inversion of advanced satellite observations. This optimization concept improves retrieval accuracy relying on the knowledge of measurement error distribution. Efficient application of such optimization requires pronounced data redundancy (excess of the measurements number over number of unknowns) that is not common in satellite observations. The POLDER imager on board the PARASOL microsatellite registers spectral polarimetric characteristics of the reflected atmospheric radiation at up to 16 viewing directions over each observed pixel. The completeness of such observations is notably higher than for most currently operating passive satellite aerosol sensors. This provides an opportunity for profound utilization of statistical optimization principles in satellite data inversion. The proposed retrieval scheme is designed as statistically optimized multi-variable fitting of all available angular observations obtained by the POLDER sensor in the window spectral channels where absorption by gas is minimal. The total number of such observations by PARASOL always exceeds a hundred over each pixel and the statistical optimization concept promises to be efficient even if the algorithm retrieves several tens of aerosol parameters. Based on this idea, the proposed algorithm uses a large number of unknowns and is aimed at retrieval of extended set of parameters affecting measured radiation.
NASA Astrophysics Data System (ADS)
Li, X.; Li, S. W.
2012-07-01
In this paper, an efficient global optimization algorithm in the field of artificial intelligence, named Particle Swarm Optimization (PSO), is introduced into close range photogrammetric data processing. PSO can be applied to obtain the approximate values of exterior orientation elements under the condition that multi-intersection photography and a small portable plane control frame are used. PSO, put forward by an American social psychologist J. Kennedy and an electrical engineer R.C. Eberhart, is a stochastic global optimization method based on swarm intelligence, which was inspired by social behavior of bird flocking or fish schooling. The strategy of obtaining the approximate values of exterior orientation elements using PSO is as follows: in terms of image coordinate observed values and space coordinates of few control points, the equations of calculating the image coordinate residual errors can be given. The sum of absolute value of each image coordinate is minimized to be the objective function. The difference between image coordinate observed value and the image coordinate computed through collinear condition equation is defined as the image coordinate residual error. Firstly a gross area of exterior orientation elements is given, and then the adjustment of other parameters is made to get the particles fly in the gross area. After iterative computation for certain times, the satisfied approximate values of exterior orientation elements are obtained. By doing so, the procedures like positioning and measuring space control points in close range photogrammetry can be avoided. Obviously, this method can improve the surveying efficiency greatly and at the same time can decrease the surveying cost. And during such a process, only one small portable control frame with a couple of control points is employed, and there are no strict requirements for the space distribution of control points. In order to verify the effectiveness of this algorithm, two experiments are carried out. In the first experiment, images of a standard grid board are taken according to multi-intersection photography using digital camera. Three points or six points which are located on the left-down corner of the standard grid are regarded as control points respectively, and the exterior orientation elements of each image are computed through PSO, and compared with these elements computed through bundle adjustment. In the second experiment, the exterior orientation elements obtained from the first experiment are used as approximate values in bundle adjustment and then the space coordinates of other grid points on the board can be computed. The coordinate difference of grid points between these computed space coordinates and their known coordinates can be used to compute the accuracy. The point accuracy computed in above experiments are ±0.76mm and ±0.43mm respectively. The above experiments prove the effectiveness of PSO used in close range photogrammetry to compute approximate values of exterior orientation elements, and the algorithm can meet the requirement of higher accuracy. In short, PSO can get better results in a faster, cheaper way compared with other surveying methods in close range photogrammetry.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
LSD: Large Survey Database framework
NASA Astrophysics Data System (ADS)
Juric, Mario
2012-09-01
The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures.
Leung, Janni; Atherton, Iain; Kyle, Richard G; Hubbard, Gill; McLaughlin, Deirdre
2016-04-01
The aim of this study is to examine the association between optimism and psychological distress in women with breast cancer after taking into account their self-rated general health. Data were aggregated from the Scottish Health Survey (2008 to 2011) to derive a nationally representative sample of 12,255 women (11,960 cancer-free controls, and 295 breast cancer cases identified from linked cancer registry data). The explanatory variables were optimism and general health, and the outcome variable was symptoms of psychological distress. Logistic regression analyses were conducted, with optimism entered in step 1 and general health entered in step 2. In an unadjusted model, higher levels of optimism were associated with lower odds of psychological distress in both the control group (OR = 0. 57, 95 % CI = 0.51-0.60) and breast cancer group (OR = 0. 64, 95 % CI = 0.47-0.88). However, in a model adjusting for general health, optimism was associated with lower odds of psychological distress only in the control group (OR = 0.50, 95 % CI = 0.44-0.57), but not significantly in the breast cancer group (OR = 1.15, 95 % CI = 0.32-4.11). In the breast cancer group, poor general health was a stronger associate of psychological distress (OR = 4. 98, 95 % CI = 1.32-18.75). Results were consistent after adjusting for age, years since breast cancer diagnosis, survey year, socioeconomic status, education, marital status, body mass index, smoking status, and alcohol consumption. This research confirms the value of multicomponent supportive care interventions for women with breast cancer. Specifically, it suggests that following breast cancer diagnosis, health care professionals need to provide advice and signpost to services that assist women to maintain or improve both their psychological and general health.
Innovative method for optimizing Side-Scan Sonar mapping: The blind band unveiled
NASA Astrophysics Data System (ADS)
Pergent, Gérard; Monnier, Briac; Clabaut, Philippe; Gascon, Gilles; Pergent-Martini, Christine; Valette-Sansevin, Audrey
2017-07-01
Over the past few years, the mapping of Mediterranean marine habitats has become a priority for scientists, environment managers and stakeholders, in particular in order to comply with European directives (Water Framework Directive and Marine Strategy Framework Directive) and to implement legislation to ensure their conservation. Side-scan sonar (SSS) is recognised as one of the most effective tool for underwater mapping. However, interpretation of acoustic data (sonograms) requires extensive field calibration and the ground-truthing process remains essential. Several techniques are commonly used, with sampling methods involving grabs, scuba diving observations or Remotely Operated Vehicle (ROV) underwater video recordings. All these techniques are time consuming, expensive and only provide sporadic informations. In the present study, the possibility of coupling a camera with a SSS and acquiring underwater videos in a continuous way has been tested. During the 'PosidCorse' oceanographic survey carried out along the eastern coast of Corsica, optical and acoustic data were respectively obtained using a GoPro™ camera and a Klein 3000™ SSS. Thereby, five profiles were performed between 10 and 50 m depth, corresponding to more than 20 km of data acquisition. The vertical images recorded with the camera fixed under the SSS and positioned facing downwards provided photo mosaics of very good quality corresponding to the entire sonograms's blind band. From the photo mosaics, 94% of the different bottom types and main habitats have been identified; specific structures linked to hydrodynamics conditions, anthropic and biological activities have also been observed as well as the substrate on which the Posidonia oceanica meadow grows. The association between acoustic data and underwater videos has proved to be a non-destructive and cost-effective method for ground-truthing in marine habitats mapping. Nevertheless, in order to optimize the results over the next surveys, certain limitations will need to be remedied.
Zhou, Zhiyong; Wagar, Nick; DeVos, Joshua R.; Rottinghaus, Erin; Diallo, Karidia; Nguyen, Duc B.; Bassey, Orji; Ugbena, Richard; Wadonda-Kabondo, Nellie; McConnell, Michelle S.; Zulu, Isaac; Chilima, Benson; Nkengasong, John; Yang, Chunfu
2011-01-01
Commercially available HIV-1 drug resistance (HIVDR) genotyping assays are expensive and have limitations in detecting non-B subtypes and circulating recombinant forms that are co-circulating in resource-limited settings (RLS). This study aimed to optimize a low cost and broadly sensitive in-house assay in detecting HIVDR mutations in the protease (PR) and reverse transcriptase (RT) regions of pol gene. The overall plasma genotyping sensitivity was 95.8% (N = 96). Compared to the original in-house assay and two commercially available genotyping systems, TRUGENE® and ViroSeq®, the optimized in-house assay showed a nucleotide sequence concordance of 99.3%, 99.6% and 99.1%, respectively. The optimized in-house assay was more sensitive in detecting mixture bases than the original in-house (N = 87, P<0.001) and TRUGENE® and ViroSeq® assays. When the optimized in-house assay was applied to genotype samples collected for HIVDR surveys (N = 230), all 72 (100%) plasma and 69 (95.8%) of the matched dried blood spots (DBS) in the Vietnam transmitted HIVDR survey were genotyped and nucleotide sequence concordance was 98.8%; Testing of treatment-experienced patient plasmas with viral load (VL) ≥ and <3 log10 copies/ml from the Nigeria and Malawi surveys yielded 100% (N = 46) and 78.6% (N = 14) genotyping rates, respectively. Furthermore, all 18 matched DBS stored at room temperature from the Nigeria survey were genotyped. Phylogenetic analysis of the 236 sequences revealed that 43.6% were CRF01_AE, 25.9% subtype C, 13.1% CRF02_AG, 5.1% subtype G, 4.2% subtype B, 2.5% subtype A, 2.1% each subtype F and unclassifiable, 0.4% each CRF06_CPX, CRF07_BC and CRF09_CPX. Conclusions The optimized in-house assay is broadly sensitive in genotyping HIV-1 group M viral strains and more sensitive than the original in-house, TRUGENE® and ViroSeq® in detecting mixed viral populations. The broad sensitivity and substantial reagent cost saving make this assay more accessible for RLS where HIVDR surveillance is recommended to minimize the development and transmission of HIVDR. PMID:22132237
NASA Astrophysics Data System (ADS)
O'Carroll, Jack P. J.; Kennedy, Robert; Ren, Lei; Nash, Stephen; Hartnett, Michael; Brown, Colin
2017-10-01
The INFOMAR (Integrated Mapping For the Sustainable Development of Ireland's Marine Resource) initiative has acoustically mapped and classified a significant proportion of Ireland's Exclusive Economic Zone (EEZ), and is likely to be an important tool in Ireland's efforts to meet the criteria of the MSFD. In this study, open source and relic data were used in combination with new grab survey data to model EUNIS level 4 biotope distributions in Galway Bay, Ireland. The correct prediction rates of two artificial neural networks (ANNs) were compared to assess the effectiveness of acoustic sediment classifications versus sediments that were visually classified by an expert in the field as predictor variables. To test for autocorrelation between predictor variables the RELATE routine with Spearman rank correlation method was used. Optimal models were derived by iteratively removing predictor variables and comparing the correct prediction rates of each model. The models with the highest correct prediction rates were chosen as optimal. The optimal models each used a combination of salinity (binary; 0 = polyhaline and 1 = euhaline), proximity to reef (binary; 0 = within 50 m and 1 = outside 50 m), depth (continuous; metres) and a sediment descriptor (acoustic or observed) as predictor variables. As the status of benthic habitats is required to be assessed under the MSFD the Ecological Status (ES) of the subtidal sediments of Galway Bay was also assessed using the Infaunal Quality Index. The ANN that used observed sediment classes as predictor variables could correctly predict the distribution of biotopes 67% of the time, compared to 63% for the ANN using acoustic sediment classes. Acoustic sediment ANN predictions were affected by local sediment heterogeneity, and the lack of a mixed sediment class. The all-round poor performance of ANNs is likely to be a result of the temporally variable and sparsely distributed data within the study area.
Association of Rhinoplasty With Perceived Attractiveness, Success, and Overall Health.
Nellis, Jason C; Ishii, Masaru; Bater, Kristin L; Papel, Ira D; Kontis, Theda C; Byrne, Patrick J; Boahene, Kofi D O; Ishii, Lisa E
2018-03-01
To date, the impact of rhinoplasty surgery on social perceptions has not been quantified. To measure the association of rhinoplasty with observer-graded perceived attractiveness, success, and overall health. In a web-based survey, blinded casual observers viewed independent images of 13 unique patient faces before or after rhinoplasty. Delphi method was used to select standardized patient images, confirming appropriate patient candidacy and overall surgical effect. Observers rated the attractiveness, perceived success, and perceived overall health for each patient image. Facial perception questions were answered on a visual analog scale from 0 to 100, where higher scores corresponded to more positive responses. A multivariate mixed-effects regression model was used to determine the effect of rhinoplasty while accounting for observer biases. To further characterize the effect of rhinoplasty, estimated ordinal rank change was calculated for each domain. The primary objective was to measure the effect of rhinoplasty on observer-graded perceived attractiveness, success, and overall health. A total of 473 observers (mean age, 29 years [range, 18-73 years]; 305 [70.8%] were female) successfully completed the survey. On multivariate regression, patients after rhinoplasty were rated as significantly more attractive (rhinoplasty effect, 6.26; 95% CI, 5.10-7.41), more successful (rhinoplasty effect, 3.24; 95% CI, 2.32-4.17), and overall healthier (rhinoplasty effect, 3.78; 95% CI, 2.79-4.81). The ordinal rank change for an average individual's perceived attractiveness, success, and overall health was a positive shift of 14, 9, and 10 out of 100 rank positions, respectively. As perceived by casual observers, rhinoplasty surgery was associatedwith perceptions that in patients appeared significantly more attractive, more successful, and healthier. These results suggest patients undergoing rhinoplasty may derive a multifaceted benefit when partaking in social interactions. Furthermore, these results facilitate improved patient discussions aiming to provide more precise surgical expectations with an understanding that these results represent optimal outcomes. NA.
Wide-Field Infrared Survey Telescope (WFIRST) Interim Report
NASA Technical Reports Server (NTRS)
Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Gaudi, S.; Lauer, T.;
2011-01-01
The New Worlds, New Horizons (NWNH) in Astronomy and Astrophysics 2010 Decadal Survey prioritized the community consensus for ground-based and space-based observatories. Recognizing that many of the community s key questions could be answered with a wide-field infrared survey telescope in space, and that the decade would be one of budget austerity, WFIRST was top ranked in the large space mission category. In addition to the powerful new science that could be accomplished with a wide-field infrared telescope, the WFIRST mission was determined to be both technologically ready and only a small fraction of the cost of previous flagship missions, such as HST or JWST. In response to the top ranking by the community, NASA formed the WFIRST Science Definition Team (SDT) and Project Office. The SDT was charged with fleshing out the NWNH scientific requirements to a greater level of detail. NWNH evaluated the risk and cost of the JDEM-Omega mission design, as submitted by NASA, and stated that it should serve as the basis for the WFIRST mission. The SDT and Project Office were charged with developing a mission optimized for achieving the science goals laid out by the NWNH re-port. The SDT and Project Office opted to use the JDEM-Omega hardware configuration as an initial start-ing point for the hardware implementation. JDEM-Omega and WFIRST both have an infrared imager with a filter wheel, as well as counter-dispersed moderate resolution spectrometers. The primary advantage of space observations is being above the Earth's atmosphere, which absorbs, scatters, warps and emits light. Observing from above the atmosphere enables WFIRST to obtain precision infrared measurements of the shapes of galaxies for weak lensing, infrared light-curves of supernovae and exoplanet microlensing events with low systematic errors, and infrared measurements of the H hydrogen line to be cleanly detected in the 1
NASA Astrophysics Data System (ADS)
Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Brink, Henrik; Crellin-Quick, Arien
2012-12-01
With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.
2012-12-15
With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In additionmore » to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.« less
Aerial survey methodology for bison population estimation in Yellowstone National Park
Hess, Steven C.
2002-01-01
I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates with an overall coefficient of variation of approximately 8% and have high power for detecting trends in population change. I demonstrated how population estimates from winter and summer can be integrated into a comprehensive monitoring program to estimate annual growth rates, overall winter mortality, and an index of calf production, requiring about 30 hours of flight per year.
NASA Astrophysics Data System (ADS)
Goetz, Jason; Marcer, Marco; Bodin, Xavier; Brenning, Alexander
2017-04-01
Snow depth mapping in open areas using close range aerial imagery is just one of the many cases where developments in structure-from-motion and multi-view-stereo (SfM-MVS) 3D reconstruction techniques have been applied for geosciences - and with good reason. Our ability to increase the spatial resolution and frequency of observations may allow us to improve our understanding of how snow depth distribution varies through space and time. However, to ensure accurate snow depth observations from close range sensing we must adequately characterize the uncertainty related to our measurement techniques. In this study, we explore the spatial uncertainties of snow elevation models for estimation of snow depth in a complex alpine terrain from close range aerial imagery. We accomplish this by conducting repeat autonomous aerial surveys over a snow-covered active-rock glacier located in the French Alps. The imagery obtained from each flight of an unmanned aerial vehicle (UAV) is used to create an individual digital elevation model (DEM) of the snow surface. As result, we obtain multiple DEMs of the snow surface for the same site. These DEMs are obtained from processing the imagery with the photogrammetry software Agisoft Photoscan. The elevation models are also georeferenced within Photoscan using the geotagged imagery from an onboard GNSS in combination with ground targets placed around the rock glacier, which have been surveyed with highly accurate RTK-GNSS equipment. The random error associated with multi-temporal DEMs of the snow surface is estimated from the repeat aerial survey data. The multiple flights are designed to follow the same flight path and altitude above the ground to simulate the optimal conditions of repeat survey of the site, and thus try to estimate the maximum precision associated with our snow-elevation measurement technique. The bias of the DEMs is assessed with RTK-GNSS survey observations of the snow surface elevation of the area on and surrounding the rock glacier. Additionally, one of the challenges with processing snow cover imagery with SfM-MVS is dealing with the general homogeneity of the surface, which makes is difficult for automated-feature detection algorithms to identify key features for point matching. This challenge depends on the snow cover surface conditions, such as scale, lighting conditions (high vs. low contrast), and availability of snow-free features within a scene, among others. We attempt to explore this aspect by spatial modelling the factors influencing the precision and bias of the DEMs from image, flight, and terrain attributes.
RFI in the 0.5 to 10.8 GHz Band at the Allen Telescope Array
NASA Astrophysics Data System (ADS)
Backus, Peter R.; Kilsdonk, T. N.; Allen Telescope Array Team
2007-05-01
Thanks to funding from the Paul G. Allen Foundation (and other philanthropic supporters) for the technology development and first phase of construction, the first 42 elements of the Allen Telescope Array (ATA-42) are being commissioned for rapid surveys of the astrophysical and technological sky. Because of the innovative design of this array that will eventually include 350 elements, traditional radio astronomy and SETI are enabled simultaneously 24x7. The array has been designed to provide an optimal snapshot image of a very large field of view and simultaneously, 16 (dual polarization) phased beams within the field of view to be analyzed by a suite of backend processors. Four independent 100 MHz bands may be tuned anywhere within the instantaneous receiver bandwidth from 0.5 to 11.2 GHz. One key to the success of rapid surveys for astrophysical or technological signals is a quiet background. This poster presents the results of initial surveys with 6.1 meter dishes at high-spectral-resolution of the background spectrum from 0.5 to 10.8 GHz at the Hat Creek Radio Observatory, where the ATA is being constructed, and compares it with the background spectrum from 1.2-3 GHz at other observatories where SETI observations have been conducted within the past 11 years.
Smoking, social support, and hassles in an urban African-American community.
Romano, P S; Bloom, J; Syme, S L
1991-01-01
BACKGROUND. Despite public health efforts, the prevalence of smoking among African Americans remains high. The determinants of smoking behavior in this population must be elucidated so that interventions can be better targeted and more effective. METHODS. As part of a prospective community intervention trial to reduce cancer mortality, we conducted a random household survey of 1137 African-American adults in San Francisco and Oakland between November 1985 and July 1986. The survey instrument included questions about social network characteristics, instrumental and emotional aspects of social support, smoking behavior, and stressors. RESULTS. The overall prevalence of smoking (41.9%) was higher than that reported in national surveys. Logistic models revealed that persons reporting high levels of stress, represented by an abbreviated hassles index, were more likely to smoke than those reporting less stress. Women with poor social networks were more likely to smoke (odds ratio = 3.1) than women with optimal networks; however, this relationship did not hold among men. Indeed, men lacking emotional support from friends or family were less likely to smoke (odds ratio = 0.5) than men receiving such support. No interaction between social support and hassles was observed. CONCLUSIONS. Stressful environments may contribute to high-risk smoking behavior among urban African Americans. PMID:1951797
Russell, Richard; Kingsland, Charles; Alfirevic, Zarko; Gazvani, Rafet
2015-03-01
Luteal support is considered as an essential component of IVF treatment following ovarian stimulation and embryo transfer. Several studies have consistently demonstrated a benefit of luteal support compared with no treatment and whilst a number of preparations are available, no product has been demonstrated as superior. There is an emerging body of evidence which suggests that extension of luteal support beyond biochemical pregnancy does not confer a benefit in terms of successful pregnancy outcome. We performed two surveys separated by 5 years of practice evolution, with the latter reporting on the use of luteal support in all IVF clinics in the UK. All clinics reported utilising luteal support with the majority favouring the use of Cyclogest 400 mg twice daily. In contrast, there was no consensus on the optimal duration of luteal support. Whilst 24% of clinics withdrew luteal support at biochemical confirmation of pregnancy, 40% continued treatment until 12 weeks gestation. Several clinics even extended luteal support beyond 12 weeks gestation. We observed no difference in practice based on the size of the IVF unit or treatment funding source. Although there was some change in practice between surveys in many clinics, there was no uniformity in the direction of change.
Objectively Optimized Observation Direction System Providing Situational Awareness for a Sensor Web
NASA Astrophysics Data System (ADS)
Aulov, O.; Lary, D. J.
2010-12-01
There is great utility in having a flexible and automated objective observation direction system for the decadal survey missions and beyond. Such a system allows us to optimize the observations made by suite of sensors to address specific goals from long term monitoring to rapid response. We have developed such a prototype using a network of communicating software elements to control a heterogeneous network of sensor systems, which can have multiple modes and flexible viewing geometries. Our system makes sensor systems intelligent and situationally aware. Together they form a sensor web of multiple sensors working together and capable of automated target selection, i.e. the sensors “know” where they are, what they are able to observe, what targets and with what priorities they should observe. This system is implemented in three components. The first component is a Sensor Web simulator. The Sensor Web simulator describes the capabilities and locations of each sensor as a function of time, whether they are orbital, sub-orbital, or ground based. The simulator has been implemented using AGIs Satellite Tool Kit (STK). STK makes it easy to analyze and visualize optimal solutions for complex space scenarios, and perform complex analysis of land, sea, air, space assets, and shares results in one integrated solution. The second component is target scheduler that was implemented with STK Scheduler. STK Scheduler is powered by a scheduling engine that finds better solutions in a shorter amount of time than traditional heuristic algorithms. The global search algorithm within this engine is based on neural network technology that is capable of finding solutions to larger and more complex problems and maximizing the value of limited resources. The third component is a modeling and data assimilation system. It provides situational awareness by supplying the time evolution of uncertainty and information content metrics that are used to tell us what we need to observe and the priority we should give to the observations. A prototype of this component was implemented with AutoChem. AutoChem is NASA release software constituting an automatic code generation, symbolic differentiator, analysis, documentation, and web site creation tool for atmospheric chemical modeling and data assimilation. Its model is explicit and uses an adaptive time-step, error monitoring time integration scheme for stiff systems of equations. AutoChem was the first model to ever have the facility to perform 4D-Var data assimilation and Kalman filter. The project developed a control system with three main accomplishments. First, fully multivariate observational and theoretical information with associated uncertainties was combined using a full Kalman filter data assimilation system. Second, an optimal distribution of the computations and of data queries was achieved by utilizing high performance computers/load balancing and a set of automatically mirrored databases. Third, inter-instrument bias correction was performed using machine learning. The PI for this project was Dr. David Lary of the UMBC Joint Center for Earth Systems Technology at NASA/Goddard Space Flight Center.
Optimal design of a plot cluster for monitoring
Charles T. Scott
1993-01-01
Traveling costs incurred during extensive forest surveys make cluster sampling cost-effective. Clusters are specified by the type of plots, plot size, number of plots, and the distance between plots within the cluster. A method to determine the optimal cluster design when different plot types are used for different forest resource attributes is described. The method...
Robust surveillance and control of invasive species using a scenario optimization approach
Denys Yemshanov; Robert G. Haight; Frank H. Koch; Bo Lu; Robert C. Venette; Ronald E. Fournier; Jean J. Turgeon
2017-01-01
Uncertainty about future outcomes of invasions is a major hurdle in the planning of invasive species management programs. We present a scenario optimization model that incorporates uncertainty about the spread of an invasive species and allocates survey and eradication measures to minimize the number of infested or potentially infested host plants on the landscape. We...
New Results in Astrodynamics Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Coverstone-Carroll, V.; Hartmann, J. W.; Williams, S. N.; Mason, W. J.
1998-01-01
Generic algorithms have gained popularity as an effective procedure for obtaining solutions to traditionally difficult space mission optimization problems. In this paper, a brief survey of the use of genetic algorithms to solve astrodynamics problems is presented and is followed by new results obtained from applying a Pareto genetic algorithm to the optimization of low-thrust interplanetary spacecraft missions.
Optimizing Balanced Incomplete Block Designs for Educational Assessments
ERIC Educational Resources Information Center
van der Linden, Wim J.; Veldkamp, Bernard P.; Carlson, James E.
2004-01-01
A popular design in large-scale educational assessments as well as any other type of survey is the balanced incomplete block design. The design is based on an item pool split into a set of blocks of items that are assigned to sets of "assessment booklets." This article shows how the problem of calculating an optimal balanced incomplete block…
Optimizing the Number of Students for an Effective Online Discussion Board Learning Experience
ERIC Educational Resources Information Center
Reonieri, Dean C., Sr.
2006-01-01
The purpose of this research was to determine if there is an opportunity for colleges and universities to improve the quality of knowledge constructed in online (asynchronous) discussion boards by optimizing the number of students in the discussion. 93 online graduate students and 36 online faculty were surveyed to gain the perspective from both…
Task Scheduling in Desktop Grids: Open Problems
NASA Astrophysics Data System (ADS)
Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny
2017-12-01
We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.
Identity Crises in Love and at Work: Dispositional Optimism as a Durable Personal Resource
ERIC Educational Resources Information Center
Andersson, Matthew A.
2012-01-01
Using the 2004 General Social Survey (N = 453), the identity stress process is investigated in terms of crises in intimate relationships and at the workplace. I discuss dispositional optimism as a psychological resource that is relatively independent of the situation and the self, making it ideal for structurally disadvantaged actors and for…
Optimal multi-dimensional poverty lines: The state of poverty in Iraq
NASA Astrophysics Data System (ADS)
Ameen, Jamal R. M.
2017-09-01
Poverty estimation based on calories intake is unrealistic. The established concept of multidimensional poverty has methodological weaknesses in the treatment of different dimensions and there is disagreement in methods of combining them into a single poverty line. This paper introduces a methodology to estimate optimal multidimensional poverty lines and uses the Iraqi household socio-economic survey data of 2012 to demonstrate the idea. The optimal poverty line for Iraq is found to be 170.5 Thousand Iraqi Dinars (TID).
Optimization Strategies for Sensor and Actuator Placement
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Kincaid, Rex K.
1999-01-01
This paper provides a survey of actuator and sensor placement problems from a wide range of engineering disciplines and a variety of applications. Combinatorial optimization methods are recommended as a means for identifying sets of actuators and sensors that maximize performance. Several sample applications from NASA Langley Research Center, such as active structural acoustic control, are covered in detail. Laboratory and flight tests of these applications indicate that actuator and sensor placement methods are effective and important. Lessons learned in solving these optimization problems can guide future research.
An efficient method for removing point sources from full-sky radio interferometric maps
NASA Astrophysics Data System (ADS)
Berger, Philippe; Oppermann, Niels; Pen, Ue-Li; Shaw, J. Richard
2017-12-01
A new generation of wide-field radio interferometers designed for 21-cm surveys is being built as drift scan instruments allowing them to observe large fractions of the sky. With large numbers of antennas and frequency channels, the enormous instantaneous data rates of these telescopes require novel, efficient, data management and analysis techniques. The m-mode formalism exploits the periodicity of such data with the sidereal day, combined with the assumption of statistical isotropy of the sky, to achieve large computational savings and render optimal analysis methods computationally tractable. We present an extension to that work that allows us to adopt a more realistic sky model and treat objects such as bright point sources. We develop a linear procedure for deconvolving maps, using a Wiener filter reconstruction technique, which simultaneously allows filtering of these unwanted components. We construct an algorithm, based on the Sherman-Morrison-Woodbury formula, to efficiently invert the data covariance matrix, as required for any optimal signal-to-noise ratio weighting. The performance of our algorithm is demonstrated using simulations of a cylindrical transit telescope.
2010-06-01
surface directly (vertically) above the hypocenter (United States Geological Survey , 2009). A graphical depiction of epicenter and hypocenter appears...to their focal depth: shallow (70-300 km), intermediate (70-300 km), and deep (300-700 km) (United States Geological Survey , 1989a). The concepts of...magnitude (Mb), and moment magnitude (MW) scales (Papazachos & Papazachou, 2003, p. 39; United States Geological Survey , 2009c). All these measurement
van Dorp, Sofie M; Notermans, Daan W; Alblas, Jeroen; Gastmeier, Petra; Mentula, Silja; Nagy, Elisabeth; Spigaglia, Patrizia; Ivanova, Katiusha; Fitzpatrick, Fidelma; Barbut, Frédéric; Morris, Trefor; Wilcox, Mark H; Kinross, Pete; Suetens, Carl; Kuijper, Ed J
2016-07-21
Suboptimal laboratory diagnostics for Clostridium difficile infection (CDI) impedes its surveillance and control across Europe. We evaluated changes in local laboratory CDI diagnostics and changes in national diagnostic and typing capacity for CDI during the European C. difficile Infection Surveillance Network (ECDIS-Net) project, through cross-sectional surveys in 33 European countries in 2011 and 2014. In 2011, 126 (61%) of a convenience sample of 206 laboratories in 31 countries completed a survey on local diagnostics. In 2014, 84 (67%) of these 126 laboratories in 26 countries completed a follow-up survey. Among laboratories that participated in both surveys, use of CDI diagnostics deemed 'optimal' or 'acceptable' increased from 19% to 46% and from 10% to 15%, respectively (p < 0.001). The survey of national capacity was completed by national coordinators of 31 and 32 countries in 2011 and 2014, respectively. Capacity for any C. difficile typing method increased from 22/31 countries in 2011 to 26/32 countries in 2014; for PCR ribotyping from 20/31 countries to 23/32 countries, and specifically for capillary PCR ribotyping from 7/31 countries to 16/32 countries. While our study indicates improved diagnostic capability and national capacity for capillary PCR ribotyping across European laboratories between 2011 and 2014, increased use of 'optimal' diagnostics should be promoted. This article is copyright of The Authors, 2016.
SKA weak lensing- II. Simulated performance and survey design considerations
NASA Astrophysics Data System (ADS)
Bonaldi, Anna; Harrison, Ian; Camera, Stefano; Brown, Michael L.
2016-12-01
We construct a pipeline for simulating weak lensing cosmology surveys with the Square Kilometre Array (SKA), taking as inputs telescope sensitivity curves; correlated source flux, size and redshift distributions; a simple ionospheric model; source redshift and ellipticity measurement errors. We then use this simulation pipeline to optimize a 2-yr weak lensing survey performed with the first deployment of the SKA (SKA1). Our assessments are based on the total signal to noise of the recovered shear power spectra, a metric that we find to correlate very well with a standard dark energy figure of merit. We first consider the choice of frequency band, trading off increases in number counts at lower frequencies against poorer resolution; our analysis strongly prefers the higher frequency Band 2 (950-1760 MHz) channel of the SKA-MID telescope to the lower frequency Band 1 (350-1050 MHz). Best results would be obtained by allowing the centre of Band 2 to shift towards lower frequency, around 1.1 GHz. We then move on to consider survey size, finding that an area of 5000 deg2 is optimal for most SKA1 instrumental configurations. Finally, we forecast the performance of a weak lensing survey with the second deployment of the SKA. The increased survey size (3π steradian) and sensitivity improves both the signal to noise and the dark energy metrics by two orders of magnitude.
Updyke, Katelyn Mariko; Urso, Brittany; Beg, Shazia; Solomon, James
2017-10-09
Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers.
Urso, Brittany; Beg, Shazia; Solomon, James
2017-01-01
Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers. PMID:29226052
2014-01-01
Background The optimal cutoff of the waist-to-hip ratio (WHR) among Han adults in Xinjiang, which is located in the center of Asia, is unknown. We aimed to examine the relationship between different WHRs and cardiovascular risk factors among Han adults in Xinjiang, and determine the optimal cutoff of the WHR. Methods The Cardiovascular Risk Survey was conducted from October 2007 to March 2010. A total of 14618 representative participants were selected using a four-stage stratified sampling method. A total of 5757 Han participants were included in the study. The present statistical analysis was restricted to the 5595 Han subjects who had complete anthropometric data. The sensitivity, specificity, and distance on the receiver operating characteristic (ROC) curve in each WHR level were calculated. The shortest distance in the ROC curves was used to determine the optimal cutoff of the WHR for detecting cardiovascular risk factors. Results In women, the WHR was positively associated with systolic blood pressure, diastolic blood pressure, and serum concentrations of serum total cholesterol. The prevalence of hypertension and hypertriglyceridemia increased as the WHR increased. The same results were not observed among men. The optimal WHR cutoffs for predicting hypertension, diabetes, dyslipidemia and ≥ two of these risk factors for Han adults in Xinjiang were 0.92, 0.92, 0.91, 0.92 in men and 0.88, 0.89, 0.88, 0.89 in women, respectively. Conclusions Higher cutoffs for the WHR are required in the identification of Han adults aged ≥ 35 years with a high risk of cardiovascular diseases in Xinjiang. PMID:25074400
Islam, M Mazharul; Masud, Mohammad Shahed
2018-04-30
The World Health Organization (WHO) recommends four antenatal care (ANC) visits, delivery in a health facility and three postnatal care (PNC) visits for women to optimize the maternal health outcomes. To examine the level and determinants of maternal health care seeking behaviour during pregnancy, delivery and the postnatal period, and assess the compliance with the WHO recommended levels of care in Bangladesh. The study is based on secondary analysis of the data obtained from the 2014 Bangladesh Demographic and Health Survey (BDHS). The 2014 BDHS was a cross-sectional survey of a nationally representative sample of 17,863 ever-married women aged 15-49 years. The sample was selected following a two-stage stratified cluster sampling design. The dataset from a subsample of 4.627 ever-married women who had delivered their last birth within three years before the survey were included in the analysis to meet the objectives of the study. Descriptive statistics and multinomial logistic regression model were used for data analysis. It has been observed that only 31% mothers had recommended four or more ANC visits, 37% births were delivered at health facilities, and 65% mothers received at least one PNC visit. Only 18.0% mothers received the WHO recommended optimal level of four or more ANC visits, births in a health facility and at least one PNC visit. Mothers aged less than 20 years, living in rural area, having no education and media exposure, multiparous, poor wealth status, husband with no education and husband's employment status appeared as significant predictors of optimal level maternal health care after adjusting for other factors. Mothers living in Sylhet, Chittagong and Barisal regions were less likely to receive the optimum level health care. Utilization of maternal health care during pregnancy, delivery and the postnatal period among Bangladeshi women does not reflect the complete compliance with the WHO recommendations. Further studies are needed to identify the reasons for underutilization of optimum level maternal care practice in Bangladesh. The findings underscore the need for targeted intervention for those groups of mothers who were identified as having lowest level of maternal care across the continuum of care. Copyright © 2018 Elsevier Ltd. All rights reserved.
23 CFR 1340.5 - Selection of observation sites.
Code of Federal Regulations, 2013 CFR
2013-04-01
... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...
23 CFR 1340.5 - Selection of observation sites.
Code of Federal Regulations, 2014 CFR
2014-04-01
... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...
23 CFR 1340.5 - Selection of observation sites.
Code of Federal Regulations, 2012 CFR
2012-04-01
... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...
Constraints on the Energy Density Content of the Universe Using Only Clusters of Galaxies
NASA Technical Reports Server (NTRS)
Molnar, Sandor M.; Haiman, Zoltan; Birkinshaw, Mark
2003-01-01
We demonstrate that it is possible to constrain the energy content of the Universe with high accuracy using observations of clusters of galaxies only. The degeneracies in the cosmological parameters are lifted by combining constraints from different observables of galaxy clusters. We show that constraints on cosmological parameters from galaxy cluster number counts as a function of redshift and accurate angular diameter distance measurements to clusters are complementary to each other and their combination can constrain the energy density content of the Universe well. The number counts can be obtained from X-ray and/or SZ (Sunyaev-Zeldovich effect) surveys, the angular diameter distances can be determined from deep observations of the intra-cluster gas using their thermal bremsstrahlung X-ray emission and the SZ effect (X-SZ method). In this letter we combine constraints from simulated cluster number counts expected from a 12 deg2 SZ cluster survey and constraints from simulated angular diameter distance measurements based on using the X-SZ method assuming an expected accuracy of 7% in the angular diameter distance determination of 70 clusters with redshifts less than 1.5. We find that R, can be determined within about 25%, A within 20%, and w within 16%. Any cluster survey can be used to select clusters for high accuracy distance measurements, but we assumed accurate angular diameter distance measurements for only 70 clusters since long observations are necessary to achieve high accuracy in distance measurements. Thus the question naturally arises: How to select clusters of galaxies for accurate diameter distance determinations? In this letter, as an example, we demonstrate that it is possible to optimize this selection changing the number of clusters observed, and the upper cut off of their redshift range. We show that constraints on cosmological parameters from combining cluster number counts and angular diameter distance measurements, as opposed to general expectations, will not improve substantially selecting clusters with redshifts higher than one. This important conclusion allow us to restrict our cluster sample to clusters closer than one, in a range where the observational time for accurate distance measurements are more manageable. Subject headings: cosmological parameters - cosmology: theory - galaxies: clusters: general - X-rays: galaxies: clusters
Object classification and outliers analysis in the forthcoming Gaia mission
NASA Astrophysics Data System (ADS)
Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.
2010-12-01
Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.
Formation Design Strategy for SCOPE High-Elliptic Formation Flying Mission
NASA Technical Reports Server (NTRS)
Tsuda, Yuichi
2007-01-01
The new formation design strategy using simulated annealing (SA) optimization is presented. The SA algorithm is useful to survey a whole solution space of optimum formation, taking into account realistic constraints composed of continuous and discrete functions. It is revealed that this method is not only applicable for circular orbit, but also for high-elliptic orbit formation flying. The developed algorithm is first tested with a simple cart-wheel motion example, and then applied to the formation design for SCOPE. SCOPE is the next generation geomagnetotail observation mission planned in JAXA, utilizing a formation flying techonology in a high elliptic orbit. A distinctive and useful heuristics is found by investigating SA results, showing the effectiveness of the proposed design process.
Miller, Jennifer R. B.; Jhala, Yadvendradev V.; Schmitz, Oswald J.
2016-01-01
Human-carnivore conflict is challenging to quantify because it is shaped by both the realities and people’s perceptions of carnivore threats. Whether perceptions align with realities can have implications for conflict mitigation: misalignments can lead to heightened and indiscriminant persecution of carnivores whereas alignments can offer deeper insights into human-carnivore interactions. We applied a landscape-scale spatial analysis of livestock killed by tigers and leopards in India to model and map observed attack risk, and surveyed owners of livestock killed by tigers and leopards for their rankings of threats across habitats to map perceived attack risk. Observed tiger risk to livestock was greatest near dense forests and at moderate distances from human activity while leopard risk was greatest near open vegetation. People accurately perceived spatial differences between tiger and leopard hunting patterns, expected greater threat in areas with high values of observed risk for both carnivores. Owners’ perception of threats largely did not depend on environmental conditions surrounding their village (spatial location, dominant land-use or observed carnivore risk). Surveys revealed that owners who previously lost livestock to carnivores used more livestock protection methods than those who had no prior losses, and that owners who had recently lost livestock for the first time expressed greater interest in changing their protection methods than those who experienced prior losses. Our findings suggest that in systems where realities and perceptions of carnivore risk align, conservation programs and policies can optimize conservation outcomes by (1) improving the effectiveness of livestock protection methods and (2) working with owners who have recently lost livestock and are most willing to invest effort in adapting protection strategies to mitigate human-carnivore conflict. PMID:27617831
Miller, Jennifer R B; Jhala, Yadvendradev V; Schmitz, Oswald J
2016-01-01
Human-carnivore conflict is challenging to quantify because it is shaped by both the realities and people's perceptions of carnivore threats. Whether perceptions align with realities can have implications for conflict mitigation: misalignments can lead to heightened and indiscriminant persecution of carnivores whereas alignments can offer deeper insights into human-carnivore interactions. We applied a landscape-scale spatial analysis of livestock killed by tigers and leopards in India to model and map observed attack risk, and surveyed owners of livestock killed by tigers and leopards for their rankings of threats across habitats to map perceived attack risk. Observed tiger risk to livestock was greatest near dense forests and at moderate distances from human activity while leopard risk was greatest near open vegetation. People accurately perceived spatial differences between tiger and leopard hunting patterns, expected greater threat in areas with high values of observed risk for both carnivores. Owners' perception of threats largely did not depend on environmental conditions surrounding their village (spatial location, dominant land-use or observed carnivore risk). Surveys revealed that owners who previously lost livestock to carnivores used more livestock protection methods than those who had no prior losses, and that owners who had recently lost livestock for the first time expressed greater interest in changing their protection methods than those who experienced prior losses. Our findings suggest that in systems where realities and perceptions of carnivore risk align, conservation programs and policies can optimize conservation outcomes by (1) improving the effectiveness of livestock protection methods and (2) working with owners who have recently lost livestock and are most willing to invest effort in adapting protection strategies to mitigate human-carnivore conflict.
NASA Astrophysics Data System (ADS)
Bouwens, Rychard; Morashita, Takahiro; Stefanon, Mauro; Magee, Dan
2018-05-01
The combination of observations taken by Hubble and Spitzer revealed the unexpected presence of sources as bright as our own Milky Way as early as 400 Myr after the Big Bang, potentially highlighting a new highly efficient regime for star formation in L>L* galaxies at very early times. Yet, the sample of high-quality z>8 galaxies with both HST and Spitzer/IRAC imaging is still small, particularly at the highest luminosities. We propose here to remedy this situation and use Spitzer/IRAC to efficiently follow up the most promising z>8 sources from our Hubble Brightest of Reionizing Galaxies (BoRG) survey, which covers a footprint on the sky similar to CANDELS, provides a deeper search than ground-based surveys like UltraVISTA, and is robust against cosmic variance because of its 210 independent lines of sight. The proposed new 3.6 micron observations will continue our Spitzer cycle 12 and 13 BORG911 programs, targeting 15 additional fields, leveraging over 200 new HST orbits to identify a final sample of about 8 bright galaxies at z >= 8.5. For optimal time use (just 20 hours), our goal is to readily discriminate between z>8 sources (undetected or marginally detected in IRAC) and z 2 interlopers (strongly detected in IRAC) with just 1-2 hours per pointing. The high-quality candidates that we will identify with IRAC will be ideal targets for further studies investigating the ionization state of the distant universe through near-IR Keck/VLT spectroscopy. They will also be uniquely suited to measurement of the redshift and stellar population properties through JWST/NIRSPEC observations, with the potential to elucidate how the first generations of stars are assembled in the earliest stages of the epoch of reionization.
Pärgmäe, P; Martins, N; Rodríguez, D; Christopoulos, P; Werner, H M J
2011-01-01
To review the compliance of the European Working Time Directive (EWTD) in different teaching hospitals across Europe and its consequences upon training. It is an observational, descriptive, cross-sectional study. The sample is constituted by the answers from trainees selected by the representatives of 29 European Network of Trainees in Ob/Gyn (ENTOG) member countries to a survey designed by ENTOG Executive. The survey content was based on a joint survey by the Royal College of Obstetricians and Gynaecologists (RCOG) and the Royal College for Paediatrics (RCP), carried out in 2008, but adapted for use on a European level. An answer rate of 75% was obtained. Only 5 countries out of 29 were compliant with EWTD two months before the compulsory adherence. Countries needed to introduce 1 to 4 changes to the system to make the rotas -compliant. Positive effect on work and private life balance was noticed in 87% from all responses. Trainees notice the need to further improve training programmes in order to have the same quality of training and continuous care of patients. Steps forward to implement EWTD are being made. Trainees should be involved with the introduction to optimize training conditions under the EWTD. Countries that still struggle to introduce the directive may learn from countries that already are compliant. It is suggested to organize a survey on senior society level to gain additional information to further investigate the effects on training quality and patient care.
Optimal Observations for Variational Data Assimilation
NASA Technical Reports Server (NTRS)
Koehl, Armin; Stammer, Detlef
2003-01-01
An important aspect of Ocean state estimation is the design of an observing system that allows the efficient study of climate aspects in the ocean. A solution of the design problem is presented here in terms of optimal observations that emerge as nondimensionalized singular vectors of the modified data resolution matrix. The actual computation is feasible only for scalar quantities in the limit of large observational errors. In the framework of a lo resolution North Atlantic primitive equation model it is demonstrated that such optimal observations when applied to determining the strength of the volume and heat transport across the Greenland-Scotland ridge, perform significantly better than traditional section data. On seasonal to inter-annual time-scales optimal observations are located primarily along the continental shelf and information about heat-transport, wind stress and stratification is being communicated via boundary waves and advective processes. On time-scales of about a month, sea surface height observations appear to be more efficient in reconstructing the cross-ridge heat transport than hydrographic observations. Optimal observations also provide a tool for understanding how the ocean state is effected by anomalies of integral quantities such as meridional heat transport.
Slip Rates of Main Active Fault Zones Through Turkey Inferred From GPS Observations
NASA Astrophysics Data System (ADS)
Ozener, H.; Aktug, B.; Dogru, A.; Tasci, L.; Acar, M.; Emre, O.; Yilmaz, O.; Turgut, B.; Halicioglu, K.; Sabuncu, A.; Bal, O.; Eraslan, A.
2015-12-01
Active Fault Map of Turkey was revised and published by General Directorate of Mineral Research and Exploration in 2012. This map reveals that there are about 500 faults can generate earthquakes.In order to understand the earthquake potential of these faults, it is needed to determine the slip rates. Although many regional and local studies were performed in the past, the slip rates of the active faults in Turkey have not been determined. In this study, the block modelling, which is the most common method to produce slip rates, will be done. GPS velocities required for block modeling is being compiled from the published studies and the raw data provided then velocity field is combined. To form a homogeneous velocity field, different stochastic models will be used and the optimal velocity field will be achieved. In literature, GPS site velocities, which are computed for different purposes and published, are combined globally and this combined velocity field are used in the analysis of strain accumulation. It is also aimed to develop optimal stochastic models to combine the velocity data. Real time, survey mode and published GPS observations is being combined in this study. We also perform new GPS observations. Furthermore, micro blocks and main fault zones from Active Fault Map Turkey will be determined and homogeneous velocity field will be used to infer slip rates of these active faults. Here, we present the result of first year of the study. This study is being supported by THE SCIENTIFIC AND TECHNOLOGICAL RESEARCH COUNCIL OF TURKEY (TUBITAK)-CAYDAG with grant no. 113Y430.
Examining the Potential of LSST to Contribute to Exoplanet Discovery
NASA Astrophysics Data System (ADS)
Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.
2018-01-01
The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.
Validation of the Chinese Version of the Social Emotional Health Survey-Primary
ERIC Educational Resources Information Center
Wang, Cixin; Yang, Chunyan; Jiang, Xu; Furlong, Michael
2018-01-01
The Social Emotional Health Survey-Primary (SEHS-P) was originally developed to assess U.S. elementary students' positive psychological traits: gratitude, zest, optimism, and persistence, and the higher-order latent construct of covitality. The present study evaluated the validity of a Chinese version of SEHS-P with a sample of 653 Chinese…
Teacher Educators Developing Professional Roles: Frictions between Current and Optimal Practices
ERIC Educational Resources Information Center
Meeus, Wil; Cools, Wouter; Placklé, Inge
2018-01-01
This article reports on a study of the professional learning of Flemish teacher educators. In the first part, an exemplary survey was conducted in order to compile an inventory of the existing types of education initiatives for teacher educators in Flanders. An electronic survey was then conducted in order to identify the professional needs of…
Optimal allocation of invasive species surveillance with the maximum expected coverage concept
Denys Yemshanov; Robert G. Haight; Frank H. Koch; Bo Lu; Robert Venette; D. Barry Lyons; Taylor Scarr; Krista Ryall; Brian. Leung
2015-01-01
We address the problem of geographically allocating scarce survey resources to detect pests in their pathways of introduction given information about their likelihood of movement between origins and destinations. We introduce a model for selecting destination sites for survey that departs from the aim of reducing propagule pressure (PP) in pest destinations and instead...
Survey of EPA facilities for solar thermal energy applications
NASA Technical Reports Server (NTRS)
Nelson, E. V.; Overly, P. T.; Bell, D. M.
1980-01-01
A study was done to assess the feasibility of applying solar thermal energy systems to EPA facilities. A survey was conducted to determine those EPA facilities where solar energy could best be used. These systems were optimized for each specific application and the system/facility combinations were ranked on the basis of greatest cost effectiveness.
Impact of a smoking ban in public places: a rapid assessment in the Seychelles.
Viswanathan, Bharathi; Plumettaz, Chloé; Gedeon, Jude; Bovet, Pascal
2011-11-01
We assessed the impact of a smoking ban in hospitality venues in the Seychelles 9 months after legislation was implemented. Survey officers observed compliance with the smoking ban in 38 most popular hospitality venues and administered a structured questionnaire to two customers, two workers and one manager in each venue. Virtually no customers or workers were seen smoking in the indoor premises. Patrons, workers and managers largely supported the ban. The personnel of the hospitality venues reported that most smokers had no difficulty refraining from smoking. However, a third of workers did not systematically request customers to stop smoking and half of them did not report adequate training. Workers reported improved health. No substantial change in the number of customers was noted. A ban on public smoking was generally well implemented in hospitality venues but some less than optimal findings suggest the need for adequate training of workers and strengthened enforcement measures. The simple and inexpensive methodology used in this rapid survey may be a useful approach to evaluate the implementation and impact of clean air policy in low and middle-income countries.
ART-XC: A Medium-energy X-ray Telescope System for the Spectrum-R-Gamma Mission
NASA Technical Reports Server (NTRS)
Arefiev, V.; Pavlinsky, M.; Lapshov, I.; Thachenko, A.; Sazonov, S.; Revnivtsev, M.; Semena, N.; Buntov,M.; Vikhlinin, A.; Gubarev, M.;
2008-01-01
The ART-XC instrument is an X-ray grazing-incidence telescope system in an ABRIXAS-type optical configuration optimized for the survey observational mode of the Spectrum-RG astrophysical mission which is scheduled to be launched in 2011. ART-XC has two units, each equipped with four identical X-ray multi-shell mirror modules. The optical axes of the individual mirror modules are not parallel but are separated by several degrees to permit the four modules to share a single CCD focal plane detector, 1/4 of the area each. The 450-micron-thick pnCCD (similar to the adjacent eROSITA telescope detector) will allow detection of X-ray photons up to 15 keV. The field of view of the individual mirror module is about 18 x 18 arcminutes(exp 2) and the sensitivity of the ART-XC system for 4 years of survey will be better than 10(exp -12) erg s(exp -1) cm(exp -2) over the 4-12 keV energy band. This will allow the ART-XC instrument to discover several thousand new AGNs.
Black Holes and the Centers of Galaxies
NASA Astrophysics Data System (ADS)
Richstone, Douglas
1997-07-01
We propose to continue our survey of centers of nearby galaxies. The major goal for Cycle 7 is to survey an unbiased set of galaxies with a potentially wide range of black hole masses. The results will constrain the prevalence and formation of massive black holes and their relationship to AGN's. Over the last several years, we have used HST to characterize the scaling laws for galaxy centers, to identify an apparent dichotomy in galaxy types based on their central light profiles, and to identify new black hole candidates and confirm ground-based results on known candidates. In the STIS epoch, we wish to capitalize on the presence of a genuine slit spectrograph to study the central stellar dynamics of a large set of systematically selected elliptical and S0 galaxies. The sample for this cycle has been carefully chosen to optimize our leverage on the character of a proposed correlation of black hole mass with galaxy mass. In addition, high-S/N observations of line profiles should permit us to distinguish between BHs and anisotropic stellar orbits, a critical degeneracy that has long plagued this subject.
Feehan, M; Walsh, M; Godin, J; Sundwall, D; Munger, M A
2017-12-01
In order to improve public health, it is necessary to facilitate patients' easy access to affordable high-quality primary health care, and one enhanced approach to do so may be to provide primary healthcare services in the community pharmacy setting. Discrete choice experiments to evaluate patient demand for services in pharmacy are relatively limited and have been hampered by a focus on only a few service alternatives, most focusing on changes in more traditional pharmacy services. The study aim was to gauge patient preferences explicitly for primary healthcare services that could be delivered through community pharmacy settings in the USA, using a very large sample to accommodate multiple service delivery options. An online survey was administered to a total of 9202 adult patients from the general population. A subsequent online survey was administered to 50 payer reimbursement decision-makers. The patient survey included a discrete choice experiment (DCE) which showed competing scenarios describing primary care service offerings. The respondents chose which scenario would be most likely to induce them to switch from their current pharmacy, and an optimal patient primary care service model was derived. The likelihood this model would be reimbursed was then determined in the payer survey. The final optimal service configuration that would maximize patient preference included the pharmacy: offering appointments to see a healthcare provider in the pharmacy, having access to their full medical record, provide point-of-care diagnostic testing, offer health preventive screening, provide limited physical examinations such as measuring vital signs, and drug prescribing in the pharmacy. The optimal model had the pharmacist as the provider; however, little change in demand was evident if the provider was a nurse-practitioner or physician's assistant. The demand for this optimal model was 2-fold higher (25.5%; 95% Bayesian precision interval (BPI) 23.5%-27.0%) than for a base pharmacy offering minimal primary care services (12.6%; 95% BPI 12.2%-13.2%), and was highest among Hispanic (30.6%; 95% BPI: 25.7%-34.3%) and African American patients (30.7%; 95% BPI: 27.1%-35.2%). In the second reimbursement decision-maker survey, the majority (66%) indicated their organization would be likely to reimburse the services described in the optimal patient model if provided in the pharmacy setting. This United States national study provides empirical support for a model of providing primary care services through community pharmacy settings that would increase access, with the potential to improve the public health. © 2017 John Wiley & Sons Ltd.
2017-03-01
OCW from ages 25–30 based on the 2015 U.S. Department of Health and Human Services Report and the Navy’s biennial Pregnancy and Parenthood Survey ... Survey , adds the 83 optimal childbearing window (OCW) to illustrate how pregnancy timing would align between ages 25–30 on each SWO career path. Each...U.S. Department of Health and Human Services report and the biennial U.S. Navy Pregnancy and Parenthood Survey , a majority of women are having
LQG/LTR Optimal Attitude Control of Small Flexible Spacecraft Using Free-Free Boundary Conditions
2006-08-03
particular on attitude control of flex- ible space structures. Croopnick et al .[50] present a literature survey in the areas of attitude control...modeling and control of space structures is compiled by Nurre et al .[161]. One important thing to note from the surveys listed above is the 21 focus on the...papers surveyed by Croopnick et al . in 1979, by Meirovitch in 1979, Balas in 1982, and Nurre et al . in 1984. The focus of the papers included in all
Global Seabed Materials and Habitats Mapped: The Computational Methods
NASA Astrophysics Data System (ADS)
Jenkins, C. J.
2016-02-01
What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.
NASA Astrophysics Data System (ADS)
Jasper, Cameron A.
Although aquifer recharge and recovery systems are a sustainable, decentralized, low cost, and low energy approach for the reclamation, treatment, and storage of post- treatment wastewater, they can suffer from poor infiltration rates and the development of a near-surface clogging layer within infiltration ponds. One such aquifer recharge and recovery system, the Aurora Water site in Colorado, U.S.A, functions at about 25% of its predicted capacity to recharge floodplain deposits by flooding infiltration ponds with post-treatment wastewater extracted from river bank aquifers along the South Platte River. The underwater self-potential method was developed to survey self-potential signals at the ground surface in a flooded infiltration pond for mapping infiltration pathways. A method for using heat as a groundwater tracer within the infiltration pond used an array of in situ high-resolution temperature sensing probes. Both relatively positive and negative underwater self-potential anomalies are consistent with observed recovery well pumping rates and specific discharge estimates from temperature data. Results from electrical resistivity tomography and electromagnetics surveys provide consistent electrical conductivity distributions associated with sediment textures. A lab method was developed for resistivity tests of near-surface sediment samples. Forward numerical modeling synthesizes the geophysical information to best match observed self- potential anomalies and provide permeability distributions, which is important for effective aquifer recharge and recovery system design, and optimization strategy development.
A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks
Costa, Daniel G.; Guedes, Luiz Affonso
2011-01-01
Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908
Alcohol Warning Label Awareness and Attention: A Multi-method Study.
Pham, Cuong; Rundle-Thiele, Sharyn; Parkinson, Joy; Li, Shanshi
2018-01-01
Evaluation of alcohol warning labels requires careful consideration ensuring that research captures more than awareness given that labels may not be prominent enough to attract attention. This study investigates attention of current in market alcohol warning labels and examines whether attention can be enhanced through theoretically informed design. Attention scores obtained through self-report methods are compared to objective measures (eye-tracking). A multi-method experimental design was used delivering four conditions, namely control, colour, size and colour and size. The first study (n = 559) involved a self-report survey to measure attention. The second study (n = 87) utilized eye-tracking to measure fixation count and duration and time to first fixation. Analysis of Variance (ANOVA) was utilized. Eye-tracking identified that 60% of participants looked at the current in market alcohol warning label while 81% looked at the optimized design (larger and red). In line with observed attention self-reported attention increased for the optimized design. The current study casts doubt on dominant practices (largely self-report), which have been used to evaluate alcohol warning labels. Awareness cannot be used to assess warning label effectiveness in isolation in cases where attention does not occur 100% of the time. Mixed methods permit objective data collection methodologies to be triangulated with surveys to assess warning label effectiveness. Attention should be incorporated as a measure in warning label effectiveness evaluations. Colour and size changes to the existing Australian warning labels aided by theoretically informed design increased attention. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.
Optimal observation network design for conceptual model discrimination and uncertainty reduction
NASA Astrophysics Data System (ADS)
Pham, Hai V.; Tsai, Frank T.-C.
2016-02-01
This study expands the Box-Hill discrimination function to design an optimal observation network to discriminate conceptual models and, in turn, identify a most favored model. The Box-Hill discrimination function measures the expected decrease in Shannon entropy (for model identification) before and after the optimal design for one additional observation. This study modifies the discrimination function to account for multiple future observations that are assumed spatiotemporally independent and Gaussian-distributed. Bayesian model averaging (BMA) is used to incorporate existing observation data and quantify future observation uncertainty arising from conceptual and parametric uncertainties in the discrimination function. In addition, the BMA method is adopted to predict future observation data in a statistical sense. The design goal is to find optimal locations and least data via maximizing the Box-Hill discrimination function value subject to a posterior model probability threshold. The optimal observation network design is illustrated using a groundwater study in Baton Rouge, Louisiana, to collect additional groundwater heads from USGS wells. The sources of uncertainty creating multiple groundwater models are geological architecture, boundary condition, and fault permeability architecture. Impacts of considering homoscedastic and heteroscedastic future observation data and the sources of uncertainties on potential observation areas are analyzed. Results show that heteroscedasticity should be considered in the design procedure to account for various sources of future observation uncertainty. After the optimal design is obtained and the corresponding data are collected for model updating, total variances of head predictions can be significantly reduced by identifying a model with a superior posterior model probability.
Optimal and Autonomous Control Using Reinforcement Learning: A Survey.
Kiumarsi, Bahare; Vamvoudakis, Kyriakos G; Modares, Hamidreza; Lewis, Frank L
2018-06-01
This paper reviews the current state of the art on reinforcement learning (RL)-based feedback control solutions to optimal regulation and tracking of single and multiagent systems. Existing RL solutions to both optimal and control problems, as well as graphical games, will be reviewed. RL methods learn the solution to optimal control and game problems online and using measured data along the system trajectories. We discuss Q-learning and the integral RL algorithm as core algorithms for discrete-time (DT) and continuous-time (CT) systems, respectively. Moreover, we discuss a new direction of off-policy RL for both CT and DT systems. Finally, we review several applications.
NASA Astrophysics Data System (ADS)
Liu, Derong; Huang, Yuzhu; Wang, Ding; Wei, Qinglai
2013-09-01
In this paper, an observer-based optimal control scheme is developed for unknown nonlinear systems using adaptive dynamic programming (ADP) algorithm. First, a neural-network (NN) observer is designed to estimate system states. Then, based on the observed states, a neuro-controller is constructed via ADP method to obtain the optimal control. In this design, two NN structures are used: a three-layer NN is used to construct the observer which can be applied to systems with higher degrees of nonlinearity and without a priori knowledge of system dynamics, and a critic NN is employed to approximate the value function. The optimal control law is computed using the critic NN and the observer NN. Uniform ultimate boundedness of the closed-loop system is guaranteed. The actor, critic, and observer structures are all implemented in real-time, continuously and simultaneously. Finally, simulation results are presented to demonstrate the effectiveness of the proposed control scheme.
NASA Astrophysics Data System (ADS)
Mulia, Iyan E.; Gusman, Aditya Riadi; Satake, Kenji
2017-12-01
Recently, there are numerous tsunami observation networks deployed in several major tsunamigenic regions. However, guidance on where to optimally place the measurement devices is limited. This study presents a methodological approach to select strategic observation locations for the purpose of tsunami source characterizations, particularly in terms of the fault slip distribution. Initially, we identify favorable locations and determine the initial number of observations. These locations are selected based on extrema of empirical orthogonal function (EOF) spatial modes. To further improve the accuracy, we apply an optimization algorithm called a mesh adaptive direct search to remove redundant measurement locations from the EOF-generated points. We test the proposed approach using multiple hypothetical tsunami sources around the Nankai Trough, Japan. The results suggest that the optimized observation points can produce more accurate fault slip estimates with considerably less number of observations compared to the existing tsunami observation networks.
An Optimal Current Observer for Predictive Current Controlled Buck DC-DC Converters
Min, Run; Chen, Chen; Zhang, Xiaodong; Zou, Xuecheng; Tong, Qiaoling; Zhang, Qiao
2014-01-01
In digital current mode controlled DC-DC converters, conventional current sensors might not provide isolation at a minimized price, power loss and size. Therefore, a current observer which can be realized based on the digital circuit itself, is a possible substitute. However, the observed current may diverge due to the parasitic resistors and the forward conduction voltage of the diode. Moreover, the divergence of the observed current will cause steady state errors in the output voltage. In this paper, an optimal current observer is proposed. It achieves the highest observation accuracy by compensating for all the known parasitic parameters. By employing the optimal current observer-based predictive current controller, a buck converter is implemented. The converter has a convergently and accurately observed inductor current, and shows preferable transient response than the conventional voltage mode controlled converter. Besides, costs, power loss and size are minimized since the strategy requires no additional hardware for current sensing. The effectiveness of the proposed optimal current observer is demonstrated experimentally. PMID:24854061
Teaching and learning communication skills in physiotherapy: what is done and how should it be done?
Parry, Ruth H; Brown, Kay
2009-12-01
To survey practice and opinion regarding school-based teaching of communication skills, to summarise relevant research evidence from physiotherapy and beyond, to reflect on practice in light of evidence, and to propose associated recommendations. Survey using customised questionnaires. Basic descriptive statistical analysis and thematic content analysis were used. The results were compared with evidence from systematic reviews to derive recommendations. SURVEY PARTICIPANTS AND SETTING: Educators in all UK centres delivering physiotherapy qualifying programmes in 2006. A response rate of 69% was achieved. The majority of respondents reported delivering communication-specific modules. Lecturing was common, and more experiential methods were also used. Assessment was mainly by written work. Educators commented on challenges and strategies involved in student engagement, provision of authentic experiences, availability of teaching time and expertise, and physiotherapy-specific teaching resources. Evidence from allied health profession, medical and nursing education research emphasises the importance of experiential teaching, formative feedback, observational assessment and a substantial evidence base on which to ground course content. In physiotherapy, the latter is emerging but incomplete. There are also gaps in direct evidence about advantages or otherwise of stand-alone modules and benefits of pre-qualification communication training. Evidence suggests that effective training requires substantial teaching time, expertise and a body of empirical research on specific communication practices and their effects. Curriculum designers and educators should endeavour to maximise the degree to which training in this area is experiential, provide training when students have already had some contact with patients, and assess students by observation if at all possible. Due to gaps in the evidence, some important questions about optimal practice remain unanswered.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
NASA Astrophysics Data System (ADS)
Arrighi, Chiara; Campo, Lorenzo
2017-04-01
In last years, the concern about the economical and lives loss due to urban floods has grown hand in hand with the numerical skills in simulating such events. The large amount of computational power needed in order to address the problem (simulating a flood in a complex terrain such as a medium-large city) is only one of the issues. Among them it is possible to consider the general lack of exhaustive observations during the event (exact extension, dynamic, water level reached in different parts of the involved area), needed for calibration and validation of the model, the need of considering the sewers effects, and the availability of a correct and precise description of the geometry of the problem. In large cities the topographic surveys are in general available with a number of points, but a complete hydraulic simulation needs a detailed description of the terrain on the whole computational domain. LIDAR surveys can achieve this goal, providing a comprehensive description of the terrain, although they often lack precision. In this work an optimal merging of these two sources of geometrical information, measured elevation points and LIDAR survey, is proposed, by taking into account the error variance of both. The procedure is applied to a flood-prone city over an area of 35 square km approximately starting with a DTM from LIDAR with a spatial resolution of 1 m, and 13000 measured points. The spatial pattern of the error (LIDAR vs points) is analysed, and the merging method is tested with a series of Jackknife procedures that take into account different densities of the available points. A discussion of the results is provided.
Ergonomic Microscope: Need of the Hour
Mhaske, Shubhangi Ashok; Ahmad, Malik Ajaz; Yuwanati, Monal B.; Prabhu, Shweta; Pardhe, Nilesh
2015-01-01
Background Prolonged use of conventional microscope develops musculo-skeletal injuries like chronic pain syndrome, including shoulder, neck, back aches & fatigue. Since the problems go unnoticed, the injuries can lead to some serious permanent damages. This further leads to a compromise in the health and welfare of the person and the institute. Hence, an understanding about the ergonomics is the need of the hour in this postmodern era. Inspite of few studies and surveys about ergonomics, there is still a steep rise in the musculoskeletal disorders. Aim of the Study The aim of our study was to gauge the general awareness of pathologists, microbiologists and oral pathologists towards ergonomics in their profession. Materials and Methods A cross-sectional survey based study was designed, which included a questionnaire. The questionnaire included multiple choice questions with four alternatives. Professionals (pathologists, microbiologists and oral pathologists) were included in the survey. Teaching faculty (Professors, Associate Professors and Lecturers) and Post graduate students formed the study group. Results and Observations The response to the questionnaire was 100%. Less than 50% of oral pathologists were aware of the importance of ergonomics in their profession. The most common site affected was neck and back. One of the drastic observations was that, Oral Pathologists suffered from a combination of problems affecting neck, back, eyes, headache, shoulders, arms and wrists. Conclusion Increase in our understanding regarding ergonomically designed microscopes can increase our efficiency and in turn improve our general well-being. With improvements in ergonomics, professionals would be able to modify and optimize their working conditions. Certain guidelines need to be followed by the professionals to reduce chances of musculoskeletal disorders. PMID:26155565
SERVS: the Spitzer Extragalactic Representative Volume Survey
NASA Astrophysics Data System (ADS)
Lacy, Mark; Afonso, Jose; Alexander, Dave; Best, Philip; Bonfield, David; Castro, Nieves; Cava, Antonio; Chapman, Scott; Dunlop, James; Dyke, Eleanor; Edge, Alastair; Farrah, Duncan; Ferguson, Harry; Foucaud, Sebastian; Franceschini, Alberto; Geach, Jim; Gonzales, Eduardo; Hatziminaoglou, Evanthia; Hickey, Samantha; Ivison, Rob; Jarvis, Matt; Le Fèvre, Olivier; Lonsdale, Carol; Maraston, Claudia; McLure, Ross; Mortier, Angela; Oliver, Seb; Ouchi, Masami; Parish, Glen; Perez-Fournon, Ismael; Petric, Andreea; Pierre, Mauguerite; Readhead, Tony; Ridgway, Susan; Romer, Katherine; Rottgering, Huub; Rowan-Robinson, Michael; Sajina, Anna; Seymour, Nick; Smail, Ian; Surace, Jason; Thomas, Peter; Trichas, Markos; Vaccari, Mattia; Verma, Aprajita; Xu, Kevin; van Kampen, Eelco
2008-12-01
We will use warm Spitzer to image 18deg^2 of sky to microJy depth. This is deep enough to undertake a complete census of massive galaxies from z~6 to ~1 in a volume ~0.8Gpc^3, large enough to overcome the effects of cosmic variance, which place severe limitations on the conclusions that can be drawn from smaller fields. We will greatly enhance the diagnostic power of the Spitzer data by performing most of this survey in the region covered by the near-IR VISTA-VIDEO survey, and in other areas covered by near-IR, Herschel and SCUBA2 surveys. We will build complete near-infrared spectral energy distributions using the superb datasets from VIDEO, in conjunction with our Spitzer data, to derive accurate photometric redshifts and the key properties of stellar mass and star formation rates for a large sample of high-z galaxies. Obscured star formation rates and dust-shrouded BH growth phases will be uncovered by combining the Spitzer data with the Herschel and SCUBA2 surveys. We will thus build a complete picture of the formation of massive galaxies from z~6, where only about 1% of the stars in massive galaxies have formed, to z~1 where ~50% of them haveE Our large volume will allow us to also find examples of rare objects such as high-z quasars (~10-100 at z>6.5), high-z galaxy clusters (~20 at z>1.5 with dark halo masses >10^14 solar masses), and evaluate how quasar activity and galaxy environment affect star formation. This survey makes nearly optimal use of warm Spitzer; (a) all of the complementary data is either taken or will be taken in the very near future, and will be immediately publicly accessible, (b) the slew overheads are relatively small, (c) the observations are deep enough to detect high redshift galaxies but not so deep that source confusion reduces the effective survey area.
The Great Observatories All-Sky LIRG Survey: Herschel Image Atlas and Aperture Photometry
NASA Astrophysics Data System (ADS)
Chu, Jason K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Díaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.
2017-04-01
Far-infrared images and photometry are presented for 201 Luminous and Ultraluminous Infrared Galaxies [LIRGs: log ({L}{IR}/{L}⊙ )=11.00{--}11.99, ULIRGs: log ({L}{IR}/{L}⊙ )=12.00{--}12.99], in the Great Observatories All-Sky LIRG Survey (GOALS), based on observations with the Herschel Space Observatory Photodetector Array Camera and Spectrometer (PACS) and the Spectral and Photometric Imaging Receiver (SPIRE) instruments. The image atlas displays each GOALS target in the three PACS bands (70, 100, and 160 μm) and the three SPIRE bands (250, 350, and 500 μm), optimized to reveal structures at both high and low surface brightness levels, with images scaled to simplify comparison of structures in the same physical areas of ˜100 × 100 kpc2. Flux densities of companion galaxies in merging systems are provided where possible, depending on their angular separation and the spatial resolution in each passband, along with integrated system fluxes (sum of components). This data set constitutes the imaging and photometric component of the GOALS Herschel OT1 observing program, and is complementary to atlases presented for the Hubble Space Telescope, Spitzer Space Telescope, and Chandra X-ray Observatory. Collectively, these data will enable a wide range of detailed studies of active galactic nucleus and starburst activity within the most luminous infrared galaxies in the local universe. Based on Herschel Space Observatory observations. Herschel is an ESA space observatory with science instruments provided by the European-led Principal Investigator consortia, and important participation from NASA.
[Acceptance of medical apps and e‑books among German radiologists].
Schleder, S; Dendl, L M; Niessen, C; Stroszczynski, C; Schreyer, A G
2017-09-01
Smartphones, tablet PCs, mobile applications (apps) and electronic book files (e-books) affect our lives in private and job-related settings. The aim of this study was to analyze the behavior of radiologists on smartphones, tablet PCs and e‑books and to investigate its effect on their daily work. An online survey containing of 23 questions was conducted using Survey Monkey© ( www.surveymonkey.com ). The invitation to the survey was done using the newsletter of the German Radiological Society (DRG). The acquired data was automatically stored by the software and then analyzed using descriptive statistics. In total, 104 radiologists (29% female) participated in the online survey. Of these, 93% and 96.5% owned a smartphone or a tablet PC, respectively, and 72% and 67% used medical apps and e‑books, respectively. Through their use, 31% found moderate and 41% found enormous improvement in their daily work. A majority of participating radiologists would be willing to pay an increased user fee for optimized apps or e‑books. With currently only moderate individual benefit of mobile medical apps and e‑books, there is a widespread need for optimally configured apps and e‑books with a correspondingly high market potential. (1) Radiologists use smartphones (93%) or tablet PCs (96.5%); (2) 72% of radiologists use a smartphone or tablet PC for medical material; (3) 53% of radiologists report significant assistance from or a high value of the mobile medical applications used; (4) There is a willingness to pay a license fee for optimized mobile applications or e‑books.
75 FR 4509 - Uniform Criteria for State Observational Surveys of Seat Belt Use
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-28
... establishing the criteria for designing and conducting State seat belt use observational surveys, procedures for obtaining NHTSA approval of survey designs, and a new form for reporting seat belt use rates to.... Assignment of Observation Times D. Observation Procedures E. Quality Control F. Computation of Estimates III...
Relative risk perception for terrorism: implications for preparedness and risk communication.
Caponecchia, Carlo
2012-09-01
Terrorism presents a significant risk that is often approached at public policy, infrastructure, or emergency management level. Public perceptions of the likelihood of terrorist events, and how this may relate to individual preparedness, are not always extensively examined. The tendency to think that negative events are less likely to happen to oneself than to the average person is known as optimism bias. Optimism bias is relevant to perceptions of terrorism, because it is thought to be related to a reduction in precaution use. Using an online survey of 164 participants, this study aimed to determine whether Sydney residents thought they had a lower likelihood of experiencing terrorist events than other Australians. Significant optimism bias was observed for witnessing terrorist events, but not for personally experiencing terrorist events. In addition, Sydney residents tended to think that terrorist attacks were more likely to occur in Sydney than another major Australian city in the next five years. At the same time, household and workplace preparedness for terrorism was quite low, as was awareness of emergency strategies in the central business district. Perceptions of high likelihood of terrorism happening in one's own city, yet low preparedness present a challenge for risk communication and emergency management strategies. The diversity of possible terrorist targets, and the simple plans that can moderate the effects of a disaster may need to be emphasized in future anti-terrorism initiatives. © 2012 Society for Risk Analysis.
Extracting DEM from airborne X-band data based on PolInSAR
NASA Astrophysics Data System (ADS)
Hou, X. X.; Huang, G. M.; Zhao, Z.
2015-06-01
Polarimetric Interferometric Synthetic Aperture Radar (PolInSAR) is a new trend of SAR remote sensing technology which combined polarized multichannel information and Interferometric information. It is of great significance for extracting DEM in some regions with low precision of DEM such as vegetation coverage area and building concentrated area. In this paper we describe our experiments with high-resolution X-band full Polarimetric SAR data acquired by a dual-baseline interferometric airborne SAR system over an area of Danling in southern China. Pauli algorithm is used to generate the double polarimetric interferometry data, Singular Value Decomposition (SVD), Numerical Radius (NR) and Phase diversity (PD) methods are used to generate the full polarimetric interferometry data. Then we can make use of the polarimetric interferometric information to extract DEM with processing of pre filtering , image registration, image resampling, coherence optimization, multilook processing, flat-earth removal, interferogram filtering, phase unwrapping, parameter calibration, height derivation and geo-coding. The processing system named SARPlore has been exploited based on VC++ led by Chinese Academy of Surveying and Mapping. Finally compared optimization results with the single polarimetric interferometry, it has been observed that optimization ways can reduce the interferometric noise and the phase unwrapping residuals, and improve the precision of DEM. The result of full polarimetric interferometry is better than double polarimetric interferometry. Meanwhile, in different terrain, the result of full polarimetric interferometry will have a different degree of increase.
Modelling rainfall interception by a lowland tropical rain forest in northeastern Puerto Rico
NASA Astrophysics Data System (ADS)
Schellekens, J.; Scatena, F. N.; Bruijnzeel, L. A.; Wickel, A. J.
1999-12-01
Recent surveys of tropical forest water use suggest that rainfall interception by the canopy is largest in wet maritime locations. To investigate the underlying processes at one such location—the Luquillo Experimental Forest in eastern Puerto Rico—66 days of detailed throughfall and above-canopy climatic data were collected in 1996 and analysed using the Rutter and Gash models of rainfall interception. Throughfall occurred on 80% of the days distributed over 80 rainfall events. Measured interception loss was 50% of gross precipitation. When Penman-Monteith based estimates for the wet canopy evaporation rate (0.11 mm h -1 on average) and a canopy storage of 1.15 mm were used, both models severely underestimated measured interception loss. A detailed analysis of four storms using the Rutter model showed that optimizing the model for the wet canopy evaporation component yielded much better results than increasing the canopy storage capacity. However, the Rutter model failed to properly estimate throughfall amounts during an exceptionally large event. The analytical model, on the other hand, was capable of representing interception during the extreme event, but once again optimizing wet canopy evaporation rates produced a much better fit than optimizing the canopy storage capacity. As such, the present results support the idea that it is primarily a high rate of evaporation from a wet canopy that is responsible for the observed high interception losses.
Yoon, Jong Lull; Cho, Jung Jin; Park, Kyung Mi; Noh, Hye Mi; Park, Yong Soon
2015-02-01
Associations between body mass index (BMI), body fat percentage (BF%), and health risks differ between Asian and European populations. BMI is commonly used to diagnose obesity; however, its accuracy in detecting adiposity in Koreans is unknown. The present cross-sectional study aimed at assessing the accuracy of BMI in determining BF%-defined obesity in 6,017 subjects (age 20-69 yr, 43.6% men) from the 2009 Korean National Health and Nutrition Examination Survey. We assessed the diagnostic performance of BMI using the Western Pacific Regional Office of World Health Organization reference standard for BF%-defined obesity by sex and age and identified the optimal BMI cut-off for BF%-defined obesity using receiver operating characteristic curve analysis. BMI-defined obesity (≥25 kg/m(2)) was observed in 38.7% of men and 28.1% of women, with a high specificity (89%, men; 84%, women) but poor sensitivity (56%, men; 72% women) for BF%-defined obesity (25.2%, men; 31.1%, women). The optimal BMI cut-off (24.2 kg/m(2)) had 78% sensitivity and 71% specificity. BMI demonstrated limited diagnostic accuracy for adiposity in Korea. There was a -1.3 kg/m(2) difference in optimal BMI cut-offs between Korea and America, smaller than the 5-unit difference between the Western Pacific Regional Office and global World Health Organization obesity criteria.
Alperet, Derrick Johnston; Lim, Wei-Yen; Mok-Kwee Heng, Derrick; Ma, Stefan; van Dam, Rob M
2016-10-01
To identify optimal anthropometric measures and cutoffs to identify undiagnosed diabetes mellitus (UDM) in three major Asian ethnic groups (Chinese, Malays, and Asian-Indians). Cross-sectional data were analyzed from 14,815 ethnic Chinese, Malay, and Asian-Indian participants of the Singapore National Health Surveys, which included anthropometric measures and an oral glucose tolerance test. Receiver operating characteristic curve analyses were used with calculation of the area under the curve (AUC) to evaluate the performance of body mass index (BMI), waist circumference (WC), waist-to-hip ratio (WHR), and waist-to-height ratio (WHTR) for the identification of UDM. BMI performed significantly worse (AUCMEN = 0.70; AUCWOMEN = 0.75) than abdominal measures, whereas WHTR (AUCMEN = 0.76; AUCWOMEN = 0.79) was among the best performing measures in both sexes and all ethnic groups. Anthropometric measures performed better in Chinese than in Asian-Indian participants for the identification of UDM. A WHTR cutoff of 0.52 appeared optimal with a sensitivity of 76% in men and 73% in women and a specificity of 63% in men and 70% in women. Although ethnic differences were observed in the performance of anthropometric measures for the identification of UDM, abdominal adiposity measures generally performed better than BMI, and WHTR performed best in all Asian ethnic groups. © 2016 The Obesity Society.
23 CFR 1340.13 - Annual reporting requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... survey design, alternate observation site selected subsequent to the original survey design), and by... survey design that was approved by NHTSA, in writing, as conforming to the Uniform Criteria for State Observational Surveys of Seat Belt Use, 23 CFR Part 1340; (3) The survey design has remained unchanged since the...
23 CFR 1340.13 - Annual reporting requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... survey design, alternate observation site selected subsequent to the original survey design), and by... survey design that was approved by NHTSA, in writing, as conforming to the Uniform Criteria for State Observational Surveys of Seat Belt Use, 23 CFR Part 1340; (3) The survey design has remained unchanged since the...
23 CFR 1340.13 - Annual reporting requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... survey design, alternate observation site selected subsequent to the original survey design), and by... survey design that was approved by NHTSA, in writing, as conforming to the Uniform Criteria for State Observational Surveys of Seat Belt Use, 23 CFR Part 1340; (3) The survey design has remained unchanged since the...
Dos Santos, Quenia; Sichieri, Rosely; Darmon, Nicole; Maillot, Matthieu; Verly-Junior, Eliseu
2018-06-01
To identify optimal food choices that meet nutritional recommendations to reduce prevalence of inadequate nutrient intakes. Linear programming was used to obtain an optimized diet with sixty-eight foods with the least difference from the observed population mean dietary intake while meeting a set of nutritional goals that included reduction in the prevalence of inadequate nutrient intakes to ≤20 %. Brazil. Participants (men and women, n 25 324) aged 20 years or more from the first National Dietary Survey (NDS) 2008-2009. Feasible solution to the model was not found when all constraints were imposed; infeasible nutrients were Ca, vitamins D and E, Mg, Zn, fibre, linolenic acid, monounsaturated fat and Na. Feasible solution was obtained after relaxing the nutritional constraints for these limiting nutrients by including a deviation variable in the model. Estimated prevalence of nutrient inadequacy was reduced by 60-70 % for most nutrients, and mean saturated and trans-fat decreased in the optimized diet meeting the model constraints. Optimized diet was characterized by increases especially in fruits (+92 g), beans (+64 g), vegetables (+43 g), milk (+12 g), fish and seafood (+15 g) and whole cereals (+14 g), and reductions of sugar-sweetened beverages (-90 g), rice (-63 g), snacks (-14 g), red meat (-13 g) and processed meat (-9·7 g). Linear programming is a unique tool to identify which changes in the current diet can increase nutrient intake and place the population at lower risk of nutrient inadequacy. Reaching nutritional adequacy for all nutrients would require major dietary changes in the Brazilian diet.
New Optimizations of Microcalorimeter Arrays for High-Resolution Imaging X-ray Spectroscopy
NASA Astrophysics Data System (ADS)
Kilbourne, Caroline
We propose to continue our successful research program in developing arrays of superconducting transition-edge sensors (TES) for x-ray astrophysics. Our standard 0.3 mm TES pixel achieves better than 2.5-eV resolution, and we now make 32x32 arrays of such pixels. We have also achieved better than 1-eV resolution in smaller pixels, and promising performance in a range of position-sensitive designs. We propose to continue to advance the designs of both the single-pixel and position-sensitive microcalorimeters so that we can produce arrays suitable for several x-ray spectroscopy observatories presently in formulation. We will also investigate various array and pixel optimizations such as would be needed for large arrays for surveys, large- pixel arrays for diffuse soft x-ray measurements, or sub-arrays of fast pixels optimized for neutron-star burst spectroscopy. In addition, we will develop fabrication processes for integrating sub-arrays with very different pixel designs into a monolithic focal-plane array to simplify the design of the focal-plane assembly and make feasible new detector configurations such as the one currently baselined for AXSIO. Through a series of measurements on test devices, we have improved our understanding of the weak-link physics governing the observed resistive transitions in TES detectors. We propose to build on that work and ultimately use the results to improve the immunity of the detector to environmental magnetic fields, as well as its fundamental performance, in each of the targeted optimizations we are developing.
Energy systems research and development for petroleum refineries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.L.
1982-08-01
For the past several years, Exxon Reasearch and Engineering has carried out a specific RandD program aimed at improving refinery energy efficiency through optimization of energy systems. Energy systems include: steam/power systems, heat exchange systems including hot oil and hot water belts and fuel systems, as well as some of the processes. This paper will describe the three major thrusts of this program which are: development of methods to support Site Energy Survey activities; development of energy management methods; and energy system optimization, which includes development of consistent, realistic, economic incentives for energy system improvements. Project selection criteria will alsomore » be discussed. The technique of a site energy survey will also be described briefly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masters, Daniel; Steinhardt, Charles; Faisst, Andreas
2015-11-01
Calibrating the photometric redshifts of ≳10{sup 9} galaxies for upcoming weak lensing cosmology experiments is a major challenge for the astrophysics community. The path to obtaining the required spectroscopic redshifts for training and calibration is daunting, given the anticipated depths of the surveys and the difficulty in obtaining secure redshifts for some faint galaxy populations. Here we present an analysis of the problem based on the self-organizing map, a method of mapping the distribution of data in a high-dimensional space and projecting it onto a lower-dimensional representation. We apply this method to existing photometric data from the COSMOS survey selectedmore » to approximate the anticipated Euclid weak lensing sample, enabling us to robustly map the empirical distribution of galaxies in the multidimensional color space defined by the expected Euclid filters. Mapping this multicolor distribution lets us determine where—in galaxy color space—redshifts from current spectroscopic surveys exist and where they are systematically missing. Crucially, the method lets us determine whether a spectroscopic training sample is representative of the full photometric space occupied by the galaxies in a survey. We explore optimal sampling techniques and estimate the additional spectroscopy needed to map out the color–redshift relation, finding that sampling the galaxy distribution in color space in a systematic way can efficiently meet the calibration requirements. While the analysis presented here focuses on the Euclid survey, similar analysis can be applied to other surveys facing the same calibration challenge, such as DES, LSST, and WFIRST.« less
Participation and contribution in crowdsourced surveys.
Swain, Robert; Berger, Alex; Bongard, Josh; Hines, Paul
2015-01-01
This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times.
Alegana, Victor A; Wright, Jim; Bosco, Claudio; Okiro, Emelda A; Atkinson, Peter M; Snow, Robert W; Tatem, Andrew J; Noor, Abdisalan M
2017-11-21
One pillar to monitoring progress towards the Sustainable Development Goals is the investment in high quality data to strengthen the scientific basis for decision-making. At present, nationally-representative surveys are the main source of data for establishing a scientific evidence base, monitoring, and evaluation of health metrics. However, little is known about the optimal precisions of various population-level health and development indicators that remains unquantified in nationally-representative household surveys. Here, a retrospective analysis of the precision of prevalence from these surveys was conducted. Using malaria indicators, data were assembled in nine sub-Saharan African countries with at least two nationally-representative surveys. A Bayesian statistical model was used to estimate between- and within-cluster variability for fever and malaria prevalence, and insecticide-treated bed nets (ITNs) use in children under the age of 5 years. The intra-class correlation coefficient was estimated along with the optimal sample size for each indicator with associated uncertainty. Results suggest that the estimated sample sizes for the current nationally-representative surveys increases with declining malaria prevalence. Comparison between the actual sample size and the modelled estimate showed a requirement to increase the sample size for parasite prevalence by up to 77.7% (95% Bayesian credible intervals 74.7-79.4) for the 2015 Kenya MIS (estimated sample size of children 0-4 years 7218 [7099-7288]), and 54.1% [50.1-56.5] for the 2014-2015 Rwanda DHS (12,220 [11,950-12,410]). This study highlights the importance of defining indicator-relevant sample sizes to achieve the required precision in the current national surveys. While expanding the current surveys would need additional investment, the study highlights the need for improved approaches to cost effective sampling.
Participation and Contribution in Crowdsourced Surveys
Swain, Robert; Berger, Alex; Bongard, Josh; Hines, Paul
2015-01-01
This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times. PMID:25837602
Prospective Relationships Between Physical Activity and Optimism in Young and Mid-aged Women.
Pavey, Toby G; Burton, Nicola W; Brown, Wendy J
2015-07-01
There is growing evidence that regular physical activity (PA) reduces the risk of poor mental health. Less research has focused on the relationship between PA and positive wellbeing. The study aims were to assess the prospective associations between PA and optimism, in both young and mid-aged women. 9688 young women (born 1973-1978) completed self-report surveys in 2000 (age 22 to 27), 2003, 2006, and 2009; and 11,226 mid-aged women (born 1946-1951) completed surveys in 2001 (age 50-55) 2004, 2007, and 2010, as part of the Australian Longitudinal Study on Women's Health. Generalized estimating equation models (with 3-year time lag) were used to examine the relationship between PA and optimism in both cohorts. In both cohorts, women reporting higher levels of PA had greater odds of reporting higher optimism over the 9-year period, (young, OR = 5.04, 95% CI: 3.85-6.59; mid-age, OR = 5.77, 95% CI: 4.76-7.00) than women who reported no PA. Odds were attenuated in adjusted models, with depression accounting for a large amount of this attenuation (young, OR = 2.00, 95% CI: 1.57-2.55; mid-age, OR = 1.64 95% CI: 1.38-1.94). Physical activity can promote optimism in young and mid-aged women over time, even after accounting for the negative effects of other psychosocial indicators such as depression.
Wolf, Joshua; Sun, Yilun; Tang, Li; Newland, Jason G; Gerber, Jeffrey S; Van Dyke, Christie J; Hymes, Saul R; Yu, Diana; Carias, Delia C; Bryant, Penelope A
2016-03-01
We undertook a cross-sectional survey of antimicrobial stewardship clinicians in North America and Australasia regarding practices, goals, and barriers to implementation of stewardship for pediatric oncology patients. Goals and barriers were similar regardless of clinician or institutional characteristics and geographic location. Strategies addressing these factors could help optimize antimicrobial use.
1992-03-01
setting of sub- optimal goals and quotas, barriers between departments, and awarding contracts primarily on price are all anti-TQM practices that hinder...customer focus, the setting of sub- optimal goals and quotas, barriers between departments, and awarding contracts primarily on price are all anti-TQM/L...surveys are often required to determine if a lower competitive price could be achieved before exercising options. This requirement is a sub- optimal
NASA Astrophysics Data System (ADS)
Cowton, L. R.; Neufeld, J. A.; Bickle, M.; White, N.; White, J.; Chadwick, A.
2017-12-01
Vertically-integrated gravity current models enable computationally efficient simulations of CO2 flow in sub-surface reservoirs. These simulations can be used to investigate the properties of reservoirs by minimizing differences between observed and modeled CO2 distributions. At the Sleipner project, about 1 Mt yr-1 of supercritical CO2 is injected at a depth of 1 km into a pristine saline aquifer with a thick shale caprock. Analysis of time-lapse seismic reflection surveys shows that CO2 is distributed within 9 discrete layers. The trapping mechanism comprises a stacked series of 1 m thick, impermeable shale horizons that are spaced at 30 m intervals through the reservoir. Within the stratigraphically highest reservoir layer, Layer 9, a submarine channel deposit has been mapped on the pre-injection seismic survey. Detailed measurements of the three-dimensional CO2 distribution within Layer 9 have been made using seven time-lapse surveys, providing a useful benchmark against which numerical flow simulations can be tested. Previous simulations have, in general, been largely unsuccessful in matching the migration rate of CO2 in this layer. Here, CO2 flow within Layer 9 is modeled as a vertically-integrated gravity current that spreads beneath a structurally complex caprock using a two-dimensional grid, considerably increasing computational efficiency compared to conventional three-dimensional simulators. This flow model is inverted to find the optimal reservoir permeability in Layer 9 by minimizing the difference between observed and predicted distributions of CO2 as a function of space and time. A three parameter inverse model, comprising reservoir permeability, channel permeability and channel width, is investigated by grid search. The best-fitting reservoir permeability is 3 Darcys, which is consistent with measurements made on core material from the reservoir. Best-fitting channel permeability is 26 Darcys. Finally, the ability of this simplified numerical model to forecast CO2 flow within Layer 9 is tested. Permeability recovered by modeling a suite of early seismic surveys is used to predict the CO2 distribution for a suite of later seismic surveys with a considerable degree of success. Forecasts have also been carried out that can be tested using future seismic surveys.
Observing Strategies for Focused Orbital Debris Surveys Using the Magellan Telescope
NASA Technical Reports Server (NTRS)
Frith, James; Cowardin, Heather; Buckalew, Brent; Anz-Meador, Phillip; Lederer, Susan; Matney, Mark
2017-01-01
A breakup of the Titan 3C-17 Transtage rocket body was reported to have occurred on June 4th, 2014 at 02:38 UT by the Space Surveillance Network (SSN). Five objects were associated with this breakup and this is the fourth breakup known for this class of object. There are likely many more objects associated with this event that are not within the Space Surveillance Network's ability to detect and have not been catalogued. Several months after the breakup, observing time was obtained on the Magellan Baade 6.5 meter telescope to be used for observations of geosynchronous (GEO) space debris targets. Using the NASA Standard Satellite Breakup Model (SSBM), a simulated debris cloud of the recent Transtage breakup was produced and propagated forward in time. This provided right ascension, declination, and tracking rate predictions for where debris associated with this breakup may be more likely to be found in the sky over Magellan for our observing run. Magellan observations were then optimized using the angles and tracking rates from the model predictions to focus the search for Transtage debris. Data were collected and analysed and preliminary comparisons made between the number of objects detected and the number expected from the model. We present our results here.
An optical to IR sky brightness model for the LSST
NASA Astrophysics Data System (ADS)
Yoachim, Peter; Coughlin, Michael; Angeli, George Z.; Claver, Charles F.; Connolly, Andrew J.; Cook, Kem; Daniel, Scott; Ivezić, Željko; Jones, R. Lynne; Petry, Catherine; Reuter, Michael; Stubbs, Christopher; Xin, Bo
2016-07-01
To optimize the observing strategy of a large survey such as the LSST, one needs an accurate model of the night sky emission spectrum across a range of atmospheric conditions and from the near-UV to the near-IR. We have used the ESO SkyCalc Sky Model Calculator1, 2 to construct a library of template spectra for the Chilean night sky. The ESO model includes emission from the upper and lower atmosphere, scattered starlight, scattered moonlight, and zodiacal light. We have then extended the ESO templates with an empirical fit to the twilight sky emission as measured by a Canon all-sky camera installed at the LSST site. With the ESO templates and our twilight model we can quickly interpolate to any arbitrary sky position and date and return the full sky spectrum or surface brightness magnitudes in the LSST filter system. Comparing our model to all-sky observations, we find typical residual RMS values of +/-0.2-0.3 magnitudes per square arcsecond.
The S-054 X-ray telescope experiment on Skylab
NASA Technical Reports Server (NTRS)
Vaiana, G. S.; Van Speybroeck, L.; Zombeck, M. V.; Krieger, A. S.; Silk, J. K.; Timothy, A.
1977-01-01
A description of the S-054 X-ray telescope on Skylab is presented with a discussion of the experimental objectives, observing program, data reduction and analysis. Some results from the Skylab mission are given. The telescope photographically records high-resolution images of the solar corona in several broadband regions of the soft X-ray spectrum. It includes an objective grating used to study the line spectrum. The spatial resolution, sensitivity, dynamic range and time resolution of the instrument were chosen to survey a wide variety of solar phenomena. It embodies improvements in design, fabrication, and calibration techniques which were developed over a ten-year period. The observing program was devised to optimize the use of the instrument and to provide studies on a wide range of time scales. The data analysis program includes morphological studies and quantitative analysis using digitized images. A small sample of the data obtained in the mission is presented to demonstrate the type of information that is available and the kinds of results that can be obtained from it.
Gigahertz-peaked Spectra Pulsars and Thermal Absorption Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kijak, J.; Basu, R.; Lewandowski, W.
2017-05-10
We present the results of our radio interferometric observations of pulsars at 325 and 610 MHz using the Giant Metrewave Radio Telescope. We used the imaging method to estimate the flux densities of several pulsars at these radio frequencies. The analysis of the shapes of the pulsar spectra allowed us to identify five new gigahertz-peaked spectra (GPS) pulsars. Using the hypothesis that the spectral turnovers are caused by thermal free–free absorption in the interstellar medium, we modeled the spectra of all known objects of this kind. Using the model, we were able to put some observational constraints on the physicalmore » parameters of the absorbing matter, which allows us to distinguish between the possible sources of absorption. We also discuss the possible effects of the existence of GPS pulsars on future search surveys, showing that the optimal frequency range for finding such objects would be from a few GHz (for regular GPS sources) to possibly 10 GHz for pulsars and radio magnetars exhibiting very strong absorption.« less
Peters, Megan A K; Lau, Hakwan
2015-01-01
Many believe that humans can ‘perceive unconsciously’ – that for weak stimuli, briefly presented and masked, above-chance discrimination is possible without awareness. Interestingly, an online survey reveals that most experts in the field recognize the lack of convincing evidence for this phenomenon, and yet they persist in this belief. Using a recently developed bias-free experimental procedure for measuring subjective introspection (confidence), we found no evidence for unconscious perception; participants’ behavior matched that of a Bayesian ideal observer, even though the stimuli were visually masked. This surprising finding suggests that the thresholds for subjective awareness and objective discrimination are effectively the same: if objective task performance is above chance, there is likely conscious experience. These findings shed new light on decades-old methodological issues regarding what it takes to consider a neurobiological or behavioral effect to be 'unconscious,' and provide a platform for rigorously investigating unconscious perception in future studies. DOI: http://dx.doi.org/10.7554/eLife.09651.001 PMID:26433023
The ARIEL mission reference sample
NASA Astrophysics Data System (ADS)
Zingales, Tiziano; Tinetti, Giovanna; Pillitteri, Ignazio; Leconte, Jérémy; Micela, Giuseppina; Sarkar, Subhajit
2018-02-01
The ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) mission concept is one of the three M4 mission candidates selected by the European Space Agency (ESA) for a Phase A study, competing for a launch in 2026. ARIEL has been designed to study the physical and chemical properties of a large and diverse sample of exoplanets and, through those, understand how planets form and evolve in our galaxy. Here we describe the assumptions made to estimate an optimal sample of exoplanets - including already known exoplanets and expected ones yet to be discovered - observable by ARIEL and define a realistic mission scenario. To achieve the mission objectives, the sample should include gaseous and rocky planets with a range of temperatures around stars of different spectral type and metallicity. The current ARIEL design enables the observation of ˜1000 planets, covering a broad range of planetary and stellar parameters, during its four year mission lifetime. This nominal list of planets is expected to evolve over the years depending on the new exoplanet discoveries.
Three-reflections telescope proposal as flat-field anastigmat for wide field observations at Dome C
NASA Astrophysics Data System (ADS)
Ferrari, M.; Lemaître, G.; Viotti, R.; La Padula, C.; Comte, G.; Blanc, M.; Boer, M.
It is now evident that the exceptional seeing at Dome C will allow, in the next years, to pursue astronomical programs with conditions better than at any other observatory in the world, and very close to space experiments. Considering a new type of wide-field telescope, particular astronomical programs could be well optimized for observations at Dome C such as surveys for the discovery and follow up of near-Earth asteroids, search for extra-solar planets using transit or micro-lensing events, and stellar luminosity variations. We propose to build a 1.5 2m class three-reflections telescope, with 1 1.5degree FOV, four times shorter than an equivalent Schmidt telescope, and providing a flat field without requiring a triplet- or quadruplet-lens corrector since its design is anastigmatic. We present the preliminary optical tests of such designs: MINITRUST1 and 2 are two 45cm identical prototypes based in France and Italy, and manufactured using active optics techniques.
Optimization of the MINERVA Exoplanet Search Strategy via Simulations
NASA Astrophysics Data System (ADS)
Nava, Chantell; Johnson, Samson; McCrady, Nate; Minerva
2015-01-01
Detection of low-mass exoplanets requires high spectroscopic precision and high observational cadence. MINERVA is a dedicated observatory capable of sub meter-per-second radial velocity precision. As a dedicated observatory, MINERVA can observe with every-clear-night cadence that is essential for low-mass exoplanet detection. However, this cadence complicates the determination of an optimal observing strategy. We simulate MINERVA observations to optimize our observing strategy and maximize exoplanet detections. A dispatch scheduling algorithm provides observations of MINERVA targets every day over a three-year observing campaign. An exoplanet population with a distribution informed by Kepler statistics is assigned to the targets, and radial velocity curves induced by the planets are constructed. We apply a correlated noise model that realistically simulates stellar astrophysical noise sources. The simulated radial velocity data is fed to the MINERVA planet detection code and the expected exoplanet yield is calculated. The full simulation provides a tool to test different strategies for scheduling observations of our targets and optimizing the MINERVA exoplanet search strategy.
Nash equilibrium and multi criterion aerodynamic optimization
NASA Astrophysics Data System (ADS)
Tang, Zhili; Zhang, Lianhe
2016-06-01
Game theory and its particular Nash Equilibrium (NE) are gaining importance in solving Multi Criterion Optimization (MCO) in engineering problems over the past decade. The solution of a MCO problem can be viewed as a NE under the concept of competitive games. This paper surveyed/proposed four efficient algorithms for calculating a NE of a MCO problem. Existence and equivalence of the solution are analyzed and proved in the paper based on fixed point theorem. Specific virtual symmetric Nash game is also presented to set up an optimization strategy for single objective optimization problems. Two numerical examples are presented to verify proposed algorithms. One is mathematical functions' optimization to illustrate detailed numerical procedures of algorithms, the other is aerodynamic drag reduction of civil transport wing fuselage configuration by using virtual game. The successful application validates efficiency of algorithms in solving complex aerodynamic optimization problem.
Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien
2014-09-01
Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.
Designing a space-based galaxy redshift survey to probe dark energy
NASA Astrophysics Data System (ADS)
Wang, Yun; Percival, Will; Cimatti, Andrea; Mukherjee, Pia; Guzzo, Luigi; Baugh, Carlton M.; Carbone, Carmelita; Franzetti, Paolo; Garilli, Bianca; Geach, James E.; Lacey, Cedric G.; Majerotto, Elisabetta; Orsi, Alvaro; Rosati, Piero; Samushia, Lado; Zamorani, Giovanni
2010-12-01
A space-based galaxy redshift survey would have enormous power in constraining dark energy and testing general relativity, provided that its parameters are suitably optimized. We study viable space-based galaxy redshift surveys, exploring the dependence of the Dark Energy Task Force (DETF) figure-of-merit (FoM) on redshift accuracy, redshift range, survey area, target selection and forecast method. Fitting formulae are provided for convenience. We also consider the dependence on the information used: the full galaxy power spectrum P(k), P(k) marginalized over its shape, or just the Baryon Acoustic Oscillations (BAO). We find that the inclusion of growth rate information (extracted using redshift space distortion and galaxy clustering amplitude measurements) leads to a factor of ~3 improvement in the FoM, assuming general relativity is not modified. This inclusion partially compensates for the loss of information when only the BAO are used to give geometrical constraints, rather than using the full P(k) as a standard ruler. We find that a space-based galaxy redshift survey covering ~20000deg2 over with σz/(1 + z) <= 0.001 exploits a redshift range that is only easily accessible from space, extends to sufficiently low redshifts to allow both a vast 3D map of the universe using a single tracer population, and overlaps with ground-based surveys to enable robust modelling of systematic effects. We argue that these parameters are close to their optimal values given current instrumental and practical constraints.
NASA Astrophysics Data System (ADS)
Haapasalo, Erkka; Pellonpää, Juha-Pekka
2017-12-01
Various forms of optimality for quantum observables described as normalized positive-operator-valued measures (POVMs) are studied in this paper. We give characterizations for observables that determine the values of the measured quantity with probabilistic certainty or a state of the system before or after the measurement. We investigate observables that are free from noise caused by classical post-processing, mixing, or pre-processing of quantum nature. Especially, a complete characterization of pre-processing and post-processing clean observables is given, and necessary and sufficient conditions are imposed on informationally complete POVMs within the set of pure states. We also discuss joint and sequential measurements of optimal quantum observables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albano Farias, L.; Stephany, J.
2010-12-15
We analyze the statistics of observables in continuous-variable (CV) quantum teleportation in the formalism of the characteristic function. We derive expressions for average values of output-state observables, in particular, cumulants which are additive in terms of the input state and the resource of teleportation. Working with a general class of teleportation resources, the squeezed-bell-like states, which may be optimized in a free parameter for better teleportation performance, we discuss the relation between resources optimal for fidelity and those optimal for different observable averages. We obtain the values of the free parameter of the squeezed-bell-like states which optimize the central momentamore » and cumulants up to fourth order. For the cumulants the distortion between in and out states due to teleportation depends only on the resource. We obtain optimal parameters {Delta}{sub (2)}{sup opt} and {Delta}{sub (4)}{sup opt} for the second- and fourth-order cumulants, which do not depend on the squeezing of the resource. The second-order central momenta, which are equal to the second-order cumulants, and the photon number average are also optimized by the resource with {Delta}{sub (2)}{sup opt}. We show that the optimal fidelity resource, which has been found previously to depend on the characteristics of input, approaches for high squeezing to the resource that optimizes the second-order momenta. A similar behavior is obtained for the resource that optimizes the photon statistics, which is treated here using the sum of the squared differences in photon probabilities of input versus output states as the distortion measure. This is interpreted naturally to mean that the distortions associated with second-order momenta dominate the behavior of the output state for large squeezing of the resource. Optimal fidelity resources and optimal photon statistics resources are compared, and it is shown that for mixtures of Fock states both resources are equivalent.« less
Decision making with regard to antiviral intervention during an influenza pandemic.
Shim, Eunha; Chapman, Gretchen B; Galvani, Alison P
2010-01-01
Antiviral coverage is defined by the proportion of the population that takes antiviral prophylaxis or treatment. High coverage of an antiviral drug has epidemiological and evolutionary repercussions. Antivirals select for drug resistance within the population, and individuals may experience adverse effects. To determine optimal antiviral coverage in the context of an influenza outbreak, we compared 2 perspectives: 1) the individual level (the Nash perspective), and 2) the population level (utilitarian perspective). We developed an epidemiological game-theoretic model of an influenza pandemic. The data sources were published literature and a national survey. The target population was the US population. The time horizon was 6 months. The perspective was individuals and the population overall. The interventions were antiviral prophylaxis and treatment. The outcome measures were the optimal coverage of antivirals in an influenza pandemic. At current antiviral pricing, the optimal Nash strategy is 0% coverage for prophylaxis and 30% coverage for treatment, whereas the optimal utilitarian strategy is 19% coverage for prophylaxis and 100% coverage for treatment. Subsidizing prophylaxis by $440 and treatment by $85 would bring the Nash and utilitarian strategies into alignment. For both prophylaxis and treatment, the optimal antiviral coverage decreases as pricing of antivirals increases. Our study does not incorporate the possibility of an effective vaccine and lacks probabilistic sensitivity analysis. Our survey also does not completely represent the US population. Because our model assumes a homogeneous population and homogeneous antiviral pricing, it does not incorporate heterogeneity of preference. The optimal antiviral coverage from the population perspective and individual perspectives differs widely for both prophylaxis and treatment strategies. Optimal population and individual strategies for prophylaxis and treatment might be aligned through subsidization.
Decision Making with Regard to Antiviral Intervention during an Influenza Pandemic
Shim, Eunha; Chapman, Gretchen B.; Galvani, Alison P.
2012-01-01
Background Antiviral coverage is defined by the proportion of the population that takes antiviral prophylaxis or treatment. High coverage of an antiviral drug has epidemiological and evolutionary repercussions. Antivirals select for drug resistance within the population, and individuals may experience adverse effects. To determine optimal antiviral coverage in the context of an influenza outbreak, we compared 2 perspectives: 1) the individual level (the Nash perspective), and 2) the population level (utilitarian perspective). Methods We developed an epidemiological game-theoretic model of an influenza pandemic. The data sources were published literature and a national survey. The target population was the US population. The time horizon was 6 months. The perspective was individuals and the population overall. The interventions were antiviral prophylaxis and treatment. The outcome measures were the optimal coverage of antivirals in an influenza pandemic. Results At current antiviral pricing, the optimal Nash strategy is 0% coverage for prophylaxis and 30% coverage for treatment, whereas the optimal utilitarian strategy is 19% coverage for prophylaxis and 100% coverage for treatment. Subsidizing prophylaxis by $440 and treatment by $85 would bring the Nash and utilitarian strategies into alignment. For both prophylaxis and treatment, the optimal antiviral coverage decreases as pricing of antivirals increases. Our study does not incorporate the possibility of an effective vaccine and lacks probabilistic sensitivity analysis. Our survey also does not completely represent the US population. Because our model assumes a homogeneous population and homogeneous antiviral pricing, it does not incorporate heterogeneity of preference. Conclusions The optimal antiviral coverage from the population perspective and individual perspectives differs widely for both prophylaxis and treatment strategies. Optimal population and individual strategies for prophylaxis and treatment might be aligned through subsidization. PMID:20634545
Multi-Objective Optimization for Trustworthy Tactical Networks: A Survey and Insights
2013-06-01
existing data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding...problems: using repeated cooperative games [12], hedonic games [25], and nontransferable utility cooperative games [27]. It should be noted that trust...examined an optimal task allocation problem in a distributed computing system where program modules need to be allocated to different processors to
An Optical and Sunyaev-Zeldovich Blind Cluster Survey
NASA Astrophysics Data System (ADS)
Gomez, Percy; Romer, A. Kathy; Holzapfel, William; Peterson, Jeffrey; Ruhl, John; Goldstein, Jon; Daub, Mike
2005-08-01
We propose to perform multicolor observations of two deep fields that were observed with the ACBAR bolometer array located at the South Pole. These fields were observed down to a sensitivity of 8 microK/5 arcmin beam at 150 GHz. These observations will be used as control fields for our blind cluster survey which has identified some 30 cluster candidates to date. The goal of the observations is to quantify the number of clusters missed by our SZE survey. This information is important in order to derive constraints on sigma-8 from our SZE blind cluster survey.
Concordance of chart and billing data with direct observation in dental practice.
Demko, Catherine A; Victoroff, Kristin Zakariasen; Wotman, Stephen
2008-10-01
The commonly used methods of chart review, billing data summaries and practitioner self-reporting have not been examined for their ability to validly and reliably represent time use and service delivery in routine dental practice. A more thorough investigation of these data sources would provide insight into the appropriateness of each approach for measuring various clinical behaviors. The aim of this study was to assess the validity of commonly used methods such as dental chart review, billing data, or practitioner self-report compared with a 'gold standard' of information derived from direct observation of routine dental visits. A team of trained dental hygienists directly observed 3751 patient visits in 120 dental practices and recorded the behaviors and procedures performed by dentists and hygienists during patient contact time. Following each visit, charts and billing records were reviewed for the performed and billed procedures. Dental providers characterized their frequency of preventive service delivery through self-administered surveys. We standardized the observation and abstraction methods to obtain optimal measures from each of the multiple data sources. Multi-rater kappa coefficients were computed to monitor standardization, while sensitivity, specificity, and kappa coefficients were calculated to compare the various data sources with direct observation. Chart audits were more sensitive than billing data for all observed procedures and demonstrated higher agreement with directly observed data. Chart and billing records were not sensitive for several prevention-related tasks (oral cancer screening and oral hygiene instruction). Provider self-reports of preventive behaviors were always over-estimated compared with direct observation. Inter-method reliability kappa coefficients for 13 procedures ranged from 0.197 to 0.952. These concordance findings suggest that strengths and weaknesses of data collection sources should be considered when investigating delivery of dental services especially when using practitioner survey data. Future investigations can more fully rely on charted information rather than billing data and provider self-report for most dental procedures, but nonbillable procedures and most counseling interactions will not be captured with routine charting and billing practices.
Self-esteem and optimism in rural youth: gender differences.
Puskar, Kathryn R; Bernardo, Lisa Marie; Ren, Dianxu; Haley, Tammy M; Tark, Kirsti Hetager; Switala, Joann; Siemon, Linda
2010-01-01
To identify and describe gender-related differences in the self-esteem and optimism levels of rural adolescents. Self-esteem and optimism have been broadly examined and are associated with health-practices, social interaction, attachment, resiliency, and personal identity. Information describing the relationship of self-esteem and optimism as it relates to gender is limited. Using a cross-sectional survey design, students (N = 193) from three high-schools in rural Pennsylvania, USA completed the Rosenberg Self-Esteem Scale and the Optimism Scale-Life Orientation Test-Revised as part of a National Institute of Health, National Institute of Nursing Research funded study. Both instruments' mean scores were in the range of average for this population, with females scoring lower than males in both self-esteem (p < 0.0001) and optimism (p < 0.0001). The results of this study have nursing implications for evidenced based interventions that target self-esteem and optimism. Attention to self-esteem and optimism in female youth is recommended.
Visual aids for aerial observers on forest insect surveys.
A.T. Larsen
1957-01-01
Aerial surveys are widely used to detect, appraise, and map damage caused to forest trees by insects. The success of these surveys largely depends upon the ability of observers to distinguish differences in foliage color and tree condition. The observers' ability is influenced by several factors.
Validity of the language development survey in infants born preterm.
Beaulieu-Poulin, Camille; Simard, Marie-Noëlle; Babakissa, Hélène; Lefebvre, Francine; Luu, Thuy Mai
2016-07-01
Preterm infants are at greater risk of language delay. Early identification of language delay is essential to improve functional outcome in these children. To examine the concurrent validity of Rescorla's Language Development Survey and the Bayley Scales of Infant and Toddler Development (Bayley-III) at 18months corrected age in preterm infants. Test accuracy study. 189 preterm infants born <29weeks were assessed at 18months. The Language Development Survey, a parent-reported screening instrument, was administered in French concurrently with the Language Scales of the Bayley-III. Receiver-Operating-Characteristics curves were used to determine optimal cut-off score on the Language Development Survey to identify Bayley-III score <85. Sensitivity, specificity, positive and negative predictive values, and κ coefficient were calculated. Using Rescorla's original cut-off scores of ≤10 words for boys and ≤24 for girls, sensitivity was 76% and 88% for boys and girls, respectively, and specificity was 73% and 52% for boys and girls, respectively, in identifying language delay as per the Bayley-III. The optimal threshold was ≤10 words for both boys and girls. In girls, lowering the cut-off score decreased sensitivity (79%), but improved specificity (82%), thus lowering the number of false-positives. Our findings support using the Language Development Survey as an expressive language screener in preterm infants. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Galaxy redshift surveys with sparse sampling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Chi-Ting; Wullstein, Philipp; Komatsu, Eiichiro
2013-12-01
Survey observations of the three-dimensional locations of galaxies are a powerful approach to measure the distribution of matter in the universe, which can be used to learn about the nature of dark energy, physics of inflation, neutrino masses, etc. A competitive survey, however, requires a large volume (e.g., V{sub survey} ∼ 10Gpc{sup 3}) to be covered, and thus tends to be expensive. A ''sparse sampling'' method offers a more affordable solution to this problem: within a survey footprint covering a given survey volume, V{sub survey}, we observe only a fraction of the volume. The distribution of observed regions should bemore » chosen such that their separation is smaller than the length scale corresponding to the wavenumber of interest. Then one can recover the power spectrum of galaxies with precision expected for a survey covering a volume of V{sub survey} (rather than the volume of the sum of observed regions) with the number density of galaxies given by the total number of observed galaxies divided by V{sub survey} (rather than the number density of galaxies within an observed region). We find that regularly-spaced sampling yields an unbiased power spectrum with no window function effect, and deviations from regularly-spaced sampling, which are unavoidable in realistic surveys, introduce calculable window function effects and increase the uncertainties of the recovered power spectrum. On the other hand, we show that the two-point correlation function (pair counting) is not affected by sparse sampling. While we discuss the sparse sampling method within the context of the forthcoming Hobby-Eberly Telescope Dark Energy Experiment, the method is general and can be applied to other galaxy surveys.« less
Bulk Data Dissemination in Low Power Sensor Networks: Present and Future Directions
Xu, Zhirong; Hu, Tianlei; Song, Qianshu
2017-01-01
Wireless sensor network-based (WSN-based) applications need an efficient and reliable data dissemination service to facilitate maintenance, management and data distribution tasks. As WSNs nowadays are becoming pervasive and data intensive, bulk data dissemination protocols have been extensively studied recently. This paper provides a comprehensive survey of the state-of-the-art bulk data dissemination protocols. The large number of papers available in the literature propose various techniques to optimize the dissemination protocols. Different from the existing survey works which separately explores the building blocks of dissemination, our work categorizes the literature according to the optimization purposes: Reliability, Scalability and Transmission/Energy efficiency. By summarizing and reviewing the key insights and techniques, we further discuss on the future directions for each category. Our survey helps unveil three key findings for future direction: (1) The recent advances in wireless communications (e.g., study on cross-technology interference, error estimating codes, constructive interference, capture effect) can be potentially exploited to support further optimization on the reliability and energy efficiency of dissemination protocols; (2) Dissemination in multi-channel, multi-task and opportunistic networks requires more efforts to fully exploit the spatial-temporal network resources to enhance the data propagation; (3) Since many designs incur changes on MAC layer protocols, the co-existence of dissemination with other network protocols is another problem left to be addressed. PMID:28098830
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; Uram, Thomas D.; Benson, Andrew J.; Campbell, Duncan; Cora, Sofía A.; DeRose, Joseph; Di Matteo, Tiziana; Habib, Salman; Hearin, Andrew P.; Bryce Kalmbach, J.; Krughoff, K. Simon; Lanusse, François; Lukić, Zarija; Mandelbaum, Rachel; Newman, Jeffrey A.; Padilla, Nelson; Paillas, Enrique; Pope, Adrian; Ricker, Paul M.; Ruiz, Andrés N.; Tenneti, Ananth; Vega-Martínez, Cristian A.; Wechsler, Risa H.; Zhou, Rongpu; Zu, Ying; The LSST Dark Energy Science Collaboration
2018-02-01
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
Calibration of weak-lensing shear in the Kilo-Degree Survey
NASA Astrophysics Data System (ADS)
Fenech Conti, I.; Herbonnet, R.; Hoekstra, H.; Merten, J.; Miller, L.; Viola, M.
2017-05-01
We describe and test the pipeline used to measure the weak-lensing shear signal from the Kilo-Degree Survey (KiDS). It includes a novel method of 'self-calibration' that partially corrects for the effect of noise bias. We also discuss the 'weight bias' that may arise in optimally weighted measurements, and present a scheme to mitigate that bias. To study the residual biases arising from both galaxy selection and shear measurement, and to derive an empirical correction to reduce the shear biases to ≲1 per cent, we create a suite of simulated images whose properties are close to those of the KiDS survey observations. We find that the use of 'self-calibration' reduces the additive and multiplicative shear biases significantly, although further correction via a calibration scheme is required, which also corrects for a dependence of the bias on galaxy properties. We find that the calibration relation itself is biased by the use of noisy, measured galaxy properties, which may limit the final accuracy that can be achieved. We assess the accuracy of the calibration in the tomographic bins used for the KiDS cosmic shear analysis, testing in particular the effect of possible variations in the uncertain distributions of galaxy size, magnitude and ellipticity, and conclude that the calibration procedure is accurate at the level of multiplicative bias ≲1 per cent required for the KiDS cosmic shear analysis.
Detecting Earth's temporarily-captured natural satellites-Minimoons
NASA Astrophysics Data System (ADS)
Bolin, Bryce; Jedicke, Robert; Granvik, Mikael; Brown, Peter; Howell, Ellen; Nolan, Michael C.; Jenniskens, Peter; Chyba, Monique; Patterson, Geoff; Wainscoat, Richard
2014-10-01
We present a study on the discoverability of temporarily captured orbiters (TCOs) by present day or near-term anticipated ground-based and space-based facilities. TCOs (Granvik, M., Vaubaillon, J., Jedicke, R. [2012]. Icarus 218, 262-277) are potential targets for spacecraft rendezvous or human exploration (Chyba, M., Patterson, G., Picot, G., Granvik, M., Jedicke, R., Vaubaillon, J. [2014]. J. Indust. Manage. Optim. 10, 477-501) and provide an opportunity to study the population of the smallest asteroids in the Solar System. We find that present day ground-based optical surveys such as Pan-STARRS and ATLAS can discover the largest TCOs over years of operation. A targeted survey conducted with the Subaru telescope can discover TCOs in the 0.5-1.0 m diameter size range in about 5 nights of observing. Furthermore, we discuss the application of space-based infrared surveys, such as NEOWISE, and ground-based meteor detection systems such as CAMS, CAMO and ASGARD in discovering TCOs. These systems can detect TCOs but at a uninteresting rate. Finally, we discuss the application of bi-static radar at Arecibo and Green Bank to discover TCOs. Our radar simulations are strongly dependent on the rotation rate distribution of the smallest asteroids but with an optimistic distribution we find that these systems have >80% chance of detecting a >10 cm diameter TCO in about 40 h of operation.
Low Frequency Flats for Imaging Cameras on the Hubble Space Telescope
NASA Astrophysics Data System (ADS)
Kossakowski, Diana; Avila, Roberto J.; Borncamp, David; Grogin, Norman A.
2017-01-01
We created a revamped Low Frequency Flat (L-Flat) algorithm for the Hubble Space Telescope (HST) and all of its imaging cameras. The current program that makes these calibration files does not compile on modern computer systems and it requires translation to Python. We took the opportunity to explore various methods that reduce the scatter of photometric observations using chi-squared optimizers along with Markov Chain Monte Carlo (MCMC). We created simulations to validate the algorithms and then worked with the UV photometry of the globular cluster NGC6681 to update the calibration files for the Advanced Camera for Surveys (ACS) and Solar Blind Channel (SBC). The new software was made for general usage and therefore can be applied to any of the current imaging cameras on HST.
Evaluation of sites for the location of WEEE recycling plants in Spain.
Queiruga, Dolores; Walther, Grit; González-Benito, Javier; Spengler, Thomas
2008-01-01
As a consequence of new European legal regulations for treatment of waste electrical and electronic equipment (WEEE), recycling plants have to be installed in Spain. In this context, this contribution describes a method for ranking of Spanish municipalities according to their appropriateness for the installation of these plants. In order to rank the alternatives, the discrete multi-criteria decision method PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations), combined with a surveys of experts, is applied. As existing plants are located in North and East Spain, a significant concentration of top ranking municipalities can be observed in South and Central Spain. The method does not present an optimal structure of the future recycling system, but provides a selection of good alternatives for potential locations of recycling plants.
Rasch Analysis for Instrument Development: Why, When, and How?
Boone, William J.
2016-01-01
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to construct “Wright maps” to explain the meaning of a test score or survey score and develop alternative forms of tests and surveys. Rasch techniques provide a mechanism by which the quality of life sciences–related tests and surveys can be optimized and the techniques can be used to provide a context (e.g., what topics a student has mastered) when explaining test and survey results. PMID:27856555
Shock and vibration technology with applications to electrical systems
NASA Technical Reports Server (NTRS)
Eshleman, R. L.
1972-01-01
A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.
Resident choice and the survey process: the need for standardized observation and transparency.
Schnelle, John F; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F
2009-08-01
To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Morning ADL care was observed in 20 NHs in 5 states by research staff using a standardized observation protocol. The number of observations in which choice was not offered was documented for 3 morning ADL care activities and compared with deficiency statements made by surveyors. Staff failed to offer choice during morning ADL care delivery for at least 1 of 3 ADL care activities in all 20 NHs. Observational data showed residents were not offered choice about when to get out of bed (11%), what to wear (25%), and breakfast dining location (39%). In comparison, survey staff issued only 2 deficiencies in all 20 NHs relevant to choice in the targeted ADL care activities, and neither deficiency was based on observational data. Survey interpretative guidelines instruct surveyors to observe if residents are offered choice during daily care provision, but standardized observation protocols are not provided to surveyors to make this determination. The use of a standardized observation protocol in the survey process similar to that used by research staff in this study would improve the accuracy and transparency of the survey process.
Optimal Pain Assessment in Pediatric Rehabilitation: Implementation of a Nursing Guideline.
Kingsnorth, Shauna; Joachimides, Nick; Krog, Kim; Davies, Barbara; Higuchi, Kathryn Smith
2015-12-01
In Ontario, Canada, the Registered Nurses' Association promotes a Best Practice Spotlight Organization initiative to enhance evidence-based practice. Qualifying organizations are required to implement strategies, evaluate outcomes, and sustain practices aligned with nursing clinical practice guidelines. This study reports on the development and evaluation of a multifaceted implementation strategy to support adoption of a nursing clinical practice guideline on the assessment and management of acute pain in a pediatric rehabilitation and complex continuing care hospital. Multiple approaches were employed to influence behavior, attitudes, and awareness around optimal pain practice (e.g., instructional resources, electronic reminders, audits, and feedback). Four measures were introduced to assess pain in communicating and noncommunicating children as part of a campaign to treat pain as the fifth vital sign. A prospective repeated measures design examined survey and audit data to assess practice aligned with the guideline. The Knowledge and Attitudes Survey (KNAS) was adapted to ensure relevance to the local practice setting and was assessed before and after nurses' participation in three education modules. Audit data included client demographics and pain scores assessed annually over a 3-year window. A final sample of 69 nurses (78% response rate) provided pre-/post-survey data. A total of 108 pediatric surgical clients (younger than 19 years) contributed audit data across the three collection cycles. Significant improvements in nurses' knowledge, attitudes, and behaviors related to optimal pain care for children with disabilities were noted following adoption of the pain clinical practice guideline. Targeted guideline implementation strategies are central to supporting optimal pain practice. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Baldwin, Maureen; Hashima, Jason; Guise, Jeanne-Marie; Gregory, William Thomas; Edelman, Alison; Segel, Sally
2010-01-01
Objective At our institution, traditional postpartum rounds consisted of separate visits from all members of the obstetric team. This led to patient care inefficiencies and miscommunication. In an effort to improve patient care, patient-centered collaborative care (PCCC) was established, whereby physicians, residents, medical students, nurses, case managers, and social workers conduct rounds as a team. The goal of this observational study was to evaluate how PCCC rounds affected resident physicians' assessment of their work environment. Methods Obstetrics and gynecology residents completed a 13-question written survey designed to assess their sense of workflow, education, and workplace cohesion. Surveys were completed before and 6 months after the implementation of PCCC. Responses were compared in aggregate for preintervention and postintervention with Pearson χ2 test. Results Ninety-two percent of the obstetrics residents (n = 23) completed the preintervention survey, and 79% (n = 19) completed the postintervention survey. For most measures, there was no difference in resident perception between the 2 time points. After implementation of PCCC rounds, fewer residents felt that rounds were educational (preintervention = 39%, postintervention = 7%; P = .03). Conclusion Residents did not report negative impacts on workflow, cohesion, or general well-being after the implementation of PCCC rounds. However, there was a perception that PCCC rounds negatively impacted the educational value of postpartum rounds. This information will help identify ways to improve the resident physician experience in the obstetric service while optimizing patient care. PMID:21975886
A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis
NASA Technical Reports Server (NTRS)
Buckles, B. P.; Hodges, B. C.; Hsia, P.
1977-01-01
A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.
Nzioki, Japheth Mativo; Ouma, James; Ombaka, James Hebert; Onyango, Rosebella Ongutu
2017-01-01
Immunization is a powerful and cost-effective health intervention which averts an estimated 2 to 3 million deaths every year. Kenya has a high infant and under five mortality and morbidity rates. Increasing routine child immunization coverage is one way of reducing child morbidity and mortality rates in Kenya. Community Health Workers (CHWs) have emerged as critical human resources for health in developing countries. The Community Strategy (CS) is one of the CHW led interventions promoting Maternal and Child Health (MCH) in Kenya. This study sought to establish the effect of CS on infant vaccination Coverage (IVC) in Mwingi west sub-county; Kenya. This was a pretest - posttest experimental study design with 1 pretest and 2 post-test surveys conducted in intervention and control sites. Mwingi west and Mwingi north sub-counties where intervention and control sites respectively. Sample size in each survey was 422 households. Women with a child aged 9-12 months were main respondents. Intervention site end-term evaluation indicated that; the CS increased IVC by 10.1% (Z =6.0241, P <0.0001), from a suboptimal level of 88.7% at baseline survey to optimal level of 98.8% at end term survey. Infants in intervention site were 2.5 times more likely to receive all recommended immunizations within their first year of life [(crude OR= 2.475, P<0.0001; 95%CI: 1.794-3.414) (adj. OR=2.516, P<0.0001; 95%CI: 1.796-3.5240)]. CS increased IVC in intervention site to optimal level (98.8%). To improve child health outcomes through immunization coverage, Kenya needs to fast-track nationwide implementation of the CS intervention.
Nzioki, Japheth Mativo; Ouma, James; Ombaka, James Hebert; Onyango, Rosebella Ongutu
2017-01-01
Introduction Immunization is a powerful and cost-effective health intervention which averts an estimated 2 to 3 million deaths every year. Kenya has a high infant and under five mortality and morbidity rates. Increasing routine child immunization coverage is one way of reducing child morbidity and mortality rates in Kenya. Community Health Workers (CHWs) have emerged as critical human resources for health in developing countries. The Community Strategy (CS) is one of the CHW led interventions promoting Maternal and Child Health (MCH) in Kenya. This study sought to establish the effect of CS on infant vaccination Coverage (IVC) in Mwingi west sub-county; Kenya. Methods This was a pretest - posttest experimental study design with 1 pretest and 2 post-test surveys conducted in intervention and control sites. Mwingi west and Mwingi north sub-counties where intervention and control sites respectively. Sample size in each survey was 422 households. Women with a child aged 9-12 months were main respondents. Results Intervention site end-term evaluation indicated that; the CS increased IVC by 10.1% (Z =6.0241, P <0.0001), from a suboptimal level of 88.7% at baseline survey to optimal level of 98.8% at end term survey. Infants in intervention site were 2.5 times more likely to receive all recommended immunizations within their first year of life [(crude OR= 2.475, P<0.0001; 95%CI: 1.794-3.414) (adj. OR=2.516, P<0.0001; 95%CI: 1.796-3.5240)]. Conclusion CS increased IVC in intervention site to optimal level (98.8%). To improve child health outcomes through immunization coverage, Kenya needs to fast-track nationwide implementation of the CS intervention. PMID:29138657
Implementation of a 4x8 NIR and CCD Mosaic Focal Plane Technology
NASA Astrophysics Data System (ADS)
Jelinsky, Patrick; Bebek, C. J.; Besuner, R. W.; Haller, G. M.; Harris, S. E.; Hart, P. A.; Heetderks, H. D.; Levi, M. E.; Maldonado, S. E.; Roe, N. A.; Roodman, A. J.; Sapozhnikov, L.
2011-01-01
Mission concepts for NASA's Wide Field Infrared Survey Telescope (WFIRST), ESA's EUCLID mission, as well as for ground based observations, have requirements for large mosaic focal planes to image visible and near infrared (NIR) wavelengths. We have developed detectors, readout electronics and focal plane design techniques that can be used to create very large scalable focal plane mosaic cameras. In our technology, CCDs and HgCdTe detectors can be intermingled on a single, silicon carbide (SiC) cold plate. This enables optimized, wideband observing strategies. The CCDs, developed at Lawrence Berkeley National Laboratory, are fully-depleted, p-channel devices that are backside illuminated capable of operating at temperatures as low as 110K and have been optimized for the weak lensing dark energy technique. The NIR detectors are 1.7µm and 2.0µm wavelength cutoff H2RG® HgCdTe, manufactured by Teledyne Imaging Sensors under contract to LBL. Both the CCDs and NIR detectors are packaged on 4-side abuttable SiC pedestals with a common mounting footprint supporting a 44.16mm mosaic pitch and are coplanar. Both types of detectors have direct-attached, readout electronics that convert the detector signal directly to serial, digital data streams and allow a flexible, low cost data acquisition strategy, despite the large data volume. A mosaic of these detectors can be operated at a common temperature that achieves the required dark current and read noise performance in both types of detectors necessary for dark energy observations. We report here the design and integration for a focal plane designed to accommodate a 4x8 heterogeneous array of CCDs and HgCdTe detectors. Our current implementation contains over 1/4-billion pixels.
Optimization of Supercomputer Use on EADS II System
NASA Technical Reports Server (NTRS)
Ahmed, Ardsher
1998-01-01
The main objective of this research was to optimize supercomputer use to achieve better throughput and utilization of supercomputers and to help facilitate the movement of non-supercomputing (inappropriate for supercomputer) codes to mid-range systems for better use of Government resources at Marshall Space Flight Center (MSFC). This work involved the survey of architectures available on EADS II and monitoring customer (user) applications running on a CRAY T90 system.
Rodríguez Pérez, Sunay; Marshall, Nicholas William; Struelens, Lara; Bosmans, Hilde
2018-01-01
This work concerns the validation of the Kyoto-Kagaku thorax anthropomorphic phantom Lungman for use in chest radiography optimization. The equivalence in terms of polymethyl methacrylate (PMMA) was established for the lung and mediastinum regions of the phantom. Patient chest examination data acquired under automatic exposure control were collated over a 2-year period for a standard x-ray room. Parameters surveyed included exposure index, air kerma area product, and exposure time, which were compared with Lungman values. Finally, a voxel model was developed by segmenting computed tomography images of the phantom and implemented in PENELOPE/penEasy Monte Carlo code to compare phantom tissue-equivalent materials with materials from ICRP Publication 89 in terms of organ dose. PMMA equivalence varied depending on tube voltage, from 9.5 to 10.0 cm and from 13.5 to 13.7 cm, for the lungs and mediastinum regions, respectively. For the survey, close agreement was found between the phantom and the patients' median values (deviations lay between 8% and 14%). Differences in lung doses, an important organ for optimization in chest radiography, were below 13% when comparing the use of phantom tissue-equivalent materials versus ICRP materials. The study confirms the value of the Lungman for chest optimization studies.
ERIC Educational Resources Information Center
Yamaguchi, Kazuo
2016-01-01
This article describes (1) the survey methodological and statistical characteristics of the nonrandomized method for surveying sensitive questions for both cross-sectional and panel survey data and (2) the way to use the incompletely observed variable obtained from this survey method in logistic regression and in loglinear and log-multiplicative…
Probabilistic Selection of High-redshfit Quasars with Subaru / Hyper Suprime-Cam Survey
NASA Astrophysics Data System (ADS)
Onoue, Masafusa
2015-08-01
High-redshift quasrs are an important probe of the distant Universe. They enable observational studies of the early growth of supermassive blackholes, cosmic reionization, chemical enrichment of host galaxies, and so on. We are now starting a new ground-breaking survey of high-redsfhit quasars (z>6) using the exquisite imaging data provided by the Hyper Suprime-Cam (HSC) Subaru Strategic Program (SSP) Survey. With the extremely wide-area coverage and high sensitivity thorugh five optical bands (1,400 deg2 to the depth of r~26 in Wide layer), it is one of the most powerful contemporary surveys that makes it possible for the HSC-AGN collaboration to increase the number of z>6 quasars by almost an order of magnitude, i.e., 300 at z~6 and 50 at z~7 based on the current estimate of the QLF at z>6 (Willott et al. 2010).One of the biggest challenges in the candidate selection is the significant contamination of Galactic brown dwarfs, which have the same point-like appearance as and similarly red colors to z>6 quasars. To overcome this issue, we have developed template SED fitting method optimized to high-redshift quasars selection for constructing the largest z>6 quasar sample with the HSC survey. Since 500 deg2 of the footprints of the HSC survey overlaps with the VISTA/VIKING survey, it is expected that z>6 quasars, with characteristic large Lyman break and flat red-continuum in its SED, can be separated out from contaminating sources by applying SED fitting with multi-wavelength photometric data. In practice, its application with 27 photometric bands to the COSMOS quasars at 3
Astrophysics in the Era of Massive Time-Domain Surveys
NASA Astrophysics Data System (ADS)
Djorgovski, G.
Synoptic sky surveys are now the largest data producers in astronomy, entering the Petascale regime, opening the time domain for a systematic exploration. A great variety of interesting phenomena, spanning essentially all subfields of astronomy, can only be studied in the time domain, and these new surveys are producing large statistical samples of the known types of objects and events for further studies (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). These surveys are generating a new science, and paving the way for even larger surveys to come, e.g., the LSST; our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges, the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already and, for the predictable future, will be severely limited, thus requiring an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of events, that incorporates heterogeneous data from the surveys themselves, archival and contextual information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts. Time domain astronomy is inherently an astronomy of telescope-computational systems, and will increasingly depend on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is a purely archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional parameter spaces produced by sky surveys.
Cui, Huanqing; Shu, Minglei; Song, Min; Wang, Yinglong
2017-03-01
Localization is a key technology in wireless sensor networks. Faced with the challenges of the sensors' memory, computational constraints, and limited energy, particle swarm optimization has been widely applied in the localization of wireless sensor networks, demonstrating better performance than other optimization methods. In particle swarm optimization-based localization algorithms, the variants and parameters should be chosen elaborately to achieve the best performance. However, there is a lack of guidance on how to choose these variants and parameters. Further, there is no comprehensive performance comparison among particle swarm optimization algorithms. The main contribution of this paper is three-fold. First, it surveys the popular particle swarm optimization variants and particle swarm optimization-based localization algorithms for wireless sensor networks. Secondly, it presents parameter selection of nine particle swarm optimization variants and six types of swarm topologies by extensive simulations. Thirdly, it comprehensively compares the performance of these algorithms. The results show that the particle swarm optimization with constriction coefficient using ring topology outperforms other variants and swarm topologies, and it performs better than the second-order cone programming algorithm.
Cui, Huanqing; Shu, Minglei; Song, Min; Wang, Yinglong
2017-01-01
Localization is a key technology in wireless sensor networks. Faced with the challenges of the sensors’ memory, computational constraints, and limited energy, particle swarm optimization has been widely applied in the localization of wireless sensor networks, demonstrating better performance than other optimization methods. In particle swarm optimization-based localization algorithms, the variants and parameters should be chosen elaborately to achieve the best performance. However, there is a lack of guidance on how to choose these variants and parameters. Further, there is no comprehensive performance comparison among particle swarm optimization algorithms. The main contribution of this paper is three-fold. First, it surveys the popular particle swarm optimization variants and particle swarm optimization-based localization algorithms for wireless sensor networks. Secondly, it presents parameter selection of nine particle swarm optimization variants and six types of swarm topologies by extensive simulations. Thirdly, it comprehensively compares the performance of these algorithms. The results show that the particle swarm optimization with constriction coefficient using ring topology outperforms other variants and swarm topologies, and it performs better than the second-order cone programming algorithm. PMID:28257060
The Sensitivity to Trans-Neptunian Dwarf Planets of the Siding Spring Survey
NASA Astrophysics Data System (ADS)
Bannister, Michele; Brown, M. E.; Schmidt, B. P.; Francis, P.; McNaught, R.; Garrad, G.; Larson, S.; Beshore, E.
2012-10-01
The last decade has seen considerable effort in assessing the populations of icy worlds in the outer Solar System, with major surveys in the Northern and more recently, in the Southern Hemisphere skies. Our archival search of more than ten thousand square degrees of sky south of the ecliptic observed over five years is a bright-object survey, sensitive to dwarf-planet sized trans-Neptunian objects. Our innovative survey analyses observations of the Siding Spring Survey, an ongoing survey for near-Earth asteroids at the 0.5 m Uppsala telescope at Siding Spring Observatory. This survey observed each of 2300 4.55 square degree fields on between 30 and 90 of the nights from early 2004 to late 2009, creating a dataset with dense temporal coverage, which we reprocessed for TNOs with a dedicated pipeline. We assess our survey's sensitivity to trans-Neptunian objects by simulating the observation of the synthetic outer Solar System populations of Grav et al. (2011): Centaurs, Kuiper belt and scattered disk. As our fields span approx. -15 to -70 declination, avoiding the galactic plane by 10 degrees either side, we are particularly sensitive to dwarf planets in high-inclination orbits. Partly due to this coverage far from the ecliptic, all known dwarf planets, including Pluto, do fall outside our survey coverage in its temporal span. We apply the widest plausible range of absolute magnitudes to each observable synthetic object, measuring each subsequent apparent magnitude against the magnitude depth of the survey observations. We evaluate our survey's null detection of new dwarf planets in light of our detection efficiencies as a function of trans-Neptunian orbital parameter space. MTB appreciates the funding support of the Joan Duffield Postgraduate Scholarship, an Australian Postgraduate Award, and the Astronomical Society of Australia.
NASA Astrophysics Data System (ADS)
Cioaca, Alexandru
A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as to reducing the operating costs of measuring networks, while preserving their ability to capture the essential features of the system under consideration.
PALM-3000: EXOPLANET ADAPTIVE OPTICS FOR THE 5 m HALE TELESCOPE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dekany, Richard; Bouchez, Antonin; Baranec, Christoph
2013-10-20
We describe and report first results from PALM-3000, the second-generation astronomical adaptive optics (AO) facility for the 5.1 m Hale telescope at Palomar Observatory. PALM-3000 has been engineered for high-contrast imaging and emission spectroscopy of brown dwarfs and large planetary mass bodies at near-infrared wavelengths around bright stars, but also supports general natural guide star use to V ≈ 17. Using its unique 66 × 66 actuator deformable mirror, PALM-3000 has thus far demonstrated residual wavefront errors of 141 nm rms under ∼1'' seeing conditions. PALM-3000 can provide phase conjugation correction over a 6.''4 × 6.''4 working region at λmore » = 2.2 μm, or full electric field (amplitude and phase) correction over approximately one-half of this field. With optimized back-end instrumentation, PALM-3000 is designed to enable 10{sup –7} contrast at 1'' angular separation, including post-observation speckle suppression processing. While continued optimization of the AO system is ongoing, we have already successfully commissioned five back-end instruments and begun a major exoplanet characterization survey, Project 1640.« less
Educational attainment moderates the associations of diabetes education with health outcomes.
Kim, Su Hyun
2016-10-01
Diabetes education is a critical element of care for people with diabetes. However, the associations between diabetes education and self-care or health outcomes have not been clearly demonstrated at a national level. The aims of this study were to examine the associations of attendance of diabetes education classes with health behaviours and glycaemic control, and to understand whether these associations were moderated by level of educational attainment. Data were analysed for 456 adults from the 2012 Korea National Health and Nutrition Examination Survey V, collected from January 2010 to December 2012. No significant differences were observed between patients who had attended diabetes education classes and those who had never attended for factors such as smoking, drinking, exercise, nutrition therapy or glycaemic control. There was a significant interaction effect between receiving diabetes education and level of educational attainment on obtaining optimal glycaemic control. Attending diabetes education was positively associated with optimal glycaemic control among patients with more than a high school education but was negatively associated with it among those with less than middle school education. Diabetes education programmes need to be tailored to the needs and cognitive capacities of the target population. © 2016 John Wiley & Sons Australia, Ltd.
Synoptic Sky Surveys: Lessons Learned and Challenges Ahead
NASA Astrophysics Data System (ADS)
Djorgovski, Stanislav G.; CRTS Team
2014-01-01
A new generation of synoptic sky surveys is now opening the time domain for a systematic exploration, presenting both great new scientific opportunities as well as the challenges. These surveys are touching essentially all subfields of astronomy, producing large statistical samples of the known types of objects and events (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). They are generating new science now, and paving the way for even larger surveys to come, e.g., the LSST. Our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already be severely limited, and this problem will grow by orders of magnitude. This requires an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of transient events, that incorporates heterogeneous data from the surveys themselves, archival information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts now under way. This is inherently an astronomy of telescope-computational systems, that increasingly depends on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is an archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional data parameter spaces.
Koneff, M.D.; Royle, J. Andrew; Forsell, D.J.; Wortham, J.S.; Boomer, G.S.; Perry, M.C.
2005-01-01
Survey design for wintering scoters (Melanitta sp.) and other sea ducks that occur in offshore waters is challenging because these species have large ranges, are subject to distributional shifts among years and within a season, and can occur in aggregations. Interest in winter sea duck population abundance surveys has grown in recent years. This interest stems from concern over the population status of some sea ducks, limitations of extant breeding waterfowl survey programs in North America and logistical challenges and costs of conducting surveys in northern breeding regions, high winter area philopatry in some species and potential conservation implications, and increasing concern over offshore development and other threats to sea duck wintering habitats. The efficiency and practicality of statistically-rigorous monitoring strategies for mobile, aggregated wintering sea duck populations have not been sufficiently investigated. This study evaluated a 2-phase adaptive stratified strip transect sampling plan to estimate wintering population size of scoters, long-tailed ducks (Clangua hyemalis), and other sea ducks and provide information on distribution. The sampling plan results in an optimal allocation of a fixed sampling effort among offshore strata in the U.S. mid-Atlantic coast region. Phase I transect selection probabilities were based on historic distribution and abundance data, while Phase 2 selection probabilities were based on observations made during Phase 1 flights. Distance sampling methods were used to estimate detection rates. Environmental variables thought to affect detection rates were recorded during the survey and post-stratification and covariate modeling were investigated to reduce the effect of heterogeneity on detection estimation. We assessed cost-precision tradeoffs under a number of fixed-cost sampling scenarios using Monte Carlo simulation. We discuss advantages and limitations of this sampling design for estimating wintering sea duck abundance and mapping distribution and suggest improvements for future surveys.
75 FR 48989 - Federal Interagency Steering Committee on Multimedia Environmental Modeling
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
..., ISCMEM Chair, U.S. Geological Survey, National Research Program, Branch of Regional Research, Eastern..., optimization modeling, reactive transport modeling, and watershed and distributed water quality modeling...
50 CFR 679.50 - Groundfish Observer Program.
Code of Federal Regulations, 2012 CFR
2012-10-01
... completion of the electronic vessel and/or processor survey(s); (B) Complete NMFS electronic vessel and/or processor surveys before performing other jobs or duties which are not part of NMFS groundfish observer...
50 CFR 679.50 - Groundfish Observer Program.
Code of Federal Regulations, 2011 CFR
2011-10-01
... completion of the electronic vessel and/or processor survey(s); (B) Complete NMFS electronic vessel and/or processor surveys before performing other jobs or duties which are not part of NMFS groundfish observer...
First Results from the ISO-IRAS Faint Galaxy Survey
NASA Technical Reports Server (NTRS)
Wolstencroft, R. D.; Wehrle, A. E.; Levine, D. A.
1997-01-01
We present the first result from the ISO-IRAS Faint Galaxy Survey (IIFGS), a program designed to obtain ISO observations of the most distant and luminous galaxies in the IRAS Faint Source Survey by filling short gaps in the ISO observing schedule with pairs of 12um ISOCAM AND 90um ISOPHOT observation.
43 CFR 3861.2-1 - Particulars to be observed in mineral surveys.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Particulars to be observed in mineral...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.2-1 Particulars to be observed in mineral surveys. (a) The following...
43 CFR 3861.2-1 - Particulars to be observed in mineral surveys.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Particulars to be observed in mineral...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.2-1 Particulars to be observed in mineral surveys. (a) The following...
43 CFR 3861.2-1 - Particulars to be observed in mineral surveys.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Particulars to be observed in mineral...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.2-1 Particulars to be observed in mineral surveys. (a) The following...
43 CFR 3861.2-1 - Particulars to be observed in mineral surveys.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Particulars to be observed in mineral...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) MINERAL PATENT APPLICATIONS Surveys and Plats § 3861.2-1 Particulars to be observed in mineral surveys. (a) The following...
Sources of variation in detection of wading birds from aerial surveys in the Florida Everglades
Conroy, M.J.; Peterson, J.T.; Bass, O.L.; Fonnesbeck, C.J.; Howell, J.E.; Moore, C.T.; Runge, J.P.
2008-01-01
We conducted dual-observer trials to estimate detection probabilities (probability that a group that is present and available is detected) for fixed-wing aerial surveys of wading birds in the Everglades system, Florida. Detection probability ranged from <0.2 to similar to 0.75 and varied according to species, group size, observer, and the observer's position in the aircraft (front or rear seat). Aerial-survey simulations indicated that incomplete detection can have a substantial effect oil assessment of population trends, particularly river relatively short intervals (<= 3 years) and small annual changes in population size (<= 3%). We conclude that detection bias is an important consideration for interpreting observations from aerial surveys of wading birds, potentially limiting the use of these data for comparative purposes and trend analyses. We recommend that workers conducting aerial surveys for wading birds endeavor to reduce observer and other controllable sources of detection bias and account for uncontrollable sources through incorporation of dual-observer or other calibratior methods as part of survey design (e.g., using double sampling).
Heimdall System for MSSS Sensor Tasking
NASA Astrophysics Data System (ADS)
Herz, A.; Jones, B.; Herz, E.; George, D.; Axelrad, P.; Gehly, S.
In Norse Mythology, Heimdall uses his foreknowledge and keen eyesight to keep watch for disaster from his home near the Rainbow Bridge. Orbit Logic and the Colorado Center for Astrodynamics Research (CCAR) at the University of Colorado (CU) have developed the Heimdall System to schedule observations of known and uncharacterized objects and search for new objects from the Maui Space Surveillance Site. Heimdall addresses the current need for automated and optimized SSA sensor tasking driven by factors associated with improved space object catalog maintenance. Orbit Logic and CU developed an initial baseline prototype SSA sensor tasking capability for select sensors at the Maui Space Surveillance Site (MSSS) using STK and STK Scheduler, and then added a new Track Prioritization Component for FiSST-inspired computations for predicted Information Gain and Probability of Detection, and a new SSA-specific Figure-of-Merit (FOM) for optimized SSA sensor tasking. While the baseline prototype addresses automation and some of the multi-sensor tasking optimization, the SSA-improved prototype addresses all of the key elements required for improved tasking leading to enhanced object catalog maintenance. The Heimdall proof-of-concept was demonstrated for MSSS SSA sensor tasking for a 24 hour period to attempt observations of all operational satellites in the unclassified NORAD catalog, observe a small set of high priority GEO targets every 30 minutes, make a sky survey of the GEO belt region accessible to MSSS sensors, and observe particular GEO regions that have a high probability of finding new objects with any excess sensor time. This Heimdall prototype software paves the way for further R&D that will integrate this technology into the MSSS systems for operational scheduling, improve the software's scalability, and further tune and enhance schedule optimization. The Heimdall software for SSA sensor tasking provides greatly improved performance over manual tasking, improved coordinated sensor usage, and tasking schedules driven by catalog improvement goals (reduced overall covariance, etc.). The improved performance also enables more responsive sensor tasking to address external events, newly detected objects, newly detected object activity, and sensor anomalies. Instead of having to wait until the next day's scheduling phase, events can be addressed with new tasking schedules immediately (within seconds or minutes). Perhaps the most important benefit is improved SSA based on an overall improvement to the quality of the space catalog. By driving sensor tasking and scheduling based on predicted Information Gain and other relevant factors, better decisions are made in the application of available sensor resources, leading to an improved catalog and better information about the objects of most interest. The Heimdall software solution provides a configurable, automated system to improve sensor tasking efficiency and responsiveness for SSA applications. The FISST algorithms for Track Prioritization, SSA specific task and resource attributes, Scheduler algorithms, and configurable SSA-specific Figure-of-Merit together provide optimized and tunable scheduling for the Maui Space Surveillance Site and possibly other sites and organizations across the U.S. military and for allies around the world.
Research Methods in Healthcare Epidemiology: Survey and Qualitative Research.
Safdar, Nasia; Abbo, Lilian M; Knobloch, Mary Jo; Seo, Susan K
2016-11-01
Surveys are one of the most frequently employed study designs in healthcare epidemiology research. Generally easier to undertake and less costly than many other study designs, surveys can be invaluable to gain insights into opinions and practices in large samples and may be descriptive and/or be used to test associations. In this context, qualitative research methods may complement this study design either at the survey development phase and/or at the interpretation/extension of results stage. This methods article focuses on key considerations for designing and deploying surveys in healthcare epidemiology and antibiotic stewardship, including identification of whether or not de novo survey development is necessary, ways to optimally lay out and display a survey, denominator measurement, discussion of biases to keep in mind particularly in research using surveys, and the role of qualitative research methods to complement surveys. We review examples of surveys in healthcare epidemiology and antimicrobial stewardship and review the pros and cons of methods used. A checklist is provided to help aid design and deployment of surveys in healthcare epidemiology and antimicrobial stewardship. Infect Control Hosp Epidemiol 2016;1-6.
Research Methods in Healthcare Epidemiology: Survey and Qualitative Research
Safdar, Nasia; Abbo, Lilian M.; Knobloch, Mary Jo; Seo, Susan K.
2017-01-01
Surveys are one of the most frequently employed study designs in healthcare epidemiology research. Generally easier to undertake and less costly than many other study designs, surveys can be invaluable to gain insights into opinions and practices in large samples and may be descriptive and/or be used to test associations. In this context, qualitative research methods may complement this study design either at the survey development phase and/or at the interpretation/extension of results stage. This methods article focuses on key considerations for designing and deploying surveys in healthcare epidemiology and antibiotic stewardship, including identification of whether or not de novo survey development is necessary, ways to optimally lay out and display a survey, denominator measurement, discussion of biases to keep in mind particularly in research using surveys, and the role of qualitative research methods to complement surveys. We review examples of surveys in healthcare epidemiology and antimicrobial stewardship and review the pros and cons of methods used. A checklist is provided to help aid design and deployment of surveys in healthcare epidemiology and antimicrobial stewardship. PMID:27514583
NASA Astrophysics Data System (ADS)
Hotokezaka, K.; Nissanke, S.; Hallinan, G.; Lazio, T. J. W.; Nakar, E.; Piran, T.
2016-11-01
Mergers of binary neutron stars and black hole-neutron star binaries produce gravitational-wave (GW) emission and outflows with significant kinetic energies. These outflows result in radio emissions through synchrotron radiation. We explore the detectability of these synchrotron-generated radio signals by follow-up observations of GW merger events lacking a detection of electromagnetic counterparts in other wavelengths. We model radio light curves arising from (I) sub-relativistic merger ejecta and (II) ultra-relativistic jets. The former produce radio remnants on timescales of a few years and the latter produce γ-ray bursts in the direction of the jet and orphan-radio afterglows extending over wider angles on timescales of weeks. Based on the derived light curves, we suggest an optimized survey at 1.4 GHz with five epochs separated by a logarithmic time interval. We estimate the detectability of the radio counterparts of simulated GW-merger events to be detected by advanced LIGO and Virgo by current and future radio facilities. The detectable distances for these GW merger events could be as high as 1 Gpc. Around 20%-60% of the long-lasting radio remnants will be detectable in the case of the moderate kinetic energy of 3\\cdot {10}50 erg and a circum-merger density of 0.1 {{cm}}-3 or larger, while 5%-20% of the orphan-radio afterglows with kinetic energy of 1048 erg will be detectable. The detection likelihood increases if one focuses on the well-localizable GW events. We discuss the background noise due to radio fluxes of host galaxies and false positives arising from extragalactic radio transients and variable active galactic nuclei, and we show that the quiet radio transient sky is of great advantage when searching for the radio counterparts.
Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.
Xie, Yanmei; Zhang, Biao
2017-04-20
Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).
The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations
NASA Astrophysics Data System (ADS)
Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.
2017-09-01
We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.
Fu, M; Ahrenmark, U; Berglund, S; Lindholm, C J; Lehto, A; Broberg, A Månsson; Tasevska-Dinevska, G; Wikstrom, G; Ågard, A; Andersson, B
2017-12-01
Despite that heart rate (HR) control is one of the guideline-recommended treatment goals for heart failure (HF) patients, implementation has been painstakingly slow. Therefore, it would be important to identify patients who have not yet achieved their target heart rates and assess possible underlying reasons as to why the target rates are not met. The survey of HR in patients with HF in Sweden (HR-HF survey) is an investigator-initiated, prospective, multicenter, observational longitudinal study designed to investigate the state of the art in the control of HR in HF and to explore potential underlying mechanisms for suboptimal HR control with focus on awareness of and adherence to guidelines for HR control among physicians who focus on the contributing role of beta-blockers (BBs). In 734 HF patients the mean HR was 68 ± 12 beats per minute (bpm) (37.2% of the patients had a HR >70 bpm). Patients with HF with reduced ejection fraction (HFrEF) (n = 425) had the highest HR (70 ± 13 bpm, with 42% >70 bpm), followed by HF with preserved ejection fraction and HF with mid-range ejection fraction. Atrial fibrillation, irrespective of HF type, had higher HR than sinus rhythm. A similar pattern was observed with BB treatment. Moreover, non-achievement of the recommended target HR (<70 bpm) in HFrEF and sinus rhythm was unrelated to age, sex, cardiovascular risk factors, cardiovascular diseases, and comorbidities, but was related to EF and the clinical decision of the physician. Approximately 50% of the physicians considered a HR of >70 bpm optimal and an equal number considered a HR of >70 bpm too high, but without recommending further action. Furthermore, suboptimal HR control cannot be attributed to the use of BBs because there was neither a difference in use of BBs nor an interaction with BBs for HR >70 bpm compared with HR <70 bpm. Suboptimal control of HR was noted in HFrEF with sinus rhythm, which appeared to be attributable to physician decision making rather than to the use of BBs. Therefore, our results underline the need for greater attention to HR control in patients with HFrEF and sinus rhythm and thus a potential for improved HF care.
Griffin, Paul C.; Schoenecker, Kate A.; Gogan, Peter J.; Lubow, Bruce C.
2009-01-01
Reliable estimates of elk (Cervus elaphus) and deer (Odocoileus hemionus) abundance on Santa Rosa Island, Channel Islands National Park, California, are required to assess the success of management actions directed at these species. We conducted a double-observer aerial survey of elk on a large portion of Santa Rosa Island on March 19, 2009. All four persons on the helicopter were treated as observers. We used two analytical approaches: (1) with three capture occasions corresponding to three possible observers, pooling the observations from the two rear-seat observers, and (2) with four capture occasions treating each observer separately. Approach 1 resulted in an estimate of 483 elk in the survey zone with a 95-percent confidence interval of 479 to 524 elk. Approach 2 resulted in an estimate of 489 elk in the survey zone with a 95-percent confidence interval of 471 to 535 elk. Approximately 5 percent of the elk groups that were estimated to have been present in the survey area were not seen by any observer. Fog prevented us from collecting double-observer observations for deer as intended on March 20. However, we did count 434 deer during the double-observer counts of elk on March 19. Both the calculated number of elk and the observed number of deer are minimal estimates of numbers of each ungulate species on Santa Rosa Island as weather conditions precluded us from surveying the entire island.
Observational Requirements for Underway Observations from Research Vessels
NASA Astrophysics Data System (ADS)
Smith, S. R.; Van Waes, M.
2016-02-01
Identifying observational requirements to build and sustain a global ocean observing system requires input from the user community. Research vessels are an essential and versatile component of the observing system. The authors will present results from a survey of the marine climate and oceanographic community that solicited observational requirements for research vessels. The goal of the survey is to determine priorities for underway instrumentation to be run on NOAA vessels operated by the Office of Marine and Aviation Operations (OMAO) to support secondary users of the NOAA fleet. Secondary users are defined as persons that do not routinely participate in cruises on NOAA vessels, but have a research or operational need for underway observations from these vessels. Secondary applications of underway data from NOAA vessels include, but are not limited to, evaluation of analyses/forecast from ocean and atmospheric models, developing satellite retrieval algorithms, and validating observations from remote sensing systems (e.g., satellites, aircraft). For this survey, underway observations are defined as digital data generated by environmental sensor systems permanently installed on the vessel and routinely maintained by the operator. The survey also assessed the need for access to these observations in real-time versus delayed-mode. The authors will discuss how these survey results can be used to inform NOAA management on the requirements for underway observations during future NOAA vessel deployments. Although originally designed to assess requirements for NOAA vessels, the international response to the survey makes the results applicable to research vessel operations around the world.
Locating waterfowl observations on aerial surveys
Butler, W.I.; Hodges, J.I.; Stehn, R.A.
1995-01-01
We modified standard aerial survey data collection to obtain the geographic location for each waterfowl observation on surveys in Alaska during 1987-1993. Using transect navigation with CPS (global positioning system), data recording on continuously running tapes, and a computer data input program, we located observations with an average deviation along transects of 214 m. The method provided flexibility in survey design and data analysis. Although developed for geese nesting near the coast of the Yukon-Kuskokwim Delta, the methods are widely applicable and were used on other waterfowl surveys in Alaska to map distribution and relative abundance of waterfowl. Accurate location data with GIS analysis and display may improve precision and usefulness of data from any aerial transect survey.
Markley, J Daniel; Pakyz, Amy; Bernard, Shaina; Lee, Kimberly; Appelbaum, Nital; Bearman, Gonzalo; Stevens, Michael P
2017-03-01
Mobile medical apps are commonly used by health care professionals and could be used by antimicrobial stewardship programs to enhance adherence to local recommendations. We conducted a survey of health care workers to inform the design of an antimicrobial stewardship smartphone app. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Bernstein, Daniel A; Salsgiver, Elizabeth; Simon, Matthew S; Greendyke, William; Eiras, Daniel P; Ito, Masahiro; Caruso, Dean A; Woodward, Timothy M; Perriel, Odette T; Saiman, Lisa; Furuya, E Yoko; Calfee, David P
2016-12-01
In this study, we used an online survey to assess knowledge, attitudes, and practices related to environmental cleaning and other infection prevention strategies among environmental services workers (ESWs) at 5 hospitals. Our findings suggest that ESWs could benefit from additional education and feedback as well as new strategies to address workflow challenges. Infect Control Hosp Epidemiol 2016;1492-1495.
The Earth Phenomena Observing System: Intelligent Autonomy for Satellite Operations
NASA Technical Reports Server (NTRS)
Ricard, Michael; Abramson, Mark; Carter, David; Kolitz, Stephan
2003-01-01
Earth monitoring systems of the future may include large numbers of inexpensive small satellites, tasked in a coordinated fashion to observe both long term and transient targets. For best performance, a tool which helps operators optimally assign targets to satellites will be required. We present the design of algorithms developed for real-time optimized autonomous planning of large numbers of small single-sensor Earth observation satellites. The algorithms will reduce requirements on the human operators of such a system of satellites, ensure good utilization of system resources, and provide the capability to dynamically respond to temporal terrestrial phenomena. Our initial real-time system model consists of approximately 100 satellites and large number of points of interest on Earth (e.g., hurricanes, volcanoes, and forest fires) with the objective to maximize the total science value of observations over time. Several options for calculating the science value of observations include the following: 1) total observation time, 2) number of observations, and the 3) quality (a function of e.g., sensor type, range, slant angle) of the observations. An integrated approach using integer programming, optimization and astrodynamics is used to calculate optimized observation and sensor tasking plans.
Seidahmed, Osama M. E.; Eltahir, Elfatih A. B.
2016-01-01
In dengue-endemic areas, transmission shows both a seasonal and interannual variability. To investigate how rainfall impacts dengue seasonality in Singapore, we carried out a longitudinal survey in the Geylang neighborhood from August 2014 to August 2015. The survey comprised of twice-weekly random inspections to outdoor breeding habitats and continuous monitoring for positive ones. In addition, observations of rainstorms were collected. Out of 6824 inspected habitats, 67 contained Aedes aegypti, 11 contained Aedes albopictus and 24 contained Culex spp. The main outdoors habitat of Aedes aegypti was storm drains (54/67). We found that 80% of breeding sites in drains (43/54) were lost after intense rainstorms related to the wet phase of the Northeast monsoon (NE) between November 2014 and early January 2015. Subsequently, 95% (41/43) of these flushed drains had dried out during the dry phase of the NE in late January-February 2015. A return in the outdoor breeding of Aedes aegypti was observed after the onset of Southwest monsoon (SW) between May and August 2015. There was also a reduction in productivity of breeding habitats for larvae and pupae after the onset of the NE. In wet equatorial regions like Singapore, rainfall varies with the monsoons. A monsoon-driven sequence of flushing and drying shapes the outdoor seasonal abundance of Aedes aegypti. This finding can be used to optimize vector control strategies and better understand dengue in the context of climate change. PMID:27459322
Seidahmed, Osama M E; Eltahir, Elfatih A B
2016-07-01
In dengue-endemic areas, transmission shows both a seasonal and interannual variability. To investigate how rainfall impacts dengue seasonality in Singapore, we carried out a longitudinal survey in the Geylang neighborhood from August 2014 to August 2015. The survey comprised of twice-weekly random inspections to outdoor breeding habitats and continuous monitoring for positive ones. In addition, observations of rainstorms were collected. Out of 6824 inspected habitats, 67 contained Aedes aegypti, 11 contained Aedes albopictus and 24 contained Culex spp. The main outdoors habitat of Aedes aegypti was storm drains (54/67). We found that 80% of breeding sites in drains (43/54) were lost after intense rainstorms related to the wet phase of the Northeast monsoon (NE) between November 2014 and early January 2015. Subsequently, 95% (41/43) of these flushed drains had dried out during the dry phase of the NE in late January-February 2015. A return in the outdoor breeding of Aedes aegypti was observed after the onset of Southwest monsoon (SW) between May and August 2015. There was also a reduction in productivity of breeding habitats for larvae and pupae after the onset of the NE. In wet equatorial regions like Singapore, rainfall varies with the monsoons. A monsoon-driven sequence of flushing and drying shapes the outdoor seasonal abundance of Aedes aegypti. This finding can be used to optimize vector control strategies and better understand dengue in the context of climate change.
A DATA-DRIVEN MODEL FOR SPECTRA: FINDING DOUBLE REDSHIFTS IN THE SLOAN DIGITAL SKY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsalmantza, P.; Hogg, David W., E-mail: vivitsal@mpia.de
2012-07-10
We present a data-driven method-heteroscedastic matrix factorization, a kind of probabilistic factor analysis-for modeling or performing dimensionality reduction on observed spectra or other high-dimensional data with known but non-uniform observational uncertainties. The method uses an iterative inverse-variance-weighted least-squares minimization procedure to generate a best set of basis functions. The method is similar to principal components analysis (PCA), but with the substantial advantage that it uses measurement uncertainties in a responsible way and accounts naturally for poorly measured and missing data; it models the variance in the noise-deconvolved data space. A regularization can be applied, in the form of a smoothnessmore » prior (inspired by Gaussian processes) or a non-negative constraint, without making the method prohibitively slow. Because the method optimizes a justified scalar (related to the likelihood), the basis provides a better fit to the data in a probabilistic sense than any PCA basis. We test the method on Sloan Digital Sky Survey (SDSS) spectra, concentrating on spectra known to contain two redshift components: these are spectra of gravitational lens candidates and massive black hole binaries. We apply a hypothesis test to compare one-redshift and two-redshift models for these spectra, utilizing the data-driven model trained on a random subset of all SDSS spectra. This test confirms 129 of the 131 lens candidates in our sample and all of the known binary candidates, and turns up very few false positives.« less
Starshade orbital maneuver study for WFIRST
NASA Astrophysics Data System (ADS)
Soto, Gabriel; Sinha, Amlan; Savransky, Dmitry; Delacroix, Christian; Garrett, Daniel
2017-09-01
The Wide Field Infrared Survey Telescope (WFIRST) mission, scheduled for launch in the mid-2020s will perform exoplanet science via both direct imaging and a microlensing survey. An internal coronagraph is planned to perform starlight suppression for exoplanet imaging, but an external starshade could be used to achieve the required high contrasts with potentially higher throughput. This approach would require a separately-launched occulter spacecraft to be positioned at exact distances from the telescope along the line of sight to a target star system. We present a detailed study to quantify the Δv requirements and feasibility of deploying this additional spacecraft as a means of exoplanet imaging. The primary focus of this study is the fuel use of the occulter while repositioning between targets. Based on its design, the occulter is given an offset distance from the nominal WFIRST halo orbit. Target star systems and look vectors are generated using Exoplanet Open-Source Imaging Simulator (EXOSIMS); a boundary value problem is then solved between successive targets. On average, 50 observations are achievable with randomly selected targets given a 30-day transfer time. Individual trajectories can be optimized for transfer time as well as fuel usage to be used in mission scheduling. Minimizing transfer time reduces the total mission time by up to 4.5 times in some simulations before expending the entire fuel budget. Minimizing Δv can generate starshade missions that achieve over 100 unique observations within the designated mission lifetime of WFIRST.
NASA Astrophysics Data System (ADS)
Wijesinghe, D. B.; Hopkins, A. M.; Sharp, R.; Gunawardhana, M.; Brough, S.; Sadler, E. M.; Driver, S.; Baldry, I.; Bamford, S.; Liske, J.; Loveday, J.; Norberg, P.; Peacock, J.; Popescu, C. C.; Tuffs, R. J.; Bland-Hawthorn, J.; Cameron, E.; Croom, S.; Frenk, C.; Hill, D.; Jones, D. H.; van Kampen, E.; Kelvin, L.; Kuijken, K.; Madore, B.; Nichol, B.; Parkinson, H.; Pimbblet, K. A.; Prescott, M.; Robotham, A. S. G.; Seibert, M.; Simmat, E.; Sutherland, W.; Taylor, E.; Thomas, D.
2011-02-01
We present self-consistent star formation rates derived through pan-spectral analysis of galaxies drawn from the Galaxy and Mass Assembly (GAMA) survey. We determine the most appropriate form of dust obscuration correction via application of a range of extinction laws drawn from the literature as applied to Hα, [O II] and UV luminosities. These corrections are applied to a sample of 31 508 galaxies from the GAMA survey at z < 0.35. We consider several different obscuration curves, including those of Milky Way, Calzetti and Fischera & Dopita curves and their effects on the observed luminosities. At the core of this technique is the observed Balmer decrement, and we provide a prescription to apply optimal obscuration corrections using the Balmer decrement. We carry out an analysis of the star formation history (SFH) using stellar population synthesis tools to investigate the evolutionary history of our sample of galaxies as well as to understand the effects of variation in the initial mass function (IMF) and the effects this has on the evolutionary history of galaxies. We find that the Fischera & Dopita obscuration curve with an Rv value of 4.5 gives the best agreement between the different SFR indicators. The 2200 Å feature needed to be removed from this curve to obtain complete consistency between all SFR indicators suggesting that this feature may not be common in the average integrated attenuation of galaxy emission. We also find that the UV dust obscuration is strongly dependent on the SFR.
The XXL Survey. VI. The 1000 brightest X-ray point sources
NASA Astrophysics Data System (ADS)
Fotopoulou, S.; Pacaud, F.; Paltani, S.; Ranalli, P.; Ramos-Ceja, M. E.; Faccioli, L.; Plionis, M.; Adami, C.; Bongiorno, A.; Brusa, M.; Chiappetti, L.; Desai, S.; Elyiv, A.; Lidman, C.; Melnyk, O.; Pierre, M.; Piconcelli, E.; Vignali, C.; Alis, S.; Ardila, F.; Arnouts, S.; Baldry, I.; Bremer, M.; Eckert, D.; Guennou, L.; Horellou, C.; Iovino, A.; Koulouridis, E.; Liske, J.; Maurogordato, S.; Menanteau, F.; Mohr, J. J.; Owers, M.; Poggianti, B.; Pompei, E.; Sadibekova, T.; Stanford, A.; Tuffs, R.; Willis, J.
2016-06-01
Context. X-ray extragalactic surveys are ideal laboratories for the study of the evolution and clustering of active galactic nuclei (AGN). Usually, a combination of deep and wide surveys is necessary to create a complete picture of the population. Deep X-ray surveys provide the faint population at high redshift, while wide surveys provide the rare bright sources. Nevertheless, very wide area surveys often lack the ancillary information available for modern deep surveys. The XXL survey spans two fields of a combined 50 deg2 observed for more than 6Ms with XMM-Newton, occupying the parameter space that lies between deep surveys and very wide area surveys; at the same time it benefits from a wealth of ancillary data. Aims: This paper marks the first release of the XXL point source catalogue including four optical photometry bands and redshift estimates. Our sample is selected in the 2 - 10 keV energy band with the goal of providing a sizable sample useful for AGN studies. The limiting flux is F2 - 10 keV = 4.8 × 10-14 erg s-1 cm-2. Methods: We use both public and proprietary data sets to identify the counterparts of the X-ray point-like sources by means of a likelihood ratio test. We improve upon the photometric redshift determination for AGN by applying a Random Forest classification trained to identify for each object the optimal photometric redshift category (passive, star forming, starburst, AGN, quasi-stellar objects (QSO)). Additionally, we assign a probability to each source that indicates whether it might be a star or an outlier. We apply Bayesian analysis to model the X-ray spectra assuming a power-law model with the presence of an absorbing medium. Results: We find that the average unabsorbed photon index is ⟨Γ⟩ = 1.85 ± 0.40 while the average hydrogen column density is log ⟨NH⟩ = 21.07 ± 1.2 cm-2. We find no trend of Γ or NH with redshift and a fraction of 26% absorbed sources (log NH> 22) consistent with the literature on bright sources (log Lx> 44). The counterpart identification rate reaches 96.7% for sources in the northern field, 97.7% for the southern field, and 97.2% in total. The photometric redshift accuracy is 0.095 for the full XMM-XXL with 28% catastrophic outliers estimated on a sample of 339 sources. Conclusions: We show that the XXL-1000-AGN sample number counts extended the number counts of the COSMOS survey to higher fluxes and are fully consistent with the Euclidean expectation. We constrain the intrinsic luminosity function of AGN in the 2 - 10 keV energy band where the unabsorbed X-ray flux is estimated from the X-ray spectral fit up to z = 3. Finally, we demonstrate the presence of a supercluster size structure at redshift 0.14, identified by means of percolation analysis of the XXL-1000-AGN sample. The XXL survey, reaching a medium flux limit and covering a wide area, is a stepping stone between current deep fields and planned wide area surveys. Based on observations obtained with XMM-Newton, an ESA science mission with instruments and contributions directly funded by ESA Member States and NASA. Based on observations made with ESO Telescopes at the La Silla and Paranal Observatories under programme ID 089.A-0666 and LP191.A-0268.A copy of the XXL-1000-AGN Catalogue is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/592/A5
Study optimizes gas lift in Gulf of Suez field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Waly, A.A.; Darwish, T.A.; Osman Salama, A.
1996-06-24
A study using PVT data combined with fluid and multiphase flow correlations optimized gas lift in the Ramadan field, Nubia C, oil wells, in the Gulf of Suez. Selection of appropriate correlations followed by multiphase flow calculations at various points of injection (POI) were the first steps in the study. After determining the POI for each well from actual pressure and temperature surveys, the study constructed lift gas performance curves for each well. Actual and optimum operating conditions were compared to determine the optimal gas lift. The study indicated a net 2,115 bo/d could be gained from implementing its recommendations.more » The actual net oil gained as a result of this optimization and injected gas reallocation was 2,024 bo/d. The paper discusses the Ramadan field, fluid properties, multiphase flow, production optimization, and results.« less
An High Resolution Near-Earth Objects Population Enabling Next-Generation Search Strategies
NASA Technical Reports Server (NTRS)
Tricaico, Pasquale; Beshore, E. C.; Larson, S. M.; Boattini, A.; Williams, G. V.
2010-01-01
Over the past decade, the dedicated search for kilometer-size near-Earth objects (NEOs), potentially hazardous objects (PHOs), and potential Earth impactors has led to a boost in the rate of discoveries of these objects. The catalog of known NEOs is the fundamental ingredient used to develop a model for the NEOs population, either by assessing and correcting for the observational bias (Jedicke et al., 2002), or by evaluating the migration rates from the NEOs source regions (Bottke et al., 2002). The modeled NEOs population is a necessary tool used to track the progress in the search of large NEOs (Jedicke et al., 2003) and to try to predict the distribution of the ones still undiscovered, as well as to study the sky distribution of potential Earth impactors (Chesley & Spahr, 2004). We present a method to model the NEOs population in all six orbital elements, on a finely grained grid, allowing us the design and test of targeted and optimized search strategies. This method relies on the observational data routinely reported to the Minor Planet Center (MPC) by the Catalina Sky Survey (CSS) and by other active NEO surveys over the past decade, to determine on a nightly basis the efficiency in detecting moving objects as a function of observable quantities including apparent magnitude, rate of motion, airmass, and galactic latitude. The cumulative detection probability is then be computed for objects within a small range in orbital elements and absolute magnitude, and the comparison with the number of know NEOs within the same range allows us to model the population. When propagated to the present epoch and projected on the sky plane, this provides the distribution of the missing large NEOs, PHOs, and potential impactors.
Wang, Chang; Qi, Fei; Shi, Guangming; Wang, Xiaotian
2013-01-01
Deployment is a critical issue affecting the quality of service of camera networks. The deployment aims at adopting the least number of cameras to cover the whole scene, which may have obstacles to occlude the line of sight, with expected observation quality. This is generally formulated as a non-convex optimization problem, which is hard to solve in polynomial time. In this paper, we propose an efficient convex solution for deployment optimizing the observation quality based on a novel anisotropic sensing model of cameras, which provides a reliable measurement of the observation quality. The deployment is formulated as the selection of a subset of nodes from a redundant initial deployment with numerous cameras, which is an ℓ0 minimization problem. Then, we relax this non-convex optimization to a convex ℓ1 minimization employing the sparse representation. Therefore, the high quality deployment is efficiently obtained via convex optimization. Simulation results confirm the effectiveness of the proposed camera deployment algorithms. PMID:23989826
Using a genetic algorithm to optimize a water-monitoring network for accuracy and cost effectiveness
NASA Astrophysics Data System (ADS)
Julich, R. J.
2004-05-01
The purpose of this project is to determine the optimal spatial distribution of water-monitoring wells to maximize important data collection and to minimize the cost of managing the network. We have employed a genetic algorithm (GA) towards this goal. The GA uses a simple fitness measure with two parts: the first part awards a maximal score to those combinations of hydraulic head observations whose net uncertainty is closest to the value representing all observations present, thereby maximizing accuracy; the second part applies a penalty function to minimize the number of observations, thereby minimizing the overall cost of the monitoring network. We used the linear statistical inference equation to calculate standard deviations on predictions from a numerical model generated for the 501-observation Death Valley Regional Flow System as the basis for our uncertainty calculations. We have organized the results to address the following three questions: 1) what is the optimal design strategy for a genetic algorithm to optimize this problem domain; 2) what is the consistency of solutions over several optimization runs; and 3) how do these results compare to what is known about the conceptual hydrogeology? Our results indicate the genetic algorithms are a more efficient and robust method for solving this class of optimization problems than have been traditional optimization approaches.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-17
... the States have accumulated substantial experience in the design and implementation of these surveys... observational surveys is to include a specification of the survey design, to be reassessed and, if appropriate, updated every five (5) years, or earlier if the State so desires. The survey design specification will...
Using large spectroscopic surveys to test the double degenerate model for Type Ia supernovae
NASA Astrophysics Data System (ADS)
Breedt, E.; Steeghs, D.; Marsh, T. R.; Gentile Fusillo, N. P.; Tremblay, P.-E.; Green, M.; De Pasquale, S.; Hermes, J. J.; Gänsicke, B. T.; Parsons, S. G.; Bours, M. C. P.; Longa-Peña, P.; Rebassa-Mansergas, A.
2017-07-01
An observational constraint on the contribution of double degenerates to Type Ia supernovae requires multiple radial velocity measurements of ideally thousands of white dwarfs. This is because only a small fraction of the double degenerate population is massive enough, with orbital periods short enough, to be considered viable Type Ia progenitors. We show how the radial velocity information available from public surveys such as the Sloan Digital Sky Survey can be used to pre-select targets for variability, leading to a 10-fold reduction in observing time required compared to an unranked or random survey. We carry out Monte Carlo simulations to quantify the detection probability of various types of binaries in the survey and show that this method, even in the most pessimistic case, doubles the survey size of the largest survey to date (the SPY Survey) in less than 15 per cent of the required observing time. Our initial follow-up observations corroborate the method, yielding 15 binaries so far (eight known and seven new), as well as orbital periods for four of the new binaries.
Optimisation des structures métalliques fléchies dans un calcul plastique
NASA Astrophysics Data System (ADS)
Geara, F.; Raphael, W.; Kaddah, F.
2005-05-01
The steel structure is a type of construction that is very developed in civil engineering. In the phase of survey and then of execution and installation of a metal work, the phase of conception is often the place of discontinuities that prevents the global optimization of material steel. In our survey, we used the traditional approach of optimization that is essentially based on the minimization of the weight of the structure, while taking advantages of plastic properties of steel in the case of a bending structure. It has been permitted because of to the relation found between the areas of the sections of the steel elements and the plastic moment of these sections. These relations have been drawn for different types of steel. In order to take advantages of the linear programming, a simplification has been introduced in transforming these relation to linear relations, which permits us to use simple methods as the simplex theorem. This procedure proves to be very interesting in the first phases of the survey and give very interesting results.
Optimal design of focused experiments and surveys
NASA Astrophysics Data System (ADS)
Curtis, Andrew
1999-10-01
Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, κ =λ+ 2μ/3.
Optimization of Medication Use at Accountable Care Organizations.
Wilks, Chrisanne; Krisle, Erik; Westrich, Kimberly; Lunner, Kristina; Muhlestein, David; Dubois, Robert
2017-10-01
Optimized medication use involves the effective use of medications for better outcomes, improved patient experience, and lower costs. Few studies systematically gather data on the actions accountable care organizations (ACOs) have taken to optimize medication use. To (a) assess how ACOs optimize medication use; (b) establish an association between efforts to optimize medication use and achievement on financial and quality metrics; (c) identify organizational factors that correlate with optimized medication use; and (d) identify barriers to optimized medication use. This cross-sectional study consisted of a survey and interviews that gathered information on the perceptions of ACO leadership. The survey contained a medication practices inventory (MPI) composed of 38 capabilities across 6 functional domains related to optimizing medication use. ACOs completed self-assessments that included rating each component of the MPI on a scale of 1 to 10. Fisher's exact tests, 2-proportions tests, t-tests, and logistic regression were used to test for associations between ACO scores on the MPI and performance on financial and quality metrics, and on ACO descriptive characteristics. Of the 847 ACOs that were contacted, 49 provided usable survey data. These ACOs rated their own system's ability to manage the quality and costs of optimizing medication use, providing a 64% and 31% affirmative response, respectively. Three ACOs achieved an overall MPI score of 8 or higher, 45 scored between 4 and 7.9, and 1 scored between 0 and 3.9. Using the 3 score groups, the study did not identify a relationship between MPI scores and achievement on financial or quality benchmarks, ACO provider type, member volume, date of ACO creation, or the presence of a pharmacist in a leadership position. Barriers to optimizing medication use relate to reimbursement for pharmacist integration, lack of health information technology interoperability, lack of data, feasibility issues, and physician buy-in. Compared with 2012 data, data on ACOs that participated in this study show that they continue to build effective strategies to optimize medication use. These ACOs struggle with both notification related to prescription use and measurement of the influence optimized medication use has on costs and quality outcomes. Compared with the earlier study, these data find that more ACOs are involving pharmacists directly in care, expanding the use of generics, electronically transmitting prescriptions, identifying gaps in care and potential adverse events, and educating patients on therapeutic alternatives. ACO-level policies that facilitate practices to optimize medication use are needed. Integrating pharmacists into care, giving both pharmacists and physicians access to clinical data, obtaining physician buy-in, and measuring the impact of practices to optimize medication use may improve these practices. This research was sponsored and funded by the National Pharmaceutical Council (NPC), an industry funded health policy research group that is not involved in lobbying or advocacy. Employees of the sponsor contributed to the research questions, determination of the relevance of the research questions, and the research design. Specifically, there was involvement in the survey and interview instruments. They also contributed to some data interpretation and revision of the manuscript. Leavitt Partners was hired by NPC to conduct research for this study and also serves a number of health care clients, including life sciences companies, provider organizations, accountable care organizations, and payers. Westrich and Dubois are employed by the NPC. Wilks, Krisle, Lunner, and Muhlestein are employed by Leavitt Partners and did not receive separate compensation. Study concept and design were contributed by Krisle, Dubois, and Muhlestein, along with Lunner and Westrich. Krisle and Muhlestein collected the data, and data interpretation was performed by Wilks, Krisle, and Muhlestein, along with Dubois and Westrich. The manuscript was written primarily by Wilks, along with Krisle and Muhlestein, and revised by Wilks, Westrich, Lunner, and Krisle. Preliminary versions of this work were presented at the following: National Council for Prescription Drug Programs Educational Summit, November 1, 2016; Academy Health 2016 Annual Research Meeting, June 27, 2016; Accountable Care Learning Collaborative Webinar, June 16, 2016; the 21st Annual PBMI Drug Benefit Conference, February 29, 2016; National Value-Based Payment and Pay for Performance Summit, February 17, 2016; National Accountable Care Congress, November 17, 2015; and American Journal of Managed Care's ACO Emerging Healthcare Delivery Coalition, Fall 2015 Live Meeting, October 15, 2015.
Rosenberg, Abby R; Bona, Kira; Wharton, Claire M; Bradford, Miranda; Shaffer, Michele L; Wolfe, Joanne; Baker, Kevin Scott
2016-04-01
Conducting patient-reported outcomes research with adolescents and young adults (AYAs) is difficult due to low participation rates and high attrition. Forty-seven AYAs with newly diagnosed cancer at two large hospitals were prospectively surveyed at the time of diagnosis and 3-6 and 12-18 months later. A subset participated in 1:1 semistructured interviews. Attrition prompted early study closure at one site. The majority of patients preferred paper-pencil to online surveys. Interview participants were more likely to complete surveys (e.g., 93% vs. 58% completion of 3-6 month surveys, P = 0.02). Engaging patients through qualitative methodologies and using patient-preferred instruments may optimize future research success. © 2015 Wiley Periodicals, Inc.
Toward an efficient Photometric Supernova Classifier
NASA Astrophysics Data System (ADS)
McClain, Bradley
2018-01-01
The Sloan Digital Sky Survey Supernova Survey (SDSS) discovered more than 1,000 Type Ia Supernovae, yet less than half of these have spectroscopic measurements. As wide-field imaging telescopes such as The Dark Energy Survey (DES) and the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS) discover more supernovae, the need for accurate and computationally cheap photometric classifiers increases. My goal is to use a photometric classification algorithm based on Sncosmo, a python library for supernova cosmology analysis, to reclassify previously identified Hubble SN and other non-spectroscopically confirmed surveys. My results will be compared to other photometric classifiers such as PSNID and STARDUST. In the near future, I expect to have the algorithm validated with simulated data, optimized for efficiency, and applied with high performance computing to real data.
A Twenty-Year Survey of Novae in M31
NASA Astrophysics Data System (ADS)
Crayton, Hannah; Rector, Travis A.; Walentosky, Matthew J.; Shafter, Allen W.; Lauber, Stephanie; Pilachowski, Catherine A.; RBSE Nova Search Team
2018-06-01
Numerous surveys of M31 in search of extragalactic novae have been completed over the last century, with a total of more than 1000 having been discovered during this time. From these surveys it has been estimated that the number of novae that occur in M31 is approximately 65 yr-1 (Darnley et al. 2006). A fraction of these are recurrent novae that recur on the timescales of years to decades (Shafter et al. 2015). From 1997 to 2017 we completed observations of M31 with the KPNO/WIYN 0.9-meter telescope, which offers a wide field of view suitable for surveying nearly all of the bulge and much of the disk of M31. Observations were completed in Hα so as to better detect novae in the bulge of the galaxy, where most novae reside. Our survey achieves a limiting absolute magnitude per epoch of MHα ∼ 7.5 mag, which prior M31 nova surveys in Hα (e.g., Ciardullo et al. 1987; Shafter & Irby 2001) have shown to be sufficiently deep to detect a typical nova several months after eruption. By completing nearly all of the observations with the same telescope, cameras, and filters we were able to obtain a remarkably consistent dataset.Our survey offers several benefits as compared to prior surveys. Nearly 200 epochs of observations were completed during the survey period. Observations were typically completed on a monthly basis; although on several occasions we completed weekly and nightly observations to search for novae with faster decay rates. Thus we were sensitive to most of the novae that erupted in M31 during the survey period.Over twenty years we detected 316 novae. Our survey found 85% of the novae in M31 that were reported by other surveys completed during the same time range and in the same survey area as ours (Pietsch et al. 2007). We also discovered 39 novae that were not found by other surveys. We present the complete catalog of novae from our survey, along with example light curves. Among other uses, our catalog will be useful for improving estimates of nova rate in M31. We also identify 72 standard stars within the survey area that will be useful for future surveys.
Chang, E C; Bridewell, W B
1998-02-01
The present study compared the effects of irrational beliefs measured by the Survey of Personal Beliefs (SPB) and optimism and pessimism as measured by the revised Life Orientation Test (LOT-R) on depressive and anxious symptoms 6 weeks later. Results of analysis of variances for both measures of psychological distress indicated a significant main effect for pessimism only. Implications for Ellis Rational Emotive Therapy are discussed.
Reid, Thomas; Chaganti, Subba Rao; Droppo, Ian G; Weisener, Christopher G
2018-06-01
Baseline biogeochemical surveys of natural environments is an often overlooked field of environmental studies. Too often research begins once contamination has occurred, with a knowledge gap as to how the affected area behaved prior to outside (often anthropogenic) influences. These baseline characterizations can provide insight into proposed bioremediation strategies crucial in cleaning up chemical spill sites or heavily mined regions. Hence, this study was conducted to survey the in-situ microbial activity within freshwater hydrocarbon-rich environments cutting through the McMurray formation - the geologic strata constituting the oil sands. We are the first to report in-situ functional variations among these freshwater microbial ecosystems using metatranscriptomics, providing insight into the in-situ gene expression within these naturally hydrocarbon-rich sites. Key genes involved in energy metabolism (nitrogen, sulfur and methane) and hydrocarbon degradation, including transcripts relating to the observed expression of methane oxidation are reported. This information provides better linkages between hydrocarbon impacted environments, closing knowledge gaps for optimizing not only oil sands mine reclamation but also enhancing microbial reclamation strategies in various freshwater environments. These finding can also be applied to existing contaminated environments, in need of efficient reclamation efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.
Color Separation of Galaxy Types in the Sloan Digital Sky Survey Imaging Data
NASA Astrophysics Data System (ADS)
Strateva, Iskra; Ivezić, Željko; Knapp, Gillian R.; Narayanan, Vijay K.; Strauss, Michael A.; Gunn, James E.; Lupton, Robert H.; Schlegel, David; Bahcall, Neta A.; Brinkmann, Jon; Brunner, Robert J.; Budavári, Tamás; Csabai, István; Castander, Francisco Javier; Doi, Mamoru; Fukugita, Masataka; Győry, Zsuzsanna; Hamabe, Masaru; Hennessy, Greg; Ichikawa, Takashi; Kunszt, Peter Z.; Lamb, Don Q.; McKay, Timothy A.; Okamura, Sadanori; Racusin, Judith; Sekiguchi, Maki; Schneider, Donald P.; Shimasaku, Kazuhiro; York, Donald
2001-10-01
We study the optical colors of 147,920 galaxies brighter than g*=21, observed in five bands by the Sloan Digital Sky Survey (SDSS) over ~100 deg2 of high Galactic latitude sky along the celestial equator. The distribution of galaxies in the g*-r* versus u*-g* color-color diagram is strongly bimodal, with an optimal color separator of u*-r*=2.22. We use visual morphology and spectral classification of subsamples of 287 and 500 galaxies, respectively, to show that the two peaks correspond roughly to early- (E, S0, and Sa) and late-type (Sb, Sc, and Irr) galaxies, as expected from their different stellar populations. We also find that the colors of galaxies are correlated with their radial profiles, as measured by the concentration index and by the likelihoods of exponential and de Vaucouleurs' profile fits. While it is well known that late-type galaxies are bluer than early-type galaxies, this is the first detection of a local minimum in their color distribution. In all SDSS bands, the counts versus apparent magnitude relations for the two color types are significantly different and demonstrate that the fraction of blue galaxies increases toward the faint end.
NASA Astrophysics Data System (ADS)
Burgarella, D.; Levacher, P.; Vives, S.; Dohlen, K.; Pascal, S.
2016-07-01
FLARE (First Light And Reionization Explorer) is a space mission that will be submitted to ESA (M5 call). Its primary goal (~80% of lifetime) is to identify and study the universe before the end of the reionization at z > 6. A secondary objective (~20% of lifetime) is to survey star formation in the Milky Way. FLARE's strategy optimizes the science return: imaging and spectroscopic integral-field observations will be carried out simultaneously on two parallel focal planes and over very wide instantaneous fields of view. FLARE will help addressing two of ESA's Cosmic Vision themes: a) << How did the universe originate and what is it made of? » and b) « What are the conditions for planet formation and the emergence of life? >> and more specifically, << From gas and dust to stars and planets >>. FLARE will provide to the ESA community a leading position to statistically study the early universe after JWST's deep but pin-hole surveys. Moreover, the instrumental development of wide-field imaging and wide-field integral-field spectroscopy in space will be a major breakthrough after making them available on ground-based telescopes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McQuinn, Kristen B. W.; Skillman, Evan D.; Dolphin, Andrew E.
Accurate distances are fundamental for interpreting various measured properties of galaxies. Surprisingly, many of the best-studied spiral galaxies in the Local Volume have distance uncertainties that are much larger than can be achieved with modern observation techniques. Using Hubble Space Telescope optical imaging, we use the tip of the red giant branch method to measure the distances to six galaxies that are included in the Spitzer Infrared Nearby Galaxies Survey program and its offspring surveys. The sample includes M63, M74, NGC 1291, NGC 4559, NGC 4625, and NGC 5398. We compare our results with distances reported to these galaxies basedmore » on a variety of methods. Depending on the technique, there can be a wide range in published distances, particularly from the Tully–Fisher relation. In addition, differences between the planetary nebular luminosity function and surface brightness fluctuation techniques can vary between galaxies, suggesting inaccuracies that cannot be explained by systematics in the calibrations. Our distances improve upon previous results, as we use a well-calibrated, stable distance indicator, precision photometry in an optimally selected field of view, and a Bayesian maximum likelihood technique that reduces measurement uncertainties.« less
NASA Astrophysics Data System (ADS)
Themistocleous, Kyriacos; Neocleous, Kyriacos; Pilakoutas, Kypros; Hadjimitsis, Diofantos G.
2014-08-01
The predominant approach for conducting road condition surveys and analyses is still largely based on extensive field observations. However, visual assessment alone cannot identify the actual extent and severity of damage. New non-invasive and cost-effective non-destructive (NDT) remote sensing technologies can be used to monitor road pavements across their life cycle, including remotely sensed aerial and satellite visual and thermal image (AI) data, Unmanned Aerial Vehicles (UAVs), Spectroscopy and Ground Penetrating Radar (GRP). These non-contact techniques can be used to obtain surface and sub-surface information about damage in road pavements, including the crack depth, and in-depth structural failure. Thus, a smart and cost-effective methodology is required that integrates several of these non-destructive/ no-contact techniques for the damage assessment and monitoring at different levels. This paper presents an overview of how an integration of the above technologies can be used to conduct detailed road condition surveys. The proposed approach can also be used to predict the future needs for road maintenance; this information is proven to be valuable to a strategic decision making tools that optimizes maintenance based on resources and environmental issues.
Assessing Diversity of DNA Structure-Related Sequence Features in Prokaryotic Genomes
Huang, Yongjie; Mrázek, Jan
2014-01-01
Prokaryotic genomes are diverse in terms of their nucleotide and oligonucleotide composition as well as presence of various sequence features that can affect physical properties of the DNA molecule. We present a survey of local sequence patterns which have a potential to promote non-canonical DNA conformations (i.e. different from standard B-DNA double helix) and interpret the results in terms of relationships with organisms' habitats, phylogenetic classifications, and other characteristics. Our present work differs from earlier similar surveys not only by investigating a wider range of sequence patterns in a large number of genomes but also by using a more realistic null model to assess significant deviations. Our results show that simple sequence repeats and Z-DNA-promoting patterns are generally suppressed in prokaryotic genomes, whereas palindromes and inverted repeats are over-represented. Representation of patterns that promote Z-DNA and intrinsic DNA curvature increases with increasing optimal growth temperature (OGT), and decreases with increasing oxygen requirement. Additionally, representations of close direct repeats, palindromes and inverted repeats exhibit clear negative trends with increasing OGT. The observed relationships with environmental characteristics, particularly OGT, suggest possible evolutionary scenarios of structural adaptation of DNA to particular environmental niches. PMID:24408877
Using wound care algorithms: a content validation study.
Beitz, J M; van Rijswijk, L
1999-09-01
Valid and reliable heuristic devices facilitating optimal wound care are lacking. The objectives of this study were to establish content validation data for a set of wound care algorithms, to identify their associated strengths and weaknesses, and to gain insight into the wound care decision-making process. Forty-four registered nurse wound care experts were surveyed and interviewed at national and regional educational meetings. Using a cross-sectional study design and an 83-item, 4-point Likert-type scale, this purposive sample was asked to quantify the degree of validity of the algorithms' decisions and components. Participants' comments were tape-recorded, transcribed, and themes were derived. On a scale of 1 to 4, the mean score of the entire instrument was 3.47 (SD +/- 0.87), the instrument's Content Validity Index was 0.86, and the individual Content Validity Index of 34 of 44 participants was > 0.8. Item scores were lower for those related to packing deep wounds (P < .001). No other significant differences were observed. Qualitative data analysis revealed themes of difficulty associated with wound assessment and care issues, that is, the absence of valid and reliable definitions. The wound care algorithms studied proved valid. However, the lack of valid and reliable wound assessment and care definitions hinders optimal use of these instruments. Further research documenting their clinical use is warranted. Research-based practice recommendations should direct the development of future valid and reliable algorithms designed to help nurses provide optimal wound care.
Schlossberg, Scott; Chase, Michael J; Griffin, Curtice R
2016-01-01
Accurate counts of animals are critical for prioritizing conservation efforts. Past research, however, suggests that observers on aerial surveys may fail to detect all individuals of the target species present in the survey area. Such errors could bias population estimates low and confound trend estimation. We used two approaches to assess the accuracy of aerial surveys for African savanna elephants (Loxodonta africana) in northern Botswana. First, we used double-observer sampling, in which two observers make observations on the same herds, to estimate detectability of elephants and determine what variables affect it. Second, we compared total counts, a complete survey of the entire study area, against sample counts, in which only a portion of the study area is sampled. Total counts are often considered a complete census, so comparing total counts against sample counts can help to determine if sample counts are underestimating elephant numbers. We estimated that observers detected only 76% ± SE of 2% of elephant herds and 87 ± 1% of individual elephants present in survey strips. Detectability increased strongly with elephant herd size. Out of the four observers used in total, one observer had a lower detection probability than the other three, and detectability was higher in the rear row of seats than the front. The habitat immediately adjacent to animals also affected detectability, with detection more likely in more open habitats. Total counts were not statistically distinguishable from sample counts. Because, however, the double-observer samples revealed that observers missed 13% of elephants, we conclude that total counts may be undercounting elephants as well. These results suggest that elephant population estimates from both sample and total counts are biased low. Because factors such as observer and habitat affected detectability of elephants, comparisons of elephant populations across time or space may be confounded. We encourage survey teams to incorporate detectability analysis in all aerial surveys for mammals.
Schlossberg, Scott; Chase, Michael J.; Griffin, Curtice R.
2016-01-01
Accurate counts of animals are critical for prioritizing conservation efforts. Past research, however, suggests that observers on aerial surveys may fail to detect all individuals of the target species present in the survey area. Such errors could bias population estimates low and confound trend estimation. We used two approaches to assess the accuracy of aerial surveys for African savanna elephants (Loxodonta africana) in northern Botswana. First, we used double-observer sampling, in which two observers make observations on the same herds, to estimate detectability of elephants and determine what variables affect it. Second, we compared total counts, a complete survey of the entire study area, against sample counts, in which only a portion of the study area is sampled. Total counts are often considered a complete census, so comparing total counts against sample counts can help to determine if sample counts are underestimating elephant numbers. We estimated that observers detected only 76% ± SE of 2% of elephant herds and 87 ± 1% of individual elephants present in survey strips. Detectability increased strongly with elephant herd size. Out of the four observers used in total, one observer had a lower detection probability than the other three, and detectability was higher in the rear row of seats than the front. The habitat immediately adjacent to animals also affected detectability, with detection more likely in more open habitats. Total counts were not statistically distinguishable from sample counts. Because, however, the double-observer samples revealed that observers missed 13% of elephants, we conclude that total counts may be undercounting elephants as well. These results suggest that elephant population estimates from both sample and total counts are biased low. Because factors such as observer and habitat affected detectability of elephants, comparisons of elephant populations across time or space may be confounded. We encourage survey teams to incorporate detectability analysis in all aerial surveys for mammals. PMID:27755570
Optimal Inversion Parameters for Full Waveform Inversion using OBS Data Set
NASA Astrophysics Data System (ADS)
Kim, S.; Chung, W.; Shin, S.; Kim, D.; Lee, D.
2017-12-01
In recent years, full Waveform Inversion (FWI) has been the most researched technique in seismic data processing. It uses the residuals between observed and modeled data as an objective function; thereafter, the final subsurface velocity model is generated through a series of iterations meant to minimize the residuals.Research on FWI has expanded from acoustic media to elastic media. In acoustic media, the subsurface property is defined by P-velocity; however, in elastic media, properties are defined by multiple parameters, such as P-velocity, S-velocity, and density. Further, the elastic media can also be defined by Lamé constants, density or impedance PI, SI; consequently, research is being carried out to ascertain the optimal parameters.From results of advanced exploration equipment and Ocean Bottom Seismic (OBS) survey, it is now possible to obtain multi-component seismic data. However, to perform FWI on these data and generate an accurate subsurface model, it is important to determine optimal inversion parameters among (Vp, Vs, ρ), (λ, μ, ρ), and (PI, SI) in elastic media. In this study, staggered grid finite difference method was applied to simulate OBS survey. As in inversion, l2-norm was set as objective function. Further, the accurate computation of gradient direction was performed using the back-propagation technique and its scaling was done using the Pseudo-hessian matrix.In acoustic media, only Vp is used as the inversion parameter. In contrast, various sets of parameters, such as (Vp, Vs, ρ) and (λ, μ, ρ) can be used to define inversion in elastic media. Therefore, it is important to ascertain the parameter that gives the most accurate result for inversion with OBS data set.In this study, we generated Vp and Vs subsurface models by using (λ, μ, ρ) and (Vp, Vs, ρ) as inversion parameters in every iteration, and compared the final two FWI results.This research was supported by the Basic Research Project(17-3312) of the Korea Institute of Geoscience and Mineral Resources(KIGAM) funded by the Ministry of Science, ICT and Future Planning of Korea.
Euclid Mission: Mapping the Geometry of the Dark Universe. Mission and Consortium Status
NASA Technical Reports Server (NTRS)
Rhodes, Jason
2011-01-01
Euclid concept: (1) High-precision survey mission to map the geometry of the Dark Universe (2) Optimized for two complementary cosmological probes: (2a) Weak Gravitational Lensing (2b) Baryonic Acoustic Oscillations (2c) Additional probes: clusters, redshift space distortions, ISW (3) Full extragalactic sky survey with 1.2m telescope at L2: (3a) Imaging: (3a-1) High precision imaging at visible wavelengths (3a-2) Photometry/Imaging in the near-infrared (3b) Near Infrared Spectroscopy (4) Synergy with ground based surveys (5) Legacy science for a wide range of in astronomy
VISIONS - Vista Star Formation Atlas
NASA Astrophysics Data System (ADS)
Meingast, Stefan; Alves, J.; Boui, H.; Ascenso, J.
2017-06-01
In this talk I will present the new ESO public survey VISIONS. Starting in early 2017 we will use the ESO VISTA survey telescope in a 550 h long programme to map the largest molecular cloud complexes within 500 pc in a multi-epoch program. The survey is optimized for measuring the proper motions of young stellar objects invisible to Gaia and mapping the cloud-structure with extinction. VISIONS will address a series of ISM topics ranging from the connection of dense cores to YSOs and the dynamical evolution of embedded clusters to variations in the reddening law on both small and large scales.
Decision-Aiding and Optimization for Vertical Navigation of Long-Haul Aircraft
NASA Technical Reports Server (NTRS)
Patrick, Nicholas J. M.; Sheridan, Thomas B.
1996-01-01
Most decisions made in the cockpit are related to safety, and have therefore been proceduralized in order to reduce risk. There are very few which are made on the basis of a value metric such as economic cost. One which can be shown to be value based, however, is the selection of a flight profile. Fuel consumption and flight time both have a substantial effect on aircraft operating cost, but they cannot be minimized simultaneously. In addition, winds, turbulence, and performance vary widely with altitude and time. These factors make it important and difficult for pilots to (a) evaluate the outcomes associated with a particular trajectory before it is flown and (b) decide among possible trajectories. The two elements of this problem considered here are: (1) determining what constitutes optimality, and (2) finding optimal trajectories. Pilots and dispatchers from major u.s. airlines were surveyed to determine which attributes of the outcome of a flight they considered the most important. Avoiding turbulence-for passenger comfort-topped the list of items which were not safety related. Pilots' decision making about the selection of flight profile on the basis of flight time, fuel burn, and exposure to turbulence was then observed. Of the several behavioral and prescriptive decision models invoked to explain the pilots' choices, utility maximization is shown to best reproduce the pilots' decisions. After considering more traditional methods for optimizing trajectories, a novel method is developed using a genetic algorithm (GA) operating on a discrete representation of the trajectory search space. The representation is a sequence of command altitudes, and was chosen to be compatible with the constraints imposed by Air Traffic Control, and with the training given to pilots. Since trajectory evaluation for the GA is performed holistically, a wide class of objective functions can be optimized easily. Also, using the GA it is possible to compare the costs associated with different airspace design and air traffic management policies. A decision aid is proposed which would combine the pilot's notion of optimality with the GA-based optimization, provide the pilot with a number of alternative pareto-optimal trajectories, and allow him to consider unmodelled attributes and constraints in choosing among them. A solution to the problem of displaying alternatives in a multi-attribute decision space is also presented.
Analysis of Spatial Autocorrelation for Optimal Observation Network in Korea
NASA Astrophysics Data System (ADS)
Park, S.; Lee, S.; Lee, E.; Park, S. K.
2016-12-01
Many studies for improving prediction of high-impact weather have been implemented, such as THORPEX (The Observing System Research and Predictability Experiment), FASTEX (Fronts and Atlantic Storm-Track Experiment), NORPEX (North Pacific Experiment), WSR/NOAA (Winter Storm Reconnaissance), and DOTSTAR (Dropwindsonde Observations for Typhoon Surveillance near the TAiwan Region). One of most important objectives in these studies is to find effects of observation on forecast, and to establish optimal observation network. However, there are lack of such studies on Korea, although Korean peninsula exhibits a highly complex terrain so it is difficult to predict its weather phenomena. Through building the future optimal observation network, it is necessary to increase utilization of numerical weather prediction and improve monitoring·tracking·prediction skills of high-impact weather in Korea. Therefore, we will perform preliminary study to understand the spatial scale for an expansion of observation system through Spatial Autocorrelation (SAC) analysis. In additions, we will develop a testbed system to design an optimal observation network. Analysis is conducted with Automatic Weather System (AWS) rainfall data, global upper air grid observation (i.e., temperature, pressure, humidity), Himawari satellite data (i.e., water vapor) during 2013-2015 of Korea. This study will provide a guideline to construct observation network for not only improving weather prediction skill but also cost-effectiveness.
A proposal of optimal sampling design using a modularity strategy
NASA Astrophysics Data System (ADS)
Simone, A.; Giustolisi, O.; Laucelli, D. B.
2016-08-01
In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.
Latest Results of the SETHI Survey at Arecibo
NASA Astrophysics Data System (ADS)
Korpela, E. J.; Demorest, P.; Heien, E.; Heiles, C.; Werthimer, D.
2004-10-01
SETH i is a survey of the distribution of galactic neutral hydrogen being performed comensally at the NAIC Arecibo Observatory. At the same time that observers use receivers in the Gregorian dome, SETHi is recording a 2.5MHz band centered at 1420 MHz from a flat feed on Carriage House 1. During normal astronomical observations, the SETH i feed scans across the sky at twice the sidereal rate. During 4 years of observations, we have accumulated over 15,000 hours of data covering most of the sky accessible to Arecibo. This survey has higher angular resolution than existing single dish surveys and higher sensitivity than existing or planned interferometric surveys.
NASA Astrophysics Data System (ADS)
Ghaly, Michael; Links, Jonathan M.; Frey, Eric C.
2016-03-01
The collimator is the primary factor that determines the spatial resolution and noise tradeoff in myocardial perfusion SPECT images. In this paper, the goal was to find the collimator that optimizes the image quality in terms of a perfusion defect detection task. Since the optimal collimator could depend on the level of approximation of the collimator-detector response (CDR) compensation modeled in reconstruction, we performed this optimization for the cases of modeling the full CDR (including geometric, septal penetration and septal scatter responses), the geometric CDR, or no model of the CDR. We evaluated the performance on the detection task using three model observers. Two observers operated on data in the projection domain: the Ideal Observer (IO) and IO with Model-Mismatch (IO-MM). The third observer was an anthropomorphic Channelized Hotelling Observer (CHO), which operated on reconstructed images. The projection-domain observers have the advantage that they are computationally less intensive. The IO has perfect knowledge of the image formation process, i.e. it has a perfect model of the CDR. The IO-MM takes into account the mismatch between the true (complete and accurate) model and an approximate model, e.g. one that might be used in reconstruction. We evaluated the utility of these projection domain observers in optimizing instrumentation parameters. We investigated a family of 8 parallel-hole collimators, spanning a wide range of resolution and sensitivity tradeoffs, using a population of simulated projection (for the IO and IO-MM) and reconstructed (for the CHO) images that included background variability. We simulated anterolateral and inferior perfusion defects with variable extents and severities. The area under the ROC curve was estimated from the IO, IO-MM, and CHO test statistics and served as the figure-of-merit. The optimal collimator for the IO had a resolution of 9-11 mm FWHM at 10 cm, which is poorer resolution than typical collimators used for MPS. When the IO-MM and CHO used a geometric or no model of the CDR, the optimal collimator shifted toward higher resolution than that obtained using the IO and the CHO with full CDR modeling. With the optimal collimator, the IO-MM and CHO using geometric modeling gave similar performance to full CDR modeling. Collimators with poorer resolution were optimal when CDR modeling was used. The agreement of rankings between the IO-MM and CHO confirmed that the IO-MM is useful for optimization tasks when model mismatch is present due to its substantially reduced computational burden compared to the CHO.
Applying Squeaky-Wheel Optimization Schedule Airborne Astronomy Observations
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Kuerklue, Elif
2004-01-01
We apply the Squeaky Wheel Optimization (SWO) algorithm to the problem of scheduling astronomy observations for the Stratospheric Observatory for Infrared Astronomy, an airborne observatory. The problem contains complex constraints relating the feasibility of an astronomical observation to the position and time at which the observation begins, telescope elevation limits, special use airspace, and available fuel. Solving the problem requires making discrete choices (e.g. selection and sequencing of observations) and continuous ones (e.g. takeoff time and setting up observations by repositioning the aircraft). The problem also includes optimization criteria such as maximizing observing time while simultaneously minimizing total flight time. Previous approaches to the problem fail to scale when accounting for all constraints. We describe how to customize SWO to solve this problem, and show that it finds better flight plans, often with less computation time, than previous approaches.
CTEPP STANDARD OPERATING PROCEDURE FOR RECORDING DATA COLLECTION FORMS (SOP-2.22)
This SOP describes the method for recording information onto the data collection forms. The data collection forms are organized into 10 modules: Recruitment Survey; House/Building Characteristics Observation Survey; Day Care Center/Building Characteristics Observation Survey; Par...
McClure, Philip K; Woiczik, Marcella; Karol, Lori; Sankar, Wudbhav N
The introduction of the 80-hour work week for Accreditation Council for Graduate Medical Education (ACGME) accredited fellowship programs initiated many efforts to optimize surgical training. One particular area of interest is on recording and tracking surgical experiences. The current standard is logging cases based on Current Procedural Terminology codes, which are primarily designed for billing. Proposed guidelines from the ACGME regarding logging exist, but their implementation is unknown, as is the variation in case volume across fellowship programs. The purpose of this study was to investigate variability in the national case log data, and explore potential sources of variation using fellow surveys. National ACGME case log data for pediatric orthopaedic fellowships from 2012 to 2015 were reviewed, with particular attention to the domains of spine, pelvis/hip, arthroscopy, trauma, and other (which includes clubfoot casting). To explore potential sources of case log variability, a survey on case logging behavior was distributed to all pediatric orthopaedic fellows for the academic year 2015 to 2016. Reported experiences based on ACGME case logs varied widely between fellows with percentage difference of up to 100% in all areas. Similarly, wide variability is present in coding practices of pediatric orthopaedic fellows, who often lack formal education on the topic of appropriate coding/logging. In the survey, hypothetical case scenarios had an absolute difference in recorded codes of up to 13 and a percentage difference of up to 100%. ACGME case log data for pediatric orthopaedic fellowships demonstrates wide variability in reported surgical experiences. This variability may be due, in part, to differences in logging practices by individual fellows. This observation makes meaningful interpretation of national data on surgical volume challenging. Proposed surgical experience minimums should be interpreted in light of these data, and may not be advisable unless accompanied by standardized and specific guidelines for case log entry. Efforts to optimize training in the post 80-hour era will require accurate data to serve as a starting point for future educational efforts.
The magnetic fields of hot subdwarf stars
NASA Astrophysics Data System (ADS)
Landstreet, J. D.; Bagnulo, S.; Fossati, L.; Jordan, S.; O'Toole, S. J.
2012-05-01
Context. Detection of magnetic fields has been reported in several sdO and sdB stars. Recent literature has cast doubts on the reliability of most of these detections. The situation concerning the occurrence and frequency of magnetic fields in hot subdwarfs is at best confused. Aims: We revisit data previously published in the literature, and we present new observations to clarify the question of how common magnetic fields are in subdwarf stars. Methods: We consider a sample of about 40 hot subdwarf stars. About 30 of them have been observed with the FORS1 and FORS2 instruments of the ESO VLT. Results have been published for only about half of the hot subdwarfs observed with FORS. Here we present new FORS1 field measurements for 17 stars, 14 of which have never been observed for magnetic fields before. We also critically review the measurements already published in the literature, and in particular we try to explain why previous papers based on the same FORS1 data have reported contradictory results. Results: All new and re-reduced measurements obtained with FORS1 are shown to be consistent with non-detection of magnetic fields. We explain previous spurious field detections from data obtained with FORS1 as due to a non-optimal method of wavelength calibration. Field detections in other surveys are found to be uncertain or doubtful, and certainly in need of confirmation. Conclusions: There is presently no strong evidence for the occurrence of a magnetic field in any sdB or sdO star, with typical longitudinal field uncertainties of the order of 2-400 G. It appears that globally simple fields of more than about 1 or 2 kG in strength occur in at most a few percent of hot subdwarfs. Further high-precision surveys, both with high-resolution spectropolarimeters and with instruments similar to FORS1 on large telescopes, would be very valuable. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile under observing programmes 072.D-0290 and 075.D-0352, or obtained from the ESO/ST-ECF Science Archive Facility.
From molecule to market: steroid hormones and financial risk-taking.
Coates, John M; Gurnell, Mark; Sarnyai, Zoltan
2010-01-27
Little is known about the role of the endocrine system in financial decision-making. Here, we survey research on steroid hormones and their cognitive effects, and examine potential links to trader performance in the financial markets. Preliminary findings suggest that cortisol codes for risk and testosterone for reward. A key finding of this endocrine research is the different cognitive effects of acute versus chronic exposure to hormones: acutely elevated steroids may optimize performance on a range of tasks; but chronically elevated steroids may promote irrational risk-reward choices. We present a hypothesis suggesting that the irrational exuberance and pessimism observed during market bubbles and crashes may be mediated by steroid hormones. If hormones can exaggerate market moves, then perhaps the age and sex composition among traders and asset managers may affect the level of instability witnessed in the financial markets.
NASA Astrophysics Data System (ADS)
Saleh Malawat, M.; Putra, M. Umar Maya
2018-03-01
This paper studies the implementation of business opportunities that can improve the revenue of Bunut Shoes Micro, Small and Medium Enterprises. Probit model with E Views 6 program was used to see how far the opportunity of variable efforts to improve the revenue such as education, training, capital assistance, technological procurement of them. The data used was the primary data by conducting a survey using questionnaires to members of them with the observation period from 2013 to 2015. The results showed that all variables of implementation did not have a business opportunity correlation to the increase in revenue and Asahan District Governments are asked to create a creative breakthrough in order to achieve optimal business revenue and cooperate with other private institutions related to increase the business income.
Resource and environmental surveys from space with the thematic mapper in the 1980's
NASA Technical Reports Server (NTRS)
1976-01-01
The selection of observation of vegetation is the primary optimization objective of the thematic mapper. The following are aspects of plans for the thematic mapper: (1) to include an appropriately modified first generation MSS in the thematic mapper mission; (2) to provide assured coverage for a minimum of six years to give agencies and other users an opportunity to justify the necessary commitment of resources for the transition into a completely valid operational phase; (3) to provide for global, direct data read-out, without the necessity for on-board data storage or dependence on foreign receiving stations; (4) to recognize the operational character of the thematic mapper after successful completion of its experimental evaluation; and (5) to combine future experimental packages with compatible orbits as part of the operational LANDSAT follow-on payloads.
NASA Astrophysics Data System (ADS)
Tsuno, S.; Korenaga, M.; Okamoto, K.; Chimoto, K.; Yamanaka, H.; Yamada, N.; Matsushima, T.
2017-12-01
To evaluate local site effects in the Kumamoto Plain, we installed 15 temporary seismic stations along the north-south survey line, after the 2016 Kumamoto earthquake foreshock (Mj 6.4). In this report, to investigate earthquake ground motions observed along the north-south survey line, we estimated site amplification factors from weak ground motion data and estimated S-wave velocity structures by array microtremor observations at temporary seismic stations. We installed 15 temporary seismic stations at an interval of 300m to 2.5km along the north-south survey line. We estimated site amplification factors, with a station at Mt. Kinbo as a reference. Site amplification factors at the middle part and the southern part along the survey line, located in the alluvial lowland, were dominated in the frequency of 1-2Hz. On the other hand, site amplification factors at the northern part along the survey line were dominated in the frequency of 2-5Hz. It suggests that the ground profiles near the surface are complicate along this north-south survey line in the Kumamoto Plain. Therefore, we performed array microtremor observations at the temporary seismic stations, to estimate S-wave velocity structures along the north-south survey line. We obtained phase velocities of Rayleigh waves by the SPAC method and estimated S-wave velocity structures by applying the Genetic Algorism to those phase velocity. The low velocity layer with a thickness of around 15m was deposited on the surface at sites located in the alluvial lowland. Finally, we compared the distribution of PGAs observed along the north-south survey line to AVs30 estimated by S-wave velocity structures. As a result, PGAs along the survey line were strongly concerned by AVs30. We concluded that earthquake ground motions in the frequency of more than 1Hz observed in this north-south survey line were excited by the low velocity layer near the surface.
Solar System science with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David
2015-11-01
The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.
The km 3 Mediterranean neutrino observatory - the NEMO.RD project
NASA Astrophysics Data System (ADS)
De Marzo, C. N.
2001-05-01
The NEMO.RD Project is a feasibility study of a km 3 underwater telescope for high energy astrophysical neutrinos to be located in the Mediterranean Sea. Results on various issues of this project are presented on: i) Monte Carlo simulation study of the capabilities of various arrays of phototubes in order to determine the detector geometry that can optimize performance and cost; ii) oceanographic survey of various sites in search of the optimal one; iii) feasibility study of mechanics, deployment, connections and maintenance of such a detector. Parameters of a site near Capo Passero, Sicily, where depth, transparency and other water parameters seem optimal are shown.
Prenatal yoga in late pregnancy and optimism, power, and well-being.
Reis, Pamela J; Alligood, Martha R
2014-01-01
The study reported here explored changes in optimism, power, and well-being over time in women who participated in a six-week prenatal yoga program during their second and third trimesters of pregnancy. The study was conceptualized from the perspective of Rogers' science of unitary human beings. A correlational, one-group, pre-post-assessment survey design with a convenience sample was conducted. Increases in mean scores for optimism, power, and well-being were statistically significant from baseline to completion of the prenatal yoga program. Findings from this study suggested that yoga as a self-care practice that nurses might recommend to promote well-being in pregnant women.
Exact solution of large asymmetric traveling salesman problems.
Miller, D L; Pekny, J F
1991-02-15
The traveling salesman problem is one of a class of difficult problems in combinatorial optimization that is representative of a large number of important scientific and engineering problems. A survey is given of recent applications and methods for solving large problems. In addition, an algorithm for the exact solution of the asymmetric traveling salesman problem is presented along with computational results for several classes of problems. The results show that the algorithm performs remarkably well for some classes of problems, determining an optimal solution even for problems with large numbers of cities, yet for other classes, even small problems thwart determination of a provably optimal solution.
A Multi-resolution, Multi-epoch Low Radio Frequency Survey of the Kepler K2 Mission Campaign 1 Field
NASA Astrophysics Data System (ADS)
Tingay, S. J.; Hancock, P. J.; Wayth, R. B.; Intema, H.; Jagannathan, P.; Mooley, K.
2016-10-01
We present the first dedicated radio continuum survey of a Kepler K2 mission field, Field 1, covering the North Galactic Cap. The survey is wide field, contemporaneous, multi-epoch, and multi-resolution in nature and was conducted at low radio frequencies between 140 and 200 MHz. The multi-epoch and ultra wide field (but relatively low resolution) part of the survey was provided by 15 nights of observation using the Murchison Widefield Array (MWA) over a period of approximately a month, contemporaneous with K2 observations of the field. The multi-resolution aspect of the survey was provided by the low resolution (4‧) MWA imaging, complemented by non-contemporaneous but much higher resolution (20″) observations using the Giant Metrewave Radio Telescope (GMRT). The survey is, therefore, sensitive to the details of radio structures across a wide range of angular scales. Consistent with other recent low radio frequency surveys, no significant radio transients or variables were detected in the survey. The resulting source catalogs consist of 1085 and 1468 detections in the two MWA observation bands (centered at 154 and 185 MHz, respectively) and 7445 detections in the GMRT observation band (centered at 148 MHz), over 314 square degrees. The survey is presented as a significant resource for multi-wavelength investigations of the more than 21,000 target objects in the K2 field. We briefly examine our survey data against K2 target lists for dwarf star types (stellar types M and L) that have been known to produce radio flares.
High resolution Florida IR silicon immersion grating spectrometer and an M dwarf planet survey
NASA Astrophysics Data System (ADS)
Ge, Jian; Powell, Scott; Zhao, Bo; Wang, Ji; Fletcher, Adam; Schofield, Sidney; Liu, Jian; Muterspaugh, Matthew; Blake, Cullen; Barnes, Rory
2012-09-01
We report the system design and predicted performance of the Florida IR Silicon immersion grating spectromeTer (FIRST). This new generation cryogenic IR spectrograph offers broad-band high resolution IR spectroscopy with R=72,000 at 1.4-1.8 μm and R=60,000 at 0.8-1.35 μm in a single exposure with a 2kx2k H2RG IR array. It is enabled by a compact design using an extremely high dispersion silicon immersion grating (SIG) and an R4 echelle with a 50 mm diameter pupil in combination with an Image Slicer. This instrument is operated in vacuum with temperature precisely controlled to reach long term stability for high precision radial velocity (RV) measurements of nearby stars, especially M dwarfs and young stars. The primary technical goal is to reach better than 4 m/s long term RV precision with J<9 M dwarfs within 30 min exposures. This instrument is scheduled to be commissioned at the Tennessee State University (TSU) 2-m Automatic Spectroscopic Telescope (AST) at Fairborn Observatory in spring 2013. FIRST can also be used for observing transiting planets, young stellar objects (YSOs), magnetic fields, binaries, brown dwarfs (BDs), ISM and stars. We plan to launch the FIRST NIR M dwarf planet survey in 2014 after FIRST is commissioned at the AST. This NIR M dwarf survey is the first large-scale NIR high precision Doppler survey dedicated to detecting and characterizing planets around 215 nearby M dwarfs with J< 10. Our primary science goal is to look for habitable Super-Earths around the late M dwarfs and also to identify transiting systems for follow-up observations with JWST to measure the planetary atmospheric compositions and study their habitability. Our secondary science goal is to detect and characterize a large number of planets around M dwarfs to understand the statistics of planet populations around these low mass stars and constrain planet formation and evolution models. Our survey baseline is expected to detect ~30 exoplanets, including 10 Super Earths, within 100 day periods. About half of the Super-Earths are in their habitable zones and one of them may be a transiting planet. The AST, with its robotic control and ease of switching between instruments (in seconds), enables great flexibility and efficiency, and enables an optimal strategy, in terms of schedule and cadence, for this NIR M dwarf planet survey.
Multi-Objective Design Of Optimal Greenhouse Gas Observation Networks
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Bergmann, D. J.; Cameron-Smith, P. J.; Gard, E.; Guilderson, T. P.; Rotman, D.; Stolaroff, J. K.
2010-12-01
One of the primary scientific functions of a Greenhouse Gas Information System (GHGIS) is to infer GHG source emission rates and their uncertainties by combining measurements from an observational network with atmospheric transport modeling. Certain features of the observational networks that serve as inputs to a GHGIS --for example, sampling location and frequency-- can greatly impact the accuracy of the retrieved GHG emissions. Observation System Simulation Experiments (OSSEs) provide a framework to characterize emission uncertainties associated with a given network configuration. By minimizing these uncertainties, OSSEs can be used to determine optimal sampling strategies. Designing a real-world GHGIS observing network, however, will involve multiple, conflicting objectives; there will be trade-offs between sampling density, coverage and measurement costs. To address these issues, we have added multi-objective optimization capabilities to OSSEs. We demonstrate these capabilities by quantifying the trade-offs between retrieval error and measurement costs for a prototype GHGIS, and deriving GHG observing networks that are Pareto optimal. [LLNL-ABS-452333: This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Detection of sea otters in boat-based surveys of Prince William Sound, Alaska
Udevitz, Mark S.; Bodkin, James L.; Costa, Daniel P.
1995-01-01
Boat-based surveys have been commonly used to monitor sea otter populations, but there has been little quantitative work to evaluate detection biases that may affect these surveys. We used ground-based observers to investigate sea otter detection probabilities in a boat-based survey of Prince William Sound, Alaska. We estimated that 30% of the otters present on surveyed transects were not detected by boat crews. Approximately half (53%) of the undetected otters were missed because the otters left the transects, apparently in response to the approaching boat. Unbiased estimates of detection probabilities will be required for obtaining unbiased population estimates from boat-based surveys of sea otters. Therefore, boat-based surveys should include methods to estimate sea otter detection probabilities under the conditions specific to each survey. Unbiased estimation of detection probabilities with ground-based observers requires either that the ground crews detect all of the otters in observed subunits, or that there are no errors in determining which crews saw each detected otter. Ground-based observer methods may be appropriate in areas where nearly all of the sea otter habitat is potentially visible from ground-based vantage points.
GPU computing in medical physics: a review.
Pratx, Guillem; Xing, Lei
2011-05-01
The graphics processing unit (GPU) has emerged as a competitive platform for computing massively parallel problems. Many computing applications in medical physics can be formulated as data-parallel tasks that exploit the capabilities of the GPU for reducing processing times. The authors review the basic principles of GPU computing as well as the main performance optimization techniques, and survey existing applications in three areas of medical physics, namely image reconstruction, dose calculation and treatment plan optimization, and image processing.
Brake System Design Optimization : Volume 1. A Survey and Assessment.
DOT National Transportation Integrated Search
1978-06-01
Existing freight car braking systems, components, and subsystems are characterized both physically and functionally, and life-cycle costs are examined. Potential improvements to existing systems previously proposed or available are identified and des...
Team Dynamics. Implications for Coaching.
ERIC Educational Resources Information Center
Freishlag, Jerry
1985-01-01
A recent survey of coaches ranks team cohesion as the most critical problem coaches face. Optimal interpersonal relationships among athletes and their coaches can maximize collective performance. Team dynamics are discussed and coaching tips are provided. (MT)
Regional operations : one approach to improve traffic signal timing.
DOT National Transportation Integrated Search
2016-11-11
In the 2014 Texas Transportation Poll, survey participants identified more effective traffic signal timing as the highest-rated strategy for resolving regional transportation issues (1). One way traffic engineers optimize traffic signal performance i...
An Optimal Design for Placements of Tsunami Observing Systems Around the Nankai Trough, Japan
NASA Astrophysics Data System (ADS)
Mulia, I. E.; Gusman, A. R.; Satake, K.
2017-12-01
Presently, there are numerous tsunami observing systems deployed in several major tsunamigenic regions throughout the world. However, documentations on how and where to optimally place such measurement devices are limited. This study presents a methodological approach to select the best and fewest observation points for the purpose of tsunami source characterizations, particularly in the form of fault slip distributions. We apply the method to design a new tsunami observation network around the Nankai Trough, Japan. In brief, our method can be divided into two stages: initialization and optimization. The initialization stage aims to identify favorable locations of observation points, as well as to determine the initial number of observations. These points are generated based on extrema of an empirical orthogonal function (EOF) spatial modes derived from 11 hypothetical tsunami events in the region. In order to further improve the accuracy, we apply an optimization algorithm called a mesh adaptive direct search (MADS) to remove redundant measurements from the initially generated points by the first stage. A combinatorial search by the MADS will improve the accuracy and reduce the number of observations simultaneously. The EOF analysis of the hypothetical tsunamis using first 2 leading modes with 4 extrema on each mode results in 30 observation points spread along the trench. This is obtained after replacing some clustered points within the radius of 30 km with only one representative. Furthermore, the MADS optimization can improve the accuracy of the EOF-generated points by approximately 10-20% with fewer observations (23 points). Finally, we compare our result with the existing observation points (68 stations) in the region. The result shows that the optimized design with fewer number of observations can produce better source characterizations with approximately 20-60% improvement of accuracies at all the 11 hypothetical cases. It should be note, however, that our design is a tsunami-based approach, some of the existing observing systems are equipped with additional devices to measure other parameter of interests, i.e., for monitoring seismic activities.
Analyses in Support of the WFIRST Supernova Survey
NASA Astrophysics Data System (ADS)
Rubin, David; Aldering, Greg Scott; Charles, Baltay; Barbary, Kyle H.; Currie, Miles; Deustua, Susana E.; Fagrelius, Parker; Dosovitz Fox, Ori; Fruchter, Andrew S.; Law, David R.; Perlmutter, Saul; Pontoppidan, Klaus; Rabinowitz, David L.; Sako, Masao
2017-01-01
The Wide-Field Infrared Survey Telescope (WFIRST) is a future optical-NIR space telescope with science spanning astrophysics and cosmology. The combination of wide-field IR imaging and optical-NIR integral-field spectroscopy enables a SN cosmology experiment with excellent systematics control. The Science Definition Team (SDT) presented a first concept of such a survey with 2700 SNe to z=1.7. We make several key improvements to the SDT analysis, including a significantly improved exposure-time calculator, evaluations of host-galaxy background light, supernova typing simulations, all combined with spectrophotometric cosmology analysis built on a Bayesian hierarchal model. Our work will be useful for deriving accurate cosmological forecasts, optimizing the survey, and the evaluation of calibration, resolution, and stability requirements.
Effects of Simplifying Choice Tasks on Estimates of Taste Heterogeneity in Stated-Choice Surveys
Johnson, F. Reed; Ozdemir, Semra; Phillips, Kathryn A
2011-01-01
Researchers usually employ orthogonal arrays or D-optimal designs with little or no attribute overlap in stated-choice surveys. The challenge is to balance statistical efficiency and respondent burden to minimize the overall error in the survey responses. This study examined whether simplifying the choice task, by using a design with more overlap, provides advantages over standard minimum-overlap methods. We administered two designs for eliciting HIV test preferences to split samples. Surveys were undertaken at four HIV testing locations in San Francisco, California. Personal characteristics had different effects on willingness to pay for the two treatments, and gains in statistical efficiency in the minimal-overlap version more than compensated for possible imprecision from increased measurement error. PMID:19880234
Kass, M. Andy
2013-01-01
Line spacing and flight height are critical parameters in airborne gravity gradient surveys; the optimal trade-off between survey costs and desired resolution, however, is different for every situation. This article investigates the additional benefit of reducing the flight height and line spacing though a study of a survey conducted over the Great Sand Dunes National Park and Preserve, which is the highest-resolution public-domain airborne gravity gradient data set available, with overlapping high- and lower-resolution surveys. By using Fourier analysis and matched filtering, it is shown that while the lower-resolution survey delineates the target body, reducing the flight height from 80 m to 40 m and the line spacing from 100 m to 50 m improves the recoverable resolution even at basement depths.
Update on the SDSS-III MARVELS data pipeline development
NASA Astrophysics Data System (ADS)
Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.
2014-01-01
MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.
A comparison of cosmological models using time delay lenses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio, E-mail: jjwei@pmo.ac.cn, E-mail: xfwu@pmo.ac.cn, E-mail: fmelia@email.arizona.edu
2014-06-20
The use of time-delay gravitational lenses to examine the cosmological expansion introduces a new standard ruler with which to test theoretical models. The sample suitable for this kind of work now includes 12 lens systems, which have thus far been used solely for optimizing the parameters of ΛCDM. In this paper, we broaden the base of support for this new, important cosmic probe by using these observations to carry out a one-on-one comparison between competing models. The currently available sample indicates a likelihood of ∼70%-80% that the R {sub h} = ct universe is the correct cosmology versus ∼20%-30% formore » the standard model. This possibly interesting result reinforces the need to greatly expand the sample of time-delay lenses, e.g., with the successful implementation of the Dark Energy Survey, the VST ATLAS survey, and the Large Synoptic Survey Telescope. In anticipation of a greatly expanded catalog of time-delay lenses identified with these surveys, we have produced synthetic samples to estimate how large they would have to be in order to rule out either model at a ∼99.7% confidence level. We find that if the real cosmology is ΛCDM, a sample of ∼150 time-delay lenses would be sufficient to rule out R {sub h} = ct at this level of accuracy, while ∼1000 time-delay lenses would be required to rule out ΛCDM if the real universe is instead R {sub h} = ct. This difference in required sample size reflects the greater number of free parameters available to fit the data with ΛCDM.« less
NASA Astrophysics Data System (ADS)
Richards, Joseph W.; Starr, Dan L.; Brink, Henrik; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; James, J. Berian; Long, James P.; Rice, John
2012-01-01
Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL—where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up—is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.
Susanto, Christopher; Kooman, J; Courtens, A M; Konings, C J A M
2018-01-01
Conservative care for patients aged 75 years and older with CKD stage 5 as a treatment option besides dialysis was proposed officially in the Netherlands in October 2016. This national survey showed the current implementation of this option in Netherlands nephrology departments. A web-based survey was sent to medical managers of 60 nephrology departments in the Netherlands in August 2016. Twenty-one medical managers (35%) completed the survey. The term "conservative care" is frequently used and well known. The estimated number of patients in whom the decision for maximal conservative care was made in 2015 was 310 of 2249 patients with CKD stage 5 age 75 years and older (range 5-50 patients per department). 164 patients became symptomatic and received no dialysis. There is no official registration for this treatment option and patient category. The practice patterns vary widely. Only one of 21 respondents reported a conservative care outpatient clinic. Formal training or education regarding conservative care is not available in most of departments. 95% of respondents discussed this treatment option with their patients. General practitioners are always being informed about their patient's decision. Their main role is providing or organizing palliative care support at the end of life and discussing advance care planning. Most respondents (86%) considered to include their patients in a prospective multicentre observational study, conservative care versus dialysis. Conservative care as a treatment option for patients with CKD stage 5 aged 75 years and older is well established. The practice patterns are varied in the Netherlands. Follow-up studies are needed to see whether the new multidisciplinary guideline facilitates harmonization of practice pattern. Funding is needed to optimize the implementation of conservative care.
Full-Depth Coadds of the WISE and First-Year NEOWISE-Reactivation Images
Meisner, Aaron M.; Lang, Dustin; Schlegel, David J.
2017-01-03
The Near Earth Object Wide-field Infrared Survey Explorer (NEOWISE) Reactivation mission released data from its first full year of observations in 2015. This data set includes ~2.5 million exposures in each of W1 and W2, effectively doubling the amount of WISE imaging available at 3.4 μm and 4.6 μm relative to the AllWISE release. In this paper, we have created the first ever full-sky set of coadds combining all publicly available W1 and W2 exposures from both the AllWISE and NEOWISE-Reactivation (NEOWISER) mission phases. We employ an adaptation of the unWISE image coaddition framework, which preserves the native WISE angularmore » resolution and is optimized for forced photometry. By incorporating two additional scans of the entire sky, we not only improve the W1/W2 depths, but also largely eliminate time-dependent artifacts such as off-axis scattered moonlight. We anticipate that our new coadds will have a broad range of applications, including target selection for upcoming spectroscopic cosmology surveys, identification of distant/massive galaxy clusters, and discovery of high-redshift quasars. In particular, our full-depth AllWISE+NEOWISER coadds will be an important input for the Dark Energy Spectroscopic Instrument selection of luminous red galaxy and quasar targets. Our full-depth W1/W2 coadds are already in use within the DECam Legacy Survey (DECaLS) and Mayall z-band Legacy Survey (MzLS) reduction pipelines. Finally, much more work still remains in order to fully leverage NEOWISER imaging for astrophysical applications beyond the solar system.« less
Consensus Treatment Plans for New-Onset Systemic Juvenile Idiopathic Arthritis
DeWitt, Esi Morgan; Kimura, Yukiko; Beukelman, Timothy; Nigrovic, Peter A.; Onel, Karen; Prahalad, Sampath; Schneider, Rayfel; Stoll, Matthew L.; Angeles-Han, Sheila; Milojevic, Diana; Schikler, Kenneth N.; Vehe, Richard K.; Weiss, Jennifer E.; Weiss, Pamela; Ilowite, Norman T.; Wallace, Carol A.
2012-01-01
Objective There is wide variation in therapeutic approaches to systemic juvenile idiopathic arthritis (sJIA) among North American rheumatologists. Understanding the comparative effectiveness of the diverse therapeutic options available for treatment of sJIA can result in better health outcomes. The Childhood Arthritis and Rheumatology Research Alliance (CARRA) developed consensus treatment plans and standardized assessment schedules for use in clinical practice to facilitate such studies. Methods Case-based surveys were administered to CARRA members to identify prevailing treatments for new-onset sJIA. A 2-day consensus conference in April 2010 employed modified nominal group technique to formulate preliminary treatment plans and determine important data elements for collection. Follow-up surveys were employed to refine the plans and assess clinical acceptability. Results The initial case-based survey identified significant variability among current treatment approaches for new onset sJIA, underscoring the utility of standardized plans to evaluate comparative effectiveness. We developed four consensus treatment plans for the first 9 months of therapy, as well as case definitions and clinical and laboratory monitoring schedules. The four treatment regimens included glucocorticoids only, or therapy with methotrexate, anakinra or tocilizumab, with or without glucocorticoids. This approach was approved by >78% of CARRA membership. Conclusion Four standardized treatment plans were developed for new-onset sJIA. Coupled with data collection at defined intervals, use of these treatment plans will create the opportunity to evaluate comparative effectiveness in an observational setting to optimize initial management of sJIA. PMID:22290637