Science.gov

Sample records for large-scale health survey

  1. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  2. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  3. Large scale survey of enteric viruses in river and waste water underlines the health status of the local population.

    PubMed

    Prevost, B; Lucas, F S; Goncalves, A; Richard, F; Moulin, L; Wurtzer, S

    2015-06-01

    Although enteric viruses constitute a major cause of acute waterborne diseases worldwide, environmental data about occurrence and viral load of enteric viruses in water are not often available. In this study, enteric viruses (i.e., adenovirus, aichivirus, astrovirus, cosavirus, enterovirus, hepatitis A and E viruses, norovirus of genogroups I and II, rotavirus A and salivirus) were monitored in the Seine River and the origin of contamination was untangled. A total of 275 water samples were collected, twice a month for one year, from the river Seine, its tributaries and the major WWTP effluents in the Paris agglomeration. All water samples were negative for hepatitis A and E viruses. AdV, NVGI, NVGII and RV-A were the most prevalent and abundant populations in all water samples. The viral load and the detection frequency increased significantly between the samples collected the most upstream and the most downstream of the Paris urban area. The calculated viral fluxes demonstrated clearly the measurable impact of WWTP effluents on the viral contamination of the Seine River. The viral load was seasonal for almost all enteric viruses, in accordance with the gastroenteritis recordings provided by the French medical authorities. These results implied the existence of a close relationship between the health status of inhabitants and the viral contamination of WWTP effluents and consequently surface water contamination. Subsequently, the regular analysis of wastewater could serve as a proxy for the monitoring of the human viruses circulating in both a population and surface water. PMID:25795193

  4. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  5. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  6. Theoretical expectations for bulk flows in large-scale surveys

    NASA Technical Reports Server (NTRS)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  7. Interloper bias in future large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Pullen, Anthony R.; Hirata, Christopher M.; Doré, Olivier; Raccanelli, Alvise

    2016-02-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing power spectrum inferences. We also construct a formalism for predicting cosmological parameter measurement bias, demonstrating that a 0.15%-0.3% interloper fraction could bias the growth rate by more than 10% of the error, which can affect constraints on gravity from upcoming surveys. We use the COSMOS Mock Catalog (CMC), with the emission lines rescaled to better reproduce recent data, to predict potential interloper fractions for the Prime Focus Spectrograph (PFS) and the Wide-Field InfraRed Survey Telescope (WFIRST). We find that secondary line identification, or confirming galaxy redshifts by finding correlated emission lines, can remove interlopers for PFS. For WFIRST, we use the CMC to predict that the 0.2% target can be reached for the WFIRST Hα survey, but sensitive optical and near-infrared photometry will be required. For the WFIRST [O III] survey, the predicted interloper fractions reach several percent and their effects will have to be estimated and removed statistically (e.g., with deep training samples). These results are optimistic as the CMC does not capture the full set of correlations of galaxy properties in the real Universe, and they do not include blending effects. Mitigating interloper contamination will be crucial to the next generation of

  8. Characterizing unknown systematics in large scale structure surveys

    SciTech Connect

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.; Seo, Hee-Jong; Ross, Ashley J.; Bahcall, Neta; Brinkmann, Jonathan; Eisenstein, Daniel J.; Muna, Demitri; Palanque-Delabrouille, Nathalie; Yèche, Christophe; Petitjean, Patrick; Schneider, Donald P.; Streblyanska, Alina; Weaver, Benjamin A.

    2014-04-01

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data, we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.

  9. Consent and widespread access to personal health information for the delivery of care: a large scale telephone survey of consumers' attitudes using vignettes in New Zealand

    PubMed Central

    Whiddett, Dick; Hunter, Inga; McDonald, Barry; Norris, Tony; Waldon, John

    2016-01-01

    Objectives In light of recent health policy, to examine factors which influence the public's willingness to consent to share their health information in a national electronic health record (EHR). Design Data were collected in a national telephone survey in 2008. Respondents were presented with vignettes that described situations in which their health information was shared and asked if they would consent to such sharing. The subset, consisting of the 18 vignettes that covered proving care, was reanalysed in depth using new statistical methods in 2016. Setting Adult population of New Zealand accessible by telephone landline. Participants 4209 adults aged 18+ years in the full data set, 2438 of which are included in the selected subset. Main outcome measures For each of 18 vignettes, we measured the percentage of respondents who would consent for their information to be shared for 2 groups; for those who did not consider that their records contained sensitive information, and for those who did or refused to say. Results Rates of consent ranged from 89% (95% CI 87% to 92%) for sharing of information with hospital doctors and nurses to 51% (47% to 55%) for government agencies. Mixed-effects logistic regression was used to identify factors which had significant impact on consent. The role of the recipient and the level of detail influenced respondents' willingness to consent (p<0.0001 for both factors). Of the individual characteristics, the biggest impact was that respondents whose records contain sensitive information (or who refused to answer) were less willing to consent (p<0.0001). Conclusions A proportion of the population are reluctant to share their health information beyond doctors, nurses and paramedics, particularly when records contain sensitive information. These findings may have adverse implications for healthcare strategies based on widespread sharing of information. Further research is needed to understand and overcome peoples' ambivalence towards

  10. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  11. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  12. Performance Health Monitoring of Large-Scale Systems

    SciTech Connect

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­‐scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  13. Probing the large scale structure with the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Leistedt, Boris

    2016-03-01

    I will present the latest cosmological results from the Dark Energy Survey (DES), a 5000 square degree optical galaxy survey in the Southern Hemisphere started in 2012. I will focus on the constraints on Baryon Acoustic Oscillations and other cosmological parameters obtained with galaxy clustering measurements from the first years of DES data. I will highlight the various tests and methods that make these results not only precise but also robust against observational systematics and modeling uncertainties. Finally, I will describe the future phases of the survey, the expected increase in constraining power, and the challenges that need to be addressed to fully exploit the data from surveys such as DES and LSST.

  14. Testing model independent modified gravity with future large scale surveys

    SciTech Connect

    Thomas, Daniel B.; Contaldi, Carlo R. E-mail: c.contaldi@ic.ac.uk

    2011-12-01

    Model-independent parametrisations of modified gravity have attracted a lot of attention over the past few years and numerous combinations of experiments and observables have been suggested to constrain the parameters used in these models. Galaxy clusters have been mentioned, but not looked at as extensively in the literature as some other probes. Here we look at adding galaxy clusters into the mix of observables and examine how they could improve the constraints on the modified gravity parameters. In particular, we forecast the constraints from combining Planck satellite Cosmic Microwave Background (CMB) measurements and Sunyaev-Zeldovich (SZ) cluster catalogue with a DES-like Weak Lensing (WL) survey. We find that cluster counts significantly improve the constraints over those derived using CMB and WL. We then look at surveys further into the future, to see how much better it may be feasible to make the constraints.

  15. A large-scale integrated aerogeophysical survey of Afghanistan

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Childers, V. A.; Gardner, J. M.; Liang, R. T.; Bowles, J. H.; Abraham, J. D.

    2007-12-01

    A multi-sensor, multidisciplinary aerogeophysical survey of a major portion of Afghanistan was recently conducted by investigators from the Naval Research Laboratory and the U.S. Geological Survey. More than 110,000 line km of data tracks were flown aboard an NP-3D Orion aircraft. Sensor systems installed on the P-3 included dual gravimeters, scalar and vector magnetometers, a digital photogrammetric camera, a hyperspectral imager, and an L-band polarimetric synthetic aperture radar (SAR). Data from all sources were precisely co-registered to the ground by a combination of interferometric-mode Global Positioning System (GPS) and inertial measurements. The data from this integrated mapping mission support numerous basic and applied science efforts in Afghanistan including: resource assessment and exploration for oil, gas, and minerals, development of techniques for sensor fusion and automated analysis, and topics in crustal geophysics and geodesy. The data will also support civil infrastructure needs such as cadastral surveying, urban planning and development, and pipeline/powerline/road routing and construction, agriculture and hydrologic resource management, earthquake hazard analysis, and base-maps for humanitarian relief missions.

  16. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  17. Large Scale Structure at 24 Microns in the SWIRE Survey

    NASA Astrophysics Data System (ADS)

    Masci, F. J.; SWIRE Team

    2006-12-01

    We present initial results of galaxy clustering at 24μm by analyzing statistics of the projected galaxy distribution from counts-in-cells. This study focuses on the ELAIS-North1 SWIRE field. The sample covers ≃5.9 deg2 and contains 24,715 sources detected at 24μm to a 5.6σ limit of 250μJy (in the lowest coverage regions). We have explored clustering as a function of 3.6 - 24μm and 24μm flux density using angular-averaged two-point correlation functions derived from the variance of counts-in-cells on scales 0°.05-0°.7. Using a power-law parameterization, w2(θ)=A(θ/deg)1-γ, we find [A,γ] = [(5.43±0.20)×10-4,2.01±0.02] for the full sample (1σ errors throughout). We have inverted Limber's equation and estimated a spatial correlation length of r0=3.32±0.19 h-1Mpc for the full sample, assuming stable clustering and a redshift model consistent with observed 24μm counts. We also find that blue [fν(24)/fν(3.6)≤5.5] and red [fν(24)/fν(3.6)≥6.5] galaxies have the lowest and highest r0 values respectively, implying that redder galaxies are more clustered (by a factor of ≈3 on scales ⪆0°.2). Overall, the clustering estimates are smaller than those derived from optical surveys, but in agreement with results from IRAS and ISO in the mid-infrared. This extends the notion to higher redshifts that infrared selected surveys show weaker clustering than optical surveys.

  18. A Novel Electronic Data Collection System for Large-Scale Surveys of Neglected Tropical Diseases

    PubMed Central

    King, Jonathan D.; Buolamwini, Joy; Cromwell, Elizabeth A.; Panfel, Andrew; Teferi, Tesfaye; Zerihun, Mulat; Melak, Berhanu; Watson, Jessica; Tadesse, Zerihun; Vienneau, Danielle; Ngondi, Jeremiah; Utzinger, Jürg; Odermatt, Peter; Emerson, Paul M.

    2013-01-01

    -based technology was suitable for a large-scale health survey, saved time, provided more accurate geo-coordinates, and was preferred by recorders over standard paper-based questionnaires. PMID:24066147

  19. Large-scale structure in the Southern Sky Redshift Survey

    NASA Technical Reports Server (NTRS)

    Park, Changbom; Gott, J. R., III; Da Costa, L. N.

    1992-01-01

    The power spectrum from the Southern Sky Redshift Survey and the CfA samples are measured in order to explore the amplitude of fluctuation in the galaxy density. At lambda of less than or equal to 30/h Mpc the observed power spectrum is quite consistent with the standard CDM model. At larger scales the data indicate an excess of power over the standard CDM model. The observed power spectrum from these optical galaxy samples is in good agreement with that drawn from the sparsely sampled IRAS galaxies. The shape of the power spectrum is also studied by examining the relation between the genus per unit volume and the smoothing length. It is found that, over Gaussian smoothing scales from 6 to 14/h Mpc, the power spectrum has a slope of about -1. The topology of the galaxy density field is studied by measuring the shift of the genus curve from the Gaussian case. Over all smoothing scales studied, the observed genus curves are consistent with a random phase distribution of the galaxy density field, as predicted by the inflationary scenarios.

  20. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we

  1. The Use of Online Social Networks by Polish Former Erasmus Students: A Large-Scale Survey

    ERIC Educational Resources Information Center

    Bryla, Pawel

    2014-01-01

    There is an increasing role of online social networks in the life of young Poles. We conducted a large-scale survey among Polish former Erasmus students. We have received 2450 completed questionnaires from alumni of 115 higher education institutions all over Poland. 85.4% of our respondents reported they kept in touch with their former Erasmus…

  2. PERSPECTIVES ON LARGE-SCALE NATURAL RESOURCES SURVEYS WHEN CAUSE-EFFECT IS A POTENTIAL ISSUE

    EPA Science Inventory

    Our objective is to present a perspective on large-scale natural resource monitoring when cause-effect is a potential issue. We believe that the approach of designing a survey to meet traditional commodity production and resource state descriptive objectives is too restrictive an...

  3. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  4. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey

  5. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  6. Addressing statistical and operational challenges in designing large-scale stream condition surveys.

    PubMed

    Dobbie, Melissa J; Negus, Peter

    2013-09-01

    Implementing a statistically valid and practical monitoring design for large-scale stream condition monitoring and assessment programs can be difficult due to factors including the likely existence of a diversity of ecosystem types such as ephemeral streams over the sampling domain; limited resources to undertake detailed monitoring surveys and address knowledge gaps; and operational constraints on effective sampling at monitoring sites. In statistical speak, these issues translate to defining appropriate target populations and sampling units; designing appropriate spatial and temporal sample site selection methods; selection and use of appropriate indicators; and setting effect sizes with limited ecological and statistical information about the indicators of interest. We identify the statistical and operational challenges in designing large-scale stream condition surveys and discuss general approaches for addressing them. The ultimate aim in drawing attention to these challenges is to ensure operational practicality in carrying out future monitoring programs and that the resulting inferences about stream condition are statistically valid and relevant. PMID:23344628

  7. Large-scale survey of adverse reactions to canine non-rabies combined vaccines in Japan.

    PubMed

    Miyaji, Kazuki; Suzuki, Aki; Shimakura, Hidekatsu; Takase, Yukari; Kiuchi, Akio; Fujimura, Masato; Kurita, Goro; Tsujimoto, Hajime; Sakaguchi, Masahiro

    2012-01-15

    Canine non-rabies combined vaccines are widely used to protect animals from infectious agents, and also play an important role in public health. We performed a large-scale survey to investigate vaccine-associated adverse events (VAAEs), including anaphylaxis, in Japan by distributing questionnaires on VAAEs to veterinary hospitals from April 1, 2006 through May 31, 2007. Valid responses were obtained for 57,300 vaccinated dogs at 573 animal hospitals; we obtained VAAEs information for last 100 vaccinated dogs in each veterinary hospital. We found that of the 57,300, 359 dogs showed VAAEs. Of the 359 dogs, death was observed in 1, anaphylaxis in 41, dermatological signs in 244, gastrointestinal signs in 160, and other signs in 106. Onset of VAAEs was mostly observed within 12h after vaccination (n=299, 83.3%). In this study, anaphylaxis events occurred within 60 min after vaccination, and about half of these events occurred within 5 min (n=19, 46.3%). Furthermore, where anaphylaxis was reported, additional information to support the diagnosis was obtained by reinvestigation. Our resurvey of dogs with anaphylaxis yielded responses on 31 dogs; 27 of these demonstrated collapse (87.1%), 24 demonstrated cyanosis (77.4%), and both signs occurred in 22 (71.0%). Higher rates of animal VAAEs, anaphylaxis, and death were found in Japan than in other countries. Further investigations, including survey studies, will be necessary to elucidate the interaction between death and vaccination and the risk factors for VAAEs, and thus develop safer vaccines. Moreover, it may also be necessary to continually update the data of VAAEs. PMID:22264736

  8. A sparse-sampling strategy for the estimation of large-scale clustering from redshift surveys

    NASA Astrophysics Data System (ADS)

    Kaiser, N.

    1986-04-01

    It is shown that a fractional faint-magnitude limited redshift survey can significantly reduce the uncertainty in the two-point function for a given telescope time investment, in the estimation of large scale clustering. The signal-to-noise ratio for a 1-in-20 bright galaxy sample is roughly twice that provided by a same-cost complete survey, and this performance is the same as for a larger complete survey of about seven times the cost. A similar performance increase is achieved with a wide-field telescope multiple redshift collection from a close to full sky coverage survey. Little performance improvement is seen for smaller multiply collected surveys ideally sampled at a 1-in-10 bright galaxy rate. The optimum sampling fraction for Abell's rich clusters is found to be close to unity, with little sparse sampling performance improvement.

  9. Studying populations of eclipsing binaries using large scale multi-epoch photometric surveys

    NASA Astrophysics Data System (ADS)

    Mowlavi, Nami; Barblan, Fabio; Holl, Berry; Rimoldini, Lorenzo; Lecoeur-Taïbi, Isabelle; Süveges, Maria; Eyer, Laurent; Guy, Leanne; Nienartowicz, Krzysztof; Ordonez, Diego; Charnas, Jonathan; Jévardat de Fombelle, Grégory

    2015-08-01

    Large scale multi-epoch photometric surveys provide unique opportunities to study populations of binary stars through the study of eclipsing binaries, provided the basic properties of binary systems can be derived from their light curves without the need to fully model the binary system. Those systems can then be classified into various types from, for example, close to wide systems, from circular to highly elliptical systems, or from systems with similar components to highly asymmetric systems. The challenge is to extract physically relevant information from the light curve geometry.In this contribution, we present the study of eclipsing binaries in the Large Magellanic Clouds (LMC) from the OGLE-III survey. The study is based on the analysis of the geometry of their light curves parameterized using a two-Gaussian model. We show what physical parameters could be extracted from such an analysis, and the results for the LMC eclipsing binaries. The method is very well adapted to process large-scale surveys containing millions of eclipsing binaries, such as is expected from the current Gaia mission or the future LSST survey.

  10. Measures of large-scale structure in the CfA redshift survey slices

    NASA Technical Reports Server (NTRS)

    De Lapparent, Valerie; Geller, Margaret J.; Huchra, John P.

    1991-01-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall.

  11. Measures of large-scale structure in the CfA redshift survey slices

    SciTech Connect

    De Lapparent, V.; Geller, M.J.; Huchra, J.P. Harvard-Smithsonian Center for Astrophysics, Cambridge, MA )

    1991-03-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall. 46 refs.

  12. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation. PMID:27577416

  13. A large-scale survey of thermal comfort in office premises in Hong Kong

    SciTech Connect

    Chan, D.W.T.; Burnett, J.; Ng, S.C.H.; Dear, R.J. de

    1998-10-01

    Hong Kong is a densely populated city in which the service sector dominates. The significant outdoor noise pollution and subtropical climate severely restrict the opportunity for office premises to be naturally ventilated. The high energy consumption for space cooling and the demand for improved indoor thermal comfort conditions simulated a large-scale survey of thermal comfort conditions in Hong Kong office premises. The neutral temperatures and preferred temperatures are found to be lower than those found in other studies in the tropics, with 60% of the surveyed subjects preferring a change of the thermal conditions in summer. The outcome provides for a better notion of thermal comfort, which can be imposed on design criteria. The results also add weight to the concern about the validity in the field of the traditional chamber test data presented by ASHRAE Standard 55-1992. It further suggests the potential for adopting an adaptive control algorithm for thermal comfort.

  14. Public health concerns for neighbors of large-scale swine production operations.

    PubMed

    Thu, K M

    2002-05-01

    This article provides a review and critical synthesis of research related to public health concerns for neighbors exposed to emissions from large-scale swine production operations. The rapid industrialization of pork production in the 1990s produced a generation of confined animal feeding operations (CAFOs) of a size previously unseen in the U.S. Recent research and results from federally sponsored scientific symposia consistently indicate that neighbors of large-scale swine CAFOs can experience health problems at significantly higher rates than controlled comparison populations. Symptoms experienced by swine CAFO neighbors are generally oriented toward irritation of the respiratory tract and are consistent with the types of symptoms among interior confinement workers thathave been well documented in the occupational health literature. However, additional exposure assessment research is required to elucidate the relationship of reported symptoms among swine CAFO neighbors and CAFO emissions. PMID:12046804

  15. Measuring large-scale structure with quasars in narrow-band filter surveys

    NASA Astrophysics Data System (ADS)

    Abramo, L. Raul; Strauss, Michael A.; Lima, Marcos; Hernández-Monteagudo, Carlos; Lazkoz, Ruth; Moles, Mariano; de Oliveira, Claudia Mendes; Sendra, Irene; Sodré, Laerte; Storchi-Bergmann, Thaisa

    2012-07-01

    We show that a large-area imaging survey using narrow-band filters could detect quasars in sufficiently high number densities, and with more than sufficient accuracy in their photometric redshifts, to turn them into suitable tracers of large-scale structure. If a narrow-band optical survey can detect objects as faint as i= 23, it could reach volumetric number densities as high as 10-4 h3 Mpc-3 (comoving) at z˜ 1.5. Such a catalogue would lead to precision measurements of the power spectrum up to z˜ 3-4. We also show that it is possible to employ quasars to measure baryon acoustic oscillations at high redshifts, where the uncertainties from redshift distortions and non-linearities are much smaller than at z≲ 1. As a concrete example we study the future impact of the Javalambre Physics of the Accelerating Universe Astrophysical Survey (J-PAS), which is a narrow-band imaging survey in the optical over 1/5 of the unobscured sky with 42 filters of ˜100-Å full width at half-maximum. We show that J-PAS will be able to take advantage of the broad emission lines of quasars to deliver excellent photometric redshifts, σz≃ 0.002 (1 +z), for millions of objects.

  16. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    PubMed

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments. PMID:27624742

  17. Large-Scale Surveys of Snow Depth on Arctic Sea Ice from Operation IceBridge

    NASA Technical Reports Server (NTRS)

    Kurtz, Nathan T.; Farrell, Sinead L.

    2011-01-01

    We show the first results of a large ]scale survey of snow depth on Arctic sea ice from NASA fs Operation IceBridge snow radar system for the 2009 season and compare the data to climatological snow depth values established over the 1954.1991 time period. For multiyear ice, the mean radar derived snow depth is 33.1 cm and the corresponding mean climatological snow depth is 33.4 cm. The small mean difference suggests consistency between contemporary estimates of snow depth with the historical climatology for the multiyear ice region of the Arctic. A 16.5 cm mean difference (climatology minus radar) is observed for first year ice areas suggesting that the increasingly seasonal sea ice cover of the Arctic Ocean has led to an overall loss of snow as the region has transitioned away from a dominantly multiyear ice cover.

  18. Ten key considerations for the successful implementation and adoption of large-scale health information technology

    PubMed Central

    Cresswell, Kathrin M; Bates, David W; Sheikh, Aziz

    2013-01-01

    The implementation of health information technology interventions is at the forefront of most policy agendas internationally. However, such undertakings are often far from straightforward as they require complex strategic planning accompanying the systemic organizational changes associated with such programs. Building on our experiences of designing and evaluating the implementation of large-scale health information technology interventions in the USA and the UK, we highlight key lessons learned in the hope of informing the on-going international efforts of policymakers, health directorates, healthcare management, and senior clinicians. PMID:23599226

  19. Large-scale internal structure in volcanogenic breakout flood deposits: Extensive GPR survey on volcaniclastic deposits

    NASA Astrophysics Data System (ADS)

    Kataoka, K.; Gomez, C. A.

    2012-12-01

    Large-scale outburst floods from volcanic lakes such as caldera lakes or volcanically dammed river-valleys tend to be voluminous with total discharge of > 1-10s km3 and peak discharge of >10000s to 100000s m3 s-1. Such a large flood can travel long distance and leave sediments and bedforms/landforms extensively with large-scale internal structures, which are difficult to assess from single local sites. Moreover, the sediments and bedforms/landforms are sometimes untraceable, and outcrop information obtained by classical geological and geomorphological field surveys is limited to the dissected/terraced parts of fan body, road cuts and/or large quarries. Therefore, GPR (Ground Penetrating Radar), using the properties of electromagnetic waves' propagation through media, seems best adapted for the appraisal of large-scale subsurface structures. Recently, studies on GPR applications to volcanic deposits have successfully captured images of lava flows and volcaniclastic deposits and proved the usefulness of this method even onto the volcanic areas which often encompass complicated stratigraphy and structures with variable material, grainsize, and ferromagnetic content. Using GPR, the present study aims to understand the large-scale internal structures of volcanogenic flood deposits. The survey was carried out over two volcanogenic flood fan (or apron) sediments in northeast Japan, at Numazawa and Towada volcanoes. The 5 ka Numazawa flood deposits in the Tadami river catchment that has been emplaced by a breakout flood from ignimbrite-dammed valley leaving pumiceous gravelly sediments with meter-sized boulders in the flow path. At Towada volcano, a comparable flood event originating from a breach in the caldera rim emplaced the 13-15 ka Sanbongi fan deposits in the Oirase river valley, which is characterized by a bouldery fan deposits. The GPR data was collected following 200 to 500 m long lateral and longitudinal transects, which were captured using a GPR Pulse

  20. Searching transients in large-scale surveys. A method based on the Abbe value

    NASA Astrophysics Data System (ADS)

    Mowlavi, N.

    2014-08-01

    Aims: A new method is presented to identify transient candidates in large-scale surveys based on the variability pattern in their light curves. Methods: The method is based on the Abbe value, Ab, that estimates the smoothness of a light curve, and on a newly introduced value called the excess Abbe and denoted excessAb, that estimates the regularity of the light curve variability pattern over the duration of the observations. Results: Based on simulated light curves, transients are shown to occupy a specific region in the {diagram} diagram, distinct from sources presenting pulsating-like features in their light curves or having featureless light curves. The method is tested on real light curves taken from EROS-2 and OGLE-II surveys in a 0.50° × 0.17° field of the sky in the Large Magellanic Cloud centered at RA(J2000) = 5h25m56.5s and Dec(J2000) = -69d29m43.3s. The method identifies 43 EROS-2 transient candidates out of a total of 1300 variable stars, and 19 more OGLE-II candidates, 10 of which do not have any EROS-2 variable star matches and which would need further confirmation to assess their reliability. The efficiency of the method is further tested by comparing the list of transient candidates with known Be stars in the literature. It is shown that all Be stars known in the studied field of view with detectable bursts or outbursts are successfully extracted by the method. In addition, four new transient candidates displaying bursts and/or outbursts are found in the field, of which at least two are good new Be candidates. Conclusions: The new method proves to be a potentially powerful tool to extract transient candidates from large-scale multi-epoch surveys. The better the photometric measurement uncertainties are, the cleaner the list of detected transient candidates is. In addition, the diagram diagram is shown to be a good diagnostic tool to check the data quality of multi-epoch photometric surveys. A trend of instrumental and/or data reduction origin

  1. Effects of unstable dark matter on large-scale structure and constraints from future surveys

    NASA Astrophysics Data System (ADS)

    Wang, Mei-Yu; Zentner, Andrew R.

    2012-02-01

    In this paper we explore the effect of decaying dark matter (DDM) on large-scale structure and possible constraints from galaxy imaging surveys. DDM models have been studied, in part, as a way to address apparent discrepancies between the predictions of standard cold dark matter models and observations of galactic structure. Our study is aimed at developing independent constraints on these models. In such models, DDM decays into a less massive, stable dark matter (SDM) particle and a significantly lighter particle. The small mass splitting between the parent DDM and the daughter SDM provides the SDM with a recoil or “kick” velocity vk, inducing a free-streaming suppression of matter fluctuations. This suppression can be probed via weak lensing power spectra measured by a number of forthcoming imaging surveys that aim primarily to constrain dark energy. Using scales on which linear perturbation theory alone is valid (multipoles ℓ<300), surveys like Euclid or the Large Synoptic Survey Telescope can be sensitive to vk≳90km/s for lifetimes τ˜1-5Gyr. To estimate more aggressive constraints, we model nonlinear corrections to lensing power using a simple halo evolution model that is in good agreement with numerical simulations. In our most ambitious forecasts, using multipoles ℓ<3000, we find that imaging surveys can be sensitive to vk˜10km/s for lifetimes τ≲10Gyr. Lensing will provide a particularly interesting complement to existing constraints in that they will probe the long lifetime regime (τ≫H0-1) far better than contemporary techniques. A caveat to these ambitious forecasts is that the evolution of perturbations on nonlinear scales will need to be well calibrated by numerical simulations before they can be realized. This work motivates the pursuit of such a numerical simulation campaign to constrain dark matter with cosmological weak lensing.

  2. Large-scale fluctuations in the number density of galaxies in independent surveys of deep fields

    NASA Astrophysics Data System (ADS)

    Shirokov, S. I.; Lovyagin, N. Yu.; Baryshev, Yu. V.; Gorokhov, V. L.

    2016-06-01

    New arguments supporting the reality of large-scale fluctuations in the density of the visible matter in deep galaxy surveys are presented. A statistical analysis of the radial distributions of galaxies in the COSMOS and HDF-N deep fields is presented. Independent spectral and photometric surveys exist for each field, carried out in different wavelength ranges and using different observing methods. Catalogs of photometric redshifts in the optical (COSMOS-Zphot) and infrared (UltraVISTA) were used for the COSMOS field in the redshift interval 0.1 < z < 3.5, as well as the zCOSMOS (10kZ) spectroscopic survey and the XMM-COSMOS and ALHAMBRA-F4 photometric redshift surveys. The HDFN-Zphot and ALHAMBRA-F5 catalogs of photometric redshifts were used for the HDF-N field. The Pearson correlation coefficient for the fluctuations in the numbers of galaxies obtained for independent surveys of the same deep field reaches R = 0.70 ± 0.16. The presence of this positive correlation supports the reality of fluctuations in the density of visible matter with sizes of up to 1000 Mpc and amplitudes of up to 20% at redshifts z ~ 2. The absence of correlations between the fluctuations in different fields (the correlation coefficient between COSMOS and HDF-N is R = -0.20 ± 0.31) testifies to the independence of structures visible in different directions on the celestial sphere. This also indicates an absence of any influence from universal systematic errors (such as "spectral voids"), which could imitate the detection of correlated structures.

  3. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  4. Survey and analysis of selected jointly owned large-scale electric utility storage projects

    SciTech Connect

    Not Available

    1982-05-01

    The objective of this study was to examine and document the issues surrounding the curtailment in commercialization of large-scale electric storage projects. It was sensed that if these issues could be uncovered, then efforts might be directed toward clearing away these barriers and allowing these technologies to penetrate the market to their maximum potential. Joint-ownership of these projects was seen as a possible solution to overcoming the major barriers, particularly economic barriers, of commercializaton. Therefore, discussions with partners involved in four pumped storage projects took place to identify the difficulties and advantages of joint-ownership agreements. The four plants surveyed included Yards Creek (Public Service Electric and Gas and Jersey Central Power and Light); Seneca (Pennsylvania Electric and Cleveland Electric Illuminating Company); Ludington (Consumers Power and Detroit Edison, and Bath County (Virginia Electric Power Company and Allegheny Power System, Inc.). Also investigated were several pumped storage projects which were never completed. These included Blue Ridge (American Electric Power); Cornwall (Consolidated Edison); Davis (Allegheny Power System, Inc.) and Kttatiny Mountain (General Public Utilities). Institutional, regulatory, technical, environmental, economic, and special issues at each project were investgated, and the conclusions relative to each issue are presented. The major barriers preventing the growth of energy storage are the high cost of these systems in times of extremely high cost of capital, diminishing load growth and regulatory influences which will not allow the building of large-scale storage systems due to environmental objections or other reasons. However, the future for energy storage looks viable despite difficult economic times for the utility industry. Joint-ownership can ease some of the economic hardships for utilites which demonstrate a need for energy storage.

  5. Photometric Redshifts for the Dark Energy Survey and VISTA and Implications for Large Scale Structure

    SciTech Connect

    Banerji, Manda; Abdalla, Filipe B.; Lahav, Ofer; Lin, Huan; /Fermilab

    2007-11-01

    We conduct a detailed analysis of the photometric redshift requirements for the proposed Dark Energy Survey (DES) using two sets of mock galaxy simulations and an artificial neural network code-ANNz. In particular, we examine how optical photometry in the DES grizY bands can be complemented with near infra-red photometry from the planned VISTA Hemisphere Survey (VHS) in the JHK{sub s} bands in order to improve the photometric redshift estimate by a factor of two at z > 1. We draw attention to the effects of galaxy formation scenarios such as reddening on the photo-z estimate and using our neural network code, calculate A{sub v} for these reddened galaxies. We also look at the impact of using different training sets when calculating photometric redshifts. In particular, we find that using the ongoing DEEP2 and VVDS-Deep spectroscopic surveys to calibrate photometric redshifts for DES, will prove effective. However we need to be aware of uncertainties in the photometric redshift bias that arise when using different training sets as these will translate into errors in the dark energy equation of state parameter, w. Furthermore, we show that the neural network error estimate on the photometric redshift may be used to remove outliers from our samples before any kind of cosmological analysis, in particular for large-scale structure experiments. By removing all galaxies with a 1{sigma} photo-z scatter greater than 0.1 from our DES+VHS sample, we can constrain the galaxy power spectrum out to a redshift of 2 and reduce the fractional error on this power spectrum by {approx}15-20% compared to using the entire catalogue.

  6. Conducting Large-Scale Surveys in Secondary Schools: The Case of the Youth On Religion (YOR) Project

    ERIC Educational Resources Information Center

    Madge, Nicola; Hemming, Peter J.; Goodman, Anthony; Goodman, Sue; Kingston, Sarah; Stenson, Kevin; Webster, Colin

    2012-01-01

    There are few published articles on conducting large-scale surveys in secondary schools, and this paper seeks to fill this gap. Drawing on the experiences of the Youth On Religion project, it discusses the politics of gaining access to these schools and the considerations leading to the adoption and administration of an online survey. It is…

  7. EVALUATION OF A MEASUREMENT METHOD FOR FOREST VEGETATION IN A LARGE-SCALE ECOLOGICAL SURVEY

    EPA Science Inventory

    We evaluate a field method for determining species richness and canopy cover of vascular plants for the Forest Health Monitoring Program (FHM), an ecological survey of U.S. forests. Measurements are taken within 12 1-m2 quadrats on 1/15 ha plots in FHM. Species richness and cover...

  8. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    NASA Astrophysics Data System (ADS)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  9. A process for creating multimetric indices for large-scale aquatic surveys

    EPA Science Inventory

    Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...

  10. ELISA: A small balloon Experiment for a Large Scale Survey in the Sub-millimeter

    NASA Astrophysics Data System (ADS)

    Bernard, J.-Ph.; Ristorcelli, I.; Stepnik, B.; Abergel, A.; Boulanger, F.; Giard, M.; Lagache, G.; Lamarre, J. M.; Meny, C.; Torre, J. P.; Armengaud, M.; Crussaire, J. P.; Leriche, B.; Longval, Y.

    2002-03-01

    HERSCHEL and the PLANCK space missions to be launched in 2007. The ELISA data will also be usable to help calibrate the observations of HERSCHEL and PLANCK and to plan the large-scale surveys to be undertaken with HERSCHEL. Owing to these objectives, 3 flights of the ELISA experiment, including one from Southern hemisphere, are foreseen in the period from 2004 to 2006. The ELISA project is carried out by an international collaboration including France (CESR, IAS, CEA, CNES), Netherlands (SSD/ESTEC), Denmark (DSRI), England (QMW), USA (JPL/Caltech), Italy (ASI). .

  11. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts. PMID:22084529

  12. Linking Errors in Trend Estimation in Large-Scale Surveys: A Case Study. Research Report. ETS RR-10-10

    ERIC Educational Resources Information Center

    Xu, Xueli; von Davier, Matthias

    2010-01-01

    One of the major objectives of large-scale educational surveys is reporting trends in academic achievement. For this purpose, a substantial number of items are carried from one assessment cycle to the next. The linking process that places academic abilities measured in different assessments on a common scale is usually based on a concurrent…

  13. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  14. Health risks from large-scale water pollution: trends in Central Asia.

    PubMed

    Törnqvist, Rebecka; Jarsjö, Jerker; Karimov, Bakhtiyor

    2011-02-01

    Limited data on the pollution status of spatially extensive water systems constrain health-risk assessments at basin-scales. Using a recipient measurement approach in a terminal water body, we show that agricultural and industrial pollutants in groundwater-surface water systems of the Aral Sea Drainage Basin (covering the main part of Central Asia) yield cumulative health hazards above guideline values in downstream surface waters, due to high concentrations of copper, arsenic, nitrite, and to certain extent dichlorodiphenyltrichloroethane (DDT). Considering these high-impact contaminants, we furthermore perform trend analyses of their upstream spatial-temporal distribution, investigating dominant large-scale spreading mechanisms. The ratio between parent DDT and its degradation products showed that discharges into or depositions onto surface waters are likely to be recent or ongoing. In river water, copper concentrations peak during the spring season, after thawing and snow melt. High spatial variability of arsenic concentrations in river water could reflect its local presence in the top soil of nearby agricultural fields. Overall, groundwaters were associated with much higher health risks than surface waters. Health risks can therefore increase considerably, if the downstream population must switch to groundwater-based drinking water supplies during surface water shortage. Arid regions are generally vulnerable to this problem due to ongoing irrigation expansion and climate changes. PMID:21131050

  15. A survey on routing protocols for large-scale wireless sensor networks.

    PubMed

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. "Large-scale" means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner and

  16. A Survey on Routing Protocols for Large-Scale Wireless Sensor Networks

    PubMed Central

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. “Large-scale” means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner

  17. Public knowledge and preventive behavior during a large-scale Salmonella outbreak: results from an online survey in the Netherlands

    PubMed Central

    2014-01-01

    Background Food-borne Salmonella infections are a worldwide concern. During a large-scale outbreak, it is important that the public follows preventive advice. To increase compliance, insight in how the public gathers its knowledge and which factors determine whether or not an individual complies with preventive advice is crucial. Methods In 2012, contaminated salmon caused a large Salmonella Thompson outbreak in the Netherlands. During the outbreak, we conducted an online survey (n = 1,057) to assess the general public’s perceptions, knowledge, preventive behavior and sources of information. Results Respondents perceived Salmonella infections and the 2012 outbreak as severe (m = 4.21; five-point scale with 5 as severe). Their knowledge regarding common food sources, the incubation period and regular treatment of Salmonella (gastro-enteritis) was relatively low (e.g., only 28.7% knew that Salmonella is not normally treated with antibiotics). Preventive behavior differed widely, and the majority (64.7%) did not check for contaminated salmon at home. Most information about the outbreak was gathered through traditional media and news and newspaper websites. This was mostly determined by time spent on the medium. Social media played a marginal role. Wikipedia seemed a potentially important source of information. Conclusions To persuade the public to take preventive actions, public health organizations should deliver their message primarily through mass media. Wikipedia seems a promising instrument for educating the public about food-borne Salmonella. PMID:24479614

  18. Child Maltreatment Experience among Primary School Children: A Large Scale Survey in Selangor State, Malaysia

    PubMed Central

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10–12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  19. Child maltreatment experience among primary school children: a large scale survey in Selangor state, Malaysia.

    PubMed

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10-12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  20. Body burden of cadmium and its related factors: a large-scale survey in China.

    PubMed

    Ke, Shen; Cheng, Xi-Yu; Li, Hao; Jia, Wen-Jing; Zhang, Jie-Ying; Luo, Hui-Fang; Wang, Zi-Ling; Chen, Zhi-Nan

    2015-04-01

    A survey of more than 6000 participants from four distinct non-polluted and polluted regions in China was conducted to evaluate the body burden of cadmium (Cd) on the Chinese populations using urinary Cd (UCd) as a biomarker. The findings revealed that the UCd level was 1.24 μg/g creatinine (μg/g cr) for the sample population from non-polluted Shanghai, and the UCd levels exceeded 5 μg/g cr, which is the health-based exposure limit set by the World Health Organization (WHO), in 1.1% of people. The mean UCd levels in moderately polluted (Hubei and Liaoning) and highly polluted areas (Guizhou) were 4.69 μg/g cr, 3.62 μg/g cr and 6.08 μg/g cr, respectively, and these levels were 2.9 to 4.9 times the levels observed in Shanghai. Notably, the UCd levels exceeded the recently updated human biomonitoring II values (i.e., intervention or "action level") in 44.8%-87.9% of people from these areas compared to only 5.1%-21.4% of people in Shanghai. The corresponding prevalence of elevated UCd levels (>WHO threshold, 5 μg/g cr) was also significantly higher (30.7% to 63.8% vs. 1.1%), which indicates that elevated Cd-induced health risks to residents in these areas. Age and region were significant determinants for UCd levels in a population, whereas gender did not significantly influence UCd. PMID:25594907

  1. Workplace Bullying and Sleep Disturbances: Findings from a Large Scale Cross-Sectional Survey in the French Working Population

    PubMed Central

    Niedhammer, Isabelle; David, Simone; Degioanni, Stéphanie; Drummond, Anne; Philip, Pierre

    2009-01-01

    Study Objectives: The purpose of this study was to explore the associations between workplace bullying, the characteristics of workplace bullying, and sleep disturbances in a large sample of employees of the French working population. Design: Workplace bullying, evaluated using the validated instrument developed by Leymann, and sleep disturbances, as well as covariates, were measured using a self-administered questionnaire. Covariates included age, marital status, presence of children, education, occupation, working hours, night work, physical and chemical exposures at work, self-reported health, and depressive symptoms. Statistical analysis was performed using logistic regression analysis and was carried out separately for men and women. Setting: General working population. Participants: The study population consisted of a random sample of 3132 men and 4562 women of the working population in the southeast of France. Results: Workplace bullying was strongly associated with sleep disturbances. Past exposure to bullying also increased the risk for this outcome. The more frequent the exposure to bullying, the higher the risk of experiencing sleep disturbances. Observing someone else being bullied in the workplace was also associated with the outcome. Adjustment for covariates did not modify the results. Additional adjustment for self-reported health and depressive symptoms diminished the magnitude of the associations that remained significant. Conclusions: The prevalence of workplace bullying (around 10%) was found to be high in this study as well was the impact of this major job-related stressor on sleep disturbances. Although no conclusion about causality could be drawn from this cross-sectional study, the findings suggest that the contribution of workplace bullying to the burden of sleep disturbances may be substantial. Citation: Niedhammer I; David S; Degioanni S; Drummond A; Philip P. Workplace bullying and sleep disturbances: findings from a large scale cross

  2. Multi-stage sampling for large scale natural resources surveys: A case study of rice and waterfowl

    USGS Publications Warehouse

    Stafford, J.D.; Reinecke, K.J.; Kaminski, R.M.; Gerard, P.D.

    2005-01-01

    Large-scale sample surveys to estimate abundance and distribution of organisms and their habitats are increasingly important in ecological studies. Multi-stage sampling (MSS) is especially suited to large-scale surveys because of the natural clustering of resources. To illustrate an application, we: (1) designed a stratified MSS to estimate late autumn abundance (kg/ha) of rice seeds in harvested fields as food for waterfowl wintering in the Mississippi Alluvial Valley (MAV); (2) investigated options for improving the MSS design; and (3) compared statistical and cost efficiency of MSS to simulated simple random sampling (SRS). During 2000?2002, we sampled 25?35 landowners per year, 1 or 2 fields per landowner per year, and measured seed mass in 10 soil cores collected within each field. Analysis of variance components and costs for each stage of the survey design indicated that collecting 10 soil cores per field was near the optimum of 11?15, whereas sampling >1 field per landowner provided few benefits because data from fields within landowners were highly correlated. Coefficients of variation (CV) of annual estimates of rice abundance ranged from 0.23 to 0.31 and were limited by variation among landowners and the number of landowners sampled. Design effects representing the statistical efficiency of MSS relative to SRS ranged from 3.2 to 9.0, and simulations indicated SRS would cost, on average, 1.4 times more than MSS because clustering of sample units in MSS decreased travel costs. We recommend MSS as a potential sampling strategy for large-scale natural resource surveys and specifically for future surveys of the availability of rice as food for waterfowl in the MAV and similar areas.

  3. A Survey of Residents' Perceptions of the Effect of Large-Scale Economic Developments on Perceived Safety, Violence, and Economic Benefits

    PubMed Central

    Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi

    2015-01-01

    Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310

  4. Assessing the Hypothesis of Measurement Invariance in the Context of Large-Scale International Surveys

    ERIC Educational Resources Information Center

    Rutkowski, Leslie; Svetina, Dubravka

    2014-01-01

    In the field of international educational surveys, equivalence of achievement scale scores across countries has received substantial attention in the academic literature; however, only a relatively recent emphasis on scale score equivalence in nonachievement education surveys has emerged. Given the current state of research in multiple-group…

  5. A Large-Scale Radio Polarization Survey of the Southern Sky at 21cm

    NASA Astrophysics Data System (ADS)

    Testori, J. C.; Reich, P.; Reich, W.

    2004-02-01

    We have successfully reduced the polarization data from the recently published 21 cm continuum survey of the southern sky carried out with a 30-m antenna at Villa Elisa (Argentina). We describe the reduction and calibration methods of the survey. The result is a fully sampled survey, which covers declinations from -90 degrees to -10 degrees with a typical rms-noise of 15 mK TB. The map of polarized intensity shows large regions with smooth low-level emission, but also a number of enhanced high-latitude features. Most of these regions have no counterpart in total intensity and indicate Faraday active regions.

  6. A Strong-Lens Survey in AEGIS: the Influence of Large Scale Structure

    SciTech Connect

    Moustakas, Leonidas A.; Marshall, Phil J.; Newman, Jeffrey A.; Coil, Alison L.; Cooper, Michael C.; Davis, Marc; Fassnacht, Christopher D.; Guhathakurta, Puragra; Hopkins, Andrew; Koekemoer, Anton; Konidaris, Nicholas P.; Lotz, Jennifer M.; Willmer, Christopher N.A.; /Arizona U., Astron. Dept. - Steward Observ.

    2006-07-14

    We report on the results of a visual search for galaxy-scale strong gravitational lenses over 650 arcmin2 of HST/ACS imaging in the Extended Groth Strip (EGS). These deep F606W- and F814W-band observations are in the DEEP2-EGS field. In addition to a previously-known Einstein Cross also found by our search (the ''Cross'', HSTJ141735+52264, with z{sub lens} = 0.8106 and a published z{sub source} = 3.40), we identify two new strong galaxy-galaxy lenses with multiple extended arcs. The first, HSTJ141820+52361 (the ''Dewdrop''; z{sub lens} = 0.5798), lenses two distinct extended sources into two pairs of arcs (z{sub source} = 0.9818 by nebular [O{sub II}] emission), while the second, HSTJ141833+52435 (the ''Anchor''; z{sub lens} = 0.4625), produces a single pair of arcs (source redshift not yet known). Four less convincing arc/counter-arc and two-image lens candidates are also found and presented for completeness. All three definite lenses are fit reasonably well by simple singular isothermal ellipsoid models including external shear, giving {chi}{sub {nu}}{sup 2}values close to unity. Using the three-dimensional line-of-sight (LOS) information on galaxies from the DEEP2 data, we calculate the convergence and shear contributions {kappa}{sub los} and {gamma}{sub los} to each lens, assuming singular isothermal sphere halos truncated at 200 h{sup -1} kpc. These are compared against a robust measure of local environment, {delta}{sub 3}, a normalized density that uses the distance to the third nearest neighbor. We find that even strong lenses in demonstrably underdense local environments may be considerably affected by LOS contributions, which in turn, under the adopted assumptions, may be underestimates of the effect of large scale structure.

  7. GLOBAL CLIMATE AND LARGE-SCALE INFLUENCES ON AQUATIC ANIMAL HEALTH

    EPA Science Inventory

    The last 3 decades have witnessed numerous large-scale mortality events of aquatic organisms in North America. Affected species range from ecologically-important sea urchins to commercially-valuable American lobsters and protected marine mammals. Short-term forensic investigation...

  8. Large-scale structure: the Chile-UK uv-excess quasar survey

    NASA Astrophysics Data System (ADS)

    Clowes, R. G.; Newman, P. R.; Campusano, L. E.; Graham, M. J.

    1996-12-01

    We report the first results from a new-generation survey for quasars using the 2(deg) -field, 128-fibre, multi-object spectrograph on the 2.5-m du Pont telescope at Las Campanas Observatory in Chile. Survey candidates are all objects with (U-B)<-0.3 and B<=19.7 on digitized UK Schmidt plates. The survey will cover 140 deg(2) and produce a homogeneous, magnitude-limited catalogue of ~ 1500 quasars with redshifts 0.4<= z<=2.2. We have so far surveyed 18.7 deg(2) and identified 183 quasars, including all 43 previously-published quasars within the selection criteria. The survey will be used to study in detail the large ( ~ 200h(-1) Mpc) quasar group discovered at z =~ 1.3 by Clowes & Campusano (1991, MNRAS, 249, 218) -- the largest known structure in the early Universe -- and to study the clustering of quasars in general. The group was found with sparse sampling of quasar candidates across 25 deg(2) ; it strikes the boundaries of this area. Our spectroscopic survey will include all candidates in an area around the group of 100 deg(2) , plus a 40 deg(2) control area ~ 34(deg) away. This survey should allow the determination of the full extent, membership and statistical significance of the group, using the MST method of Graham, Clowes and Campusano (1995, MNRAS, 275, 790). Preliminary analysis of our new data shows that the group persists with increased membership. The measurement of the density contrast of the quasar group will be compared with theoretical expectations, and so determine the consistency of the group with formation from Gaussian density fluctuations. We will search for sub-clustering in the group and test the hypothesis that all small-scale (<=10h(-1) Mpc) quasar clustering is attributable to large groups. Our sample will allow further investigation of the clustering of quasars in general. We will also identify and characterise any other large quasar groups in the survey using the MST method.

  9. A composite large-scale CO survey at high Galactic latitudes in the second quadrant

    NASA Technical Reports Server (NTRS)

    Heithausen, A.; Stacy, J. G.; Thaddeus, P.

    1990-01-01

    Surveys of the second quadrant of the Galaxy undertaken with the CfA 1.2-m telescope have been combined to produce a map of approximately 620 sq deg in the 2.6-mm CO (J = 1-0) line at high Galactic latitudes. CO was detected over about 13 percent of the region surveyed, an order of magnitude more gas by area than previously estimated. In contrast, only 26 percent of the area predicted by Desert et al. (1988) to contain molecular gas actually reveals CO, and about two-thirds of the clouds detected are not listed in their catalog of IR excess clouds.

  10. Children's Attitudes about Nuclear War: Results of Large-Scale Surveys of Adolescents.

    ERIC Educational Resources Information Center

    Doctor, Ronald M.; And Others

    A three-section survey instrument was developed to provide descriptive and expressive information about teenagers' attitudes and fear reactions related to the nuclear threat. The first section consisted of one open-ended statement, "Write down your three greatest worries." The second section consisted of 20 areas of potential worry or concern…

  11. Nonparametric Bayesian Multiple Imputation for Incomplete Categorical Variables in Large-Scale Assessment Surveys

    ERIC Educational Resources Information Center

    Si, Yajuan; Reiter, Jerome P.

    2013-01-01

    In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…

  12. Large-scale clustering of galaxies in the CfA Redshift Survey

    NASA Technical Reports Server (NTRS)

    Vogeley, Michael S.; Park, Changbom; Geller, Margaret J.; Huchra, John P.

    1992-01-01

    The power spectrum of the galaxy distribution in the Center for Astrophysics Redshift Survey (de Lapparent et al., 1986; Geller and Huchra, 1989; and Huchra et al., 1992) is measured up to wavelengths of 200/h Mpc. Results are compared with several cosmological simulations with Gaussian initial conditions. It is shown that the power spectrum of the standard CDM model is inconsistent with the observed power spectrum at the 99 percent confidence level.

  13. A composite large-scale CO survey at high galactic latitudes in the second quadrant

    NASA Astrophysics Data System (ADS)

    Heithausen, A.; Stacy, J. G.; de Vries, H. W.; Mebold, U.; Thaddeus, P.

    1993-02-01

    Surveys undertaken in the 2nd quadrant of the Galaxy with the CfA 1.2 m telescope have been combined to produce a map covering about 620 sq deg in the 2.6 mm CO(J = 1 - 0) line at high galactic latitudes. There is CO emission from molecular 'cirrus' clouds in about 13 percent of the region surveyed. The CO clouds are grouped together into three major cloud complexes with 29 individual members. All clouds are associated with infrared emission at 100 micron, although there is no one-to-one correlation between the corresponding intensities. CO emission is detected in all bright and dark Lynds' nebulae cataloged in that region; however not all CO clouds are visible on optical photographs as reflection or absorption features. The clouds are probably local. At an adopted distance of 240 pc cloud sizes range from O.1 to 30 pc and cloud masses from 1 to 1600 solar masses. The molecular cirrus clouds contribute between 0.4 and 0.8 M solar mass/sq pc to the surface density of molecular gas in the galactic plane. Only 26 percent of the 'infrared-excess clouds' in the area surveyed actually show CO and about 2/3 of the clouds detected in CO do not show an infrared excess.

  14. A composite large-scale CO survey at high galactic latitudes in the second quadrant

    NASA Technical Reports Server (NTRS)

    Heithausen, A.; Stacy, J. G.; De Vries, H. W.; Mebold, U.; Thaddeus, P.

    1993-01-01

    Surveys undertaken in the 2nd quadrant of the Galaxy with the CfA 1.2 m telescope have been combined to produce a map covering about 620 sq deg in the 2.6 mm CO(J = 1 - 0) line at high galactic latitudes. There is CO emission from molecular 'cirrus' clouds in about 13 percent of the region surveyed. The CO clouds are grouped together into three major cloud complexes with 29 individual members. All clouds are associated with infrared emission at 100 micron, although there is no one-to-one correlation between the corresponding intensities. CO emission is detected in all bright and dark Lynds' nebulae cataloged in that region; however not all CO clouds are visible on optical photographs as reflection or absorption features. The clouds are probably local. At an adopted distance of 240 pc cloud sizes range from O.1 to 30 pc and cloud masses from 1 to 1600 solar masses. The molecular cirrus clouds contribute between 0.4 and 0.8 M solar mass/sq pc to the surface density of molecular gas in the galactic plane. Only 26 percent of the 'infrared-excess clouds' in the area surveyed actually show CO and about 2/3 of the clouds detected in CO do not show an infrared excess.

  15. Implementing large-scale workforce change: learning from 55 pilot sites of allied health workforce redesign in Queensland, Australia

    PubMed Central

    2013-01-01

    Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616

  16. The Muenster Red Sky Survey: Large-scale structures in the universe

    NASA Astrophysics Data System (ADS)

    Ungruhe, R.; Seitter, W. C.; Duerbeck, H. W.

    2003-01-01

    We present a large-scale galaxy catalogue for the red spectral region which covers an area of 5 000 square degrees. It contains positions, red magnitudes, radii, ellipticities and position angles of about 5.5 million galaxies. Together with the APM catalogue (4,300 square degrees) in the blue spectral region, this catalogue forms at present the largest coherent data base for cosmological investigations in the southern hemisphere. 217 ESO Southern Sky Atlas R Schmidt plates with galactic latitudes -45 degrees were digitized with the two PDS microdensitometers of the Astronomisches Institut Münster, with a step width of 15 microns, corresponding to 1.01 arcseconds per pixel. All data were stored on different storage media and are available for further investigations. Suitable search parameters must be chosen in such a way that all objects are found on the plates, and that the percentage of artificial objects remains as low as possible. Based on two reference areas on different plates, a search threshold of 140 PDS density units and a minimum number of four pixels per object were chosen. The detected objects were stored, according to size, in frames of different size length. Each object was investigated in its frame, and 18 object parameters were determined. The classification of objects into stars, galaxies and perturbed objects was done with an automatic procedure which makes use of combinations of computed object parameters. In the first step, the perturbed objects are removed from the catalogue. Double objects and noise objects can be excluded on the basis of symmetry properties, while for satellite trails, a new classification criterium based on apparent magnitude, effective radius and apparent ellipticity, was developed. For the remaining objects, a star/galaxy separation was carried out. For bright objects, the relation between apparent magnitude and effective radius serves as the discriminating property, for fainter objects, the relation between effective

  17. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  18. A satellite geodetic survey of large-scale deformation of volcanic centres in the central Andes.

    PubMed

    Pritchard, Matthew E; Simons, Mark

    2002-07-11

    Surface deformation in volcanic areas usually indicates movement of magma or hydrothermal fluids at depth. Stratovolcanoes tend to exhibit a complex relationship between deformation and eruptive behaviour. The characteristically long time spans between such eruptions requires a long time series of observations to determine whether deformation without an eruption is common at a given edifice. Such studies, however, are logistically difficult to carry out in most volcanic arcs, as these tend to be remote regions with large numbers of volcanoes (hundreds to even thousands). Here we present a satellite-based interferometric synthetic aperture radar (InSAR) survey of the remote central Andes volcanic arc, a region formed by subduction of the Nazca oceanic plate beneath continental South America. Spanning the years 1992 to 2000, our survey reveals the background level of activity of about 900 volcanoes, 50 of which have been classified as potentially active. We find four centres of broad (tens of kilometres wide), roughly axisymmetric surface deformation. None of these centres are at volcanoes currently classified as potentially active, although two lie within about 10 km of volcanoes with known activity. Source depths inferred from the patterns of deformation lie between 5 and 17 km. In contrast to the four new sources found, we do not observe any deformation associated with recent eruptions of Lascar, Chile. PMID:12110886

  19. A satellite geodetic survey of large-scale deformation of volcanic centres in the central Andes

    NASA Astrophysics Data System (ADS)

    Pritchard, Matthew E.; Simons, Mark

    2002-07-01

    Surface deformation in volcanic areas usually indicates movement of magma or hydrothermal fluids at depth. Stratovolcanoes tend to exhibit a complex relationship between deformation and eruptive behaviour. The characteristically long time spans between such eruptions requires a long time series of observations to determine whether deformation without an eruption is common at a given edifice. Such studies, however, are logistically difficult to carry out in most volcanic arcs, as these tend to be remote regions with large numbers of volcanoes (hundreds to even thousands). Here we present a satellite-based interferometric synthetic aperture radar (InSAR) survey of the remote central Andes volcanic arc, a region formed by subduction of the Nazca oceanic plate beneath continental South America. Spanning the years 1992 to 2000, our survey reveals the background level of activity of about 900 volcanoes, 50 of which have been classified as potentially active. We find four centres of broad (tens of kilometres wide), roughly axisymmetric surface deformation. None of these centres are at volcanoes currently classified as potentially active, although two lie within about 10km of volcanoes with known activity. Source depths inferred from the patterns of deformation lie between 5 and 17km. In contrast to the four new sources found, we do not observe any deformation associated with recent eruptions of Lascar, Chile.

  20. Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities

    NASA Astrophysics Data System (ADS)

    Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.

    2012-07-01

    Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.

  1. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys.

    PubMed

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov-Malyshev-Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  2. Statistical analysis of large scale surveys for constraining the Galaxy evolution

    NASA Astrophysics Data System (ADS)

    Martins, A. M. M.; Robin, A. C.

    2014-07-01

    The formation and evolution of the thick disc of the Milky Way remain controversial. We make use of the Besançon Galaxy model, which among other utilities can be used for data interpretation and to test different scenarios of galaxy formation and evolution. We examine these questions by studying the metallicity distribution of the thin and thick disc with the help of a sample of Main Sequence turn-off stars from the SEGUE survey. We developed a tool based on a MCMC-ABC method to determine the metallicity distribution and study the correlation between the fitted parameters. We obtained a local metallicity of the thick disc of - 0.47 ± 0.03 dex similar to previous studies and the thick disc shows no gradient. A flat gradient in the thick disc can be a consequence of radial mixing or the result of a strong turbulent gaseous disc.

  3. Large-scale survey to describe acne management in Brazilian clinical practice

    PubMed Central

    Seité, Sophie; Caixeta, Clarice; Towersey, Loan

    2015-01-01

    Background Acne is a chronic disease of the pilosebaceous unit that mainly affects adolescents. It is the most common dermatological problem, affecting approximately 80% of teenagers between 12 and 18 years of age. Diagnosis is clinical and is based on the patient’s age at the time the lesions first appear, and on its polymorphism, type of lesions, and their anatomical location. The right treatment for the right patient is key to treating acne safely. The aim of this investigational survey was to evaluate how Brazilian dermatologists in private practice currently manage acne. Materials and methods Dermatologists practicing in 12 states of Brazil were asked how they manage patients with grades I, II, III, and IV acne. Each dermatologist completed a written questionnaire about patient characteristics, acne severity, and the therapy they usually prescribe for each situation. Results In total, 596 dermatologists were interviewed. Adolescents presented as the most common acneic population received by dermatologists, and the most common acne grade was grade II. The doctors could choose more than one type of treatment for each patient, and treatment choices varied according to acne severity. A great majority of dermatologists considered treatment with drugs as the first alternative for all acne grades, choosing either topical or oral presentation depending on the pathology severity. Dermocosmetics were chosen mostly as adjunctive therapy, and their inclusion in the treatment regimen decreased as acne grades increased. Conclusion This survey illustrates that Brazilian dermatologists employ complex treatment regimens to manage acne, choosing systemic drugs, particularly isotretinoin, even in some cases of grade I acne, and heavily prescribe antibiotics. Because complex regimens are harder for patients to comply with, this result notably raises the question of adherence, which is a key factor in successful treatment. PMID:26609243

  4. Studying Displacement After a Disaster Using Large Scale Survey Methods: Sumatra After the 2004 Tsunami

    PubMed Central

    Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan

    2014-01-01

    Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature. PMID:24839300

  5. SDSS-III Baryon Oscillation Spectroscopic Survey Data Release 12: galaxy target selection and large-scale structure catalogues

    NASA Astrophysics Data System (ADS)

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rita; White, Martin; Eisenstein, Daniel J.; Maraston, Claudia; Ross, Ashley J.; Sánchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia-Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco-Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastián E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodríguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scóccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2016-01-01

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. The code used, designated MKSAMPLE, is released with this paper.

  6. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    SciTech Connect

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; Sanchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia -Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco -Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastian E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodriguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scoccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.

  7. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    DOE PAGESBeta

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; et al

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less

  8. Large-scale distribution of surface ozone mixing ratio in southern Mongolia: A survey

    NASA Astrophysics Data System (ADS)

    Meixner, F. X.; Behrendt, T.; Ermel, M.; Hempelmann, N.; Andreae, M. O.; Jöckel, P.

    2012-04-01

    For the first time, measurements of surface ozone mixing ratio have been performed from semi-arid steppe to arid/hyper-arid southern Mongolian Gobi desert. During 12-29 August 2009, ozone mixing ratio was continuously measured from a mobile platform (4x4 Furgon SUV). The survey (3060 km / 229171km2) started at the Mongolian capital Ulaan-Baatar (47.9582° N, 107.0190° E ), heading to south-west (Echin Gol, 43.2586° N, 99.0255° E), eastward to Dalanzadgad (43.6061° N, 104.4445° E), and finally back to Ulaan-Baatar. Ambient air was sampled (approx. 1 l/min) through a 4 m long PTFE-intake line along a forward facing boom mounted on the roof of a 4x4 Furgon SUV. Ozone mixing ratio has been measured by UV-spectroscopy using a mobile dual-cell ozone analyzer (model 205, 2BTechnologies, Boulder, U.S.A.). While ozone signals were measured every 5 seconds, 1 minute averages and standard deviations have been calculated on-line and stored into the data logger. The latter are used to identify and to discriminate against unrealistic low or high ozone mixing ratios which have been due to occasionally passing plumes of vehicle exhaust and/or biomass burning gases, as well as gasoline (at gas filling stations). Even under desert conditions, the temporal behaviour of ozone mixing ratio was characterized by considerable and regular diel variations. Minimum mixing ratios (15-25 ppb) occurred early in the morning (approx. 06:00 local), when surface depletion of ozone (by dry deposition) can not be compensated by supply from the free troposphere due to thermodynamic stability of the nocturnal boundary layer. Late in the afternoon (approx. 17:00 local), under conditions of a turbulently well mixed convective boundary layer, maximum ozone mixing ratios (45-55 ppb) were reached. Daily amplitudes of the diel cycle of ozone mixing ratio were in the order of 30 ppb (steppe), 20 ppb (arid desert), to approx. 5 ppb (hyper-arid Gobi desert (Shargyn Gobi)). Ozone surface measurements were

  9. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications. PMID:18430292

  10. A global survey of martian central mounds: Central mounds as remnants of previously more extensive large-scale sedimentary deposits

    NASA Astrophysics Data System (ADS)

    Bennett, Kristen A.; Bell, James F.

    2016-01-01

    We conducted a survey of central mounds within large (>25 km diameter) impact craters on Mars. We use mound locations, mound offsets within their host craters, and relative mound heights to address and extend various mound formation hypotheses. The results of this survey support the hypothesis that mound sediments once filled their host craters and were later eroded into the features we observe today. The majority of mounds are located near the boundaries of previously identified large-scale sedimentary deposits. We discuss the implications of the hypothesis that central mounds are part of previously more extensive sedimentary units that filled and overtopped underlying impact craters. In this scenario, as erosion of the sedimentary unit occurred, the sediment within impact craters was preserved slightly longer than the overlying sediment because it was sheltered by the crater walls. Our study also reveals that most mounds are offset from the center of their host crater in the same direction as the present regional winds (e.g., the mounds in Arabia Terra are offset towards the western portion of their craters). We propose that this implies that wind has been the dominant agent causing the erosion of central mounds. Mound offset (r) is normalized to each crater's radius. The Mound offset (θ) is such that 0 is north and 270 is west.

  11. Pre- and Postnatal Influences on Preschool Mental Health: A Large-Scale Cohort Study

    ERIC Educational Resources Information Center

    Robinson, Monique; Oddy, Wendy H.; Li, Jianghong; Kendall, Garth E.; de Klerk, Nicholas H.; Silburn, Sven R.; Zubrick, Stephen R.; Newnham, John P.; Stanley, Fiona J.; Mattes, Eugen

    2008-01-01

    Background: Methodological challenges such as confounding have made the study of the early determinants of mental health morbidity problematic. This study aims to address these challenges in investigating antenatal, perinatal and postnatal risk factors for the development of mental health problems in pre-school children in a cohort of Western…

  12. AGN and QSOs in the eROSITA All-Sky Survey. II. The large-scale structure

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2013-10-01

    The four-year X-ray all-sky survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z ≈ 1 and a typical luminosity of L0.5-2.0 keV ~ 1044 ergs-1. We show that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure research. For the first time, detailed redshift- and luminosity-resolved studies of the bias factor for X-ray selected AGN will become possible. The eRASS AGN sample will not only improve the redshift- and luminosity resolution of these studies, but will also expand their luminosity range beyond L0.5-2.0 keV ~ 1044 ergs-1, thus enabling a direct comparison of the clustering properties of luminous X-ray AGN and optical quasars. These studies will dramatically improve our understanding of the AGN environment, triggering mechanisms, the growth of supermassive black holes and their co-evolution with dark matter halos. The eRASS AGN sample will become a powerful cosmological probe. It will enable detecting baryonic acoustic oscillations (BAOs) for the first time with X-ray selected AGN. With the data from the entire extragalactic sky, BAO will be detected at a ≳10σ confidence level in the full redshift range and with ~8σ confidence in the 0.8 < z < 2.0 range, which is currently not covered by any existing BAO surveys. To exploit the full potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  13. The SRG/eROSITA All-Sky Survey: A new era of large-scale structure studies with AGN

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2015-08-01

    The four-year X-ray All-Sky Survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma (SRG) satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z~1 and typical luminosity of L0.5-2.0keV ~ 1044 erg/s. We demonstrate that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure (LSS) studies.We show that with this sample of X-ray selected AGN, it will become possible for the first time to perform detailed redshift- and luminosity-resolved studies of the AGN clustering. This enable us to put strong constraints on different AGN triggering/fueling models as a function of AGN environment, which will dramatically improve our understanding of super-massive black hole growth and its correlation with the co-evolving LSS.Further, the eRASS AGN sample will become a powerful cosmological probe. We demonstrate for the first time that, given the breadth and depth of eRASS, it will become possible to convincingly detect baryonic acoustic oscillations (BAOs) with ~8σ confidence in the 0.8 < z < 2.0 range, currently uncovered by any existing BAO survey.Finally, we discuss the requirements for follow-up missions and demonstrate that in order to fully exploit the potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  14. Health-2000: an integrated large-scale expert system for the hospital of the future.

    PubMed

    Boyom, S F; Kwankam, S Y; Asoh, D A; Asaah, C; Kengne, F

    1997-02-01

    Decision making and management are problems which plague health systems in developing countries, particularly in Sub-Saharan Africa where there is significant waste of resources. The need goes beyond national health management information systems, to tools required in daily micro-management of various components of the health system. This paper describes an integrated expert system, Health-2000, an information-oriented tool for acquiring, processing and disseminating medical knowledge, data and decisions in the hospital of the future. It integrates six essential features of the medical care environment: personnel management, patient management, medical diagnosis, laboratory management, propharmacy, and equipment management. Disease conditions covered are the major tropical diseases. An intelligent tutoring feature completes the package. Emphasis is placed on the graphical user interface to facilitate interactions between the user and the system, which is developed for PCs using Pascal, C, Clipper and Prolog. PMID:9242002

  15. Awareness and Concern about Large-Scale Livestock and Poultry: Results from a Statewide Survey of Ohioans

    ERIC Educational Resources Information Center

    Sharp, Jeff; Tucker, Mark

    2005-01-01

    The development of large-scale livestock facilities has become a controversial issue in many regions of the U.S. in recent years. In this research, rural-urban differences in familiarity and concern about large-scale livestock facilities among Ohioans is examined as well as the relationship of social distance from agriculture and trust in risk…

  16. Health Benefits from Large-Scale Ozone Reduction in the United States

    PubMed Central

    Berman, Jesse D.; Fann, Neal; Hollingsworth, John W.; Pinkerton, Kent E.; Rom, William N.; Szema, Anthony M.; Breysse, Patrick N.; White, Ronald H.

    2012-01-01

    Background: Exposure to ozone has been associated with adverse health effects, including premature mortality and cardiopulmonary and respiratory morbidity. In 2008, the U.S. Environmental Protection Agency (EPA) lowered the primary (health-based) National Ambient Air Quality Standard (NAAQS) for ozone to 75 ppb, expressed as the fourth-highest daily maximum 8-hr average over a 24-hr period. Based on recent monitoring data, U.S. ozone levels still exceed this standard in numerous locations, resulting in avoidable adverse health consequences. Objectives: We sought to quantify the potential human health benefits from achieving the current primary NAAQS standard of 75 ppb and two alternative standard levels, 70 and 60 ppb, which represent the range recommended by the U.S. EPA Clean Air Scientific Advisory Committee (CASAC). Methods: We applied health impact assessment methodology to estimate numbers of deaths and other adverse health outcomes that would have been avoided during 2005, 2006, and 2007 if the current (or lower) NAAQS ozone standards had been met. Estimated reductions in ozone concentrations were interpolated according to geographic area and year, and concentration–response functions were obtained or derived from the epidemiological literature. Results: We estimated that annual numbers of avoided ozone-related premature deaths would have ranged from 1,410 to 2,480 at 75 ppb to 2,450 to 4,130 at 70 ppb, and 5,210 to 7,990 at 60 ppb. Acute respiratory symptoms would have been reduced by 3 million cases and school-loss days by 1 million cases annually if the current 75-ppb standard had been attained. Substantially greater health benefits would have resulted if the CASAC-recommended range of standards (70–60 ppb) had been met. Conclusions: Attaining a more stringent primary ozone standard would significantly reduce ozone-related premature mortality and morbidity. PMID:22809899

  17. Monitoring and Evaluating the Transition of Large-Scale Programs in Global Health

    PubMed Central

    Bao, James; Rodriguez, Daniela C; Paina, Ligia; Ozawa, Sachiko; Bennett, Sara

    2015-01-01

    Purpose: Donors are increasingly interested in the transition and sustainability of global health programs as priorities shift and external funding declines. Systematic and high-quality monitoring and evaluation (M&E) of such processes is rare. We propose a framework and related guiding questions to systematize the M&E of global health program transitions. Methods: We conducted stakeholder interviews, searched the peer-reviewed and gray literature, gathered feedback from key informants, and reflected on author experiences to build a framework on M&E of transition and to develop guiding questions. Findings: The conceptual framework models transition as a process spanning pre-transition and transition itself and extending into sustained services and outcomes. Key transition domains include leadership, financing, programming, and service delivery, and relevant activities that drive the transition in these domains forward include sustaining a supportive policy environment, creating financial sustainability, developing local stakeholder capacity, communicating to all stakeholders, and aligning programs. Ideally transition monitoring would begin prior to transition processes being implemented and continue for some time after transition has been completed. As no set of indicators will be applicable across all types of health program transitions, we instead propose guiding questions and illustrative quantitative and qualitative indicators to be considered and adapted based on the transition domains identified as most important to the particular health program transition. The M&E of transition faces new and unique challenges, requiring measuring constructs to which evaluators may not be accustomed. Many domains hinge on measuring “intangibles” such as the management of relationships. Monitoring these constructs may require a compromise between rigorous data collection and the involvement of key stakeholders. Conclusion: Monitoring and evaluating transitions in global

  18. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  19. Large-Scale Survey Findings Inform Patients’ Experiences in Using Secure Messaging to Engage in Patient-Provider Communication and Self-Care Management: A Quantitative Assessment

    PubMed Central

    Patel, Nitin R; Lind, Jason D; Antinori, Nicole

    2015-01-01

    Background Secure email messaging is part of a national transformation initiative in the United States to promote new models of care that support enhanced patient-provider communication. To date, only a limited number of large-scale studies have evaluated users’ experiences in using secure email messaging. Objective To quantitatively assess veteran patients’ experiences in using secure email messaging in a large patient sample. Methods A cross-sectional mail-delivered paper-and-pencil survey study was conducted with a sample of respondents identified as registered for the Veteran Health Administrations’ Web-based patient portal (My HealtheVet) and opted to use secure messaging. The survey collected demographic data, assessed computer and health literacy, and secure messaging use. Analyses conducted on survey data include frequencies and proportions, chi-square tests, and one-way analysis of variance. Results The majority of respondents (N=819) reported using secure messaging 6 months or longer (n=499, 60.9%). They reported secure messaging to be helpful for completing medication refills (n=546, 66.7%), managing appointments (n=343, 41.9%), looking up test results (n=350, 42.7%), and asking health-related questions (n=340, 41.5%). Notably, some respondents reported using secure messaging to address sensitive health topics (n=67, 8.2%). Survey responses indicated that younger age (P=.039) and higher levels of education (P=.025) and income (P=.003) were associated with more frequent use of secure messaging. Females were more likely to report using secure messaging more often, compared with their male counterparts (P=.098). Minorities were more likely to report using secure messaging more often, at least once a month, compared with nonminorities (P=.086). Individuals with higher levels of health literacy reported more frequent use of secure messaging (P=.007), greater satisfaction (P=.002), and indicated that secure messaging is a useful (P=.002) and easy

  20. A Conceptual Framework for Allocation of Federally Stockpiled Ventilators During Large-Scale Public Health Emergencies.

    PubMed

    Zaza, Stephanie; Koonin, Lisa M; Ajao, Adebola; Nystrom, Scott V; Branson, Richard; Patel, Anita; Bray, Bruce; Iademarco, Michael F

    2016-01-01

    Some types of public health emergencies could result in large numbers of patients with respiratory failure who need mechanical ventilation. Federal public health planning has included needs assessment and stockpiling of ventilators. However, additional federal guidance is needed to assist states in further allocating federally supplied ventilators to individual hospitals to ensure that ventilators are shipped to facilities where they can best be used during an emergency. A major consideration in planning is a hospital's ability to absorb additional ventilators, based on available space and staff expertise. A simple pro rata plan that does not take these factors into account might result in suboptimal use or unused scarce resources. This article proposes a conceptual framework that identifies the steps in planning and an important gap in federal guidance regarding the distribution of stockpiled mechanical ventilators during an emergency. PMID:26828799

  1. Engaging in large-scale digital health technologies and services. What factors hinder recruitment?

    PubMed

    O'Connor, Siobhan; Mair, Frances S; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; O'Donnell, Kate

    2015-01-01

    Implementing consumer oriented digital health products and services at scale is challenging and a range of barriers to reaching and recruiting users to these types of solutions can be encountered. This paper describes the experience of implementers with the rollout of the Delivering Assisted Living Lifestyles at Scale (dallas) programme. The findings are based on qualitative analysis of baseline and midpoint interviews and project documentation. Eight main themes emerged as key factors which hindered participation. These include how the dallas programme was designed and operationalised, constraints imposed by partnerships, technology, branding, and recruitment strategies, as well as challenges with the development cycle and organisational culture. PMID:25991155

  2. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    PubMed Central

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K.

    2013-01-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  3. Perspectives on clinical informatics: integrating large-scale clinical, genomic, and health information for clinical care.

    PubMed

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K; Chung, Yeun-Jun

    2013-12-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  4. LARGE-SCALE STAR-FORMATION-DRIVEN OUTFLOWS AT 1 < z < 2 IN THE 3D-HST SURVEY

    SciTech Connect

    Lundgren, Britt F.; Van Dokkum, Pieter; Bezanson, Rachel; Momcheva, Ivelina; Nelson, Erica; Skelton, Rosalind E.; Wake, David; Whitaker, Katherine; Brammer, Gabriel; Franx, Marijn; Fumagalli, Mattia; Labbe, Ivo; Patel, Shannon; Da Cunha, Elizabete; Rix, Hans Walter; Schmidt, Kasper; Erb, Dawn K.; Fan Xiaohui; Kriek, Mariska; Marchesini, Danilo; and others

    2012-11-20

    We present evidence of large-scale outflows from three low-mass (log(M {sub *}/M {sub Sun }) {approx} 9.75) star-forming (SFR > 4 M {sub Sun} yr{sup -1}) galaxies observed at z = 1.24, z = 1.35, and z = 1.75 in the 3D-HST Survey. Each of these galaxies is located within a projected physical distance of 60 kpc around the sight line to the quasar SDSS J123622.93+621526.6, which exhibits well-separated strong (W {sup {lambda}2796} {sub r} {approx}> 0.8 A) Mg II absorption systems matching precisely to the redshifts of the three galaxies. We derive the star formation surface densities from the H{alpha} emission in the WFC3 G141 grism observations for the galaxies and find that in each case the star formation surface density well exceeds 0.1 M {sub Sun} yr{sup -1} kpc{sup -2}, the typical threshold for starburst galaxies in the local universe. From a small but complete parallel census of the 0.65 < z < 2.6 galaxies with H {sub 140} {approx}< 24 proximate to the quasar sight line, we detect Mg II absorption associated with galaxies extending to physical distances of 130 kpc. We determine that the W{sub r} > 0.8 A Mg II covering fraction of star-forming galaxies at 1 < z < 2 may be as large as unity on scales extending to at least 60 kpc, providing early constraints on the typical extent of starburst-driven winds around galaxies at this redshift. Our observations additionally suggest that the azimuthal distribution of W{sub r} > 0.4 A Mg II absorbing gas around star-forming galaxies may evolve from z {approx} 2 to the present, consistent with recent observations of an increasing collimation of star-formation-driven outflows with time from z {approx} 3.

  5. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  6. Large-scale latitude distortions of the inner Milky Way disk from the Herschel/Hi-GAL Survey

    NASA Astrophysics Data System (ADS)

    Molinari, S.; Noriega-Crespo, A.; Bally, J.; Moore, T. J. T.; Elia, D.; Schisano, E.; Plume, R.; Swinyard, B.; Di Giorgio, A. M.; Pezzuto, S.; Benedettini, M.; Testi, L.

    2016-04-01

    -infrared catalogues are filtered according to criteria that primarily select Young Stellar Objects (YSOs). Conclusions: The distortions of the Galactic inner disk revealed by Herschel confirm previous findings from CO surveys and HII/OB source counts but with much greater statistical significance and are interpreted as large-scale bending modes of the plane. The lack of similar distortions in tracers of more evolved YSOs or stars rules out gravitational instabilities or satellite-induced perturbations, because they should act on both the diffuse and stellar disk components. We propose that the observed bends are caused by incoming flows of extra-planar gas from the Galactic fountain or the Galactic halo interacting with the gaseous disk. With a much lower cross-section, stars decouple from the gaseous ISM and relax into the stellar disk potential. The timescale required for the disappearance of the distortions from the diffuse ISM to the relatively evolved YSO stages are compatible with star formation timescales.

  7. A LARGE-SCALE CLUSTER RANDOMIZED TRIAL TO DETERMINE THE EFFECTS OF COMMUNITY-BASED DIETARY SODIUM REDUCTION – THE CHINA RURAL HEALTH INITIATIVE SODIUM REDUCTION STUDY

    PubMed Central

    Li, Nicole; Yan, Lijing L.; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce

    2013-01-01

    Background Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. Design This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24 hour urine. Trial status The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. Discussion The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. PMID:24176436

  8. Evaluating a Large-Scale Community-Based Intervention to Improve Pregnancy and Newborn Health Among the Rural Poor in India.

    PubMed

    Acharya, Arnab; Lalwani, Tanya; Dutta, Rahul; Rajaratnam, Julie Knoll; Ruducha, Jenny; Varkey, Leila Caleb; Wunnava, Sita; Menezes, Lysander; Taylor, Catharine; Bernson, Jeff

    2015-01-01

    Objectives. We evaluated the effectiveness of the Sure Start project, which was implemented in 7 districts of Uttar Pradesh, India, to improve maternal and newborn health. Methods. Interventions were implemented at 2 randomly assigned levels of intensity. Forty percent of the areas received a more intense intervention, including community-level meetings with expectant mothers. A baseline survey consisted of 12 000 women who completed pregnancy in 2007; a follow-up survey was conducted for women in 2010 in the same villages. Our quantitative analyses provide an account of the project's impact. Results. We observed significant health improvements in both intervention areas over time; in the more intensive intervention areas, we found greater improvements in care-seeking and healthy behaviors. The more intensive intervention areas did not experience a significantly greater decline in neonatal mortality. Conclusions. This study demonstrates that community-based efforts, especially mothers' group meetings designed to increase care-seeking and healthy behaviors, are effective and can be implemented at large scale. PMID:25393175

  9. Prevalence and determinants of child maltreatment among high school students in Southern China: A large scale school based survey

    PubMed Central

    Leung, Phil WS; Wong, William CW; Chen, WQ; Tang, Catherine SK

    2008-01-01

    Background Child maltreatment can cause significant physical and psychological problems. The present study aimed to investigate the prevalence and determinants of child maltreatment in Guangzhou, China, where such issues are often considered a taboo subject. Methods A school-based survey was conducted in southern China in 2005. 24 high schools were selected using stratified random sampling strategy based on their districts and bandings. The self-administered validated Chinese version of parent-child Conflict Tactics Scale (CTSPC) was used as the main assessment tool to measure the abusive experiences encountered by students in the previous six months. Results The response rate of this survey was 99.7%. Among the 6592 responding students, the mean age was 14.68. Prevalence of parental psychological aggression, corporal punishment, severe and very serve physical maltreatment in the past 6 months were 78.3%, 23.2%, 15.1% and 2.8% respectively. The prevalence of sexual abuse is 0.6%. The most commonly cited reasons for maltreatment included 'disobedience to parents', 'poor academic performance', and 'quarrelling between parents'. Age, parental education, places of origins and types of housing were found to be associated with physical maltreatments whereas gender and fathers' education level were associated with sexual abuse. Conclusion Though largely unspoken, child maltreatment is a common problem in China. Identification of significant determinants in this study can provide valuable information for teachers and health professionals so as to pay special attention to those at-risk children. PMID:18823544

  10. Prevalence of disability in Manikganj district of Bangladesh: results from a large-scale cross-sectional survey

    PubMed Central

    Zaman, M Mostafa; Mashreky, Saidur Rahman

    2016-01-01

    Objective To conduct a comprehensive survey on disability to determine the prevalence and distribution of cause-specific disability among residents of the Manikganj district in Bangladesh. Methods The survey was conducted in Manikganj, a typical district in Bangladesh, in 2009. Data were collected from 37 030 individuals of all ages. Samples were drawn from 8905 households from urban and rural areas proportionate to population size. Three sets of interviewer-administered questionnaires were used separately for age groups 0–1 years, 2–10 years and 11 years and above to collect data. For the age groups 0–1 years and 2–10 years, the parents or the head of the household were interviewed to obtain the responses. Impairments, activity limitations and restriction of participation were considered in defining disability consistent with the International Classification of Functioning, Disability and Health framework. Results Overall, age-standardised prevalence of disability per 1000 was 46.5 (95% CI 44.4 to 48.6). Prevalence was significantly higher among respondents living in rural areas (50.2; 95% CI 47.7 to 52.7) than in urban areas (31.0; 95% CI 27.0 to 35.0). Overall, female respondents had more disability (50.0; 95% CI 46.9 to 53.1) than male respondents (43.4; 95% CI 40.5 to 46.3). Educational deprivation was closely linked to higher prevalence of disability. Commonly reported prevalences (per 1000) for underlying causes of disability were 20.2 for illness, followed by 9.4 for congenital causes and 6.8 for injury, and these were consistent in males and females. Conclusions Disability is a common problem in this typical district of Bangladesh, which is largely generalisable. Interventions at community level with special attention to the socioeconomically deprived are warranted. PMID:27431897

  11. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  12. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  13. Large-Scale Survey of Chinese Precollege Students' Epistemological Beliefs about Physics: A Progression or a Regression?

    ERIC Educational Resources Information Center

    Zhang, Ping; Ding, Lin

    2013-01-01

    This paper reports a cross-grade comparative study of Chinese precollege students' epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade…

  14. Adult Siblings of Individuals with Down Syndrome versus with Autism: Findings from a Large-Scale US Survey

    ERIC Educational Resources Information Center

    Hodapp, R. M.; Urbano, R. C.

    2007-01-01

    Background: As adults with Down syndrome live increasingly longer lives, their adult siblings will most likely assume caregiving responsibilities. Yet little is known about either the sibling relationship or the general functioning of these adult siblings. Using a national, web-based survey, this study compared adult siblings of individuals with…

  15. An evaluation of two large scale demand side financing programs for maternal health in India: the MATIND study protocol

    PubMed Central

    2012-01-01

    Background High maternal mortality in India is a serious public health challenge. Demand side financing interventions have emerged as a strategy to promote access to emergency obstetric care. Two such state run programs, Janani Suraksha Yojana (JSY)and Chiranjeevi Yojana (CY), were designed and implemented to reduce financial access barriers that preclude women from obtaining emergency obstetric care. JSY, a conditional cash transfer, awards money directly to a woman who delivers in a public health facility. This will be studied in Madhya Pradesh province. CY, a voucher based program, empanels private obstetricians in Gujarat province, who are reimbursed by the government to perform deliveries of socioeconomically disadvantaged women. The programs have been in operation for the last seven years. Methods/designs The study outlined in this protocol will assess and compare the influence of the two programs on various aspects of maternal health care including trends in program uptake, institutional delivery rates, maternal and neonatal outcomes, quality of care, experiences of service providers and users, and cost effectiveness. The study will collect primary data using a combination of qualitative and quantitative methods, including facility level questionnaires, observations, a population based survey, in-depth interviews, and focus group discussions. Primary data will be collected in three districts of each province. The research will take place at three levels: the state health departments, obstetric facilities in the districts and among recently delivered mothers in the community. Discussion The protocol is a comprehensive assessment of the performance and impact of the programs and an economic analysis. It will fill existing evidence gaps in the scientific literature including access and quality to services, utilization, coverage and impact. The implementation of the protocol will also generate evidence to facilitate decision making among policy makers and

  16. Assessing large-scale surveyor variability in the historic forest data of the original U.S. Public Land Survey

    USGS Publications Warehouse

    Manies, K.L.; Mladenoff, D.J.; Nordheim, E.V.

    2001-01-01

    The U.S. General Land Office Public Land Survey (PLS) records are a valuable resource for studying pre-European settlement vegetation. However, these data were taken for legal, not ecological, purposes. In turn, the instructions the surveyors followed affected the data collected. For this reason, it has been suggested that the PLS data may not truly represent the surveyed landscapes. This study examined the PLS data of northern Wisconsin, U.S.A., to determine the extent of variability among surveyors. We statistically tested for differences among surveyors in recorded tree species, size, location, and distance from the survey point. While we cannot rule out effects from other influences (e.g., environmental factors), we found evidence suggesting some level of surveyor bias for four of five variables, including tree species and size. The PLS data remain one of the best records of pre-European settlement vegetation available. However, based on our findings, we recommend that projects using PLS records examine these data carefully. This assessment should include not only the choice of variables to be studied but also the spatial extent at which the data will be examined.

  17. Abuse of Medications Employed for the Treatment of ADHD: Results From a Large-Scale Community Survey

    PubMed Central

    Bright, George M.

    2008-01-01

    Objective The objective is to assess abuse of prescription and illicit stimulants among individuals being treated for attention-deficit/hyperactivity disorder (ADHD). Methods A survey was distributed to patients enrolled in an ADHD treatment center. It included questions designed to gain information about demographics; ADHD treatment history; illicit drug use; and misuse of prescribed stimulant medications, including type of stimulant medication most frequently misused or abused, and how the stimulant was prepared and administered. Results A total of 545 subjects (89.2% with ADHD) were included in the survey. Results indicated that 14.3% of respondents abused prescription stimulants. Of these, 79.8% abused short-acting agents; 17.2% abused long-acting stimulants; 2.0% abused both short- and long-acting agents; and 1.0% abused other agents. The specific medications abused most often were mixed amphetamine salts (Adderall; 40.0%), mixed amphetamine salts extended release (Adderall XR; 14.2%), and methylphenidate (Ritalin; 15.0%), and the most common manner of stimulant abuse was crushing pills and snorting (75.0%). Survey results also showed that 39.1% of respondents used nonprescription stimulants, most often cocaine (62.2%), methamphetamine (4.8%), and both cocaine and amphetamine (31.1%). Choice of illicit drug was based on rapidity of high onset (43.5%), ease of acquisition (40.7%), ease of use (10.2%), and cost (5.5%). Conclusions The risks for abuse of prescription and illicit stimulants are elevated among individuals being treated in an ADHD clinic. Prescription agents used most often are those with pharmacologic and pharmacokinetic characteristics that provide a rapid high. This suggests that long-acting stimulant preparations that have been developed for the treatment of ADHD may have lower abuse potential than short-acting formulations. PMID:18596945

  18. Future Large-Scale Surveys of 'Interesting' Stars in The Halo and Thick Disk of the Galaxy

    NASA Astrophysics Data System (ADS)

    Beers, T. C.

    The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the numbers of interesting stars with available spectroscopic and photometric information. The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the

  19. Large-scale survey of Chinese precollege students' epistemological beliefs about physics: A progression or a regression?

    NASA Astrophysics Data System (ADS)

    Zhang, Ping; Ding, Lin

    2013-06-01

    This paper reports a cross-grade comparative study of Chinese precollege students’ epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade to the 12th grade in China. The original CLASS was translated into Mandarin through a rigorous transadaption process, and then it was administered as a pencil-and-paper in-class survey to a total of 1318 students across all the five grade levels (8-12). Our results showed that although in general student epistemological beliefs became less expertlike after receiving more years of traditional instruction (a trend consistent with what was reported in the previous literature), the cross-grade change was not a monotonous decrease. Instead, students at grades 9 and 12 showed a slight positive shift in their beliefs measured by CLASS. Particularly, when compared to the 8th graders, students at the 9th grade demonstrated a significant increase in their views about the conceptual nature of physics and problem-solving sophistication. We hypothesize that both pedagogical and nonpedagogical factors may have contributed to these positive changes. Our results cast light on the complex nature of the relationship between formal instruction and student epistemological beliefs.

  20. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  1. A Large-Scale, Low-Frequency Murchison Widefield Array Survey of Galactic H ii Regions between 260 < l < 340

    NASA Astrophysics Data System (ADS)

    Hindson, L.; Johnston-Hollitt, M.; Hurley-Walker, N.; Callingham, J. R.; Su, H.; Morgan, J.; Bell, M.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Deshpande, A. A.; Dwarakanath, K. S.; For, B.-Q.; Gaensler, B. M.; Greenhill, L. J.; Hancock, P.; Hazelton, B. J.; Kapińska, A. D.; Kaplan, D. L.; Lenc, E.; Lonsdale, C. J.; Mckinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Oberoi, D.; Offringa, A.; Ord, S. M.; Procopio, P.; Prabu, T.; Shankar, N. Udaya; Srivani, K. S.; Staveley-Smith, L.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.; Wu, C.; Zheng, Q.

    2016-05-01

    We have compiled a catalogue of H ii regions detected with the Murchison Widefield Array between 72 and 231 MHz. The multiple frequency bands provided by the Murchison Widefield Array allow us identify the characteristic spectrum generated by the thermal Bremsstrahlung process in H ii regions. We detect 306 H ii regions between 260° < l < 340° and report on the positions, sizes, peak, integrated flux density, and spectral indices of these H ii regions. By identifying the point at which H ii regions transition from the optically thin to thick regime, we derive the physical properties including the electron density, ionised gas mass, and ionising photon flux, towards 61 H ii regions. This catalogue of H ii regions represents the most extensive and uniform low frequency survey of H ii regions in the Galaxy to date.

  2. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  3. DEMOGRAPHIC AND HEALTH SURVEYS

    EPA Science Inventory

    Demographic and Health Surveys are nationally representative household surveys with large sample sizes of between 5,000 and 30,000 households, typically. DHS surveys provide data for a wide range of monitoring and impact evaluation indicators in the areas of population, health, a...

  4. Evaluation of airborne geophysical surveys for large-scale mapping of contaminated mine pools: draft final report

    SciTech Connect

    Hammack, R. W.

    2006-12-28

    Decades of underground coal mining has left about 5,000 square miles of abandoned mine workings that are rapidly filling with water. The water quality of mine pools is often poor; environmental regulatory agencies are concerned because water from mine pools could contaminate diminishing surface and groundwater supplies. Mine pools are also a threat to the safety of current mining operations. Conversely, mine pools are a large, untapped water resource that, with treatment, could be used for a variety of industrial purposes. Others have proposed using mine pools in conjunction with heat pumps as a source of heating and cooling for large industrial facilities. The management or use of mine pool water requires accurate maps of mine pools. West Virginia University has predicted the likely location and volume of mine pools in the Pittsburgh Coalbed using existing mine maps, structure contour maps, and measured mine pool elevations. Unfortunately, mine maps only reflect conditions at the time of mining, are not available for all mines, and do not always denote the maximum extent of mining. Since 1999, the National Energy Technology Laboratory (NETL) has been evaluating helicopter-borne, electromagnetic sensing technologies for the detection and mapping of mine pools. Frequency domain electromagnetic sensors are able to detect shallow mine pools (depth < 50 m) if there is sufficient contrast between the conductance of the mine pool and the conductance of the overburden. The mine pools (conductors) most confidently detected by this technology are overlain by thick, resistive sandstone layers. In 2003, a helicopter time domain electromagnetic sensor was applied to mined areas in southwestern Virginia in an attempt to increase the depth of mine pool detection. This study failed because the mine pool targets were thin and not very conductive. Also, large areas of the surveys were degraded or made unusable by excessive amounts of cultural electromagnetic noise that obscured the

  5. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  6. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Astrophysics Data System (ADS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  7. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Technical Reports Server (NTRS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-01-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  8. Galaxy evolution and large-scale structure in the far-infrared. II. The IRAS faint source survey

    SciTech Connect

    Lonsdale, C.J.; Hacking, P.B.; Conrow, T.P.; Rowan-Robinson, M. Queen Mary College, London )

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling. 105 refs.

  9. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  10. Assessing outcomes of large-scale public health interventions in the absence of baseline data using a mixture of Cox and binomial regressions

    PubMed Central

    2014-01-01

    Background Large-scale public health interventions with rapid scale-up are increasingly being implemented worldwide. Such implementation allows for a large target population to be reached in a short period of time. But when the time comes to investigate the effectiveness of these interventions, the rapid scale-up creates several methodological challenges, such as the lack of baseline data and the absence of control groups. One example of such an intervention is Avahan, the India HIV/AIDS initiative of the Bill & Melinda Gates Foundation. One question of interest is the effect of Avahan on condom use by female sex workers with their clients. By retrospectively reconstructing condom use and sex work history from survey data, it is possible to estimate how condom use rates evolve over time. However formal inference about how this rate changes at a given point in calendar time remains challenging. Methods We propose a new statistical procedure based on a mixture of binomial regression and Cox regression. We compare this new method to an existing approach based on generalized estimating equations through simulations and application to Indian data. Results Both methods are unbiased, but the proposed method is more powerful than the existing method, especially when initial condom use is high. When applied to the Indian data, the new method mostly agrees with the existing method, but seems to have corrected some implausible results of the latter in a few districts. We also show how the new method can be used to analyze the data of all districts combined. Conclusions The use of both methods can be recommended for exploratory data analysis. However for formal statistical inference, the new method has better power. PMID:24397563

  11. Large Scale Computing

    NASA Astrophysics Data System (ADS)

    Capiluppi, Paolo

    2005-04-01

    Large Scale Computing is acquiring an important role in the field of data analysis and treatment for many Sciences and also for some Social activities. The present paper discusses the characteristics of Computing when it becomes "Large Scale" and the current state of the art for some particular application needing such a large distributed resources and organization. High Energy Particle Physics (HEP) Experiments are discussed in this respect; in particular the Large Hadron Collider (LHC) Experiments are analyzed. The Computing Models of LHC Experiments represent the current prototype implementation of Large Scale Computing and describe the level of maturity of the possible deployment solutions. Some of the most recent results on the measurements of the performances and functionalities of the LHC Experiments' testing are discussed.

  12. Research on the Second Region of Sino-German 6 cm Polarization Survey of the Galactic Plane and Large-scale Supernova Remnants

    NASA Astrophysics Data System (ADS)

    Xiao, L.

    2011-11-01

    Polarization observation provides a useful tool to study the properties of interstellar medium (ISM). It could directly show the orientation of large-scale magnetic fields, and help us understand the structure of large-scale magnetic field in our Galaxy and the evolution of supernova remnants (SNRs). Moreover, combing with polarization observations at other wavelengths, the Faraday rotation could be applied to study the properties of the thermal electron density, filling factor, regular and random magnetic fields in ISM and SNRs.The previous polarization measurements mostly conducted at low frequencies were significantly influenced by the Faraday effects of ISM, while at 6 cm, they are much less affected and polarized emission from larger distances could be detected. By studying Faraday screens, we could explore the physical parameters of the sources as well as the synchrotron emissivities of the Galaxy. The 6 cm total intensity measurements are the key data to clarify the spectrum behavior of diffused emission or individual objects at high frequencies, and help us understand the distribution of relativistic electrons, the disk-halo interaction and the evolution of late-stage SNRs. In August 2009, the project of 6~cm continuum and polarization survey of Galactic plane had been completed successfully using the 25~m radio telescope at Urumqi. The work presented in this thesis is mainly based on data analysis of the second survey region with 60° ≤ l ≤129° and |b|≤5°. We tried to compensate the missing large-scale structures by extrapolating the WMAP K-band polarization data with the spectral index model and simulation of the rotation measures (RMs). By comparing the maps pre- with post-``calibration'', we studied the extended objects in this region. We analyzed the depolarization structure at the periphery of HII region complex using Faraday screen model, and studied the distribution of fluctuation in the entire survey region using structure functions

  13. [The benefit of large-scale cohort studies for health research: the example of the German National Cohort].

    PubMed

    Ahrens, Wolfgang; Jöckel, K-H

    2015-08-01

    The prospective nature of large-scale epidemiological multi-purpose cohort studies with long observation periods facilitates the search for complex causes of diseases, the analysis of the natural history of diseases and the identification of novel pre-clinical markers of disease. The German National Cohort (GNC) is a population-based, highly standardised and in-depth phenotyped cohort. It shall create the basis for new strategies for risk assessment and identification, early diagnosis and prevention of multifactorial diseases. The GNC is the largest population-based cohort study in Germany to date. In the year 2014 the examination of 200,000 women and men aged 20-69 years started in 18 study centers. The study facilitates the investigation of the etiology of chronic diseases in relation to lifestyle, genetic, socioeconomic, psychosocial and environmental factors. By this the GNC creates the basis for the development of methods for early diagnosis and prevention of these diseases. Cardiovascular and respiratory diseases, cancer, diabetes, neurodegenerative/-psychiatric diseases, musculoskeletal and infectious diseases are in focus of this study. Due to its mere size, the study could be characterized as a Big Data project. We deduce that this is not the case. PMID:26077870

  14. Health Occupations Survey.

    ERIC Educational Resources Information Center

    Willett, Lynn H.

    A survey was conducted to determine the need for health occupations personnel in the Moraine Valley Community College district, specifically to: (1) describe present employment for selected health occupations; (2) project health occupation employment to 1974; (3) identify the supply of applicants for the selected occupations; and (4) identify…

  15. Evolution of clustering length, large-scale bias, and host halo mass at 2 < z < 5 in the VIMOS Ultra Deep Survey (VUDS)⋆

    NASA Astrophysics Data System (ADS)

    Durkalec, A.; Le Fèvre, O.; Pollo, A.; de la Torre, S.; Cassata, P.; Garilli, B.; Le Brun, V.; Lemaux, B. C.; Maccagni, D.; Pentericci, L.; Tasca, L. A. M.; Thomas, R.; Vanzella, E.; Zamorani, G.; Zucca, E.; Amorín, R.; Bardelli, S.; Cassarà, L. P.; Castellano, M.; Cimatti, A.; Cucciati, O.; Fontana, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Ilbert, O.; Paltani, S.; Ribeiro, B.; Schaerer, D.; Scodeggio, M.; Sommariva, V.; Talia, M.; Tresse, L.; Vergani, D.; Capak, P.; Charlot, S.; Contini, T.; Cuby, J. G.; Dunlop, J.; Fotopoulou, S.; Koekemoer, A.; López-Sanjuan, C.; Mellier, Y.; Pforr, J.; Salvato, M.; Scoville, N.; Taniguchi, Y.; Wang, P. W.

    2015-11-01

    We investigate the evolution of galaxy clustering for galaxies in the redshift range 2.0 Survey (VUDS). We present the projected (real-space) two-point correlation function wp(rp) measured by using 3022 galaxies with robust spectroscopic redshifts in two independent fields (COSMOS and VVDS-02h) covering in total 0.8deg2. We quantify how the scale dependent clustering amplitude r0 changes with redshift making use of mock samples to evaluate and correct the survey selection function. Using a power-law model ξ(r) = (r/r0)- γ we find that the correlation function for the general population is best fit by a model with a clustering length r0 = 3.95+0.48-0.54 h-1 Mpc and slope γ = 1.8+0.02-0.06 at z ~ 2.5, r0 = 4.35 ± 0.60 h-1 Mpc and γ = 1.6+0.12-0.13 at z ~ 3.5. We use these clustering parameters to derive the large-scale linear galaxy bias bLPL, between galaxies and dark matter. We find bLPL = 2.68 ± 0.22 at redshift z ~ 3 (assuming σ8 = 0.8), significantly higher than found at intermediate and low redshifts for the similarly general galaxy populations. We fit a halo occupation distribution (HOD) model to the data and we obtain that the average halo mass at redshift z ~ 3 is Mh = 1011.75 ± 0.23 h-1M⊙. From this fit we confirm that the large-scale linear galaxy bias is relatively high at bLHOD = 2.82 ± 0.27. Comparing these measurements with similar measurements at lower redshifts we infer that the star-forming population of galaxies at z ~ 3 should evolve into the massive and bright (Mr< -21.5)galaxy population, which typically occupy haloes of mass ⟨ Mh ⟩ = 1013.9 h-1M⊙ at redshift z = 0. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, under Large Program 185.A-0791.Appendices are available in electronic form at http://www.aanda.org

  16. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    PubMed

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland. PMID:23920887

  17. The VIMOS Public Extragalactic Redshift Survey (VIPERS). An unprecedented view of galaxies and large-scale structure at 0.5 < z < 1.2

    NASA Astrophysics Data System (ADS)

    Guzzo, L.; Scodeggio, M.; Garilli, B.; Granett, B. R.; Fritz, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bolzonella, M.; Bottini, D.; Branchini, E.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; De Lucia, G.; de la Torre, S.; Franzetti, P.; Fumana, M.; Hudelot, P.; Ilbert, O.; Iovino, A.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Peacock, J. A.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zamorani, G.; Zanichelli, A.; Burden, A.; Di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Moscardini, L.; Nichol, R. C.; Percival, W. J.; Phleps, S.; Wolk, M.

    2014-06-01

    We describe the construction and general features of VIPERS, the VIMOS Public Extragalactic Redshift Survey. This ESO Large Programme is using the Very Large Telescope with the aim of building a spectroscopic sample of ~ 100 000 galaxies with iAB< 22.5 and 0.5 survey covers a total area of ~ 24 deg2 within the CFHTLS-Wide W1 and W4 fields. VIPERS is designed to address a broad range of problems in large-scale structure and galaxy evolution, thanks to a unique combination of volume (~ 5 × 107h-3 Mpc3) and sampling rate (~ 40%), comparable to state-of-the-art surveys of the local Universe, together with extensive multi-band optical and near-infrared photometry. Here we present the survey design, the selection of the source catalogue and the development of the spectroscopic observations. We discuss in detail the overall selection function that results from the combination of the different constituents of the project. This includes the masks arising from the parent photometric sample and the spectroscopic instrumental footprint, together with the weights needed to account for the sampling and the success rates of the observations. Using the catalogue of 53 608 galaxy redshifts composing the forthcoming VIPERS Public Data Release 1 (PDR-1), we provide a first assessment of the quality of the spectroscopic data. The stellar contamination is found to be only 3.2%, endorsing the quality of the star-galaxy separation process and fully confirming the original estimates based on the VVDS data, which also indicate a galaxy incompleteness from this process of only 1.4%. Using a set of 1215 repeated observations, we estimate an rms redshift error σz/ (1 + z) = 4.7 × 10-4 and calibrate the internal spectral quality grading. Benefiting from the combination of size and detailed sampling of this dataset, we conclude by presenting a map showing in unprecedented detail the large-scale distribution of galaxies between 5 and 8 billion years ago. Based on observations

  18. Understanding Uncertainties in Non-Linear Population Trajectories: A Bayesian Semi-Parametric Hierarchical Approach to Large-Scale Surveys of Coral Cover

    PubMed Central

    Vercelloni, Julie; Caley, M. Julian; Kayal, Mohsen; Low-Choy, Samantha; Mengersen, Kerrie

    2014-01-01

    Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making. PMID:25364915

  19. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    SciTech Connect

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many of the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.

  20. Public appraisal of government efforts and participation intent in medico-ethical policymaking in Japan: a large scale national survey concerning brain death and organ transplant

    PubMed Central

    Sato, Hajime; Akabayashi, Akira; Kai, Ichiro

    2005-01-01

    Background Public satisfaction with policy process influences the legitimacy and acceptance of policies, and conditions the future political process, especially when contending ethical value judgments are involved. On the other hand, public involvement is required if effective policy is to be developed and accepted. Methods Using the data from a large-scale national opinion survey, this study evaluates public appraisal of past government efforts to legalize organ transplant from brain-dead bodies in Japan, and examines the public's intent to participate in future policy. Results A relatively large percentage of people became aware of the issue when government actions were initiated, and many increasingly formed their own opinions on the policy in question. However, a significant number (43.3%) remained unaware of any legislative efforts, and only 26.3% of those who were aware provided positive appraisals of the policymaking process. Furthermore, a majority of respondents (61.8%) indicated unwillingness to participate in future policy discussions of bioethical issues. Multivariate analysis revealed the following factors are associated with positive appraisals of policy development: greater age; earlier opinion formation; and familiarity with donor cards. Factors associated with likelihood of future participation in policy discussion include younger age, earlier attention to the issue, and knowledge of past government efforts. Those unwilling to participate cited as their reasons that experts are more knowledgeable and that the issues are too complex. Conclusions Results of an opinion survey in Japan were presented, and a set of factors statistically associated with them were discussed. Further efforts to improve policy making process on bioethical issues are desirable. PMID:15661080

  1. Testing deviations from ΛCDM with growth rate measurements from six large-scale structure surveys at z = 0.06-1

    NASA Astrophysics Data System (ADS)

    Alam, Shadab; Ho, Shirley; Silvestri, Alessandra

    2016-03-01

    We use measurements from the Planck satellite mission and galaxy redshift surveys over the last decade to test three of the basic assumptions of the standard model of cosmology, ΛCDM (Λ cold dark matter): the spatial curvature of the universe, the nature of dark energy and the laws of gravity on large scales. We obtain improved constraints on several scenarios that violate one or more of these assumptions. We measure w0 = -0.94 ± 0.17 (18 per cent measurement) and 1 + wa = 1.16 ± 0.36 (31 per cent measurement) for models with a time-dependent equation of state, which is an improvement over current best constraints. In the context of modified gravity, we consider popular scalar-tensor models as well as a parametrization of the growth factor. In the case of one-parameter f(R) gravity models with a ΛCDM background, we constrain B0 < 1.36 × 10-5 (1σ C.L.), which is an improvement by a factor of 4 on the current best. We provide the very first constraint on the coupling parameters of general scalar-tensor theory and stringent constraint on the only free coupling parameter of Chameleon models. We also derive constraints on extended Chameleon models, improving the constraint on the coupling by a factor of 6 on the current best. The constraints on coupling parameter for Chameleon model rule out the value of β1 = 4/3 required for f(R) gravity. We also measure γ = 0.612 ± 0.072 (11.7 per cent measurement) for growth index parametrization. We improve all the current constraints by combining results from various galaxy redshift surveys in a coherent way, which includes a careful treatment of scale dependence introduced by modified gravity.

  2. The Big Drink Debate: perceptions of the impact of price on alcohol consumption from a large scale cross-sectional convenience survey in north west England

    PubMed Central

    2011-01-01

    Background A large-scale survey was conducted in 2008 in north west England, a region with high levels of alcohol-related harm, during a regional 'Big Drink Debate' campaign. The aim of this paper is to explore perceptions of how alcohol consumption would change if alcohol prices were to increase or decrease. Methods A convenience survey of residents (≥ 18 years) of north west England measured demographics, income, alcohol consumption in previous week, and opinions on drinking behaviour under two pricing conditions: low prices and discounts and increased alcohol prices (either 'decrease', 'no change' or 'increase'). Multinomial logistic regression used three outcomes: 'completely elastic' (consider that lower prices increase drinking and higher prices decrease drinking); 'lower price elastic' (lower prices increase drinking, higher prices have no effect); and 'price inelastic' (no change for either). Results Of 22,780 drinkers surveyed, 80.3% considered lower alcohol prices and discounts would increase alcohol consumption, while 22.1% thought raising prices would decrease consumption, making lower price elasticity only (i.e. lower prices increase drinking, higher prices have no effect) the most common outcome (62%). Compared to a high income/high drinking category, the lightest drinkers with a low income (adjusted odds ratio AOR = 1.78, 95% confidence intervals CI 1.38-2.30) or medium income (AOR = 1.88, CI 1.47-2.41) were most likely to be lower price elastic. Females were more likely than males to be lower price elastic (65% vs 57%) while the reverse was true for complete elasticity (20% vs 26%, P < 0.001). Conclusions Lower pricing increases alcohol consumption, and the alcohol industry's continued focus on discounting sales encourages higher drinking levels. International evidence suggests increasing the price of alcohol reduces consumption, and one in five of the surveyed population agreed; more work is required to increase this agreement to achieve public

  3. Health risks from large-scale water pollution: Current trends and implications for improving drinking water quality in the lower Amu Darya drainage basin, Uzbekistan

    NASA Astrophysics Data System (ADS)

    Törnqvist, Rebecka; Jarsjö, Jerker

    2010-05-01

    Safe drinking water is a primary prerequisite to human health, well being and development. Yet, there are roughly one billion people around the world that lack access to safe drinking water supply. Health risk assessments are effective for evaluating the suitability of using various water sources as drinking water supply. Additionally, knowledge of pollutant transport processes on relatively large scales is needed to identify effective management strategies for improving water resources of poor quality. The lower Amu Darya drainage basin close to the Aral Sea in Uzbekistan suffers from physical water scarcity and poor water quality. This is mainly due to the intensive agriculture production in the region, which requires extensive freshwater withdrawals and use of fertilizers and pesticides. In addition, recurrent droughts in the region affect the surface water availability. On average 20% of the population in rural areas in Uzbekistan lack access to improved drinking water sources, and the situation is even more severe in the lower Amu Darya basin. In this study, we consider health risks related to water-borne contaminants by dividing measured substance concentrations with health-risk based guideline values from the World Health Organisation (WHO). In particular, we analyse novel results of water quality measurements performed in 2007 and 2008 in the Mejdurechye Reservoir (located in the downstream part of the Amu Darya river basin). We furthermore identify large-scale trends by comparing the Mejdurechye results to reported water quality results from a considerable stretch of the Amu Darya river basin, including drainage water, river water and groundwater. The results show that concentrations of cadmium and nitrite exceed the WHO health-risk based guideline values in Mejdurechye Reservoir. Furthermore, concentrations of the since long ago banned and highly toxic pesticides dichlorodiphenyltrichloroethane (DDT) and γ-hexachlorocyclohexane (γ-HCH) were detected in

  4. Seismic texture and amplitude analysis of large scale fluid escape pipes using time lapses seismic surveys: examples from the Loyal Field (Scotland, UK)

    NASA Astrophysics Data System (ADS)

    Maestrelli, Daniele; Jihad, Ali; Iacopini, David; Bond, Clare

    2016-04-01

    ) affected by large scale fracture (semblance image) and seem consistent with a suspended mud/sand mixture non-fluidized fluid flow. Near-Middle-Far offsets amplitude analysis confirms that most of the amplitude anomalies within the pipes conduit and terminus are only partly related to gas. An interpretation of the possible texture observed is proposed with a discussion of the noise and artefact induced by resolution and migration problems. Possible hypothetical formation mechanisms for those Pipes are discussed.

  5. Analysis and Modeling of Threatening Factors of Workforce’s Health in Large-Scale Workplaces: Comparison of Four-Fitting Methods to select optimum technique

    PubMed Central

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Savareh, Behrouz Alizadeh

    2016-01-01

    Introduction Workforce is one of the pillars of development in any country. Therefore, the workforce’s health is very important, and analyzing its threatening factors is one of the fundamental steps for health planning. This study was the first part of a comprehensive study aimed at comparing the fitting methods to analyze and model the factors threatening health in occupational injuries. Methods In this study, 980 human occupational injuries in 10 Iranian large-scale workplaces within 10 years (2005–2014) were analyzed and modeled based on the four fitting methods: linear regression, regression analysis, generalized linear model, and artificial neural networks (ANN) using IBM SPSS Modeler 14.2. Results Accident Severity Rate (ASR) of occupational injuries was 557.47 ± 397.87. The results showed that the mean of age and work experience of injured workers were 27.82 ± 5.23 and 4.39 ± 3.65 years, respectively. Analysis of health-threatening factors showed that some factors, including age, quality of provided H&S training, number of workers, hazard identification (HAZID), and periodic risk assessment, and periodic H&S training were important factors that affected ASR. In addition, the results of comparison of the four fitting methods showed that the correlation coefficient of ANN (R = 0.968) and the relative error (R.E) of ANN (R.E = 0.063) were the highest and lowest, respectively, among other fitting methods. Conclusion The findings of the present study indicated that, despite the suitability and effectiveness of all fitting methods in analyzing severity of occupational injuries, ANN is the best fitting method for modeling of the threatening factors of a workforce’s health. Furthermore, all fitting methods, especially ANN, should be considered more in analyzing and modeling of occupational injuries and health-threatening factors as well as planning to provide and improve the workforce’s health. PMID:27053999

  6. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  7. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L.; Rickert, M.

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  8. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  9. Self-Assessments or Tests? Comparing Cross-National Differences in Patterns and Outcomes of Graduates' Skills Based on International Large-Scale Surveys

    ERIC Educational Resources Information Center

    Humburg, Martin; van der Velden, Rolf

    2015-01-01

    In this paper an analysis is carried out whether objective tests and subjective self-assessments in international large-scale studies yield similar results when looking at cross-national differences in the effects of skills on earnings, and skills patterns across countries, fields of study and gender. The findings indicate that subjective skills…

  10. A Numeric Scorecard Assessing the Mental Health Preparedness for Large-Scale Crises at College and University Campuses: A Delphi Study

    ERIC Educational Resources Information Center

    Burgin, Rick A.

    2012-01-01

    Large-scale crises continue to surprise, overwhelm, and shatter college and university campuses. While the devastation to physical plants and persons is often evident and is addressed with crisis management plans, the number of emotional casualties left in the wake of these large-scale crises may not be apparent and are often not addressed with…

  11. The Development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): A Large-Scale Data Sharing Initiative

    PubMed Central

    Lutomski, Jennifer E.; Baars, Maria A. E.; Schalk, Bianca W. M.; Boter, Han; Buurman, Bianca M.; den Elzen, Wendy P. J.; Jansen, Aaltje P. D.; Kempen, Gertrudis I. J. M.; Steunenberg, Bas; Steyerberg, Ewout W.; Olde Rikkert, Marcel G. M.; Melis, René J. F.

    2013-01-01

    Introduction In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons’ health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). Materials and Methods A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Results Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. Discussion TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the

  12. NATIONAL PREGNANCY AND HEALTH SURVEY

    EPA Science Inventory

    The National Pregnancy and Health Survey conducted by NIDA is a nationwide hospital survey to determine the extent of drug abuse among pregnant women in the United States. The primary objective of the National Pregnancy and Health Survey (NPHS) was to produce national annual esti...

  13. Effectiveness of a large-scale health and nutritional education program on anemia in children younger than 5 years in Shifang, a heavily damaged area of Wenchuan earthquake.

    PubMed

    Yang, Fan; Wang, Chuan; Yang, Hui; Yang, Huiming; Yang, Sufei; Yu, Tao; Tang, Zhanghui; Ji, Qiaoyun; Li, Fengyi; Shi, Hua; Mao, Meng

    2015-03-01

    This study aimed to explore an ideal way to prevent anemia among children younger than 5 years after disasters especially when health care facilities are not enough. A preliminary survey was carried out involving 13 065 children younger than 5 years. Pretested questionnaires were used for data collection and hemoglobin levels were measured. After 12-month intervention, the impact survey involving 2769 children was conducted. Results showed that there were some improvements both in feeding knowledge and practice related to anemia. The total prevalence of anemia decreased from 14.3% to 7.8% (P < .001), and the severity of anemia also declined. The hemoglobin concentration increased significantly from 118.8 ± 10.5 to 122.0 ± 9.9 g/L (P < .001). Thus, health and nutritional education could be an ideal way to combat anemia after disasters especially in less developed areas with multiparty cooperation. The methods and experiences of this study may be well worth learning and implementing. PMID:23536239

  14. A large scale survey of trace metal levels in coastal waters of the Western Mediterranean basin using caged mussels (Mytilus galloprovincialis).

    PubMed

    Benedicto, José; Andral, Bruno; Martínez-Gómez, Concepción; Guitart, Carlos; Deudero, Salud; Cento, Alessandro; Scarpato, Alfonso; Caixach, Josep; Benbrahim, Samir; Chouba, Lassaad; Boulahdid, Mostefa; Galgani, François

    2011-05-01

    A large scale study of trace metal contamination (Hg, Cd, Pb and Ni) by means of caged mussels Mytilus galloprovincialis was undertaken along the coastal waters of the Western Mediterranean Sea within the context of the MYTILOS project. Individual mussels from an homogeneous population (shell size 50 ± 5 mm) obtained from an aquaculture farm were consecutively caged and deployed at 123 sites located in the Alborán, North-Western, South-Western and Tyrrhenian sub-basins for 12 weeks (April-July) in 2004, 2005 and 2006. After cage recoveries, both the metal content in the whole mussel tissue and the allometric parameters were measured. Statistical analysis of the datasets showed significant differences in concentrations between sub-basins for some metals and mussel condition index (CI). Linear regression models coupled to the CI were revisited for the data adjustment of certain trace metals (Hg, Cd and Ni), and four level categories were statistically derived to facilitate interregional comparison. Seawater masses surrounding coastal areas impacted by run-off from land mineralised coasts and industrial activities displayed the highest concentration ranges (Hg: 0.15-0.31 mg kg(-1) dw; Cd: 1.97-2.11; Ni: 2.18-3.20 and Pb: 3.1-3.8), although the levels obtained in most of the sites fitted within moderate or low categories, and they could be considered as baseline concentrations. However, few sites considered little-influenced by human activities, at present, showed high concentrations of Cd, Ni and Pb, which constitute new areas of concern. Overall, the use of active biomonitoring (ABM) approach allowed to investigate trace metal contamination in order to support policy makers in establishing regional strategies (particularly, with regard to the European Marine Strategy Directive). PMID:21384032

  15. Is cost-related non-collection of prescriptions associated with a reduction in health? Findings from a large-scale longitudinal study of New Zealand adults

    PubMed Central

    Jatrana, Santosh; Richardson, Ken; Norris, Pauline; Crampton, Peter

    2015-01-01

    Objective To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. Settings New Zealand Survey of Family, Income and Employment (SoFIE)-Health. Participants Data from 17 363 participants with at least two observations in three waves (2004–2005, 2006–2007, 2008–2009) of a panel study were analysed using fixed effects regression modelling. Primary outcome measures Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. Results After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15–24 and 25–64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. Conclusion Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health. PMID:26553826

  16. What Sort of Girl Wants to Study Physics after the Age of 16? Findings from a Large-Scale UK Survey

    ERIC Educational Resources Information Center

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-01-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK…

  17. A Short Survey on the State of the Art in Architectures and Platforms for Large Scale Data Analysis and Knowledge Discovery from Data

    SciTech Connect

    Begoli, Edmon

    2012-01-01

    Intended as a survey for practicing architects and researchers seeking an overview of the state-of-the-art architectures for data analysis, this paper provides an overview of the emerg- ing data management and analytic platforms including par- allel databases, Hadoop-based systems, High Performance Computing (HPC) platforms and platforms popularly re- ferred to as NoSQL platforms. Platforms are presented based on their relevance, analysis they support and the data organization model they support.

  18. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  19. Relationship between overactive bladder and irritable bowel syndrome: a large-scale internet survey in Japan using the overactive bladder symptom score and Rome III criteria

    PubMed Central

    Matsumoto, Seiji; Hashizume, Kazumi; Wada, Naoki; Hori, Jyunichi; Tamaki, Gaku; Kita, Masafumi; Iwata, Tatsuya; Kakizaki, Hidehiro

    2013-01-01

    What's known on the subject? and What does the study add? There is known to be an association between overactive bladder (OAB) and irritable bowel syndrome (IBS). The study investigates the association between OAB and IBS using an internet-based survey in Japan. It is the first to investigate the prevalence and severity of OAB in the general population using the OAB symptom score questionnaire. Objective To investigate the association between overactive bladder (OAB) and irritable bowel syndrome (IBS) by using an internet-based survey in Japan. Subjects and Methods Questionnaires were sent via the internet to Japanese adults. The overactive bladder symptom score was used for screening OAB, and the Japanese version of the Rome III criteria for the diagnosis of IBS was used for screening this syndrome. Results The overall prevalence of OAB and IBS was 9.3% and 21.2%, respectively. Among the subjects with OAB, 33.3% had concurrent IBS. The prevalence of OAB among men was 9.7% and among women it was 8.9%, while 18.6% of men and 23.9% of women had IBS. Concurrent IBS was noted in 32.0% of men and 34.8% of women with OAB. Conclusion Taking into account a high rate of concurrent IBS in patients with OAB, it seems to be important for physicians to assess the defaecation habits of patients when diagnosing and treating OAB. PMID:23106867

  20. Prevalence of HIV among MSM in Europe: comparison of self-reported diagnoses from a large scale internet survey and existing national estimates

    PubMed Central

    2012-01-01

    Background Country level comparisons of HIV prevalence among men having sex with men (MSM) is challenging for a variety of reasons, including differences in the definition and measurement of the denominator group, recruitment strategies and the HIV detection methods. To assess their comparability, self-reported data on HIV diagnoses in a 2010 pan-European MSM internet survey (EMIS) were compared with pre-existing estimates of HIV prevalence in MSM from a variety of European countries. Methods The first pan-European survey of MSM recruited more than 180,000 men from 38 countries across Europe and included questions on the year and result of last HIV test. HIV prevalence as measured in EMIS was compared with national estimates of HIV prevalence based on studies using biological measurements or modelling approaches to explore the degree of agreement between different methods. Existing estimates were taken from Dublin Declaration Monitoring Reports or UNAIDS country fact sheets, and were verified by contacting the nominated contact points for HIV surveillance in EU/EEA countries. Results The EMIS self-reported measurements of HIV prevalence were strongly correlated with existing estimates based on biological measurement and modelling studies using surveillance data (R2=0.70 resp. 0.72). In most countries HIV positive MSM appeared disproportionately likely to participate in EMIS, and prevalences as measured in EMIS are approximately twice the estimates based on existing estimates. Conclusions Comparison of diagnosed HIV prevalence as measured in EMIS with pre-existing estimates based on biological measurements using varied sampling frames (e.g. Respondent Driven Sampling, Time and Location Sampling) demonstrates a high correlation and suggests similar selection biases from both types of studies. For comparison with modelled estimates the self-selection bias of the Internet survey with increased participation of men diagnosed with HIV has to be taken into account. For

  1. First measurements of the scope for growth (SFG) in mussels from a large scale survey in the North-Atlantic Spanish coast.

    PubMed

    Albentosa, Marina; Viñas, Lucía; Besada, Victoria; Franco, Angeles; González-Quijano, Amelia

    2012-10-01

    SFG and physiological rates were measured in wild mussels from the Spanish Marine Pollution monitoring program (SMP) in order to determine seawater quality. It consists of 41 stations, covering almost 2500 km of coast, making the SMP the widest-ranging monitoring network in the Iberian Peninsula's Atlantic region. Results of the 2007 and 2008 surveys when 39 sites were sampled: (20 in 2007 and 19 in 2008, being 8 sites sampled both years) were presented. Chemical analyses were carried out to determine the relationships between physiological rates and the accumulation of toxic compounds. Data presented are the first to become available on the use of SFG as a biomarker of the marine environment on a large spatial scale (>1000 km) along Spain's Atlantic seaboard. SFG values enable significant differences to be established between the areas sampled and between the two years surveyed. The integration of biological and chemical data suggests that certain organochlorine compounds, namely chlordanes and DDTs, may have a negative effect on SFG, although such an effect is of a lesser magnitude than that associated with certain biological parameters such as condition index and mussel age. These variables act as confounding factors when attempting to determine the effect of chemical compounds present in the marine environment on mussel SFG. Further research is therefore needed on the relation between these confounding factors and SFG in order to apply the relevant corrective strategies to enable this index to be used in monitoring programs. The effect of these confounding factors is more clearly revealed in studies that cover a wide-ranging spatial and time scale, such as those carried out within the SMP. These results do not invalidate the use of biological data in monitoring programs, but rather point to the need to analyze all the factors affecting each biological process. PMID:22885349

  2. A 1.85-m mm-submm Telescope for Large-Scale Molecular Gas Surveys in 12CO, 13CO, and C18O (J = 2-1)

    NASA Astrophysics Data System (ADS)

    Onishi, Toshikazu; Nishimura, Atsushi; Ota, Yuya; Hashizume, Akio; Kojima, Yoshiharu; Minami, Akihito; Tokuda, Kazuki; Touga, Shiori; Abe, Yasuhiro; Kaiden, Masahiro; Kimura, Kimihiro; Muraoka, Kazuyuki; Maezawa, Hiroyuki; Ogawa, Hideo; Dobashi, Kazuhito; Shimoikura, Tomomi; Yonekura, Yoshinori; Asayama, Shin'ichiro; Handa, Toshihiro; Nakajima, Taku; Noguchi, Takashi; Kuno, Nario

    2013-08-01

    We have developed a new mm-submm telescope with a diameter of 1.85-m installed at the Nobeyama Radio Observatory. The scientific goal is to precisely reveal the physical properties of molecular clouds in the Milky Way Galaxy by obtaining a large-scale distribution of molecular gas, which can also be compared with large-scale observations at various wavelengths. The target frequency is ˜ 230 GHz; simultaneous observations at the molecular rotational lines of J = 2-1 of three carbon monoxide isotopes (12CO, 13CO, C18 O) are achieved with a beam size (HPBW) of 2.7'. In order to accomplish the simultaneous observations, we have developed waveguide-type sideband-separating SIS mixers to obtain spectra separately in the upper and lower side bands. A Fourier digital spectrometer with a 1 GHz bandwidth having 16384 channels is installed, and the bandwidth of the spectrometer is divided into three parts, corresponding to each of the three spectra; the IF system has been designed so as to inject these three lines into the spectrometer. A flexible observation system was created mainly in Python on Linux PCs, enabling effective OTF (On-The-Fly) scans for large-area mapping. The telescope is enclosed in a radome with a membrane covered to prevent any harmful effects of sunlight, strong wind, and precipitation in order to minimize errors in the telescope pointing, and to stabilize the receiver and the IF devices. From 2011 November, we started science operation, resulting in a large-scale survey of the Orion A/B clouds, Cygnus OB7, Galactic Plane, Taurus, and so on. We also updated the receiver system for dual-polarization observations.

  3. IRAM 30 m large scale survey of {sup 12}CO(2-1) and {sup 13}CO(2-1) emission in the Orion molecular cloud

    SciTech Connect

    Berné, O.; Cernicharo, J.; Marcelino, N.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the {sup 12}CO and {sup 13}CO (2-1) lines with a maximal spatial resolution of ∼11'' and spectral resolution of ∼0.4 km s{sup –1}. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M {sub ☉} (half of which is found in regions with visual extinctions A{sub V} below ∼10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/∼oberne/Olivier{sub B}erne/Data).

  4. What Sort of Girl Wants to Study Physics After the Age of 16? Findings from a Large-scale UK Survey

    NASA Astrophysics Data System (ADS)

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-11-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK schools as learners of physics during the academic year 2008-2009. A comparison between boys and girls indicates the pervasiveness of gender issues, with boys more likely to respond positively towards physics-specific constructs than girls. The analysis also indicates that girls and boys who expressed intentions to participate in physics post-16 gave similar responses towards their physics teachers and physics lessons and had comparable physics extrinsic motivation. Girls (regardless of their intention to participate in physics) were less likely than boys to be encouraged to study physics post-16 by teachers, family and friends. Despite this, there were a subset of girls still intending to study physics post-16. The crucial differences between the girls who intended to study physics post-16 and those who did not is that girls who intend to study physics post-16 had higher physics extrinsic motivation, more positive perceptions of physics teachers and lessons, greater competitiveness and a tendency to be less extrovert. This strongly suggests that higher extrinsic motivation in physics could be the crucial underlying key that encourages a subset of girls (as well as boys) in wanting to pursue physics post-16.

  5. IRAM 30 m Large Scale Survey of 12CO(2-1) and 13CO(2-1) Emission in the Orion Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Berné, O.; Marcelino, N.; Cernicharo, J.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the 12CO and 13CO (2-1) lines with a maximal spatial resolution of ~11'' and spectral resolution of ~0.4 km s-1. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M ⊙ (half of which is found in regions with visual extinctions AV below ~10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/~oberne/Olivier_Berne/Data).

  6. The clustering of galaxies in the SDSS-III Baryon Oscillation Spectroscopic Survey: cosmological implications of the large-scale two-point correlation function

    NASA Astrophysics Data System (ADS)

    Sánchez, Ariel G.; Scóccola, C. G.; Ross, A. J.; Percival, W.; Manera, M.; Montesano, F.; Mazzalay, X.; Cuesta, A. J.; Eisenstein, D. J.; Kazin, E.; McBride, C. K.; Mehta, K.; Montero-Dorta, A. D.; Padmanabhan, N.; Prada, F.; Rubiño-Martín, J. A.; Tojeiro, R.; Xu, X.; Magaña, M. Vargas; Aubourg, E.; Bahcall, N. A.; Bailey, S.; Bizyaev, D.; Bolton, A. S.; Brewington, H.; Brinkmann, J.; Brownstein, J. R.; Gott, J. Richard; Hamilton, J. C.; Ho, S.; Honscheid, K.; Labatie, A.; Malanushenko, E.; Malanushenko, V.; Maraston, C.; Muna, D.; Nichol, R. C.; Oravetz, D.; Pan, K.; Ross, N. P.; Roe, N. A.; Reid, B. A.; Schlegel, D. J.; Shelden, A.; Schneider, D. P.; Simmons, A.; Skibba, R.; Snedden, S.; Thomas, D.; Tinker, J.; Wake, D. A.; Weaver, B. A.; Weinberg, David H.; White, Martin; Zehavi, I.; Zhao, G.

    2012-09-01

    We obtain constraints on cosmological parameters from the spherically averaged redshift-space correlation function of the CMASS Data Release 9 (DR9) sample of the Baryonic Oscillation Spectroscopic Survey (BOSS). We combine this information with additional data from recent cosmic microwave background (CMB), supernova and baryon acoustic oscillation measurements. Our results show no significant evidence of deviations from the standard flat Λ cold dark matter model, whose basic parameters can be specified by Ωm = 0.285 ± 0.009, 100 Ωb = 4.59 ± 0.09, ns = 0.961 ± 0.009, H0 = 69.4 ± 0.8 km s-1 Mpc-1 and σ8 = 0.80 ± 0.02. The CMB+CMASS combination sets tight constraints on the curvature of the Universe, with Ωk = -0.0043 ± 0.0049, and the tensor-to-scalar amplitude ratio, for which we find r < 0.16 at the 95 per cent confidence level (CL). These data show a clear signature of a deviation from scale invariance also in the presence of tensor modes, with ns < 1 at the 99.7 per cent CL. We derive constraints on the fraction of massive neutrinos of fν < 0.049 (95 per cent CL), implying a limit of ∑mν < 0.51 eV. We find no signature of a deviation from a cosmological constant from the combination of all data sets, with a constraint of wDE = -1.033 ± 0.073 when this parameter is assumed time-independent, and no evidence of a departure from this value when it is allowed to evolve as wDE(a) = w0 + wa(1 - a). The achieved accuracy on our cosmological constraints is a clear demonstration of the constraining power of current cosmological observations.

  7. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  8. CHINA HEALTH AND NUTRITION SURVEY

    EPA Science Inventory

    The China Health and Nutrition Survey is designed to examine the effects of health, nutrition, and family planning policies and programs as they have been implemented by national and local governments. It is designed to examine how both the social and economic transformation of C...

  9. MEDICARE HEALTH OUTCOMES SURVEY (HOS)

    EPA Science Inventory

    The Medicare Health Outcomes Survey (HOS) is the first Medicare managed care outcomes measure. CMS, in collaboration with NCQA, launched the Medicare HOS in the 1998 Health Plan Employer Data and Information Set (HEDIS?). The measure includes the most recent advances in summarizi...

  10. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  11. An integrative structural health monitoring system for the local/global responses of a large-scale irregular building under construction.

    PubMed

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  12. Microfluidic large-scale integration.

    PubMed

    Thorsen, Todd; Maerkl, Sebastian J; Quake, Stephen R

    2002-10-18

    We developed high-density microfluidic chips that contain plumbing networks with thousands of micromechanical valves and hundreds of individually addressable chambers. These fluidic devices are analogous to electronic integrated circuits fabricated using large-scale integration. A key component of these networks is the fluidic multiplexor, which is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs. We used these integrated microfluidic networks to construct the microfluidic analog of a comparator array and a microfluidic memory storage device whose behavior resembles random-access memory. PMID:12351675

  13. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success.

    PubMed

    Cutler, G Christopher; Scott-Dupree, Cynthia D; Sultan, Maryam; McFarlane, Andrew D; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5-2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012-2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  14. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success

    PubMed Central

    Scott-Dupree, Cynthia D.; Sultan, Maryam; McFarlane, Andrew D.; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5–2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012–2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  15. Large scale topography of Io

    NASA Technical Reports Server (NTRS)

    Gaskell, R. W.; Synnott, S. P.

    1987-01-01

    To investigate the large scale topography of the Jovian satellite Io, both limb observations and stereographic techniques applied to landmarks are used. The raw data for this study consists of Voyager 1 images of Io, 800x800 arrays of picture elements each of which can take on 256 possible brightness values. In analyzing this data it was necessary to identify and locate landmarks and limb points on the raw images, remove the image distortions caused by the camera electronics and translate the corrected locations into positions relative to a reference geoid. Minimizing the uncertainty in the corrected locations is crucial to the success of this project. In the highest resolution frames, an error of a tenth of a pixel in image space location can lead to a 300 m error in true location. In the lowest resolution frames, the same error can lead to an uncertainty of several km.

  16. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  17. California's "5 a day--for better health!" campaign: an innovative population-based effort to effect large-scale dietary change.

    PubMed

    Foerster, S B; Kizer, K W; Disogra, L K; Bal, D G; Krieg, B F; Bunch, K L

    1995-01-01

    The annual toll of diet-related diseases in the United States is similar to that taken by tobacco, but less progress has been achieved in reaching the Public Health Service's Healthy People 2000 objectives for improving food consumption than for reducing tobacco use. In 1988, the California Department of Health Services embarked upon an innovative multi-year social marketing program to increase fruit and vegetable consumption. The 5 a Day--for Better Health! Campaign had several distinctive features, including its simple, positive, behavior-specific message to eat 5 servings of fruits and vegetables every day as part of a low-fat, high fiber diet; its use of mass media; its partnership between the state health department and the produce and supermarket industries; and its extensive use of point-of-purchase messages. Over its nearly three years of operation in California, the 5 a Day Campaign appears to have raised public awareness that fruits and vegetables help reduce cancer risk, increased fruit and vegetable consumption in major population segments, and created an ongoing partnership between public health and agribusiness that has allowed extension of the campaign to other population segments, namely children and Latino adults. In 1991 the campaign was adopted as a national initiative by the National Cancer Institute and the Produce for Better Health Foundation. By 1994, over 700 industry organizations and 48 states, territories, and the District of Columbia were licensed to participate. Preventive medicine practitioners and others involved in health promotion may build upon the 5 a Day Campaign experience in developing and implementing efforts to reach the nation's dietary goals. PMID:7632448

  18. National Adolescent Student Health Survey.

    ERIC Educational Resources Information Center

    Health Education (Washington D.C.), 1988

    1988-01-01

    Results are reported from a national survey of teenaged youth on their attitudes toward a variety of health related issues. Topics covered were Acquired Immune Deficiency Syndrome; sexually transmitted diseases, violence, suicide, injury prevention, drug abuse, nutrition, and consumer education. (JD)

  19. HEALTH AND DIET SURVEY (HDS)

    EPA Science Inventory

    The FDA conducts this periodic omnibus survey of American consumers to track consumer attitudes, knowledge, and reported behaviors related to diet and health issues including cholesterol awareness of diet-disease risk factors, food label use, dietary supplement use, and awarenes...

  20. The NIHR collaboration for leadership in applied health research and care (CLAHRC) for greater manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy

    PubMed Central

    2011-01-01

    Background In response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme. Discussion The paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning. Summary Designing and evaluating a large-scale implementation

  1. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  2. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  3. Individual Skill Differences and Large-Scale Environmental Learning

    ERIC Educational Resources Information Center

    Fields, Alexa W.; Shelton, Amy L.

    2006-01-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited…

  4. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  5. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  6. Psychological Resilience after Hurricane Sandy: The Influence of Individual- and Community-Level Factors on Mental Health after a Large-Scale Natural Disaster

    PubMed Central

    Lowe, Sarah R.; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13–16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience. PMID:25962178

  7. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  8. Evaluating effective reaction rates of kinetically driven solutes in large-scale, anisotropic media: human health risk implications in CO2 leakage

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2011-12-01

    The role of high and low hydraulic conductivity (K) regions in heterogeneous, stratified and non-stratified flow fields and the subsequent effect of rate dependent geochemical reactions are investigated with regards to mobilized arsenic from CO2 leakage at a Carbon Capture and Storage (CCS) site. Following the methodology of previous work, human health risk is used as an endpoint for comparison via a two-stage or nested Monte Carlo scheme, explicitly considering joint uncertainty and variability for a hypothetical population of individuals. This study identifies geo-hydrologic conditions where solute reactions are either rate limited (non-reactive), in equilibrium (linear equilibrium assumption, LEA, is appropriate), or are sensitive to time-dependent kinetic reaction rates. Potential interplay between multiple parameters (i.e. positive or negative feedbacks) is shown utilizing stochastic ensembles. In particular, the effect of preferential flow pathways and solute mixing on the field-scale (macrodispersion) and sub-grid (local dispersion) is examined for varying degrees of stratification and regional groundwater velocities. Results show effective reaction rates of kinetic ensembles are dissimilar from LEA ensembles with the inclusion of local dispersion, resulting in an additive tailing effect of the solute plume, a retarded peak time, and an increased cancer risk. This discrepancy between kinetic and LEA ensembles is augmented in highly anisotropic media, especially at intermediate regional groundwater velocities. The distribution, magnitude, and associated uncertainty of cancer risk are controlled by these factors, but are also strongly dependent on the regional groundwater velocity. We demonstrate a higher associated uncertainty of cancer risk in stratified domains is linked to higher aquifer connectivity and less macrodispersion in the flow field. This study has implications in CCS site selection and groundwater driven risk assessment modeling.

  9. Accuracy of Electronic Health Record Data for Identifying Stroke Cases in Large-Scale Epidemiological Studies: A Systematic Review from the UK Biobank Stroke Outcomes Group

    PubMed Central

    Woodfield, Rebecca; Grant, Ian; Sudlow, Cathie L. M.

    2015-01-01

    Objective Long-term follow-up of population-based prospective studies is often achieved through linkages to coded regional or national health care data. Our knowledge of the accuracy of such data is incomplete. To inform methods for identifying stroke cases in UK Biobank (a prospective study of 503,000 UK adults recruited in middle-age), we systematically evaluated the accuracy of these data for stroke and its main pathological types (ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage), determining the optimum codes for case identification. Methods We sought studies published from 1990-November 2013, which compared coded data from death certificates, hospital admissions or primary care with a reference standard for stroke or its pathological types. We extracted information on a range of study characteristics and assessed study quality with the Quality Assessment of Diagnostic Studies tool (QUADAS-2). To assess accuracy, we extracted data on positive predictive values (PPV) and—where available—on sensitivity, specificity, and negative predictive values (NPV). Results 37 of 39 eligible studies assessed accuracy of International Classification of Diseases (ICD)-coded hospital or death certificate data. They varied widely in their settings, methods, reporting, quality, and in the choice and accuracy of codes. Although PPVs for stroke and its pathological types ranged from 6–97%, appropriately selected, stroke-specific codes (rather than broad cerebrovascular codes) consistently produced PPVs >70%, and in several studies >90%. The few studies with data on sensitivity, specificity and NPV showed higher sensitivity of hospital versus death certificate data for stroke, with specificity and NPV consistently >96%. Few studies assessed either primary care data or combinations of data sources. Conclusions Particular stroke-specific codes can yield high PPVs (>90%) for stroke/stroke types. Inclusion of primary care data and combining data sources should

  10. Curvature constraints from large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  11. Young women's reproductive health survey.

    PubMed

    Lewis, H

    1987-08-12

    A survey of reproductive health issues was conducted on 15 year old Hutt Valley secondary school girls by means of a self-administered anonymous questionnaire. The prevalence of sexual intercourse in the sample was 29%. Sixteen percent of the sexually active respondents used no method of contraception. Knowledge of reproductive health facts and contraception was poor both amongst sexually experienced and inexperienced respondents. Twenty-six percent relied on peers for this information, with mothers, teachers and books being other important sources cited. Respondents requested more information on sexually transmitted diseases, contraception and sexual relationships. Most would like this information more readily accessible. Preferred sources of information mentioned were: parents, books, films/videos, family planning clinics and friends. PMID:3455514

  12. THE OBSERVATIONS OF REDSHIFT EVOLUTION IN LARGE-SCALE ENVIRONMENTS (ORELSE) SURVEY. I. THE SURVEY DESIGN AND FIRST RESULTS ON CL 0023+0423 AT z = 0.84 AND RX J1821.6+6827 AT z = 0.82

    SciTech Connect

    Lubin, L. M.; Lemaux, B. C.; Kocevski, D. D.; Gal, R. R.; Squires, G. K.

    2009-06-15

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h {sup -1} {sub 70} Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z {approx} 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc x 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s{sup -1}. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the H{delta} line in a composite spectrum of 138 members indicates a substantial contribution from recent

  13. The Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey. I. The Survey Design and First Results on CL 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82

    NASA Astrophysics Data System (ADS)

    Lubin, L. M.; Gal, R. R.; Lemaux, B. C.; Kocevski, D. D.; Squires, G. K.

    2009-06-01

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h -1 70 Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z ≈ 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc × 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s-1. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the Hδ line in a composite spectrum of 138 members indicates a substantial contribution from recent starbursts to the overall galaxy

  14. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  15. Large-scale infrared scene projectors

    NASA Astrophysics Data System (ADS)

    Murray, Darin A.

    1999-07-01

    Large-scale infrared scene projectors, typically have unique opto-mechanical characteristics associated to their application. This paper outlines two large-scale zoom lens assemblies with different environmental and package constraints. Various challenges and their respective solutions are discussed and presented.

  16. The convergent validity of three surveys as alternative sources of health information to the 2011 UK census.

    PubMed

    Taylor, Joanna; Twigg, Liz; Moon, Graham

    2014-09-01

    Censuses have traditionally been a key source of localised information on the state of a nation's health. Many countries are now adopting alternative approaches to the traditional census, placing such information at risk. The purpose of this paper is to inform debate about whether existing social surveys could provide an adequate 'base' for alternative model-based small area estimates of health data in a post traditional census era. Using a case study of 2011 UK Census questions on self-assessed health and limiting long term illness, we examine the extent to which the results from three large-scale surveys - the Health Survey for England, the Crime Survey for England and Wales and the Integrated Household Survey - conform to census output. Particularly in the case of limiting long term illness, the question wording renders comparisons difficult. However, with the exception of the general health question from the Health Survey for England all three surveys meet tests for convergent validity. PMID:25016326

  17. Individual skill differences and large-scale environmental learning.

    PubMed

    Fields, Alexa W; Shelton, Amy L

    2006-05-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited exposure and tested on judgments about relative locations of objects. They also performed a series of spatial and nonspatial component skill tests. With limited learning, performance after route encoding was worse than performance after survey encoding. Furthermore, performance after route and survey encoding appeared to be preferentially linked to perspective and object-based transformations, respectively. Together, the results provide clues to how different skills might be engaged by different individuals for the same goal of learning a large-scale environment. PMID:16719662

  18. The large-scale landslide risk classification in catchment scale

    NASA Astrophysics Data System (ADS)

    Liu, Che-Hsin; Wu, Tingyeh; Chen, Lien-Kuang; Lin, Sheng-Chi

    2013-04-01

    The landslide disasters caused heavy casualties during Typhoon Morakot, 2009. This disaster is defined as largescale landslide due to the casualty numbers. This event also reflects the survey on large-scale landslide potential is so far insufficient and significant. The large-scale landslide potential analysis provides information about where should be focused on even though it is very difficult to distinguish. Accordingly, the authors intend to investigate the methods used by different countries, such as Hong Kong, Italy, Japan and Switzerland to clarify the assessment methodology. The objects include the place with susceptibility of rock slide and dip slope and the major landslide areas defined from historical records. Three different levels of scales are confirmed necessarily from country to slopeland, which are basin, catchment, and slope scales. Totally ten spots were classified with high large-scale landslide potential in the basin scale. The authors therefore focused on the catchment scale and employ risk matrix to classify the potential in this paper. The protected objects and large-scale landslide susceptibility ratio are two main indexes to classify the large-scale landslide risk. The protected objects are the constructions and transportation facilities. The large-scale landslide susceptibility ratio is based on the data of major landslide area and dip slope and rock slide areas. Totally 1,040 catchments are concerned and are classified into three levels, which are high, medium, and low levels. The proportions of high, medium, and low levels are 11%, 51%, and 38%, individually. This result represents the catchments with high proportion of protected objects or large-scale landslide susceptibility. The conclusion is made and it be the base material for the slopeland authorities when considering slopeland management and the further investigation.

  19. Large-scale investment in green space as an intervention for physical activity, mental and cardiometabolic health: study protocol for a quasi-experimental evaluation of a natural experiment

    PubMed Central

    Astell-Burt, Thomas; Feng, Xiaoqi; Kolt, Gregory S

    2016-01-01

    Introduction ‘Green spaces’ such as public parks are regarded as determinants of health, but evidence from tends to be based on cross-sectional designs. This protocol describes a study that will evaluate a large-scale investment in approximately 5280 hectares of green space stretching 27 km north to south in Western Sydney, Australia. Methods and analysis A Geographic Information System was used to identify 7272 participants in the 45 and Up Study baseline data (2006–2008) living within 5 km of the Western Sydney Parklands and some of the features that have been constructed since 2009, such as public access points, advertising billboards, walking and cycle tracks, BBQ stations, and children's playgrounds. These data were linked to information on a range of health and behavioural outcomes, with the second wave of data collection initiated by the Sax Institute in 2012 and expected to be completed by 2015. Multilevel models will be used to analyse potential change in physical activity, weight status, social contacts, mental and cardiometabolic health within a closed sample of residentially stable participants. Comparisons between persons with contrasting proximities to different areas of the Parklands will provide ‘treatment’ and ‘control’ groups within a ‘quasi-experimental’ study design. In line with expectations, baseline results prior to the enhancement of the Western Sydney Parklands indicated virtually no significant differences in the distribution of any of the outcomes with respect to proximity to green space preintervention. Ethics and dissemination Ethical approval was obtained for the 45 and Up Study from the University of New South Wales Human Research Ethics Committee. Ethics approval for this study was obtained from the University of Western Sydney Ethics Committee. Findings will be disseminated through partner organisations (the Western Sydney Parklands and the National Heart Foundation of Australia), as well as to policymakers in

  20. Synthesis of small and large scale dynamos

    NASA Astrophysics Data System (ADS)

    Subramanian, Kandaswamy

    Using a closure model for the evolution of magnetic correlations, we uncover an interesting plausible saturated state of the small-scale fluctuation dynamo (SSD) and a novel analogy between quantum mechanical tunnelling and the generation of large-scale fields. Large scale fields develop via the α-effect, but as magnetic helicity can only change on a resistive timescale, the time it takes to organize the field into large scales increases with magnetic Reynolds number. This is very similar to the results which obtain from simulations using the full MHD equations.

  1. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  2. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  3. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  4. Washington State Survey of Adolescent Health Behaviors.

    ERIC Educational Resources Information Center

    Washington State Dept. of Social and Health Services, Olympia.

    The 1992 Washington State Survey of Adolescent Health Behaviors (WSSAHB) was created to collect information regarding a variety of adolescent health behaviors among students in the state of Washington. It expands on two previous administrations of a student tobacco, alcohol, and other drug survey and includes questions about medical care, safety,…

  5. NATIONAL EMPLOYER HEALTH INSURANCE SURVEY (NEHIS)

    EPA Science Inventory

    The National Employer Health Insurance Survey (NEHIS) was developed to produce estimates on employer-sponsored health insurance data in the United States. The NEHIS was the first Federal survey to represent all employers in the United States by State and obtain information on all...

  6. Estimating health expenditure shares from household surveys

    PubMed Central

    Brooks, Benjamin PC; Hanlon, Michael

    2013-01-01

    Abstract Objective To quantify the effects of household expenditure survey characteristics on the estimated share of a household’s expenditure devoted to health. Methods A search was conducted for all country surveys reporting data on health expenditure and total household expenditure. Data on total expenditure and health expenditure were extracted from the surveys to generate the health expenditure share (i.e. fraction of the household expenditure devoted to health). To do this the authors relied on survey microdata or survey reports to calculate the health expenditure share for the particular instrument involved. Health expenditure share was modelled as a function of the survey’s recall period, the number of health expenditure items, the number of total expenditure items, the data collection method and the placement of the health module within the survey. Data exists across space and time, so fixed effects for territory and year were included as well. The model was estimated by means of ordinary least squares regression with clustered standard errors. Findings A one-unit increase in the number of health expenditure questions was accompanied by a 1% increase in the estimated health expenditure share. A one-unit increase in the number of non-health expenditure questions resulted in a 0.2% decrease in the estimated share. Increasing the recall period by one month was accompanied by a 6% decrease in the health expenditure share. Conclusion The characteristics of a survey instrument examined in the study affect the estimate of the health expenditure share. Those characteristics need to be accounted for when comparing results across surveys within a territory and, ultimately, across territories. PMID:23825879

  7. Large-scale inhomogeneities and galaxy statistics

    NASA Technical Reports Server (NTRS)

    Schaeffer, R.; Silk, J.

    1984-01-01

    The density fluctuations associated with the formation of large-scale cosmic pancake-like and filamentary structures are evaluated using the Zel'dovich approximation for the evolution of nonlinear inhomogeneities in the expanding universe. It is shown that the large-scale nonlinear density fluctuations in the galaxy distribution due to pancakes modify the standard scale-invariant correlation function xi(r) at scales comparable to the coherence length of adiabatic fluctuations. The typical contribution of pancakes and filaments to the J3 integral, and more generally to the moments of galaxy counts in a volume of approximately (15-40 per h Mpc)exp 3, provides a statistical test for the existence of large scale inhomogeneities. An application to several recent three dimensional data sets shows that despite large observational uncertainties over the relevant scales characteristic features may be present that can be attributed to pancakes in most, but not all, of the various galaxy samples.

  8. Development of the adult and child complementary medicine questionnaires fielded on the National Health Interview Survey

    PubMed Central

    2013-01-01

    The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications. PMID:24267412

  9. Post-disaster mental health need assessment surveys - the challenge of improved future research.

    PubMed

    Kessler, Ronald C; Wittchen, Hans-Ulrich

    2008-12-01

    Disasters are very common occurrences, becoming increasingly prevalent throughout the world. The number of natural disasters either affecting more than 100 people or resulting in a call for international assistance, increased from roughly 100 per year worldwide in the late 1960s, to over 500 per year in the past decade. Population growth, environmental degradation, and global warming all play parts in accounting for these increases. There is also the possibility of a pandemic. This paper and associated journal issue focuses on the topic of growing worldwide importance: mental health needs assessment in the wake of large-scale disasters. Although natural and human-made disasters are known to have substantial effects on the mental health of the people who experience them, research shows that the prevalence of post-disaster psychopathology varies enormously from one disaster to another in ways that are difficult to predict merely by knowing the objective circumstances of the disaster. Mental health needs assessment surveys are consequently carried out after many large-scale natural and human-made disasters to provide information for service planners on the nature and magnitude of need for services. These surveys vary greatly, though, in the rigor with which they assess disaster-related stressors and post-disaster mental illness. Synthesis of findings across surveys is hampered by these inconsistencies. The typically limited focus of these surveys with regard to the inclusion of risk factors, follow-up assessments, and evaluations of treatment, also limit insights from these surveys concerning post-disaster mental illness and treatment response. The papers in this issue discuss methodological issues in the design and implementation of post-disaster mental health needs assessment surveys aimed at improving on the quality of previous such surveys. The many recommendations in these papers will hopefully help to foster improvements in the next generation of post

  10. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  11. A Large Scale Computer Terminal Output Controller.

    ERIC Educational Resources Information Center

    Tucker, Paul Thomas

    This paper describes the design and implementation of a large scale computer terminal output controller which supervises the transfer of information from a Control Data 6400 Computer to a PLATO IV data network. It discusses the cost considerations leading to the selection of educational television channels rather than telephone lines for…

  12. Large Scale Commodity Clusters for Lattice QCD

    SciTech Connect

    A. Pochinsky; W. Akers; R. Brower; J. Chen; P. Dreher; R. Edwards; S. Gottlieb; D. Holmgren; P. Mackenzie; J. Negele; D. Richards; J. Simone; W. Watson

    2002-06-01

    We describe the construction of large scale clusters for lattice QCD computing being developed under the umbrella of the U.S. DoE SciDAC initiative. We discuss the study of floating point and network performance that drove the design of the cluster, and present our plans for future multi-Terascale facilities.

  13. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  14. ARPACK: Solving large scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Lehoucq, Rich; Maschhoff, Kristi; Sorensen, Danny; Yang, Chao

    2013-11-01

    ARPACK is a collection of Fortran77 subroutines designed to solve large scale eigenvalue problems. The package is designed to compute a few eigenvalues and corresponding eigenvectors of a general n by n matrix A. It is most appropriate for large sparse or structured matrices A where structured means that a matrix-vector product w

  15. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-07-01

    The Jacksonville Electric Authority's large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy's Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process in included.

  16. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-04-01

    The Jacksonville Electric Authority`s large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy`s Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process is included.

  17. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  18. Planned NLM/AHCPR large-scale vocabulary test: using UMLS technology to determine the extent to which controlled vocabularies cover terminology needed for health care and public health.

    PubMed Central

    Humphreys, B L; Hole, W T; McCray, A T; Fitzmaurice, J M

    1996-01-01

    The National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR) are sponsoring a test to determine the extent to which a combination of existing health-related terminologies covers vocabulary needed in health information systems. The test vocabularies are the 30 that are fully or partially represented in the 1996 edition of the Unified Medical Language System (UMLS) Metathesaurus, plus three planned additions: the portions of SNOMED International not in the 1996 Metathesaurus Read Clinical Classification, and the Logical Observations Identifiers, Names, and Codes (LOINC) system. These vocabularies are available to testers through a special interface to the Internet-based UMLS Knowledge Source Server. The test will determine the ability of the test vocabularies to serve as a source of controlled vocabulary for health data systems and applications. It should provide the basis for realistic resource estimates for developing and maintaining a comprehensive "standard" health vocabulary that is based on existing terminologies. PMID:8816351

  19. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project--design, population and data harmonization of a large-scale, international study.

    PubMed

    Boffetta, Paolo; Bobak, Martin; Borsch-Supan, Axel; Brenner, Hermann; Eriksson, Sture; Grodstein, Fran; Jansen, Eugene; Jenab, Mazda; Juerges, Hendrik; Kampman, Ellen; Kee, Frank; Kuulasmaa, Kari; Park, Yikyung; Tjonneland, Anne; van Duijn, Cornelia; Wilsgaard, Tom; Wolk, Alicja; Trichopoulos, Dimitrios; Bamia, Christina; Trichopoulou, Antonia

    2014-12-01

    There is a public health demand to prevent health conditions which lead to increased morbidity and mortality among the rapidly-increasing elderly population. Data for the incidence of such conditions exist in cohort studies worldwide, which, however, differ in various aspects. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project aims at harmonizing data from existing major longitudinal studies for the elderly whilst focussing on cardiovascular diseases, diabetes mellitus, cancer, fractures and cognitive impairment in order to estimate their prevalence, incidence and cause-specific mortality, and identify lifestyle, socioeconomic, and genetic determinants and biomarkers for the incidence of and mortality from these conditions. A survey instrument assessing ageing-related conditions of the elderly will be also developed. Fourteen cohort studies participate in CHANCES with 683,228 elderly (and 150,210 deaths), from 23 European and three non-European countries. So far, 287 variables on health conditions and a variety of exposures, including biomarkers and genetic data have been harmonized. Different research hypotheses are investigated with meta-analyses. The results which will be produced can help international organizations, governments and policy-makers to better understand the broader implications and consequences of ageing and thus make informed decisions. PMID:25504016

  20. THE GUATEMALAN SURVEY OF FAMILY HEALTH

    EPA Science Inventory

    The Guatemalan Survey of Family Health, known as EGSF from its name in Spanish, was designed to examine the way in which rural Guatemalan families and individuals cope with childhood illness and pregnancy, and the role of ethnicity, poverty, and social support and health beliefs ...

  1. HISPANIC HEALTH AND NUTRITION EXAMINATION SURVEY (HHANES)

    EPA Science Inventory

    The Hispanic Health and Nutrition Examination Survey (HHANES) was a nationwide probability sample of approximately 16,000 persons, 6 months-74 years of age. Hispanics were included in past health and nutrition examinations, but neither in sufficient numbers to produce estimates o...

  2. NATIONAL MATERNAL AND INFANT HEALTH SURVEY (NMIHS)

    EPA Science Inventory

    The National Maternal and Infant Health Survey (NMIHS) provides data on maternal and infant health, including prenatal care, birth weight, fetal loss, and infant mortality. The objective of the NMIHS is to collect data needed by Federal, State, and private researchers to study fa...

  3. Fractals and cosmological large-scale structure

    NASA Technical Reports Server (NTRS)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  4. Large-scale extraction of proteins.

    PubMed

    Cunha, Teresa; Aires-Barros, Raquel

    2002-01-01

    The production of foreign proteins using selected host with the necessary posttranslational modifications is one of the key successes in modern biotechnology. This methodology allows the industrial production of proteins that otherwise are produced in small quantities. However, the separation and purification of these proteins from the fermentation media constitutes a major bottleneck for the widespread commercialization of recombinant proteins. The major production costs (50-90%) for typical biological product resides in the purification strategy. There is a need for efficient, effective, and economic large-scale bioseparation techniques, to achieve high purity and high recovery, while maintaining the biological activity of the molecule. Aqueous two-phase systems (ATPS) allow process integration as simultaneously separation and concentration of the target protein is achieved, with posterior removal and recycle of the polymer. The ease of scale-up combined with the high partition coefficients obtained allow its potential application in large-scale downstream processing of proteins produced by fermentation. The equipment and the methodology for aqueous two-phase extraction of proteins on a large scale using mixer-settlerand column contractors are described. The operation of the columns, either stagewise or differential, are summarized. A brief description of the methods used to account for mass transfer coefficients, hydrodynamics parameters of hold-up, drop size, and velocity, back mixing in the phases, and flooding performance, required for column design, is also provided. PMID:11876297

  5. Large scale processes in the solar nebula.

    NASA Astrophysics Data System (ADS)

    Boss, A. P.

    Most proposed chondrule formation mechanisms involve processes occurring inside the solar nebula, so the large scale (roughly 1 to 10 AU) structure of the nebula is of general interest for any chrondrule-forming mechanism. Chondrules and Ca, Al-rich inclusions (CAIs) might also have been formed as a direct result of the large scale structure of the nebula, such as passage of material through high temperature regions. While recent nebula models do predict the existence of relatively hot regions, the maximum temperatures in the inner planet region may not be high enough to account for chondrule or CAI thermal processing, unless the disk mass is considerably greater than the minimum mass necessary to restore the planets to solar composition. Furthermore, it does not seem to be possible to achieve both rapid heating and rapid cooling of grain assemblages in such a large scale furnace. However, if the accretion flow onto the nebula surface is clumpy, as suggested by observations of variability in young stars, then clump-disk impacts might be energetic enough to launch shock waves which could propagate through the nebula to the midplane, thermally processing any grain aggregates they encounter, and leaving behind a trail of chondrules.

  6. Afghan Health Education Project: a community survey.

    PubMed

    Lipson, J G; Omidian, P A; Paul, S M

    1995-06-01

    This study assessed the health concerns and needs for health education in the Afghan refugee and immigrant community of the San Francisco Bay Area. The study used a telephone survey, seven community meetings and a survey administered to 196 Afghan families through face-to-face interviews. Data were analyzed qualitatively and statistically. Health problems of most concern are mental health problems and stress related to past refugee trauma and loss, current occupational and economic problems, and culture conflict. Physical health problems include heart disease, diabetes and dental problems. Needed health education topics include dealing with stress, heart health, nutrition, raising children in the United States (particularly adolescents), aging in the United States, and diabetes. Using coalition building and involving Afghans in their community assessment, we found that the Afghan community is eager for culture- and language-appropriate health education programs through videos, television, lectures, and written materials. Brief health education talks in community meetings and a health fair revealed enthusiasm and willingness to consider health promotion and disease-prevention practices. PMID:7596962

  7. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  8. Evaluations of treatment efficacy of depression from perspective of both patients' symptoms and general sense of mental health and wellbeing: A large scale, multi-centered, longitudinal study in China.

    PubMed

    Zeng, Qingzhi; Wang, Wei Chun; Fang, Yiru; Mellor, David; Mccabe, Marita; Byrne, Linda; Zuo, Sai; Xu, Yifeng

    2016-07-30

    Relying on the absence, presence of level of symptomatology may not provide an adequate indication of the effects of treatment for depression, nor sufficient information for the development of treatment plans that meet patients' needs. Using a prospective, multi-centered, and observational design, the present study surveyed a large sample of outpatients with depression in China (n=9855). The 17-item Hamilton Rating Scale for Depression (HRSD-17) and the Remission Evaluation and Mood Inventory Tool (REMIT) were administered at baseline, two weeks later and 4 weeks, to assess patients' self-reported symptoms and general sense of mental health and wellbeing. Of 9855 outpatients, 91.3% were diagnosed as experiencing moderate to severe depression. The patients reported significant improvement over time on both depressive symptoms and general sense after 4-week treatment. The effect sizes of change in general sense were lower than those in symptoms at both two week and four week follow-up. Treatment effects on both general sense and depressive symptomatology were associated with demographic and clinical factors. The findings indicate that a focus on both general sense of mental health and wellbeing in addition to depressive symptomatology will provide clinicians, researchers and patients themselves with a broader perspective of the status of patients. PMID:27156024

  9. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  10. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  11. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  12. Nonthermal Components in the Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco

    2004-12-01

    I address the issue of nonthermal processes in the large scale structure of the universe. After reviewing the properties of cosmic shocks and their role as particle accelerators, I discuss the main observational results, from radio to γ-ray and describe the processes that are thought be responsible for the observed nonthermal emissions. Finally, I emphasize the important role of γ-ray astronomy for the progress in the field. Non detections at these photon energies have already allowed us important conclusions. Future observations will tell us more about the physics of the intracluster medium, shocks dissipation and CR acceleration.

  13. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  14. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  15. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  16. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  17. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    NASA Astrophysics Data System (ADS)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  18. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  19. Large-scale Globally Propagating Coronal Waves

    NASA Astrophysics Data System (ADS)

    Warmuth, Alexander

    2015-09-01

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  20. Korea Community Health Survey Data Profiles.

    PubMed

    Kang, Yang Wha; Ko, Yun Sil; Kim, Yoo Jin; Sung, Kyoung Mi; Kim, Hyo Jin; Choi, Hyung Yun; Sung, Changhyun; Jeong, Eunkyeong

    2015-06-01

    In 2008, Korea Centers for Disease Control and Prevention initiated the first nationwide survey, Korea Community Health Survey (KCHS), to provide data that could be used to plan, implement, monitor, and evaluate community health promotion and disease prevention programs. This community-based cross-sectional survey has been conducted by 253 community health centers, 35 community universities, and 1500 interviewers. The KCHS standardized questionnaire was developed jointly by the Korea Centers for Disease Control and Prevention staff, a working group of health indicators standardization subcommittee, and 16 metropolitan cities and provinces with 253 regional sites. The questionnaire covers a variety of topics related to health behaviors and prevention, which is used to assess the prevalence of personal health practices and behaviors related to the leading causes of disease, including smoking, alcohol use, drinking and driving, high blood pressure control, physical activity, weight control, quality of life (European Quality of Life-5 Dimensions, European Quality of Life-Visual Analogue Scale, Korean Instrumental Activities of Daily Living ), medical service, accident, injury, etc. The KCHS was administered by trained interviewers, and the quality control of the KCHS was improved by the introduction of a computer-assisted personal interview in 2010. The KCHS data allow a direct comparison of the differences of health issues among provinces. Furthermore, the provinces can use these data for their own cost-effective health interventions to improve health promotion and disease prevention. For users and researchers throughout the world, microdata (in the form of SAS files) and analytic guidelines can be downloaded from the KCHS website (http://KCHS.cdc.go.kr/) in Korean. PMID:26430619

  1. Decision maker perceptions of resource allocation processes in Canadian health care organizations: a national survey

    PubMed Central

    2013-01-01

    Background Resource allocation is a key challenge for healthcare decision makers. While several case studies of organizational practice exist, there have been few large-scale cross-organization comparisons. Methods Between January and April 2011, we conducted an on-line survey of senior decision makers within regional health authorities (and closely equivalent organizations) across all Canadian provinces and territories. We received returns from 92 individual managers, from 60 out of 89 organizations in total. The survey inquired about structures, process features, and behaviours related to organization-wide resource allocation decisions. We focus here on three main aspects: type of process, perceived fairness, and overall rating. Results About one-half of respondents indicated that their organization used a formal process for resource allocation, while the others reported that political or historical factors were predominant. Seventy percent (70%) of respondents self-reported that their resource allocation process was fair and just over one-half assessed their process as ‘good’ or ‘very good’. This paper explores these findings in greater detail and assesses them in context of the larger literature. Conclusion Data from this large-scale cross-jurisdictional survey helps to illustrate common challenges and areas of positive performance among Canada’s health system leadership teams. PMID:23819598

  2. Large-scale magnetic topologies of early M dwarfs

    NASA Astrophysics Data System (ADS)

    Donati, J.-F.; Morin, J.; Petit, P.; Delfosse, X.; Forveille, T.; Aurière, M.; Cabanac, R.; Dintrans, B.; Fares, R.; Gastine, T.; Jardine, M. M.; Lignières, F.; Paletou, F.; Ramirez Velez, J. C.; Théado, S.

    2008-10-01

    We present here additional results of a spectropolarimetric survey of a small sample of stars ranging from spectral type M0 to M8 aimed at investigating observationally how dynamo processes operate in stars on both sides of the full convection threshold (spectral type M4). The present paper focuses on early M stars (M0-M3), that is above the full convection threshold. Applying tomographic imaging techniques to time series of rotationally modulated circularly polarized profiles collected with the NARVAL spectropolarimeter, we determine the rotation period and reconstruct the large-scale magnetic topologies of six early M dwarfs. We find that early-M stars preferentially host large-scale fields with dominantly toroidal and non-axisymmetric poloidal configurations, along with significant differential rotation (and long-term variability); only the lowest-mass star of our subsample is found to host an almost fully poloidal, mainly axisymmetric large-scale field resembling those found in mid-M dwarfs. This abrupt change in the large-scale magnetic topologies of M dwarfs (occurring at spectral type M3) has no related signature on X-ray luminosities (measuring the total amount of magnetic flux); it thus suggests that underlying dynamo processes become more efficient at producing large-scale fields (despite producing the same flux) at spectral types later than M3. We suspect that this change relates to the rapid decrease in the radiative cores of low-mass stars and to the simultaneous sharp increase of the convective turnover times (with decreasing stellar mass) that models predict to occur at M3; it may also be (at least partly) responsible for the reduced magnetic braking reported for fully convective stars. Based on observations obtained at the Télescope Bernard Lyot (TBL), operated by the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique of France. E-mail: donati@ast.obs-mip.fr (J-FD); jmorin@ast.obs-mip.fr (JM); petit

  3. [National Strategic Promotion for Large-Scale Clinical Cancer Research].

    PubMed

    Toyama, Senya

    2016-04-01

    The number of clinical research by clinical cancer study groups has been decreasing this year in Japan. They say the reason is the abolition of donations to the groups from the pharmaceutical companies after the Diovan scandal. But I suppose fundamental problem is that government-supported large-scale clinical cancer study system for evidence based medicine (EBM) has not been fully established. An urgent establishment of the system based on the national strategy is needed for the cancer patients and the public health promotion. PMID:27220800

  4. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  5. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  6. Large-scale databases of proper names.

    PubMed

    Conley, P; Burgess, C; Hage, D

    1999-05-01

    Few tools for research in proper names have been available--specifically, there is no large-scale corpus of proper names. Two corpora of proper names were constructed, one based on U.S. phone book listings, the other derived from a database of Usenet text. Name frequencies from both corpora were compared with human subjects' reaction times (RTs) to the proper names in a naming task. Regression analysis showed that the Usenet frequencies contributed to predictions of human RT, whereas phone book frequencies did not. In addition, semantic neighborhood density measures derived from the HAL corpus were compared with the subjects' RTs and found to be a better predictor of RT than was frequency in either corpus. These new corpora are freely available on line for download. Potentials for these corpora range from using the names as stimuli in experiments to using the corpus data in software applications. PMID:10495803

  7. Estimation of large-scale dimension densities.

    PubMed

    Raab, C; Kurths, J

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor. PMID:11461376

  8. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  9. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  10. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.