Science.gov

Sample records for large-scale health survey

  1. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  2. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  3. Large scale survey of enteric viruses in river and waste water underlines the health status of the local population.

    PubMed

    Prevost, B; Lucas, F S; Goncalves, A; Richard, F; Moulin, L; Wurtzer, S

    2015-06-01

    Although enteric viruses constitute a major cause of acute waterborne diseases worldwide, environmental data about occurrence and viral load of enteric viruses in water are not often available. In this study, enteric viruses (i.e., adenovirus, aichivirus, astrovirus, cosavirus, enterovirus, hepatitis A and E viruses, norovirus of genogroups I and II, rotavirus A and salivirus) were monitored in the Seine River and the origin of contamination was untangled. A total of 275 water samples were collected, twice a month for one year, from the river Seine, its tributaries and the major WWTP effluents in the Paris agglomeration. All water samples were negative for hepatitis A and E viruses. AdV, NVGI, NVGII and RV-A were the most prevalent and abundant populations in all water samples. The viral load and the detection frequency increased significantly between the samples collected the most upstream and the most downstream of the Paris urban area. The calculated viral fluxes demonstrated clearly the measurable impact of WWTP effluents on the viral contamination of the Seine River. The viral load was seasonal for almost all enteric viruses, in accordance with the gastroenteritis recordings provided by the French medical authorities. These results implied the existence of a close relationship between the health status of inhabitants and the viral contamination of WWTP effluents and consequently surface water contamination. Subsequently, the regular analysis of wastewater could serve as a proxy for the monitoring of the human viruses circulating in both a population and surface water. PMID:25795193

  4. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  5. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  6. Theoretical expectations for bulk flows in large-scale surveys

    NASA Technical Reports Server (NTRS)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  7. Interloper bias in future large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Pullen, Anthony R.; Hirata, Christopher M.; Doré, Olivier; Raccanelli, Alvise

    2016-02-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing power spectrum inferences. We also construct a formalism for predicting cosmological parameter measurement bias, demonstrating that a 0.15%-0.3% interloper fraction could bias the growth rate by more than 10% of the error, which can affect constraints on gravity from upcoming surveys. We use the COSMOS Mock Catalog (CMC), with the emission lines rescaled to better reproduce recent data, to predict potential interloper fractions for the Prime Focus Spectrograph (PFS) and the Wide-Field InfraRed Survey Telescope (WFIRST). We find that secondary line identification, or confirming galaxy redshifts by finding correlated emission lines, can remove interlopers for PFS. For WFIRST, we use the CMC to predict that the 0.2% target can be reached for the WFIRST Hα survey, but sensitive optical and near-infrared photometry will be required. For the WFIRST [O III] survey, the predicted interloper fractions reach several percent and their effects will have to be estimated and removed statistically (e.g., with deep training samples). These results are optimistic as the CMC does not capture the full set of correlations of galaxy properties in the real Universe, and they do not include blending effects. Mitigating interloper contamination will be crucial to the next generation of

  8. Characterizing unknown systematics in large scale structure surveys

    SciTech Connect

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.; Seo, Hee-Jong; Ross, Ashley J.; Bahcall, Neta; Brinkmann, Jonathan; Eisenstein, Daniel J.; Muna, Demitri; Palanque-Delabrouille, Nathalie; Yèche, Christophe; Petitjean, Patrick; Schneider, Donald P.; Streblyanska, Alina; Weaver, Benjamin A.

    2014-04-01

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data, we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.

  9. Consent and widespread access to personal health information for the delivery of care: a large scale telephone survey of consumers' attitudes using vignettes in New Zealand

    PubMed Central

    Whiddett, Dick; Hunter, Inga; McDonald, Barry; Norris, Tony; Waldon, John

    2016-01-01

    Objectives In light of recent health policy, to examine factors which influence the public's willingness to consent to share their health information in a national electronic health record (EHR). Design Data were collected in a national telephone survey in 2008. Respondents were presented with vignettes that described situations in which their health information was shared and asked if they would consent to such sharing. The subset, consisting of the 18 vignettes that covered proving care, was reanalysed in depth using new statistical methods in 2016. Setting Adult population of New Zealand accessible by telephone landline. Participants 4209 adults aged 18+ years in the full data set, 2438 of which are included in the selected subset. Main outcome measures For each of 18 vignettes, we measured the percentage of respondents who would consent for their information to be shared for 2 groups; for those who did not consider that their records contained sensitive information, and for those who did or refused to say. Results Rates of consent ranged from 89% (95% CI 87% to 92%) for sharing of information with hospital doctors and nurses to 51% (47% to 55%) for government agencies. Mixed-effects logistic regression was used to identify factors which had significant impact on consent. The role of the recipient and the level of detail influenced respondents' willingness to consent (p<0.0001 for both factors). Of the individual characteristics, the biggest impact was that respondents whose records contain sensitive information (or who refused to answer) were less willing to consent (p<0.0001). Conclusions A proportion of the population are reluctant to share their health information beyond doctors, nurses and paramedics, particularly when records contain sensitive information. These findings may have adverse implications for healthcare strategies based on widespread sharing of information. Further research is needed to understand and overcome peoples' ambivalence towards

  10. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  11. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  12. Performance Health Monitoring of Large-Scale Systems

    SciTech Connect

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­‐scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  13. Probing the large scale structure with the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Leistedt, Boris

    2016-03-01

    I will present the latest cosmological results from the Dark Energy Survey (DES), a 5000 square degree optical galaxy survey in the Southern Hemisphere started in 2012. I will focus on the constraints on Baryon Acoustic Oscillations and other cosmological parameters obtained with galaxy clustering measurements from the first years of DES data. I will highlight the various tests and methods that make these results not only precise but also robust against observational systematics and modeling uncertainties. Finally, I will describe the future phases of the survey, the expected increase in constraining power, and the challenges that need to be addressed to fully exploit the data from surveys such as DES and LSST.

  14. Testing model independent modified gravity with future large scale surveys

    SciTech Connect

    Thomas, Daniel B.; Contaldi, Carlo R. E-mail: c.contaldi@ic.ac.uk

    2011-12-01

    Model-independent parametrisations of modified gravity have attracted a lot of attention over the past few years and numerous combinations of experiments and observables have been suggested to constrain the parameters used in these models. Galaxy clusters have been mentioned, but not looked at as extensively in the literature as some other probes. Here we look at adding galaxy clusters into the mix of observables and examine how they could improve the constraints on the modified gravity parameters. In particular, we forecast the constraints from combining Planck satellite Cosmic Microwave Background (CMB) measurements and Sunyaev-Zeldovich (SZ) cluster catalogue with a DES-like Weak Lensing (WL) survey. We find that cluster counts significantly improve the constraints over those derived using CMB and WL. We then look at surveys further into the future, to see how much better it may be feasible to make the constraints.

  15. A large-scale integrated aerogeophysical survey of Afghanistan

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Childers, V. A.; Gardner, J. M.; Liang, R. T.; Bowles, J. H.; Abraham, J. D.

    2007-12-01

    A multi-sensor, multidisciplinary aerogeophysical survey of a major portion of Afghanistan was recently conducted by investigators from the Naval Research Laboratory and the U.S. Geological Survey. More than 110,000 line km of data tracks were flown aboard an NP-3D Orion aircraft. Sensor systems installed on the P-3 included dual gravimeters, scalar and vector magnetometers, a digital photogrammetric camera, a hyperspectral imager, and an L-band polarimetric synthetic aperture radar (SAR). Data from all sources were precisely co-registered to the ground by a combination of interferometric-mode Global Positioning System (GPS) and inertial measurements. The data from this integrated mapping mission support numerous basic and applied science efforts in Afghanistan including: resource assessment and exploration for oil, gas, and minerals, development of techniques for sensor fusion and automated analysis, and topics in crustal geophysics and geodesy. The data will also support civil infrastructure needs such as cadastral surveying, urban planning and development, and pipeline/powerline/road routing and construction, agriculture and hydrologic resource management, earthquake hazard analysis, and base-maps for humanitarian relief missions.

  16. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  17. Large Scale Structure at 24 Microns in the SWIRE Survey

    NASA Astrophysics Data System (ADS)

    Masci, F. J.; SWIRE Team

    2006-12-01

    We present initial results of galaxy clustering at 24μm by analyzing statistics of the projected galaxy distribution from counts-in-cells. This study focuses on the ELAIS-North1 SWIRE field. The sample covers ≃5.9 deg2 and contains 24,715 sources detected at 24μm to a 5.6σ limit of 250μJy (in the lowest coverage regions). We have explored clustering as a function of 3.6 - 24μm and 24μm flux density using angular-averaged two-point correlation functions derived from the variance of counts-in-cells on scales 0°.05-0°.7. Using a power-law parameterization, w2(θ)=A(θ/deg)1-γ, we find [A,γ] = [(5.43±0.20)×10-4,2.01±0.02] for the full sample (1σ errors throughout). We have inverted Limber's equation and estimated a spatial correlation length of r0=3.32±0.19 h-1Mpc for the full sample, assuming stable clustering and a redshift model consistent with observed 24μm counts. We also find that blue [fν(24)/fν(3.6)≤5.5] and red [fν(24)/fν(3.6)≥6.5] galaxies have the lowest and highest r0 values respectively, implying that redder galaxies are more clustered (by a factor of ≈3 on scales ⪆0°.2). Overall, the clustering estimates are smaller than those derived from optical surveys, but in agreement with results from IRAS and ISO in the mid-infrared. This extends the notion to higher redshifts that infrared selected surveys show weaker clustering than optical surveys.

  18. A Novel Electronic Data Collection System for Large-Scale Surveys of Neglected Tropical Diseases

    PubMed Central

    King, Jonathan D.; Buolamwini, Joy; Cromwell, Elizabeth A.; Panfel, Andrew; Teferi, Tesfaye; Zerihun, Mulat; Melak, Berhanu; Watson, Jessica; Tadesse, Zerihun; Vienneau, Danielle; Ngondi, Jeremiah; Utzinger, Jürg; Odermatt, Peter; Emerson, Paul M.

    2013-01-01

    -based technology was suitable for a large-scale health survey, saved time, provided more accurate geo-coordinates, and was preferred by recorders over standard paper-based questionnaires. PMID:24066147

  19. Large-scale structure in the Southern Sky Redshift Survey

    NASA Technical Reports Server (NTRS)

    Park, Changbom; Gott, J. R., III; Da Costa, L. N.

    1992-01-01

    The power spectrum from the Southern Sky Redshift Survey and the CfA samples are measured in order to explore the amplitude of fluctuation in the galaxy density. At lambda of less than or equal to 30/h Mpc the observed power spectrum is quite consistent with the standard CDM model. At larger scales the data indicate an excess of power over the standard CDM model. The observed power spectrum from these optical galaxy samples is in good agreement with that drawn from the sparsely sampled IRAS galaxies. The shape of the power spectrum is also studied by examining the relation between the genus per unit volume and the smoothing length. It is found that, over Gaussian smoothing scales from 6 to 14/h Mpc, the power spectrum has a slope of about -1. The topology of the galaxy density field is studied by measuring the shift of the genus curve from the Gaussian case. Over all smoothing scales studied, the observed genus curves are consistent with a random phase distribution of the galaxy density field, as predicted by the inflationary scenarios.

  20. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we

  1. PERSPECTIVES ON LARGE-SCALE NATURAL RESOURCES SURVEYS WHEN CAUSE-EFFECT IS A POTENTIAL ISSUE

    EPA Science Inventory

    Our objective is to present a perspective on large-scale natural resource monitoring when cause-effect is a potential issue. We believe that the approach of designing a survey to meet traditional commodity production and resource state descriptive objectives is too restrictive an...

  2. The Use of Online Social Networks by Polish Former Erasmus Students: A Large-Scale Survey

    ERIC Educational Resources Information Center

    Bryla, Pawel

    2014-01-01

    There is an increasing role of online social networks in the life of young Poles. We conducted a large-scale survey among Polish former Erasmus students. We have received 2450 completed questionnaires from alumni of 115 higher education institutions all over Poland. 85.4% of our respondents reported they kept in touch with their former Erasmus…

  3. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  4. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey

  5. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  6. Addressing statistical and operational challenges in designing large-scale stream condition surveys.

    PubMed

    Dobbie, Melissa J; Negus, Peter

    2013-09-01

    Implementing a statistically valid and practical monitoring design for large-scale stream condition monitoring and assessment programs can be difficult due to factors including the likely existence of a diversity of ecosystem types such as ephemeral streams over the sampling domain; limited resources to undertake detailed monitoring surveys and address knowledge gaps; and operational constraints on effective sampling at monitoring sites. In statistical speak, these issues translate to defining appropriate target populations and sampling units; designing appropriate spatial and temporal sample site selection methods; selection and use of appropriate indicators; and setting effect sizes with limited ecological and statistical information about the indicators of interest. We identify the statistical and operational challenges in designing large-scale stream condition surveys and discuss general approaches for addressing them. The ultimate aim in drawing attention to these challenges is to ensure operational practicality in carrying out future monitoring programs and that the resulting inferences about stream condition are statistically valid and relevant. PMID:23344628

  7. Large-scale survey of adverse reactions to canine non-rabies combined vaccines in Japan.

    PubMed

    Miyaji, Kazuki; Suzuki, Aki; Shimakura, Hidekatsu; Takase, Yukari; Kiuchi, Akio; Fujimura, Masato; Kurita, Goro; Tsujimoto, Hajime; Sakaguchi, Masahiro

    2012-01-15

    Canine non-rabies combined vaccines are widely used to protect animals from infectious agents, and also play an important role in public health. We performed a large-scale survey to investigate vaccine-associated adverse events (VAAEs), including anaphylaxis, in Japan by distributing questionnaires on VAAEs to veterinary hospitals from April 1, 2006 through May 31, 2007. Valid responses were obtained for 57,300 vaccinated dogs at 573 animal hospitals; we obtained VAAEs information for last 100 vaccinated dogs in each veterinary hospital. We found that of the 57,300, 359 dogs showed VAAEs. Of the 359 dogs, death was observed in 1, anaphylaxis in 41, dermatological signs in 244, gastrointestinal signs in 160, and other signs in 106. Onset of VAAEs was mostly observed within 12h after vaccination (n=299, 83.3%). In this study, anaphylaxis events occurred within 60 min after vaccination, and about half of these events occurred within 5 min (n=19, 46.3%). Furthermore, where anaphylaxis was reported, additional information to support the diagnosis was obtained by reinvestigation. Our resurvey of dogs with anaphylaxis yielded responses on 31 dogs; 27 of these demonstrated collapse (87.1%), 24 demonstrated cyanosis (77.4%), and both signs occurred in 22 (71.0%). Higher rates of animal VAAEs, anaphylaxis, and death were found in Japan than in other countries. Further investigations, including survey studies, will be necessary to elucidate the interaction between death and vaccination and the risk factors for VAAEs, and thus develop safer vaccines. Moreover, it may also be necessary to continually update the data of VAAEs. PMID:22264736

  8. A sparse-sampling strategy for the estimation of large-scale clustering from redshift surveys

    NASA Astrophysics Data System (ADS)

    Kaiser, N.

    1986-04-01

    It is shown that a fractional faint-magnitude limited redshift survey can significantly reduce the uncertainty in the two-point function for a given telescope time investment, in the estimation of large scale clustering. The signal-to-noise ratio for a 1-in-20 bright galaxy sample is roughly twice that provided by a same-cost complete survey, and this performance is the same as for a larger complete survey of about seven times the cost. A similar performance increase is achieved with a wide-field telescope multiple redshift collection from a close to full sky coverage survey. Little performance improvement is seen for smaller multiply collected surveys ideally sampled at a 1-in-10 bright galaxy rate. The optimum sampling fraction for Abell's rich clusters is found to be close to unity, with little sparse sampling performance improvement.

  9. Studying populations of eclipsing binaries using large scale multi-epoch photometric surveys

    NASA Astrophysics Data System (ADS)

    Mowlavi, Nami; Barblan, Fabio; Holl, Berry; Rimoldini, Lorenzo; Lecoeur-Taïbi, Isabelle; Süveges, Maria; Eyer, Laurent; Guy, Leanne; Nienartowicz, Krzysztof; Ordonez, Diego; Charnas, Jonathan; Jévardat de Fombelle, Grégory

    2015-08-01

    Large scale multi-epoch photometric surveys provide unique opportunities to study populations of binary stars through the study of eclipsing binaries, provided the basic properties of binary systems can be derived from their light curves without the need to fully model the binary system. Those systems can then be classified into various types from, for example, close to wide systems, from circular to highly elliptical systems, or from systems with similar components to highly asymmetric systems. The challenge is to extract physically relevant information from the light curve geometry.In this contribution, we present the study of eclipsing binaries in the Large Magellanic Clouds (LMC) from the OGLE-III survey. The study is based on the analysis of the geometry of their light curves parameterized using a two-Gaussian model. We show what physical parameters could be extracted from such an analysis, and the results for the LMC eclipsing binaries. The method is very well adapted to process large-scale surveys containing millions of eclipsing binaries, such as is expected from the current Gaia mission or the future LSST survey.

  10. Measures of large-scale structure in the CfA redshift survey slices

    NASA Technical Reports Server (NTRS)

    De Lapparent, Valerie; Geller, Margaret J.; Huchra, John P.

    1991-01-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall.

  11. Measures of large-scale structure in the CfA redshift survey slices

    SciTech Connect

    De Lapparent, V.; Geller, M.J.; Huchra, J.P. Harvard-Smithsonian Center for Astrophysics, Cambridge, MA )

    1991-03-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall. 46 refs.

  12. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation. PMID:27577416

  13. A large-scale survey of thermal comfort in office premises in Hong Kong

    SciTech Connect

    Chan, D.W.T.; Burnett, J.; Ng, S.C.H.; Dear, R.J. de

    1998-10-01

    Hong Kong is a densely populated city in which the service sector dominates. The significant outdoor noise pollution and subtropical climate severely restrict the opportunity for office premises to be naturally ventilated. The high energy consumption for space cooling and the demand for improved indoor thermal comfort conditions simulated a large-scale survey of thermal comfort conditions in Hong Kong office premises. The neutral temperatures and preferred temperatures are found to be lower than those found in other studies in the tropics, with 60% of the surveyed subjects preferring a change of the thermal conditions in summer. The outcome provides for a better notion of thermal comfort, which can be imposed on design criteria. The results also add weight to the concern about the validity in the field of the traditional chamber test data presented by ASHRAE Standard 55-1992. It further suggests the potential for adopting an adaptive control algorithm for thermal comfort.

  14. Public health concerns for neighbors of large-scale swine production operations.

    PubMed

    Thu, K M

    2002-05-01

    This article provides a review and critical synthesis of research related to public health concerns for neighbors exposed to emissions from large-scale swine production operations. The rapid industrialization of pork production in the 1990s produced a generation of confined animal feeding operations (CAFOs) of a size previously unseen in the U.S. Recent research and results from federally sponsored scientific symposia consistently indicate that neighbors of large-scale swine CAFOs can experience health problems at significantly higher rates than controlled comparison populations. Symptoms experienced by swine CAFO neighbors are generally oriented toward irritation of the respiratory tract and are consistent with the types of symptoms among interior confinement workers thathave been well documented in the occupational health literature. However, additional exposure assessment research is required to elucidate the relationship of reported symptoms among swine CAFO neighbors and CAFO emissions. PMID:12046804

  15. Measuring large-scale structure with quasars in narrow-band filter surveys

    NASA Astrophysics Data System (ADS)

    Abramo, L. Raul; Strauss, Michael A.; Lima, Marcos; Hernández-Monteagudo, Carlos; Lazkoz, Ruth; Moles, Mariano; de Oliveira, Claudia Mendes; Sendra, Irene; Sodré, Laerte; Storchi-Bergmann, Thaisa

    2012-07-01

    We show that a large-area imaging survey using narrow-band filters could detect quasars in sufficiently high number densities, and with more than sufficient accuracy in their photometric redshifts, to turn them into suitable tracers of large-scale structure. If a narrow-band optical survey can detect objects as faint as i= 23, it could reach volumetric number densities as high as 10-4 h3 Mpc-3 (comoving) at z˜ 1.5. Such a catalogue would lead to precision measurements of the power spectrum up to z˜ 3-4. We also show that it is possible to employ quasars to measure baryon acoustic oscillations at high redshifts, where the uncertainties from redshift distortions and non-linearities are much smaller than at z≲ 1. As a concrete example we study the future impact of the Javalambre Physics of the Accelerating Universe Astrophysical Survey (J-PAS), which is a narrow-band imaging survey in the optical over 1/5 of the unobscured sky with 42 filters of ˜100-Å full width at half-maximum. We show that J-PAS will be able to take advantage of the broad emission lines of quasars to deliver excellent photometric redshifts, σz≃ 0.002 (1 +z), for millions of objects.

  16. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    PubMed

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments. PMID:27624742

  17. Large-Scale Surveys of Snow Depth on Arctic Sea Ice from Operation IceBridge

    NASA Technical Reports Server (NTRS)

    Kurtz, Nathan T.; Farrell, Sinead L.

    2011-01-01

    We show the first results of a large ]scale survey of snow depth on Arctic sea ice from NASA fs Operation IceBridge snow radar system for the 2009 season and compare the data to climatological snow depth values established over the 1954.1991 time period. For multiyear ice, the mean radar derived snow depth is 33.1 cm and the corresponding mean climatological snow depth is 33.4 cm. The small mean difference suggests consistency between contemporary estimates of snow depth with the historical climatology for the multiyear ice region of the Arctic. A 16.5 cm mean difference (climatology minus radar) is observed for first year ice areas suggesting that the increasingly seasonal sea ice cover of the Arctic Ocean has led to an overall loss of snow as the region has transitioned away from a dominantly multiyear ice cover.

  18. Ten key considerations for the successful implementation and adoption of large-scale health information technology

    PubMed Central

    Cresswell, Kathrin M; Bates, David W; Sheikh, Aziz

    2013-01-01

    The implementation of health information technology interventions is at the forefront of most policy agendas internationally. However, such undertakings are often far from straightforward as they require complex strategic planning accompanying the systemic organizational changes associated with such programs. Building on our experiences of designing and evaluating the implementation of large-scale health information technology interventions in the USA and the UK, we highlight key lessons learned in the hope of informing the on-going international efforts of policymakers, health directorates, healthcare management, and senior clinicians. PMID:23599226

  19. Large-scale internal structure in volcanogenic breakout flood deposits: Extensive GPR survey on volcaniclastic deposits

    NASA Astrophysics Data System (ADS)

    Kataoka, K.; Gomez, C. A.

    2012-12-01

    Large-scale outburst floods from volcanic lakes such as caldera lakes or volcanically dammed river-valleys tend to be voluminous with total discharge of > 1-10s km3 and peak discharge of >10000s to 100000s m3 s-1. Such a large flood can travel long distance and leave sediments and bedforms/landforms extensively with large-scale internal structures, which are difficult to assess from single local sites. Moreover, the sediments and bedforms/landforms are sometimes untraceable, and outcrop information obtained by classical geological and geomorphological field surveys is limited to the dissected/terraced parts of fan body, road cuts and/or large quarries. Therefore, GPR (Ground Penetrating Radar), using the properties of electromagnetic waves' propagation through media, seems best adapted for the appraisal of large-scale subsurface structures. Recently, studies on GPR applications to volcanic deposits have successfully captured images of lava flows and volcaniclastic deposits and proved the usefulness of this method even onto the volcanic areas which often encompass complicated stratigraphy and structures with variable material, grainsize, and ferromagnetic content. Using GPR, the present study aims to understand the large-scale internal structures of volcanogenic flood deposits. The survey was carried out over two volcanogenic flood fan (or apron) sediments in northeast Japan, at Numazawa and Towada volcanoes. The 5 ka Numazawa flood deposits in the Tadami river catchment that has been emplaced by a breakout flood from ignimbrite-dammed valley leaving pumiceous gravelly sediments with meter-sized boulders in the flow path. At Towada volcano, a comparable flood event originating from a breach in the caldera rim emplaced the 13-15 ka Sanbongi fan deposits in the Oirase river valley, which is characterized by a bouldery fan deposits. The GPR data was collected following 200 to 500 m long lateral and longitudinal transects, which were captured using a GPR Pulse

  20. Searching transients in large-scale surveys. A method based on the Abbe value

    NASA Astrophysics Data System (ADS)

    Mowlavi, N.

    2014-08-01

    Aims: A new method is presented to identify transient candidates in large-scale surveys based on the variability pattern in their light curves. Methods: The method is based on the Abbe value, Ab, that estimates the smoothness of a light curve, and on a newly introduced value called the excess Abbe and denoted excessAb, that estimates the regularity of the light curve variability pattern over the duration of the observations. Results: Based on simulated light curves, transients are shown to occupy a specific region in the {diagram} diagram, distinct from sources presenting pulsating-like features in their light curves or having featureless light curves. The method is tested on real light curves taken from EROS-2 and OGLE-II surveys in a 0.50° × 0.17° field of the sky in the Large Magellanic Cloud centered at RA(J2000) = 5h25m56.5s and Dec(J2000) = -69d29m43.3s. The method identifies 43 EROS-2 transient candidates out of a total of 1300 variable stars, and 19 more OGLE-II candidates, 10 of which do not have any EROS-2 variable star matches and which would need further confirmation to assess their reliability. The efficiency of the method is further tested by comparing the list of transient candidates with known Be stars in the literature. It is shown that all Be stars known in the studied field of view with detectable bursts or outbursts are successfully extracted by the method. In addition, four new transient candidates displaying bursts and/or outbursts are found in the field, of which at least two are good new Be candidates. Conclusions: The new method proves to be a potentially powerful tool to extract transient candidates from large-scale multi-epoch surveys. The better the photometric measurement uncertainties are, the cleaner the list of detected transient candidates is. In addition, the diagram diagram is shown to be a good diagnostic tool to check the data quality of multi-epoch photometric surveys. A trend of instrumental and/or data reduction origin

  1. Effects of unstable dark matter on large-scale structure and constraints from future surveys

    NASA Astrophysics Data System (ADS)

    Wang, Mei-Yu; Zentner, Andrew R.

    2012-02-01

    In this paper we explore the effect of decaying dark matter (DDM) on large-scale structure and possible constraints from galaxy imaging surveys. DDM models have been studied, in part, as a way to address apparent discrepancies between the predictions of standard cold dark matter models and observations of galactic structure. Our study is aimed at developing independent constraints on these models. In such models, DDM decays into a less massive, stable dark matter (SDM) particle and a significantly lighter particle. The small mass splitting between the parent DDM and the daughter SDM provides the SDM with a recoil or “kick” velocity vk, inducing a free-streaming suppression of matter fluctuations. This suppression can be probed via weak lensing power spectra measured by a number of forthcoming imaging surveys that aim primarily to constrain dark energy. Using scales on which linear perturbation theory alone is valid (multipoles ℓ<300), surveys like Euclid or the Large Synoptic Survey Telescope can be sensitive to vk≳90km/s for lifetimes τ˜1-5Gyr. To estimate more aggressive constraints, we model nonlinear corrections to lensing power using a simple halo evolution model that is in good agreement with numerical simulations. In our most ambitious forecasts, using multipoles ℓ<3000, we find that imaging surveys can be sensitive to vk˜10km/s for lifetimes τ≲10Gyr. Lensing will provide a particularly interesting complement to existing constraints in that they will probe the long lifetime regime (τ≫H0-1) far better than contemporary techniques. A caveat to these ambitious forecasts is that the evolution of perturbations on nonlinear scales will need to be well calibrated by numerical simulations before they can be realized. This work motivates the pursuit of such a numerical simulation campaign to constrain dark matter with cosmological weak lensing.

  2. Large-scale fluctuations in the number density of galaxies in independent surveys of deep fields

    NASA Astrophysics Data System (ADS)

    Shirokov, S. I.; Lovyagin, N. Yu.; Baryshev, Yu. V.; Gorokhov, V. L.

    2016-06-01

    New arguments supporting the reality of large-scale fluctuations in the density of the visible matter in deep galaxy surveys are presented. A statistical analysis of the radial distributions of galaxies in the COSMOS and HDF-N deep fields is presented. Independent spectral and photometric surveys exist for each field, carried out in different wavelength ranges and using different observing methods. Catalogs of photometric redshifts in the optical (COSMOS-Zphot) and infrared (UltraVISTA) were used for the COSMOS field in the redshift interval 0.1 < z < 3.5, as well as the zCOSMOS (10kZ) spectroscopic survey and the XMM-COSMOS and ALHAMBRA-F4 photometric redshift surveys. The HDFN-Zphot and ALHAMBRA-F5 catalogs of photometric redshifts were used for the HDF-N field. The Pearson correlation coefficient for the fluctuations in the numbers of galaxies obtained for independent surveys of the same deep field reaches R = 0.70 ± 0.16. The presence of this positive correlation supports the reality of fluctuations in the density of visible matter with sizes of up to 1000 Mpc and amplitudes of up to 20% at redshifts z ~ 2. The absence of correlations between the fluctuations in different fields (the correlation coefficient between COSMOS and HDF-N is R = -0.20 ± 0.31) testifies to the independence of structures visible in different directions on the celestial sphere. This also indicates an absence of any influence from universal systematic errors (such as "spectral voids"), which could imitate the detection of correlated structures.

  3. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  4. Survey and analysis of selected jointly owned large-scale electric utility storage projects

    SciTech Connect

    Not Available

    1982-05-01

    The objective of this study was to examine and document the issues surrounding the curtailment in commercialization of large-scale electric storage projects. It was sensed that if these issues could be uncovered, then efforts might be directed toward clearing away these barriers and allowing these technologies to penetrate the market to their maximum potential. Joint-ownership of these projects was seen as a possible solution to overcoming the major barriers, particularly economic barriers, of commercializaton. Therefore, discussions with partners involved in four pumped storage projects took place to identify the difficulties and advantages of joint-ownership agreements. The four plants surveyed included Yards Creek (Public Service Electric and Gas and Jersey Central Power and Light); Seneca (Pennsylvania Electric and Cleveland Electric Illuminating Company); Ludington (Consumers Power and Detroit Edison, and Bath County (Virginia Electric Power Company and Allegheny Power System, Inc.). Also investigated were several pumped storage projects which were never completed. These included Blue Ridge (American Electric Power); Cornwall (Consolidated Edison); Davis (Allegheny Power System, Inc.) and Kttatiny Mountain (General Public Utilities). Institutional, regulatory, technical, environmental, economic, and special issues at each project were investgated, and the conclusions relative to each issue are presented. The major barriers preventing the growth of energy storage are the high cost of these systems in times of extremely high cost of capital, diminishing load growth and regulatory influences which will not allow the building of large-scale storage systems due to environmental objections or other reasons. However, the future for energy storage looks viable despite difficult economic times for the utility industry. Joint-ownership can ease some of the economic hardships for utilites which demonstrate a need for energy storage.

  5. Photometric Redshifts for the Dark Energy Survey and VISTA and Implications for Large Scale Structure

    SciTech Connect

    Banerji, Manda; Abdalla, Filipe B.; Lahav, Ofer; Lin, Huan; /Fermilab

    2007-11-01

    We conduct a detailed analysis of the photometric redshift requirements for the proposed Dark Energy Survey (DES) using two sets of mock galaxy simulations and an artificial neural network code-ANNz. In particular, we examine how optical photometry in the DES grizY bands can be complemented with near infra-red photometry from the planned VISTA Hemisphere Survey (VHS) in the JHK{sub s} bands in order to improve the photometric redshift estimate by a factor of two at z > 1. We draw attention to the effects of galaxy formation scenarios such as reddening on the photo-z estimate and using our neural network code, calculate A{sub v} for these reddened galaxies. We also look at the impact of using different training sets when calculating photometric redshifts. In particular, we find that using the ongoing DEEP2 and VVDS-Deep spectroscopic surveys to calibrate photometric redshifts for DES, will prove effective. However we need to be aware of uncertainties in the photometric redshift bias that arise when using different training sets as these will translate into errors in the dark energy equation of state parameter, w. Furthermore, we show that the neural network error estimate on the photometric redshift may be used to remove outliers from our samples before any kind of cosmological analysis, in particular for large-scale structure experiments. By removing all galaxies with a 1{sigma} photo-z scatter greater than 0.1 from our DES+VHS sample, we can constrain the galaxy power spectrum out to a redshift of 2 and reduce the fractional error on this power spectrum by {approx}15-20% compared to using the entire catalogue.

  6. Conducting Large-Scale Surveys in Secondary Schools: The Case of the Youth On Religion (YOR) Project

    ERIC Educational Resources Information Center

    Madge, Nicola; Hemming, Peter J.; Goodman, Anthony; Goodman, Sue; Kingston, Sarah; Stenson, Kevin; Webster, Colin

    2012-01-01

    There are few published articles on conducting large-scale surveys in secondary schools, and this paper seeks to fill this gap. Drawing on the experiences of the Youth On Religion project, it discusses the politics of gaining access to these schools and the considerations leading to the adoption and administration of an online survey. It is…

  7. EVALUATION OF A MEASUREMENT METHOD FOR FOREST VEGETATION IN A LARGE-SCALE ECOLOGICAL SURVEY

    EPA Science Inventory

    We evaluate a field method for determining species richness and canopy cover of vascular plants for the Forest Health Monitoring Program (FHM), an ecological survey of U.S. forests. Measurements are taken within 12 1-m2 quadrats on 1/15 ha plots in FHM. Species richness and cover...

  8. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    NASA Astrophysics Data System (ADS)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  9. A process for creating multimetric indices for large-scale aquatic surveys

    EPA Science Inventory

    Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...

  10. ELISA: A small balloon Experiment for a Large Scale Survey in the Sub-millimeter

    NASA Astrophysics Data System (ADS)

    Bernard, J.-Ph.; Ristorcelli, I.; Stepnik, B.; Abergel, A.; Boulanger, F.; Giard, M.; Lagache, G.; Lamarre, J. M.; Meny, C.; Torre, J. P.; Armengaud, M.; Crussaire, J. P.; Leriche, B.; Longval, Y.

    2002-03-01

    HERSCHEL and the PLANCK space missions to be launched in 2007. The ELISA data will also be usable to help calibrate the observations of HERSCHEL and PLANCK and to plan the large-scale surveys to be undertaken with HERSCHEL. Owing to these objectives, 3 flights of the ELISA experiment, including one from Southern hemisphere, are foreseen in the period from 2004 to 2006. The ELISA project is carried out by an international collaboration including France (CESR, IAS, CEA, CNES), Netherlands (SSD/ESTEC), Denmark (DSRI), England (QMW), USA (JPL/Caltech), Italy (ASI). .

  11. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts. PMID:22084529

  12. Linking Errors in Trend Estimation in Large-Scale Surveys: A Case Study. Research Report. ETS RR-10-10

    ERIC Educational Resources Information Center

    Xu, Xueli; von Davier, Matthias

    2010-01-01

    One of the major objectives of large-scale educational surveys is reporting trends in academic achievement. For this purpose, a substantial number of items are carried from one assessment cycle to the next. The linking process that places academic abilities measured in different assessments on a common scale is usually based on a concurrent…

  13. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  14. Health risks from large-scale water pollution: trends in Central Asia.

    PubMed

    Törnqvist, Rebecka; Jarsjö, Jerker; Karimov, Bakhtiyor

    2011-02-01

    Limited data on the pollution status of spatially extensive water systems constrain health-risk assessments at basin-scales. Using a recipient measurement approach in a terminal water body, we show that agricultural and industrial pollutants in groundwater-surface water systems of the Aral Sea Drainage Basin (covering the main part of Central Asia) yield cumulative health hazards above guideline values in downstream surface waters, due to high concentrations of copper, arsenic, nitrite, and to certain extent dichlorodiphenyltrichloroethane (DDT). Considering these high-impact contaminants, we furthermore perform trend analyses of their upstream spatial-temporal distribution, investigating dominant large-scale spreading mechanisms. The ratio between parent DDT and its degradation products showed that discharges into or depositions onto surface waters are likely to be recent or ongoing. In river water, copper concentrations peak during the spring season, after thawing and snow melt. High spatial variability of arsenic concentrations in river water could reflect its local presence in the top soil of nearby agricultural fields. Overall, groundwaters were associated with much higher health risks than surface waters. Health risks can therefore increase considerably, if the downstream population must switch to groundwater-based drinking water supplies during surface water shortage. Arid regions are generally vulnerable to this problem due to ongoing irrigation expansion and climate changes. PMID:21131050

  15. A survey on routing protocols for large-scale wireless sensor networks.

    PubMed

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. "Large-scale" means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner and

  16. A Survey on Routing Protocols for Large-Scale Wireless Sensor Networks

    PubMed Central

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. “Large-scale” means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner

  17. Public knowledge and preventive behavior during a large-scale Salmonella outbreak: results from an online survey in the Netherlands

    PubMed Central

    2014-01-01

    Background Food-borne Salmonella infections are a worldwide concern. During a large-scale outbreak, it is important that the public follows preventive advice. To increase compliance, insight in how the public gathers its knowledge and which factors determine whether or not an individual complies with preventive advice is crucial. Methods In 2012, contaminated salmon caused a large Salmonella Thompson outbreak in the Netherlands. During the outbreak, we conducted an online survey (n = 1,057) to assess the general public’s perceptions, knowledge, preventive behavior and sources of information. Results Respondents perceived Salmonella infections and the 2012 outbreak as severe (m = 4.21; five-point scale with 5 as severe). Their knowledge regarding common food sources, the incubation period and regular treatment of Salmonella (gastro-enteritis) was relatively low (e.g., only 28.7% knew that Salmonella is not normally treated with antibiotics). Preventive behavior differed widely, and the majority (64.7%) did not check for contaminated salmon at home. Most information about the outbreak was gathered through traditional media and news and newspaper websites. This was mostly determined by time spent on the medium. Social media played a marginal role. Wikipedia seemed a potentially important source of information. Conclusions To persuade the public to take preventive actions, public health organizations should deliver their message primarily through mass media. Wikipedia seems a promising instrument for educating the public about food-borne Salmonella. PMID:24479614

  18. Child maltreatment experience among primary school children: a large scale survey in Selangor state, Malaysia.

    PubMed

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10-12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  19. Child Maltreatment Experience among Primary School Children: A Large Scale Survey in Selangor State, Malaysia

    PubMed Central

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10–12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  20. Body burden of cadmium and its related factors: a large-scale survey in China.

    PubMed

    Ke, Shen; Cheng, Xi-Yu; Li, Hao; Jia, Wen-Jing; Zhang, Jie-Ying; Luo, Hui-Fang; Wang, Zi-Ling; Chen, Zhi-Nan

    2015-04-01

    A survey of more than 6000 participants from four distinct non-polluted and polluted regions in China was conducted to evaluate the body burden of cadmium (Cd) on the Chinese populations using urinary Cd (UCd) as a biomarker. The findings revealed that the UCd level was 1.24 μg/g creatinine (μg/g cr) for the sample population from non-polluted Shanghai, and the UCd levels exceeded 5 μg/g cr, which is the health-based exposure limit set by the World Health Organization (WHO), in 1.1% of people. The mean UCd levels in moderately polluted (Hubei and Liaoning) and highly polluted areas (Guizhou) were 4.69 μg/g cr, 3.62 μg/g cr and 6.08 μg/g cr, respectively, and these levels were 2.9 to 4.9 times the levels observed in Shanghai. Notably, the UCd levels exceeded the recently updated human biomonitoring II values (i.e., intervention or "action level") in 44.8%-87.9% of people from these areas compared to only 5.1%-21.4% of people in Shanghai. The corresponding prevalence of elevated UCd levels (>WHO threshold, 5 μg/g cr) was also significantly higher (30.7% to 63.8% vs. 1.1%), which indicates that elevated Cd-induced health risks to residents in these areas. Age and region were significant determinants for UCd levels in a population, whereas gender did not significantly influence UCd. PMID:25594907

  1. Workplace Bullying and Sleep Disturbances: Findings from a Large Scale Cross-Sectional Survey in the French Working Population

    PubMed Central

    Niedhammer, Isabelle; David, Simone; Degioanni, Stéphanie; Drummond, Anne; Philip, Pierre

    2009-01-01

    Study Objectives: The purpose of this study was to explore the associations between workplace bullying, the characteristics of workplace bullying, and sleep disturbances in a large sample of employees of the French working population. Design: Workplace bullying, evaluated using the validated instrument developed by Leymann, and sleep disturbances, as well as covariates, were measured using a self-administered questionnaire. Covariates included age, marital status, presence of children, education, occupation, working hours, night work, physical and chemical exposures at work, self-reported health, and depressive symptoms. Statistical analysis was performed using logistic regression analysis and was carried out separately for men and women. Setting: General working population. Participants: The study population consisted of a random sample of 3132 men and 4562 women of the working population in the southeast of France. Results: Workplace bullying was strongly associated with sleep disturbances. Past exposure to bullying also increased the risk for this outcome. The more frequent the exposure to bullying, the higher the risk of experiencing sleep disturbances. Observing someone else being bullied in the workplace was also associated with the outcome. Adjustment for covariates did not modify the results. Additional adjustment for self-reported health and depressive symptoms diminished the magnitude of the associations that remained significant. Conclusions: The prevalence of workplace bullying (around 10%) was found to be high in this study as well was the impact of this major job-related stressor on sleep disturbances. Although no conclusion about causality could be drawn from this cross-sectional study, the findings suggest that the contribution of workplace bullying to the burden of sleep disturbances may be substantial. Citation: Niedhammer I; David S; Degioanni S; Drummond A; Philip P. Workplace bullying and sleep disturbances: findings from a large scale cross

  2. Multi-stage sampling for large scale natural resources surveys: A case study of rice and waterfowl

    USGS Publications Warehouse

    Stafford, J.D.; Reinecke, K.J.; Kaminski, R.M.; Gerard, P.D.

    2005-01-01

    Large-scale sample surveys to estimate abundance and distribution of organisms and their habitats are increasingly important in ecological studies. Multi-stage sampling (MSS) is especially suited to large-scale surveys because of the natural clustering of resources. To illustrate an application, we: (1) designed a stratified MSS to estimate late autumn abundance (kg/ha) of rice seeds in harvested fields as food for waterfowl wintering in the Mississippi Alluvial Valley (MAV); (2) investigated options for improving the MSS design; and (3) compared statistical and cost efficiency of MSS to simulated simple random sampling (SRS). During 2000?2002, we sampled 25?35 landowners per year, 1 or 2 fields per landowner per year, and measured seed mass in 10 soil cores collected within each field. Analysis of variance components and costs for each stage of the survey design indicated that collecting 10 soil cores per field was near the optimum of 11?15, whereas sampling >1 field per landowner provided few benefits because data from fields within landowners were highly correlated. Coefficients of variation (CV) of annual estimates of rice abundance ranged from 0.23 to 0.31 and were limited by variation among landowners and the number of landowners sampled. Design effects representing the statistical efficiency of MSS relative to SRS ranged from 3.2 to 9.0, and simulations indicated SRS would cost, on average, 1.4 times more than MSS because clustering of sample units in MSS decreased travel costs. We recommend MSS as a potential sampling strategy for large-scale natural resource surveys and specifically for future surveys of the availability of rice as food for waterfowl in the MAV and similar areas.

  3. A Survey of Residents' Perceptions of the Effect of Large-Scale Economic Developments on Perceived Safety, Violence, and Economic Benefits

    PubMed Central

    Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi

    2015-01-01

    Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310

  4. Assessing the Hypothesis of Measurement Invariance in the Context of Large-Scale International Surveys

    ERIC Educational Resources Information Center

    Rutkowski, Leslie; Svetina, Dubravka

    2014-01-01

    In the field of international educational surveys, equivalence of achievement scale scores across countries has received substantial attention in the academic literature; however, only a relatively recent emphasis on scale score equivalence in nonachievement education surveys has emerged. Given the current state of research in multiple-group…

  5. A Large-Scale Radio Polarization Survey of the Southern Sky at 21cm

    NASA Astrophysics Data System (ADS)

    Testori, J. C.; Reich, P.; Reich, W.

    2004-02-01

    We have successfully reduced the polarization data from the recently published 21 cm continuum survey of the southern sky carried out with a 30-m antenna at Villa Elisa (Argentina). We describe the reduction and calibration methods of the survey. The result is a fully sampled survey, which covers declinations from -90 degrees to -10 degrees with a typical rms-noise of 15 mK TB. The map of polarized intensity shows large regions with smooth low-level emission, but also a number of enhanced high-latitude features. Most of these regions have no counterpart in total intensity and indicate Faraday active regions.

  6. A Strong-Lens Survey in AEGIS: the Influence of Large Scale Structure

    SciTech Connect

    Moustakas, Leonidas A.; Marshall, Phil J.; Newman, Jeffrey A.; Coil, Alison L.; Cooper, Michael C.; Davis, Marc; Fassnacht, Christopher D.; Guhathakurta, Puragra; Hopkins, Andrew; Koekemoer, Anton; Konidaris, Nicholas P.; Lotz, Jennifer M.; Willmer, Christopher N.A.; /Arizona U., Astron. Dept. - Steward Observ.

    2006-07-14

    We report on the results of a visual search for galaxy-scale strong gravitational lenses over 650 arcmin2 of HST/ACS imaging in the Extended Groth Strip (EGS). These deep F606W- and F814W-band observations are in the DEEP2-EGS field. In addition to a previously-known Einstein Cross also found by our search (the ''Cross'', HSTJ141735+52264, with z{sub lens} = 0.8106 and a published z{sub source} = 3.40), we identify two new strong galaxy-galaxy lenses with multiple extended arcs. The first, HSTJ141820+52361 (the ''Dewdrop''; z{sub lens} = 0.5798), lenses two distinct extended sources into two pairs of arcs (z{sub source} = 0.9818 by nebular [O{sub II}] emission), while the second, HSTJ141833+52435 (the ''Anchor''; z{sub lens} = 0.4625), produces a single pair of arcs (source redshift not yet known). Four less convincing arc/counter-arc and two-image lens candidates are also found and presented for completeness. All three definite lenses are fit reasonably well by simple singular isothermal ellipsoid models including external shear, giving {chi}{sub {nu}}{sup 2}values close to unity. Using the three-dimensional line-of-sight (LOS) information on galaxies from the DEEP2 data, we calculate the convergence and shear contributions {kappa}{sub los} and {gamma}{sub los} to each lens, assuming singular isothermal sphere halos truncated at 200 h{sup -1} kpc. These are compared against a robust measure of local environment, {delta}{sub 3}, a normalized density that uses the distance to the third nearest neighbor. We find that even strong lenses in demonstrably underdense local environments may be considerably affected by LOS contributions, which in turn, under the adopted assumptions, may be underestimates of the effect of large scale structure.

  7. GLOBAL CLIMATE AND LARGE-SCALE INFLUENCES ON AQUATIC ANIMAL HEALTH

    EPA Science Inventory

    The last 3 decades have witnessed numerous large-scale mortality events of aquatic organisms in North America. Affected species range from ecologically-important sea urchins to commercially-valuable American lobsters and protected marine mammals. Short-term forensic investigation...

  8. Large-scale structure: the Chile-UK uv-excess quasar survey

    NASA Astrophysics Data System (ADS)

    Clowes, R. G.; Newman, P. R.; Campusano, L. E.; Graham, M. J.

    1996-12-01

    We report the first results from a new-generation survey for quasars using the 2(deg) -field, 128-fibre, multi-object spectrograph on the 2.5-m du Pont telescope at Las Campanas Observatory in Chile. Survey candidates are all objects with (U-B)<-0.3 and B<=19.7 on digitized UK Schmidt plates. The survey will cover 140 deg(2) and produce a homogeneous, magnitude-limited catalogue of ~ 1500 quasars with redshifts 0.4<= z<=2.2. We have so far surveyed 18.7 deg(2) and identified 183 quasars, including all 43 previously-published quasars within the selection criteria. The survey will be used to study in detail the large ( ~ 200h(-1) Mpc) quasar group discovered at z =~ 1.3 by Clowes & Campusano (1991, MNRAS, 249, 218) -- the largest known structure in the early Universe -- and to study the clustering of quasars in general. The group was found with sparse sampling of quasar candidates across 25 deg(2) ; it strikes the boundaries of this area. Our spectroscopic survey will include all candidates in an area around the group of 100 deg(2) , plus a 40 deg(2) control area ~ 34(deg) away. This survey should allow the determination of the full extent, membership and statistical significance of the group, using the MST method of Graham, Clowes and Campusano (1995, MNRAS, 275, 790). Preliminary analysis of our new data shows that the group persists with increased membership. The measurement of the density contrast of the quasar group will be compared with theoretical expectations, and so determine the consistency of the group with formation from Gaussian density fluctuations. We will search for sub-clustering in the group and test the hypothesis that all small-scale (<=10h(-1) Mpc) quasar clustering is attributable to large groups. Our sample will allow further investigation of the clustering of quasars in general. We will also identify and characterise any other large quasar groups in the survey using the MST method.

  9. A composite large-scale CO survey at high Galactic latitudes in the second quadrant

    NASA Technical Reports Server (NTRS)

    Heithausen, A.; Stacy, J. G.; Thaddeus, P.

    1990-01-01

    Surveys of the second quadrant of the Galaxy undertaken with the CfA 1.2-m telescope have been combined to produce a map of approximately 620 sq deg in the 2.6-mm CO (J = 1-0) line at high Galactic latitudes. CO was detected over about 13 percent of the region surveyed, an order of magnitude more gas by area than previously estimated. In contrast, only 26 percent of the area predicted by Desert et al. (1988) to contain molecular gas actually reveals CO, and about two-thirds of the clouds detected are not listed in their catalog of IR excess clouds.

  10. Nonparametric Bayesian Multiple Imputation for Incomplete Categorical Variables in Large-Scale Assessment Surveys

    ERIC Educational Resources Information Center

    Si, Yajuan; Reiter, Jerome P.

    2013-01-01

    In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…

  11. Children's Attitudes about Nuclear War: Results of Large-Scale Surveys of Adolescents.

    ERIC Educational Resources Information Center

    Doctor, Ronald M.; And Others

    A three-section survey instrument was developed to provide descriptive and expressive information about teenagers' attitudes and fear reactions related to the nuclear threat. The first section consisted of one open-ended statement, "Write down your three greatest worries." The second section consisted of 20 areas of potential worry or concern…

  12. Large-scale clustering of galaxies in the CfA Redshift Survey

    NASA Technical Reports Server (NTRS)

    Vogeley, Michael S.; Park, Changbom; Geller, Margaret J.; Huchra, John P.

    1992-01-01

    The power spectrum of the galaxy distribution in the Center for Astrophysics Redshift Survey (de Lapparent et al., 1986; Geller and Huchra, 1989; and Huchra et al., 1992) is measured up to wavelengths of 200/h Mpc. Results are compared with several cosmological simulations with Gaussian initial conditions. It is shown that the power spectrum of the standard CDM model is inconsistent with the observed power spectrum at the 99 percent confidence level.

  13. A composite large-scale CO survey at high galactic latitudes in the second quadrant

    NASA Technical Reports Server (NTRS)

    Heithausen, A.; Stacy, J. G.; De Vries, H. W.; Mebold, U.; Thaddeus, P.

    1993-01-01

    Surveys undertaken in the 2nd quadrant of the Galaxy with the CfA 1.2 m telescope have been combined to produce a map covering about 620 sq deg in the 2.6 mm CO(J = 1 - 0) line at high galactic latitudes. There is CO emission from molecular 'cirrus' clouds in about 13 percent of the region surveyed. The CO clouds are grouped together into three major cloud complexes with 29 individual members. All clouds are associated with infrared emission at 100 micron, although there is no one-to-one correlation between the corresponding intensities. CO emission is detected in all bright and dark Lynds' nebulae cataloged in that region; however not all CO clouds are visible on optical photographs as reflection or absorption features. The clouds are probably local. At an adopted distance of 240 pc cloud sizes range from O.1 to 30 pc and cloud masses from 1 to 1600 solar masses. The molecular cirrus clouds contribute between 0.4 and 0.8 M solar mass/sq pc to the surface density of molecular gas in the galactic plane. Only 26 percent of the 'infrared-excess clouds' in the area surveyed actually show CO and about 2/3 of the clouds detected in CO do not show an infrared excess.

  14. A composite large-scale CO survey at high galactic latitudes in the second quadrant

    NASA Astrophysics Data System (ADS)

    Heithausen, A.; Stacy, J. G.; de Vries, H. W.; Mebold, U.; Thaddeus, P.

    1993-02-01

    Surveys undertaken in the 2nd quadrant of the Galaxy with the CfA 1.2 m telescope have been combined to produce a map covering about 620 sq deg in the 2.6 mm CO(J = 1 - 0) line at high galactic latitudes. There is CO emission from molecular 'cirrus' clouds in about 13 percent of the region surveyed. The CO clouds are grouped together into three major cloud complexes with 29 individual members. All clouds are associated with infrared emission at 100 micron, although there is no one-to-one correlation between the corresponding intensities. CO emission is detected in all bright and dark Lynds' nebulae cataloged in that region; however not all CO clouds are visible on optical photographs as reflection or absorption features. The clouds are probably local. At an adopted distance of 240 pc cloud sizes range from O.1 to 30 pc and cloud masses from 1 to 1600 solar masses. The molecular cirrus clouds contribute between 0.4 and 0.8 M solar mass/sq pc to the surface density of molecular gas in the galactic plane. Only 26 percent of the 'infrared-excess clouds' in the area surveyed actually show CO and about 2/3 of the clouds detected in CO do not show an infrared excess.

  15. The Muenster Red Sky Survey: Large-scale structures in the universe

    NASA Astrophysics Data System (ADS)

    Ungruhe, R.; Seitter, W. C.; Duerbeck, H. W.

    2003-01-01

    We present a large-scale galaxy catalogue for the red spectral region which covers an area of 5 000 square degrees. It contains positions, red magnitudes, radii, ellipticities and position angles of about 5.5 million galaxies. Together with the APM catalogue (4,300 square degrees) in the blue spectral region, this catalogue forms at present the largest coherent data base for cosmological investigations in the southern hemisphere. 217 ESO Southern Sky Atlas R Schmidt plates with galactic latitudes -45 degrees were digitized with the two PDS microdensitometers of the Astronomisches Institut Münster, with a step width of 15 microns, corresponding to 1.01 arcseconds per pixel. All data were stored on different storage media and are available for further investigations. Suitable search parameters must be chosen in such a way that all objects are found on the plates, and that the percentage of artificial objects remains as low as possible. Based on two reference areas on different plates, a search threshold of 140 PDS density units and a minimum number of four pixels per object were chosen. The detected objects were stored, according to size, in frames of different size length. Each object was investigated in its frame, and 18 object parameters were determined. The classification of objects into stars, galaxies and perturbed objects was done with an automatic procedure which makes use of combinations of computed object parameters. In the first step, the perturbed objects are removed from the catalogue. Double objects and noise objects can be excluded on the basis of symmetry properties, while for satellite trails, a new classification criterium based on apparent magnitude, effective radius and apparent ellipticity, was developed. For the remaining objects, a star/galaxy separation was carried out. For bright objects, the relation between apparent magnitude and effective radius serves as the discriminating property, for fainter objects, the relation between effective

  16. Implementing large-scale workforce change: learning from 55 pilot sites of allied health workforce redesign in Queensland, Australia

    PubMed Central

    2013-01-01

    Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616

  17. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  18. A satellite geodetic survey of large-scale deformation of volcanic centres in the central Andes

    NASA Astrophysics Data System (ADS)

    Pritchard, Matthew E.; Simons, Mark

    2002-07-01

    Surface deformation in volcanic areas usually indicates movement of magma or hydrothermal fluids at depth. Stratovolcanoes tend to exhibit a complex relationship between deformation and eruptive behaviour. The characteristically long time spans between such eruptions requires a long time series of observations to determine whether deformation without an eruption is common at a given edifice. Such studies, however, are logistically difficult to carry out in most volcanic arcs, as these tend to be remote regions with large numbers of volcanoes (hundreds to even thousands). Here we present a satellite-based interferometric synthetic aperture radar (InSAR) survey of the remote central Andes volcanic arc, a region formed by subduction of the Nazca oceanic plate beneath continental South America. Spanning the years 1992 to 2000, our survey reveals the background level of activity of about 900 volcanoes, 50 of which have been classified as potentially active. We find four centres of broad (tens of kilometres wide), roughly axisymmetric surface deformation. None of these centres are at volcanoes currently classified as potentially active, although two lie within about 10km of volcanoes with known activity. Source depths inferred from the patterns of deformation lie between 5 and 17km. In contrast to the four new sources found, we do not observe any deformation associated with recent eruptions of Lascar, Chile.

  19. Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities

    NASA Astrophysics Data System (ADS)

    Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.

    2012-07-01

    Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.

  20. A satellite geodetic survey of large-scale deformation of volcanic centres in the central Andes.

    PubMed

    Pritchard, Matthew E; Simons, Mark

    2002-07-11

    Surface deformation in volcanic areas usually indicates movement of magma or hydrothermal fluids at depth. Stratovolcanoes tend to exhibit a complex relationship between deformation and eruptive behaviour. The characteristically long time spans between such eruptions requires a long time series of observations to determine whether deformation without an eruption is common at a given edifice. Such studies, however, are logistically difficult to carry out in most volcanic arcs, as these tend to be remote regions with large numbers of volcanoes (hundreds to even thousands). Here we present a satellite-based interferometric synthetic aperture radar (InSAR) survey of the remote central Andes volcanic arc, a region formed by subduction of the Nazca oceanic plate beneath continental South America. Spanning the years 1992 to 2000, our survey reveals the background level of activity of about 900 volcanoes, 50 of which have been classified as potentially active. We find four centres of broad (tens of kilometres wide), roughly axisymmetric surface deformation. None of these centres are at volcanoes currently classified as potentially active, although two lie within about 10 km of volcanoes with known activity. Source depths inferred from the patterns of deformation lie between 5 and 17 km. In contrast to the four new sources found, we do not observe any deformation associated with recent eruptions of Lascar, Chile. PMID:12110886

  1. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys.

    PubMed

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov-Malyshev-Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  2. Statistical analysis of large scale surveys for constraining the Galaxy evolution

    NASA Astrophysics Data System (ADS)

    Martins, A. M. M.; Robin, A. C.

    2014-07-01

    The formation and evolution of the thick disc of the Milky Way remain controversial. We make use of the Besançon Galaxy model, which among other utilities can be used for data interpretation and to test different scenarios of galaxy formation and evolution. We examine these questions by studying the metallicity distribution of the thin and thick disc with the help of a sample of Main Sequence turn-off stars from the SEGUE survey. We developed a tool based on a MCMC-ABC method to determine the metallicity distribution and study the correlation between the fitted parameters. We obtained a local metallicity of the thick disc of - 0.47 ± 0.03 dex similar to previous studies and the thick disc shows no gradient. A flat gradient in the thick disc can be a consequence of radial mixing or the result of a strong turbulent gaseous disc.

  3. Large-scale survey to describe acne management in Brazilian clinical practice

    PubMed Central

    Seité, Sophie; Caixeta, Clarice; Towersey, Loan

    2015-01-01

    Background Acne is a chronic disease of the pilosebaceous unit that mainly affects adolescents. It is the most common dermatological problem, affecting approximately 80% of teenagers between 12 and 18 years of age. Diagnosis is clinical and is based on the patient’s age at the time the lesions first appear, and on its polymorphism, type of lesions, and their anatomical location. The right treatment for the right patient is key to treating acne safely. The aim of this investigational survey was to evaluate how Brazilian dermatologists in private practice currently manage acne. Materials and methods Dermatologists practicing in 12 states of Brazil were asked how they manage patients with grades I, II, III, and IV acne. Each dermatologist completed a written questionnaire about patient characteristics, acne severity, and the therapy they usually prescribe for each situation. Results In total, 596 dermatologists were interviewed. Adolescents presented as the most common acneic population received by dermatologists, and the most common acne grade was grade II. The doctors could choose more than one type of treatment for each patient, and treatment choices varied according to acne severity. A great majority of dermatologists considered treatment with drugs as the first alternative for all acne grades, choosing either topical or oral presentation depending on the pathology severity. Dermocosmetics were chosen mostly as adjunctive therapy, and their inclusion in the treatment regimen decreased as acne grades increased. Conclusion This survey illustrates that Brazilian dermatologists employ complex treatment regimens to manage acne, choosing systemic drugs, particularly isotretinoin, even in some cases of grade I acne, and heavily prescribe antibiotics. Because complex regimens are harder for patients to comply with, this result notably raises the question of adherence, which is a key factor in successful treatment. PMID:26609243

  4. Studying Displacement After a Disaster Using Large Scale Survey Methods: Sumatra After the 2004 Tsunami

    PubMed Central

    Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan

    2014-01-01

    Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature. PMID:24839300

  5. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    SciTech Connect

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; Sanchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia -Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco -Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastian E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodriguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scoccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.

  6. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    DOE PAGESBeta

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; et al

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less

  7. SDSS-III Baryon Oscillation Spectroscopic Survey Data Release 12: galaxy target selection and large-scale structure catalogues

    NASA Astrophysics Data System (ADS)

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rita; White, Martin; Eisenstein, Daniel J.; Maraston, Claudia; Ross, Ashley J.; Sánchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia-Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco-Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastián E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodríguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scóccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2016-01-01

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. The code used, designated MKSAMPLE, is released with this paper.

  8. Large-scale distribution of surface ozone mixing ratio in southern Mongolia: A survey

    NASA Astrophysics Data System (ADS)

    Meixner, F. X.; Behrendt, T.; Ermel, M.; Hempelmann, N.; Andreae, M. O.; Jöckel, P.

    2012-04-01

    For the first time, measurements of surface ozone mixing ratio have been performed from semi-arid steppe to arid/hyper-arid southern Mongolian Gobi desert. During 12-29 August 2009, ozone mixing ratio was continuously measured from a mobile platform (4x4 Furgon SUV). The survey (3060 km / 229171km2) started at the Mongolian capital Ulaan-Baatar (47.9582° N, 107.0190° E ), heading to south-west (Echin Gol, 43.2586° N, 99.0255° E), eastward to Dalanzadgad (43.6061° N, 104.4445° E), and finally back to Ulaan-Baatar. Ambient air was sampled (approx. 1 l/min) through a 4 m long PTFE-intake line along a forward facing boom mounted on the roof of a 4x4 Furgon SUV. Ozone mixing ratio has been measured by UV-spectroscopy using a mobile dual-cell ozone analyzer (model 205, 2BTechnologies, Boulder, U.S.A.). While ozone signals were measured every 5 seconds, 1 minute averages and standard deviations have been calculated on-line and stored into the data logger. The latter are used to identify and to discriminate against unrealistic low or high ozone mixing ratios which have been due to occasionally passing plumes of vehicle exhaust and/or biomass burning gases, as well as gasoline (at gas filling stations). Even under desert conditions, the temporal behaviour of ozone mixing ratio was characterized by considerable and regular diel variations. Minimum mixing ratios (15-25 ppb) occurred early in the morning (approx. 06:00 local), when surface depletion of ozone (by dry deposition) can not be compensated by supply from the free troposphere due to thermodynamic stability of the nocturnal boundary layer. Late in the afternoon (approx. 17:00 local), under conditions of a turbulently well mixed convective boundary layer, maximum ozone mixing ratios (45-55 ppb) were reached. Daily amplitudes of the diel cycle of ozone mixing ratio were in the order of 30 ppb (steppe), 20 ppb (arid desert), to approx. 5 ppb (hyper-arid Gobi desert (Shargyn Gobi)). Ozone surface measurements were

  9. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications. PMID:18430292

  10. A global survey of martian central mounds: Central mounds as remnants of previously more extensive large-scale sedimentary deposits

    NASA Astrophysics Data System (ADS)

    Bennett, Kristen A.; Bell, James F.

    2016-01-01

    We conducted a survey of central mounds within large (>25 km diameter) impact craters on Mars. We use mound locations, mound offsets within their host craters, and relative mound heights to address and extend various mound formation hypotheses. The results of this survey support the hypothesis that mound sediments once filled their host craters and were later eroded into the features we observe today. The majority of mounds are located near the boundaries of previously identified large-scale sedimentary deposits. We discuss the implications of the hypothesis that central mounds are part of previously more extensive sedimentary units that filled and overtopped underlying impact craters. In this scenario, as erosion of the sedimentary unit occurred, the sediment within impact craters was preserved slightly longer than the overlying sediment because it was sheltered by the crater walls. Our study also reveals that most mounds are offset from the center of their host crater in the same direction as the present regional winds (e.g., the mounds in Arabia Terra are offset towards the western portion of their craters). We propose that this implies that wind has been the dominant agent causing the erosion of central mounds. Mound offset (r) is normalized to each crater's radius. The Mound offset (θ) is such that 0 is north and 270 is west.

  11. Pre- and Postnatal Influences on Preschool Mental Health: A Large-Scale Cohort Study

    ERIC Educational Resources Information Center

    Robinson, Monique; Oddy, Wendy H.; Li, Jianghong; Kendall, Garth E.; de Klerk, Nicholas H.; Silburn, Sven R.; Zubrick, Stephen R.; Newnham, John P.; Stanley, Fiona J.; Mattes, Eugen

    2008-01-01

    Background: Methodological challenges such as confounding have made the study of the early determinants of mental health morbidity problematic. This study aims to address these challenges in investigating antenatal, perinatal and postnatal risk factors for the development of mental health problems in pre-school children in a cohort of Western…

  12. AGN and QSOs in the eROSITA All-Sky Survey. II. The large-scale structure

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2013-10-01

    The four-year X-ray all-sky survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z ≈ 1 and a typical luminosity of L0.5-2.0 keV ~ 1044 ergs-1. We show that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure research. For the first time, detailed redshift- and luminosity-resolved studies of the bias factor for X-ray selected AGN will become possible. The eRASS AGN sample will not only improve the redshift- and luminosity resolution of these studies, but will also expand their luminosity range beyond L0.5-2.0 keV ~ 1044 ergs-1, thus enabling a direct comparison of the clustering properties of luminous X-ray AGN and optical quasars. These studies will dramatically improve our understanding of the AGN environment, triggering mechanisms, the growth of supermassive black holes and their co-evolution with dark matter halos. The eRASS AGN sample will become a powerful cosmological probe. It will enable detecting baryonic acoustic oscillations (BAOs) for the first time with X-ray selected AGN. With the data from the entire extragalactic sky, BAO will be detected at a ≳10σ confidence level in the full redshift range and with ~8σ confidence in the 0.8 < z < 2.0 range, which is currently not covered by any existing BAO surveys. To exploit the full potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  13. The SRG/eROSITA All-Sky Survey: A new era of large-scale structure studies with AGN

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2015-08-01

    The four-year X-ray All-Sky Survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma (SRG) satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z~1 and typical luminosity of L0.5-2.0keV ~ 1044 erg/s. We demonstrate that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure (LSS) studies.We show that with this sample of X-ray selected AGN, it will become possible for the first time to perform detailed redshift- and luminosity-resolved studies of the AGN clustering. This enable us to put strong constraints on different AGN triggering/fueling models as a function of AGN environment, which will dramatically improve our understanding of super-massive black hole growth and its correlation with the co-evolving LSS.Further, the eRASS AGN sample will become a powerful cosmological probe. We demonstrate for the first time that, given the breadth and depth of eRASS, it will become possible to convincingly detect baryonic acoustic oscillations (BAOs) with ~8σ confidence in the 0.8 < z < 2.0 range, currently uncovered by any existing BAO survey.Finally, we discuss the requirements for follow-up missions and demonstrate that in order to fully exploit the potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  14. Health-2000: an integrated large-scale expert system for the hospital of the future.

    PubMed

    Boyom, S F; Kwankam, S Y; Asoh, D A; Asaah, C; Kengne, F

    1997-02-01

    Decision making and management are problems which plague health systems in developing countries, particularly in Sub-Saharan Africa where there is significant waste of resources. The need goes beyond national health management information systems, to tools required in daily micro-management of various components of the health system. This paper describes an integrated expert system, Health-2000, an information-oriented tool for acquiring, processing and disseminating medical knowledge, data and decisions in the hospital of the future. It integrates six essential features of the medical care environment: personnel management, patient management, medical diagnosis, laboratory management, propharmacy, and equipment management. Disease conditions covered are the major tropical diseases. An intelligent tutoring feature completes the package. Emphasis is placed on the graphical user interface to facilitate interactions between the user and the system, which is developed for PCs using Pascal, C, Clipper and Prolog. PMID:9242002

  15. Awareness and Concern about Large-Scale Livestock and Poultry: Results from a Statewide Survey of Ohioans

    ERIC Educational Resources Information Center

    Sharp, Jeff; Tucker, Mark

    2005-01-01

    The development of large-scale livestock facilities has become a controversial issue in many regions of the U.S. in recent years. In this research, rural-urban differences in familiarity and concern about large-scale livestock facilities among Ohioans is examined as well as the relationship of social distance from agriculture and trust in risk…

  16. Health Benefits from Large-Scale Ozone Reduction in the United States

    PubMed Central

    Berman, Jesse D.; Fann, Neal; Hollingsworth, John W.; Pinkerton, Kent E.; Rom, William N.; Szema, Anthony M.; Breysse, Patrick N.; White, Ronald H.

    2012-01-01

    Background: Exposure to ozone has been associated with adverse health effects, including premature mortality and cardiopulmonary and respiratory morbidity. In 2008, the U.S. Environmental Protection Agency (EPA) lowered the primary (health-based) National Ambient Air Quality Standard (NAAQS) for ozone to 75 ppb, expressed as the fourth-highest daily maximum 8-hr average over a 24-hr period. Based on recent monitoring data, U.S. ozone levels still exceed this standard in numerous locations, resulting in avoidable adverse health consequences. Objectives: We sought to quantify the potential human health benefits from achieving the current primary NAAQS standard of 75 ppb and two alternative standard levels, 70 and 60 ppb, which represent the range recommended by the U.S. EPA Clean Air Scientific Advisory Committee (CASAC). Methods: We applied health impact assessment methodology to estimate numbers of deaths and other adverse health outcomes that would have been avoided during 2005, 2006, and 2007 if the current (or lower) NAAQS ozone standards had been met. Estimated reductions in ozone concentrations were interpolated according to geographic area and year, and concentration–response functions were obtained or derived from the epidemiological literature. Results: We estimated that annual numbers of avoided ozone-related premature deaths would have ranged from 1,410 to 2,480 at 75 ppb to 2,450 to 4,130 at 70 ppb, and 5,210 to 7,990 at 60 ppb. Acute respiratory symptoms would have been reduced by 3 million cases and school-loss days by 1 million cases annually if the current 75-ppb standard had been attained. Substantially greater health benefits would have resulted if the CASAC-recommended range of standards (70–60 ppb) had been met. Conclusions: Attaining a more stringent primary ozone standard would significantly reduce ozone-related premature mortality and morbidity. PMID:22809899

  17. Monitoring and Evaluating the Transition of Large-Scale Programs in Global Health

    PubMed Central

    Bao, James; Rodriguez, Daniela C; Paina, Ligia; Ozawa, Sachiko; Bennett, Sara

    2015-01-01

    Purpose: Donors are increasingly interested in the transition and sustainability of global health programs as priorities shift and external funding declines. Systematic and high-quality monitoring and evaluation (M&E) of such processes is rare. We propose a framework and related guiding questions to systematize the M&E of global health program transitions. Methods: We conducted stakeholder interviews, searched the peer-reviewed and gray literature, gathered feedback from key informants, and reflected on author experiences to build a framework on M&E of transition and to develop guiding questions. Findings: The conceptual framework models transition as a process spanning pre-transition and transition itself and extending into sustained services and outcomes. Key transition domains include leadership, financing, programming, and service delivery, and relevant activities that drive the transition in these domains forward include sustaining a supportive policy environment, creating financial sustainability, developing local stakeholder capacity, communicating to all stakeholders, and aligning programs. Ideally transition monitoring would begin prior to transition processes being implemented and continue for some time after transition has been completed. As no set of indicators will be applicable across all types of health program transitions, we instead propose guiding questions and illustrative quantitative and qualitative indicators to be considered and adapted based on the transition domains identified as most important to the particular health program transition. The M&E of transition faces new and unique challenges, requiring measuring constructs to which evaluators may not be accustomed. Many domains hinge on measuring “intangibles” such as the management of relationships. Monitoring these constructs may require a compromise between rigorous data collection and the involvement of key stakeholders. Conclusion: Monitoring and evaluating transitions in global

  18. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  19. Large-Scale Survey Findings Inform Patients’ Experiences in Using Secure Messaging to Engage in Patient-Provider Communication and Self-Care Management: A Quantitative Assessment

    PubMed Central

    Patel, Nitin R; Lind, Jason D; Antinori, Nicole

    2015-01-01

    Background Secure email messaging is part of a national transformation initiative in the United States to promote new models of care that support enhanced patient-provider communication. To date, only a limited number of large-scale studies have evaluated users’ experiences in using secure email messaging. Objective To quantitatively assess veteran patients’ experiences in using secure email messaging in a large patient sample. Methods A cross-sectional mail-delivered paper-and-pencil survey study was conducted with a sample of respondents identified as registered for the Veteran Health Administrations’ Web-based patient portal (My HealtheVet) and opted to use secure messaging. The survey collected demographic data, assessed computer and health literacy, and secure messaging use. Analyses conducted on survey data include frequencies and proportions, chi-square tests, and one-way analysis of variance. Results The majority of respondents (N=819) reported using secure messaging 6 months or longer (n=499, 60.9%). They reported secure messaging to be helpful for completing medication refills (n=546, 66.7%), managing appointments (n=343, 41.9%), looking up test results (n=350, 42.7%), and asking health-related questions (n=340, 41.5%). Notably, some respondents reported using secure messaging to address sensitive health topics (n=67, 8.2%). Survey responses indicated that younger age (P=.039) and higher levels of education (P=.025) and income (P=.003) were associated with more frequent use of secure messaging. Females were more likely to report using secure messaging more often, compared with their male counterparts (P=.098). Minorities were more likely to report using secure messaging more often, at least once a month, compared with nonminorities (P=.086). Individuals with higher levels of health literacy reported more frequent use of secure messaging (P=.007), greater satisfaction (P=.002), and indicated that secure messaging is a useful (P=.002) and easy

  20. A Conceptual Framework for Allocation of Federally Stockpiled Ventilators During Large-Scale Public Health Emergencies.

    PubMed

    Zaza, Stephanie; Koonin, Lisa M; Ajao, Adebola; Nystrom, Scott V; Branson, Richard; Patel, Anita; Bray, Bruce; Iademarco, Michael F

    2016-01-01

    Some types of public health emergencies could result in large numbers of patients with respiratory failure who need mechanical ventilation. Federal public health planning has included needs assessment and stockpiling of ventilators. However, additional federal guidance is needed to assist states in further allocating federally supplied ventilators to individual hospitals to ensure that ventilators are shipped to facilities where they can best be used during an emergency. A major consideration in planning is a hospital's ability to absorb additional ventilators, based on available space and staff expertise. A simple pro rata plan that does not take these factors into account might result in suboptimal use or unused scarce resources. This article proposes a conceptual framework that identifies the steps in planning and an important gap in federal guidance regarding the distribution of stockpiled mechanical ventilators during an emergency. PMID:26828799

  1. Engaging in large-scale digital health technologies and services. What factors hinder recruitment?

    PubMed

    O'Connor, Siobhan; Mair, Frances S; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; O'Donnell, Kate

    2015-01-01

    Implementing consumer oriented digital health products and services at scale is challenging and a range of barriers to reaching and recruiting users to these types of solutions can be encountered. This paper describes the experience of implementers with the rollout of the Delivering Assisted Living Lifestyles at Scale (dallas) programme. The findings are based on qualitative analysis of baseline and midpoint interviews and project documentation. Eight main themes emerged as key factors which hindered participation. These include how the dallas programme was designed and operationalised, constraints imposed by partnerships, technology, branding, and recruitment strategies, as well as challenges with the development cycle and organisational culture. PMID:25991155

  2. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    PubMed Central

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K.

    2013-01-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  3. Perspectives on clinical informatics: integrating large-scale clinical, genomic, and health information for clinical care.

    PubMed

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K; Chung, Yeun-Jun

    2013-12-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  4. LARGE-SCALE STAR-FORMATION-DRIVEN OUTFLOWS AT 1 < z < 2 IN THE 3D-HST SURVEY

    SciTech Connect

    Lundgren, Britt F.; Van Dokkum, Pieter; Bezanson, Rachel; Momcheva, Ivelina; Nelson, Erica; Skelton, Rosalind E.; Wake, David; Whitaker, Katherine; Brammer, Gabriel; Franx, Marijn; Fumagalli, Mattia; Labbe, Ivo; Patel, Shannon; Da Cunha, Elizabete; Rix, Hans Walter; Schmidt, Kasper; Erb, Dawn K.; Fan Xiaohui; Kriek, Mariska; Marchesini, Danilo; and others

    2012-11-20

    We present evidence of large-scale outflows from three low-mass (log(M {sub *}/M {sub Sun }) {approx} 9.75) star-forming (SFR > 4 M {sub Sun} yr{sup -1}) galaxies observed at z = 1.24, z = 1.35, and z = 1.75 in the 3D-HST Survey. Each of these galaxies is located within a projected physical distance of 60 kpc around the sight line to the quasar SDSS J123622.93+621526.6, which exhibits well-separated strong (W {sup {lambda}2796} {sub r} {approx}> 0.8 A) Mg II absorption systems matching precisely to the redshifts of the three galaxies. We derive the star formation surface densities from the H{alpha} emission in the WFC3 G141 grism observations for the galaxies and find that in each case the star formation surface density well exceeds 0.1 M {sub Sun} yr{sup -1} kpc{sup -2}, the typical threshold for starburst galaxies in the local universe. From a small but complete parallel census of the 0.65 < z < 2.6 galaxies with H {sub 140} {approx}< 24 proximate to the quasar sight line, we detect Mg II absorption associated with galaxies extending to physical distances of 130 kpc. We determine that the W{sub r} > 0.8 A Mg II covering fraction of star-forming galaxies at 1 < z < 2 may be as large as unity on scales extending to at least 60 kpc, providing early constraints on the typical extent of starburst-driven winds around galaxies at this redshift. Our observations additionally suggest that the azimuthal distribution of W{sub r} > 0.4 A Mg II absorbing gas around star-forming galaxies may evolve from z {approx} 2 to the present, consistent with recent observations of an increasing collimation of star-formation-driven outflows with time from z {approx} 3.

  5. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  6. Large-scale latitude distortions of the inner Milky Way disk from the Herschel/Hi-GAL Survey

    NASA Astrophysics Data System (ADS)

    Molinari, S.; Noriega-Crespo, A.; Bally, J.; Moore, T. J. T.; Elia, D.; Schisano, E.; Plume, R.; Swinyard, B.; Di Giorgio, A. M.; Pezzuto, S.; Benedettini, M.; Testi, L.

    2016-04-01

    -infrared catalogues are filtered according to criteria that primarily select Young Stellar Objects (YSOs). Conclusions: The distortions of the Galactic inner disk revealed by Herschel confirm previous findings from CO surveys and HII/OB source counts but with much greater statistical significance and are interpreted as large-scale bending modes of the plane. The lack of similar distortions in tracers of more evolved YSOs or stars rules out gravitational instabilities or satellite-induced perturbations, because they should act on both the diffuse and stellar disk components. We propose that the observed bends are caused by incoming flows of extra-planar gas from the Galactic fountain or the Galactic halo interacting with the gaseous disk. With a much lower cross-section, stars decouple from the gaseous ISM and relax into the stellar disk potential. The timescale required for the disappearance of the distortions from the diffuse ISM to the relatively evolved YSO stages are compatible with star formation timescales.

  7. A LARGE-SCALE CLUSTER RANDOMIZED TRIAL TO DETERMINE THE EFFECTS OF COMMUNITY-BASED DIETARY SODIUM REDUCTION – THE CHINA RURAL HEALTH INITIATIVE SODIUM REDUCTION STUDY

    PubMed Central

    Li, Nicole; Yan, Lijing L.; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce

    2013-01-01

    Background Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. Design This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24 hour urine. Trial status The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. Discussion The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. PMID:24176436

  8. Evaluating a Large-Scale Community-Based Intervention to Improve Pregnancy and Newborn Health Among the Rural Poor in India.

    PubMed

    Acharya, Arnab; Lalwani, Tanya; Dutta, Rahul; Rajaratnam, Julie Knoll; Ruducha, Jenny; Varkey, Leila Caleb; Wunnava, Sita; Menezes, Lysander; Taylor, Catharine; Bernson, Jeff

    2015-01-01

    Objectives. We evaluated the effectiveness of the Sure Start project, which was implemented in 7 districts of Uttar Pradesh, India, to improve maternal and newborn health. Methods. Interventions were implemented at 2 randomly assigned levels of intensity. Forty percent of the areas received a more intense intervention, including community-level meetings with expectant mothers. A baseline survey consisted of 12 000 women who completed pregnancy in 2007; a follow-up survey was conducted for women in 2010 in the same villages. Our quantitative analyses provide an account of the project's impact. Results. We observed significant health improvements in both intervention areas over time; in the more intensive intervention areas, we found greater improvements in care-seeking and healthy behaviors. The more intensive intervention areas did not experience a significantly greater decline in neonatal mortality. Conclusions. This study demonstrates that community-based efforts, especially mothers' group meetings designed to increase care-seeking and healthy behaviors, are effective and can be implemented at large scale. PMID:25393175

  9. Prevalence and determinants of child maltreatment among high school students in Southern China: A large scale school based survey

    PubMed Central

    Leung, Phil WS; Wong, William CW; Chen, WQ; Tang, Catherine SK

    2008-01-01

    Background Child maltreatment can cause significant physical and psychological problems. The present study aimed to investigate the prevalence and determinants of child maltreatment in Guangzhou, China, where such issues are often considered a taboo subject. Methods A school-based survey was conducted in southern China in 2005. 24 high schools were selected using stratified random sampling strategy based on their districts and bandings. The self-administered validated Chinese version of parent-child Conflict Tactics Scale (CTSPC) was used as the main assessment tool to measure the abusive experiences encountered by students in the previous six months. Results The response rate of this survey was 99.7%. Among the 6592 responding students, the mean age was 14.68. Prevalence of parental psychological aggression, corporal punishment, severe and very serve physical maltreatment in the past 6 months were 78.3%, 23.2%, 15.1% and 2.8% respectively. The prevalence of sexual abuse is 0.6%. The most commonly cited reasons for maltreatment included 'disobedience to parents', 'poor academic performance', and 'quarrelling between parents'. Age, parental education, places of origins and types of housing were found to be associated with physical maltreatments whereas gender and fathers' education level were associated with sexual abuse. Conclusion Though largely unspoken, child maltreatment is a common problem in China. Identification of significant determinants in this study can provide valuable information for teachers and health professionals so as to pay special attention to those at-risk children. PMID:18823544

  10. Prevalence of disability in Manikganj district of Bangladesh: results from a large-scale cross-sectional survey

    PubMed Central

    Zaman, M Mostafa; Mashreky, Saidur Rahman

    2016-01-01

    Objective To conduct a comprehensive survey on disability to determine the prevalence and distribution of cause-specific disability among residents of the Manikganj district in Bangladesh. Methods The survey was conducted in Manikganj, a typical district in Bangladesh, in 2009. Data were collected from 37 030 individuals of all ages. Samples were drawn from 8905 households from urban and rural areas proportionate to population size. Three sets of interviewer-administered questionnaires were used separately for age groups 0–1 years, 2–10 years and 11 years and above to collect data. For the age groups 0–1 years and 2–10 years, the parents or the head of the household were interviewed to obtain the responses. Impairments, activity limitations and restriction of participation were considered in defining disability consistent with the International Classification of Functioning, Disability and Health framework. Results Overall, age-standardised prevalence of disability per 1000 was 46.5 (95% CI 44.4 to 48.6). Prevalence was significantly higher among respondents living in rural areas (50.2; 95% CI 47.7 to 52.7) than in urban areas (31.0; 95% CI 27.0 to 35.0). Overall, female respondents had more disability (50.0; 95% CI 46.9 to 53.1) than male respondents (43.4; 95% CI 40.5 to 46.3). Educational deprivation was closely linked to higher prevalence of disability. Commonly reported prevalences (per 1000) for underlying causes of disability were 20.2 for illness, followed by 9.4 for congenital causes and 6.8 for injury, and these were consistent in males and females. Conclusions Disability is a common problem in this typical district of Bangladesh, which is largely generalisable. Interventions at community level with special attention to the socioeconomically deprived are warranted. PMID:27431897

  11. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  12. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  13. Adult Siblings of Individuals with Down Syndrome versus with Autism: Findings from a Large-Scale US Survey

    ERIC Educational Resources Information Center

    Hodapp, R. M.; Urbano, R. C.

    2007-01-01

    Background: As adults with Down syndrome live increasingly longer lives, their adult siblings will most likely assume caregiving responsibilities. Yet little is known about either the sibling relationship or the general functioning of these adult siblings. Using a national, web-based survey, this study compared adult siblings of individuals with…

  14. Large-Scale Survey of Chinese Precollege Students' Epistemological Beliefs about Physics: A Progression or a Regression?

    ERIC Educational Resources Information Center

    Zhang, Ping; Ding, Lin

    2013-01-01

    This paper reports a cross-grade comparative study of Chinese precollege students' epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade…

  15. An evaluation of two large scale demand side financing programs for maternal health in India: the MATIND study protocol

    PubMed Central

    2012-01-01

    Background High maternal mortality in India is a serious public health challenge. Demand side financing interventions have emerged as a strategy to promote access to emergency obstetric care. Two such state run programs, Janani Suraksha Yojana (JSY)and Chiranjeevi Yojana (CY), were designed and implemented to reduce financial access barriers that preclude women from obtaining emergency obstetric care. JSY, a conditional cash transfer, awards money directly to a woman who delivers in a public health facility. This will be studied in Madhya Pradesh province. CY, a voucher based program, empanels private obstetricians in Gujarat province, who are reimbursed by the government to perform deliveries of socioeconomically disadvantaged women. The programs have been in operation for the last seven years. Methods/designs The study outlined in this protocol will assess and compare the influence of the two programs on various aspects of maternal health care including trends in program uptake, institutional delivery rates, maternal and neonatal outcomes, quality of care, experiences of service providers and users, and cost effectiveness. The study will collect primary data using a combination of qualitative and quantitative methods, including facility level questionnaires, observations, a population based survey, in-depth interviews, and focus group discussions. Primary data will be collected in three districts of each province. The research will take place at three levels: the state health departments, obstetric facilities in the districts and among recently delivered mothers in the community. Discussion The protocol is a comprehensive assessment of the performance and impact of the programs and an economic analysis. It will fill existing evidence gaps in the scientific literature including access and quality to services, utilization, coverage and impact. The implementation of the protocol will also generate evidence to facilitate decision making among policy makers and

  16. Assessing large-scale surveyor variability in the historic forest data of the original U.S. Public Land Survey

    USGS Publications Warehouse

    Manies, K.L.; Mladenoff, D.J.; Nordheim, E.V.

    2001-01-01

    The U.S. General Land Office Public Land Survey (PLS) records are a valuable resource for studying pre-European settlement vegetation. However, these data were taken for legal, not ecological, purposes. In turn, the instructions the surveyors followed affected the data collected. For this reason, it has been suggested that the PLS data may not truly represent the surveyed landscapes. This study examined the PLS data of northern Wisconsin, U.S.A., to determine the extent of variability among surveyors. We statistically tested for differences among surveyors in recorded tree species, size, location, and distance from the survey point. While we cannot rule out effects from other influences (e.g., environmental factors), we found evidence suggesting some level of surveyor bias for four of five variables, including tree species and size. The PLS data remain one of the best records of pre-European settlement vegetation available. However, based on our findings, we recommend that projects using PLS records examine these data carefully. This assessment should include not only the choice of variables to be studied but also the spatial extent at which the data will be examined.

  17. Abuse of Medications Employed for the Treatment of ADHD: Results From a Large-Scale Community Survey

    PubMed Central

    Bright, George M.

    2008-01-01

    Objective The objective is to assess abuse of prescription and illicit stimulants among individuals being treated for attention-deficit/hyperactivity disorder (ADHD). Methods A survey was distributed to patients enrolled in an ADHD treatment center. It included questions designed to gain information about demographics; ADHD treatment history; illicit drug use; and misuse of prescribed stimulant medications, including type of stimulant medication most frequently misused or abused, and how the stimulant was prepared and administered. Results A total of 545 subjects (89.2% with ADHD) were included in the survey. Results indicated that 14.3% of respondents abused prescription stimulants. Of these, 79.8% abused short-acting agents; 17.2% abused long-acting stimulants; 2.0% abused both short- and long-acting agents; and 1.0% abused other agents. The specific medications abused most often were mixed amphetamine salts (Adderall; 40.0%), mixed amphetamine salts extended release (Adderall XR; 14.2%), and methylphenidate (Ritalin; 15.0%), and the most common manner of stimulant abuse was crushing pills and snorting (75.0%). Survey results also showed that 39.1% of respondents used nonprescription stimulants, most often cocaine (62.2%), methamphetamine (4.8%), and both cocaine and amphetamine (31.1%). Choice of illicit drug was based on rapidity of high onset (43.5%), ease of acquisition (40.7%), ease of use (10.2%), and cost (5.5%). Conclusions The risks for abuse of prescription and illicit stimulants are elevated among individuals being treated in an ADHD clinic. Prescription agents used most often are those with pharmacologic and pharmacokinetic characteristics that provide a rapid high. This suggests that long-acting stimulant preparations that have been developed for the treatment of ADHD may have lower abuse potential than short-acting formulations. PMID:18596945

  18. Future Large-Scale Surveys of 'Interesting' Stars in The Halo and Thick Disk of the Galaxy

    NASA Astrophysics Data System (ADS)

    Beers, T. C.

    The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the numbers of interesting stars with available spectroscopic and photometric information. The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the

  19. Large-scale survey of Chinese precollege students' epistemological beliefs about physics: A progression or a regression?

    NASA Astrophysics Data System (ADS)

    Zhang, Ping; Ding, Lin

    2013-06-01

    This paper reports a cross-grade comparative study of Chinese precollege students’ epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade to the 12th grade in China. The original CLASS was translated into Mandarin through a rigorous transadaption process, and then it was administered as a pencil-and-paper in-class survey to a total of 1318 students across all the five grade levels (8-12). Our results showed that although in general student epistemological beliefs became less expertlike after receiving more years of traditional instruction (a trend consistent with what was reported in the previous literature), the cross-grade change was not a monotonous decrease. Instead, students at grades 9 and 12 showed a slight positive shift in their beliefs measured by CLASS. Particularly, when compared to the 8th graders, students at the 9th grade demonstrated a significant increase in their views about the conceptual nature of physics and problem-solving sophistication. We hypothesize that both pedagogical and nonpedagogical factors may have contributed to these positive changes. Our results cast light on the complex nature of the relationship between formal instruction and student epistemological beliefs.

  20. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  1. A Large-Scale, Low-Frequency Murchison Widefield Array Survey of Galactic H ii Regions between 260 < l < 340

    NASA Astrophysics Data System (ADS)

    Hindson, L.; Johnston-Hollitt, M.; Hurley-Walker, N.; Callingham, J. R.; Su, H.; Morgan, J.; Bell, M.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Deshpande, A. A.; Dwarakanath, K. S.; For, B.-Q.; Gaensler, B. M.; Greenhill, L. J.; Hancock, P.; Hazelton, B. J.; Kapińska, A. D.; Kaplan, D. L.; Lenc, E.; Lonsdale, C. J.; Mckinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Oberoi, D.; Offringa, A.; Ord, S. M.; Procopio, P.; Prabu, T.; Shankar, N. Udaya; Srivani, K. S.; Staveley-Smith, L.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.; Wu, C.; Zheng, Q.

    2016-05-01

    We have compiled a catalogue of H ii regions detected with the Murchison Widefield Array between 72 and 231 MHz. The multiple frequency bands provided by the Murchison Widefield Array allow us identify the characteristic spectrum generated by the thermal Bremsstrahlung process in H ii regions. We detect 306 H ii regions between 260° < l < 340° and report on the positions, sizes, peak, integrated flux density, and spectral indices of these H ii regions. By identifying the point at which H ii regions transition from the optically thin to thick regime, we derive the physical properties including the electron density, ionised gas mass, and ionising photon flux, towards 61 H ii regions. This catalogue of H ii regions represents the most extensive and uniform low frequency survey of H ii regions in the Galaxy to date.

  2. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  3. DEMOGRAPHIC AND HEALTH SURVEYS

    EPA Science Inventory

    Demographic and Health Surveys are nationally representative household surveys with large sample sizes of between 5,000 and 30,000 households, typically. DHS surveys provide data for a wide range of monitoring and impact evaluation indicators in the areas of population, health, a...

  4. Evaluation of airborne geophysical surveys for large-scale mapping of contaminated mine pools: draft final report

    SciTech Connect

    Hammack, R. W.

    2006-12-28

    Decades of underground coal mining has left about 5,000 square miles of abandoned mine workings that are rapidly filling with water. The water quality of mine pools is often poor; environmental regulatory agencies are concerned because water from mine pools could contaminate diminishing surface and groundwater supplies. Mine pools are also a threat to the safety of current mining operations. Conversely, mine pools are a large, untapped water resource that, with treatment, could be used for a variety of industrial purposes. Others have proposed using mine pools in conjunction with heat pumps as a source of heating and cooling for large industrial facilities. The management or use of mine pool water requires accurate maps of mine pools. West Virginia University has predicted the likely location and volume of mine pools in the Pittsburgh Coalbed using existing mine maps, structure contour maps, and measured mine pool elevations. Unfortunately, mine maps only reflect conditions at the time of mining, are not available for all mines, and do not always denote the maximum extent of mining. Since 1999, the National Energy Technology Laboratory (NETL) has been evaluating helicopter-borne, electromagnetic sensing technologies for the detection and mapping of mine pools. Frequency domain electromagnetic sensors are able to detect shallow mine pools (depth < 50 m) if there is sufficient contrast between the conductance of the mine pool and the conductance of the overburden. The mine pools (conductors) most confidently detected by this technology are overlain by thick, resistive sandstone layers. In 2003, a helicopter time domain electromagnetic sensor was applied to mined areas in southwestern Virginia in an attempt to increase the depth of mine pool detection. This study failed because the mine pool targets were thin and not very conductive. Also, large areas of the surveys were degraded or made unusable by excessive amounts of cultural electromagnetic noise that obscured the

  5. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  6. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Technical Reports Server (NTRS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-01-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  7. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Astrophysics Data System (ADS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  8. Galaxy evolution and large-scale structure in the far-infrared. II. The IRAS faint source survey

    SciTech Connect

    Lonsdale, C.J.; Hacking, P.B.; Conrow, T.P.; Rowan-Robinson, M. Queen Mary College, London )

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling. 105 refs.

  9. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  10. Assessing outcomes of large-scale public health interventions in the absence of baseline data using a mixture of Cox and binomial regressions

    PubMed Central

    2014-01-01

    Background Large-scale public health interventions with rapid scale-up are increasingly being implemented worldwide. Such implementation allows for a large target population to be reached in a short period of time. But when the time comes to investigate the effectiveness of these interventions, the rapid scale-up creates several methodological challenges, such as the lack of baseline data and the absence of control groups. One example of such an intervention is Avahan, the India HIV/AIDS initiative of the Bill & Melinda Gates Foundation. One question of interest is the effect of Avahan on condom use by female sex workers with their clients. By retrospectively reconstructing condom use and sex work history from survey data, it is possible to estimate how condom use rates evolve over time. However formal inference about how this rate changes at a given point in calendar time remains challenging. Methods We propose a new statistical procedure based on a mixture of binomial regression and Cox regression. We compare this new method to an existing approach based on generalized estimating equations through simulations and application to Indian data. Results Both methods are unbiased, but the proposed method is more powerful than the existing method, especially when initial condom use is high. When applied to the Indian data, the new method mostly agrees with the existing method, but seems to have corrected some implausible results of the latter in a few districts. We also show how the new method can be used to analyze the data of all districts combined. Conclusions The use of both methods can be recommended for exploratory data analysis. However for formal statistical inference, the new method has better power. PMID:24397563

  11. Large Scale Computing

    NASA Astrophysics Data System (ADS)

    Capiluppi, Paolo

    2005-04-01

    Large Scale Computing is acquiring an important role in the field of data analysis and treatment for many Sciences and also for some Social activities. The present paper discusses the characteristics of Computing when it becomes "Large Scale" and the current state of the art for some particular application needing such a large distributed resources and organization. High Energy Particle Physics (HEP) Experiments are discussed in this respect; in particular the Large Hadron Collider (LHC) Experiments are analyzed. The Computing Models of LHC Experiments represent the current prototype implementation of Large Scale Computing and describe the level of maturity of the possible deployment solutions. Some of the most recent results on the measurements of the performances and functionalities of the LHC Experiments' testing are discussed.

  12. Research on the Second Region of Sino-German 6 cm Polarization Survey of the Galactic Plane and Large-scale Supernova Remnants

    NASA Astrophysics Data System (ADS)

    Xiao, L.

    2011-11-01

    Polarization observation provides a useful tool to study the properties of interstellar medium (ISM). It could directly show the orientation of large-scale magnetic fields, and help us understand the structure of large-scale magnetic field in our Galaxy and the evolution of supernova remnants (SNRs). Moreover, combing with polarization observations at other wavelengths, the Faraday rotation could be applied to study the properties of the thermal electron density, filling factor, regular and random magnetic fields in ISM and SNRs.The previous polarization measurements mostly conducted at low frequencies were significantly influenced by the Faraday effects of ISM, while at 6 cm, they are much less affected and polarized emission from larger distances could be detected. By studying Faraday screens, we could explore the physical parameters of the sources as well as the synchrotron emissivities of the Galaxy. The 6 cm total intensity measurements are the key data to clarify the spectrum behavior of diffused emission or individual objects at high frequencies, and help us understand the distribution of relativistic electrons, the disk-halo interaction and the evolution of late-stage SNRs. In August 2009, the project of 6~cm continuum and polarization survey of Galactic plane had been completed successfully using the 25~m radio telescope at Urumqi. The work presented in this thesis is mainly based on data analysis of the second survey region with 60° ≤ l ≤129° and |b|≤5°. We tried to compensate the missing large-scale structures by extrapolating the WMAP K-band polarization data with the spectral index model and simulation of the rotation measures (RMs). By comparing the maps pre- with post-``calibration'', we studied the extended objects in this region. We analyzed the depolarization structure at the periphery of HII region complex using Faraday screen model, and studied the distribution of fluctuation in the entire survey region using structure functions

  13. [The benefit of large-scale cohort studies for health research: the example of the German National Cohort].

    PubMed

    Ahrens, Wolfgang; Jöckel, K-H

    2015-08-01

    The prospective nature of large-scale epidemiological multi-purpose cohort studies with long observation periods facilitates the search for complex causes of diseases, the analysis of the natural history of diseases and the identification of novel pre-clinical markers of disease. The German National Cohort (GNC) is a population-based, highly standardised and in-depth phenotyped cohort. It shall create the basis for new strategies for risk assessment and identification, early diagnosis and prevention of multifactorial diseases. The GNC is the largest population-based cohort study in Germany to date. In the year 2014 the examination of 200,000 women and men aged 20-69 years started in 18 study centers. The study facilitates the investigation of the etiology of chronic diseases in relation to lifestyle, genetic, socioeconomic, psychosocial and environmental factors. By this the GNC creates the basis for the development of methods for early diagnosis and prevention of these diseases. Cardiovascular and respiratory diseases, cancer, diabetes, neurodegenerative/-psychiatric diseases, musculoskeletal and infectious diseases are in focus of this study. Due to its mere size, the study could be characterized as a Big Data project. We deduce that this is not the case. PMID:26077870

  14. Health Occupations Survey.

    ERIC Educational Resources Information Center

    Willett, Lynn H.

    A survey was conducted to determine the need for health occupations personnel in the Moraine Valley Community College district, specifically to: (1) describe present employment for selected health occupations; (2) project health occupation employment to 1974; (3) identify the supply of applicants for the selected occupations; and (4) identify…

  15. Evolution of clustering length, large-scale bias, and host halo mass at 2 < z < 5 in the VIMOS Ultra Deep Survey (VUDS)⋆

    NASA Astrophysics Data System (ADS)

    Durkalec, A.; Le Fèvre, O.; Pollo, A.; de la Torre, S.; Cassata, P.; Garilli, B.; Le Brun, V.; Lemaux, B. C.; Maccagni, D.; Pentericci, L.; Tasca, L. A. M.; Thomas, R.; Vanzella, E.; Zamorani, G.; Zucca, E.; Amorín, R.; Bardelli, S.; Cassarà, L. P.; Castellano, M.; Cimatti, A.; Cucciati, O.; Fontana, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Ilbert, O.; Paltani, S.; Ribeiro, B.; Schaerer, D.; Scodeggio, M.; Sommariva, V.; Talia, M.; Tresse, L.; Vergani, D.; Capak, P.; Charlot, S.; Contini, T.; Cuby, J. G.; Dunlop, J.; Fotopoulou, S.; Koekemoer, A.; López-Sanjuan, C.; Mellier, Y.; Pforr, J.; Salvato, M.; Scoville, N.; Taniguchi, Y.; Wang, P. W.

    2015-11-01

    We investigate the evolution of galaxy clustering for galaxies in the redshift range 2.0 Survey (VUDS). We present the projected (real-space) two-point correlation function wp(rp) measured by using 3022 galaxies with robust spectroscopic redshifts in two independent fields (COSMOS and VVDS-02h) covering in total 0.8deg2. We quantify how the scale dependent clustering amplitude r0 changes with redshift making use of mock samples to evaluate and correct the survey selection function. Using a power-law model ξ(r) = (r/r0)- γ we find that the correlation function for the general population is best fit by a model with a clustering length r0 = 3.95+0.48-0.54 h-1 Mpc and slope γ = 1.8+0.02-0.06 at z ~ 2.5, r0 = 4.35 ± 0.60 h-1 Mpc and γ = 1.6+0.12-0.13 at z ~ 3.5. We use these clustering parameters to derive the large-scale linear galaxy bias bLPL, between galaxies and dark matter. We find bLPL = 2.68 ± 0.22 at redshift z ~ 3 (assuming σ8 = 0.8), significantly higher than found at intermediate and low redshifts for the similarly general galaxy populations. We fit a halo occupation distribution (HOD) model to the data and we obtain that the average halo mass at redshift z ~ 3 is Mh = 1011.75 ± 0.23 h-1M⊙. From this fit we confirm that the large-scale linear galaxy bias is relatively high at bLHOD = 2.82 ± 0.27. Comparing these measurements with similar measurements at lower redshifts we infer that the star-forming population of galaxies at z ~ 3 should evolve into the massive and bright (Mr< -21.5)galaxy population, which typically occupy haloes of mass ⟨ Mh ⟩ = 1013.9 h-1M⊙ at redshift z = 0. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, under Large Program 185.A-0791.Appendices are available in electronic form at http://www.aanda.org

  16. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    PubMed

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland. PMID:23920887

  17. The VIMOS Public Extragalactic Redshift Survey (VIPERS). An unprecedented view of galaxies and large-scale structure at 0.5 < z < 1.2

    NASA Astrophysics Data System (ADS)

    Guzzo, L.; Scodeggio, M.; Garilli, B.; Granett, B. R.; Fritz, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bolzonella, M.; Bottini, D.; Branchini, E.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; De Lucia, G.; de la Torre, S.; Franzetti, P.; Fumana, M.; Hudelot, P.; Ilbert, O.; Iovino, A.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Peacock, J. A.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zamorani, G.; Zanichelli, A.; Burden, A.; Di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Moscardini, L.; Nichol, R. C.; Percival, W. J.; Phleps, S.; Wolk, M.

    2014-06-01

    We describe the construction and general features of VIPERS, the VIMOS Public Extragalactic Redshift Survey. This ESO Large Programme is using the Very Large Telescope with the aim of building a spectroscopic sample of ~ 100 000 galaxies with iAB< 22.5 and 0.5 survey covers a total area of ~ 24 deg2 within the CFHTLS-Wide W1 and W4 fields. VIPERS is designed to address a broad range of problems in large-scale structure and galaxy evolution, thanks to a unique combination of volume (~ 5 × 107h-3 Mpc3) and sampling rate (~ 40%), comparable to state-of-the-art surveys of the local Universe, together with extensive multi-band optical and near-infrared photometry. Here we present the survey design, the selection of the source catalogue and the development of the spectroscopic observations. We discuss in detail the overall selection function that results from the combination of the different constituents of the project. This includes the masks arising from the parent photometric sample and the spectroscopic instrumental footprint, together with the weights needed to account for the sampling and the success rates of the observations. Using the catalogue of 53 608 galaxy redshifts composing the forthcoming VIPERS Public Data Release 1 (PDR-1), we provide a first assessment of the quality of the spectroscopic data. The stellar contamination is found to be only 3.2%, endorsing the quality of the star-galaxy separation process and fully confirming the original estimates based on the VVDS data, which also indicate a galaxy incompleteness from this process of only 1.4%. Using a set of 1215 repeated observations, we estimate an rms redshift error σz/ (1 + z) = 4.7 × 10-4 and calibrate the internal spectral quality grading. Benefiting from the combination of size and detailed sampling of this dataset, we conclude by presenting a map showing in unprecedented detail the large-scale distribution of galaxies between 5 and 8 billion years ago. Based on observations

  18. Understanding Uncertainties in Non-Linear Population Trajectories: A Bayesian Semi-Parametric Hierarchical Approach to Large-Scale Surveys of Coral Cover

    PubMed Central

    Vercelloni, Julie; Caley, M. Julian; Kayal, Mohsen; Low-Choy, Samantha; Mengersen, Kerrie

    2014-01-01

    Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making. PMID:25364915

  19. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    SciTech Connect

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many of the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.

  20. Public appraisal of government efforts and participation intent in medico-ethical policymaking in Japan: a large scale national survey concerning brain death and organ transplant

    PubMed Central

    Sato, Hajime; Akabayashi, Akira; Kai, Ichiro

    2005-01-01

    Background Public satisfaction with policy process influences the legitimacy and acceptance of policies, and conditions the future political process, especially when contending ethical value judgments are involved. On the other hand, public involvement is required if effective policy is to be developed and accepted. Methods Using the data from a large-scale national opinion survey, this study evaluates public appraisal of past government efforts to legalize organ transplant from brain-dead bodies in Japan, and examines the public's intent to participate in future policy. Results A relatively large percentage of people became aware of the issue when government actions were initiated, and many increasingly formed their own opinions on the policy in question. However, a significant number (43.3%) remained unaware of any legislative efforts, and only 26.3% of those who were aware provided positive appraisals of the policymaking process. Furthermore, a majority of respondents (61.8%) indicated unwillingness to participate in future policy discussions of bioethical issues. Multivariate analysis revealed the following factors are associated with positive appraisals of policy development: greater age; earlier opinion formation; and familiarity with donor cards. Factors associated with likelihood of future participation in policy discussion include younger age, earlier attention to the issue, and knowledge of past government efforts. Those unwilling to participate cited as their reasons that experts are more knowledgeable and that the issues are too complex. Conclusions Results of an opinion survey in Japan were presented, and a set of factors statistically associated with them were discussed. Further efforts to improve policy making process on bioethical issues are desirable. PMID:15661080

  1. Testing deviations from ΛCDM with growth rate measurements from six large-scale structure surveys at z = 0.06-1

    NASA Astrophysics Data System (ADS)

    Alam, Shadab; Ho, Shirley; Silvestri, Alessandra

    2016-03-01

    We use measurements from the Planck satellite mission and galaxy redshift surveys over the last decade to test three of the basic assumptions of the standard model of cosmology, ΛCDM (Λ cold dark matter): the spatial curvature of the universe, the nature of dark energy and the laws of gravity on large scales. We obtain improved constraints on several scenarios that violate one or more of these assumptions. We measure w0 = -0.94 ± 0.17 (18 per cent measurement) and 1 + wa = 1.16 ± 0.36 (31 per cent measurement) for models with a time-dependent equation of state, which is an improvement over current best constraints. In the context of modified gravity, we consider popular scalar-tensor models as well as a parametrization of the growth factor. In the case of one-parameter f(R) gravity models with a ΛCDM background, we constrain B0 < 1.36 × 10-5 (1σ C.L.), which is an improvement by a factor of 4 on the current best. We provide the very first constraint on the coupling parameters of general scalar-tensor theory and stringent constraint on the only free coupling parameter of Chameleon models. We also derive constraints on extended Chameleon models, improving the constraint on the coupling by a factor of 6 on the current best. The constraints on coupling parameter for Chameleon model rule out the value of β1 = 4/3 required for f(R) gravity. We also measure γ = 0.612 ± 0.072 (11.7 per cent measurement) for growth index parametrization. We improve all the current constraints by combining results from various galaxy redshift surveys in a coherent way, which includes a careful treatment of scale dependence introduced by modified gravity.

  2. The Big Drink Debate: perceptions of the impact of price on alcohol consumption from a large scale cross-sectional convenience survey in north west England

    PubMed Central

    2011-01-01

    Background A large-scale survey was conducted in 2008 in north west England, a region with high levels of alcohol-related harm, during a regional 'Big Drink Debate' campaign. The aim of this paper is to explore perceptions of how alcohol consumption would change if alcohol prices were to increase or decrease. Methods A convenience survey of residents (≥ 18 years) of north west England measured demographics, income, alcohol consumption in previous week, and opinions on drinking behaviour under two pricing conditions: low prices and discounts and increased alcohol prices (either 'decrease', 'no change' or 'increase'). Multinomial logistic regression used three outcomes: 'completely elastic' (consider that lower prices increase drinking and higher prices decrease drinking); 'lower price elastic' (lower prices increase drinking, higher prices have no effect); and 'price inelastic' (no change for either). Results Of 22,780 drinkers surveyed, 80.3% considered lower alcohol prices and discounts would increase alcohol consumption, while 22.1% thought raising prices would decrease consumption, making lower price elasticity only (i.e. lower prices increase drinking, higher prices have no effect) the most common outcome (62%). Compared to a high income/high drinking category, the lightest drinkers with a low income (adjusted odds ratio AOR = 1.78, 95% confidence intervals CI 1.38-2.30) or medium income (AOR = 1.88, CI 1.47-2.41) were most likely to be lower price elastic. Females were more likely than males to be lower price elastic (65% vs 57%) while the reverse was true for complete elasticity (20% vs 26%, P < 0.001). Conclusions Lower pricing increases alcohol consumption, and the alcohol industry's continued focus on discounting sales encourages higher drinking levels. International evidence suggests increasing the price of alcohol reduces consumption, and one in five of the surveyed population agreed; more work is required to increase this agreement to achieve public

  3. Health risks from large-scale water pollution: Current trends and implications for improving drinking water quality in the lower Amu Darya drainage basin, Uzbekistan

    NASA Astrophysics Data System (ADS)

    Törnqvist, Rebecka; Jarsjö, Jerker

    2010-05-01

    Safe drinking water is a primary prerequisite to human health, well being and development. Yet, there are roughly one billion people around the world that lack access to safe drinking water supply. Health risk assessments are effective for evaluating the suitability of using various water sources as drinking water supply. Additionally, knowledge of pollutant transport processes on relatively large scales is needed to identify effective management strategies for improving water resources of poor quality. The lower Amu Darya drainage basin close to the Aral Sea in Uzbekistan suffers from physical water scarcity and poor water quality. This is mainly due to the intensive agriculture production in the region, which requires extensive freshwater withdrawals and use of fertilizers and pesticides. In addition, recurrent droughts in the region affect the surface water availability. On average 20% of the population in rural areas in Uzbekistan lack access to improved drinking water sources, and the situation is even more severe in the lower Amu Darya basin. In this study, we consider health risks related to water-borne contaminants by dividing measured substance concentrations with health-risk based guideline values from the World Health Organisation (WHO). In particular, we analyse novel results of water quality measurements performed in 2007 and 2008 in the Mejdurechye Reservoir (located in the downstream part of the Amu Darya river basin). We furthermore identify large-scale trends by comparing the Mejdurechye results to reported water quality results from a considerable stretch of the Amu Darya river basin, including drainage water, river water and groundwater. The results show that concentrations of cadmium and nitrite exceed the WHO health-risk based guideline values in Mejdurechye Reservoir. Furthermore, concentrations of the since long ago banned and highly toxic pesticides dichlorodiphenyltrichloroethane (DDT) and γ-hexachlorocyclohexane (γ-HCH) were detected in

  4. Seismic texture and amplitude analysis of large scale fluid escape pipes using time lapses seismic surveys: examples from the Loyal Field (Scotland, UK)

    NASA Astrophysics Data System (ADS)

    Maestrelli, Daniele; Jihad, Ali; Iacopini, David; Bond, Clare

    2016-04-01

    ) affected by large scale fracture (semblance image) and seem consistent with a suspended mud/sand mixture non-fluidized fluid flow. Near-Middle-Far offsets amplitude analysis confirms that most of the amplitude anomalies within the pipes conduit and terminus are only partly related to gas. An interpretation of the possible texture observed is proposed with a discussion of the noise and artefact induced by resolution and migration problems. Possible hypothetical formation mechanisms for those Pipes are discussed.

  5. Analysis and Modeling of Threatening Factors of Workforce’s Health in Large-Scale Workplaces: Comparison of Four-Fitting Methods to select optimum technique

    PubMed Central

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Savareh, Behrouz Alizadeh

    2016-01-01

    Introduction Workforce is one of the pillars of development in any country. Therefore, the workforce’s health is very important, and analyzing its threatening factors is one of the fundamental steps for health planning. This study was the first part of a comprehensive study aimed at comparing the fitting methods to analyze and model the factors threatening health in occupational injuries. Methods In this study, 980 human occupational injuries in 10 Iranian large-scale workplaces within 10 years (2005–2014) were analyzed and modeled based on the four fitting methods: linear regression, regression analysis, generalized linear model, and artificial neural networks (ANN) using IBM SPSS Modeler 14.2. Results Accident Severity Rate (ASR) of occupational injuries was 557.47 ± 397.87. The results showed that the mean of age and work experience of injured workers were 27.82 ± 5.23 and 4.39 ± 3.65 years, respectively. Analysis of health-threatening factors showed that some factors, including age, quality of provided H&S training, number of workers, hazard identification (HAZID), and periodic risk assessment, and periodic H&S training were important factors that affected ASR. In addition, the results of comparison of the four fitting methods showed that the correlation coefficient of ANN (R = 0.968) and the relative error (R.E) of ANN (R.E = 0.063) were the highest and lowest, respectively, among other fitting methods. Conclusion The findings of the present study indicated that, despite the suitability and effectiveness of all fitting methods in analyzing severity of occupational injuries, ANN is the best fitting method for modeling of the threatening factors of a workforce’s health. Furthermore, all fitting methods, especially ANN, should be considered more in analyzing and modeling of occupational injuries and health-threatening factors as well as planning to provide and improve the workforce’s health. PMID:27053999

  6. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  7. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L.; Rickert, M.

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  8. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  9. Self-Assessments or Tests? Comparing Cross-National Differences in Patterns and Outcomes of Graduates' Skills Based on International Large-Scale Surveys

    ERIC Educational Resources Information Center

    Humburg, Martin; van der Velden, Rolf

    2015-01-01

    In this paper an analysis is carried out whether objective tests and subjective self-assessments in international large-scale studies yield similar results when looking at cross-national differences in the effects of skills on earnings, and skills patterns across countries, fields of study and gender. The findings indicate that subjective skills…

  10. A Numeric Scorecard Assessing the Mental Health Preparedness for Large-Scale Crises at College and University Campuses: A Delphi Study

    ERIC Educational Resources Information Center

    Burgin, Rick A.

    2012-01-01

    Large-scale crises continue to surprise, overwhelm, and shatter college and university campuses. While the devastation to physical plants and persons is often evident and is addressed with crisis management plans, the number of emotional casualties left in the wake of these large-scale crises may not be apparent and are often not addressed with…

  11. The Development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): A Large-Scale Data Sharing Initiative

    PubMed Central

    Lutomski, Jennifer E.; Baars, Maria A. E.; Schalk, Bianca W. M.; Boter, Han; Buurman, Bianca M.; den Elzen, Wendy P. J.; Jansen, Aaltje P. D.; Kempen, Gertrudis I. J. M.; Steunenberg, Bas; Steyerberg, Ewout W.; Olde Rikkert, Marcel G. M.; Melis, René J. F.

    2013-01-01

    Introduction In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons’ health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). Materials and Methods A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Results Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. Discussion TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the

  12. NATIONAL PREGNANCY AND HEALTH SURVEY

    EPA Science Inventory

    The National Pregnancy and Health Survey conducted by NIDA is a nationwide hospital survey to determine the extent of drug abuse among pregnant women in the United States. The primary objective of the National Pregnancy and Health Survey (NPHS) was to produce national annual esti...

  13. Effectiveness of a large-scale health and nutritional education program on anemia in children younger than 5 years in Shifang, a heavily damaged area of Wenchuan earthquake.

    PubMed

    Yang, Fan; Wang, Chuan; Yang, Hui; Yang, Huiming; Yang, Sufei; Yu, Tao; Tang, Zhanghui; Ji, Qiaoyun; Li, Fengyi; Shi, Hua; Mao, Meng

    2015-03-01

    This study aimed to explore an ideal way to prevent anemia among children younger than 5 years after disasters especially when health care facilities are not enough. A preliminary survey was carried out involving 13 065 children younger than 5 years. Pretested questionnaires were used for data collection and hemoglobin levels were measured. After 12-month intervention, the impact survey involving 2769 children was conducted. Results showed that there were some improvements both in feeding knowledge and practice related to anemia. The total prevalence of anemia decreased from 14.3% to 7.8% (P < .001), and the severity of anemia also declined. The hemoglobin concentration increased significantly from 118.8 ± 10.5 to 122.0 ± 9.9 g/L (P < .001). Thus, health and nutritional education could be an ideal way to combat anemia after disasters especially in less developed areas with multiparty cooperation. The methods and experiences of this study may be well worth learning and implementing. PMID:23536239

  14. A large scale survey of trace metal levels in coastal waters of the Western Mediterranean basin using caged mussels (Mytilus galloprovincialis).

    PubMed

    Benedicto, José; Andral, Bruno; Martínez-Gómez, Concepción; Guitart, Carlos; Deudero, Salud; Cento, Alessandro; Scarpato, Alfonso; Caixach, Josep; Benbrahim, Samir; Chouba, Lassaad; Boulahdid, Mostefa; Galgani, François

    2011-05-01

    A large scale study of trace metal contamination (Hg, Cd, Pb and Ni) by means of caged mussels Mytilus galloprovincialis was undertaken along the coastal waters of the Western Mediterranean Sea within the context of the MYTILOS project. Individual mussels from an homogeneous population (shell size 50 ± 5 mm) obtained from an aquaculture farm were consecutively caged and deployed at 123 sites located in the Alborán, North-Western, South-Western and Tyrrhenian sub-basins for 12 weeks (April-July) in 2004, 2005 and 2006. After cage recoveries, both the metal content in the whole mussel tissue and the allometric parameters were measured. Statistical analysis of the datasets showed significant differences in concentrations between sub-basins for some metals and mussel condition index (CI). Linear regression models coupled to the CI were revisited for the data adjustment of certain trace metals (Hg, Cd and Ni), and four level categories were statistically derived to facilitate interregional comparison. Seawater masses surrounding coastal areas impacted by run-off from land mineralised coasts and industrial activities displayed the highest concentration ranges (Hg: 0.15-0.31 mg kg(-1) dw; Cd: 1.97-2.11; Ni: 2.18-3.20 and Pb: 3.1-3.8), although the levels obtained in most of the sites fitted within moderate or low categories, and they could be considered as baseline concentrations. However, few sites considered little-influenced by human activities, at present, showed high concentrations of Cd, Ni and Pb, which constitute new areas of concern. Overall, the use of active biomonitoring (ABM) approach allowed to investigate trace metal contamination in order to support policy makers in establishing regional strategies (particularly, with regard to the European Marine Strategy Directive). PMID:21384032

  15. Is cost-related non-collection of prescriptions associated with a reduction in health? Findings from a large-scale longitudinal study of New Zealand adults

    PubMed Central

    Jatrana, Santosh; Richardson, Ken; Norris, Pauline; Crampton, Peter

    2015-01-01

    Objective To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. Settings New Zealand Survey of Family, Income and Employment (SoFIE)-Health. Participants Data from 17 363 participants with at least two observations in three waves (2004–2005, 2006–2007, 2008–2009) of a panel study were analysed using fixed effects regression modelling. Primary outcome measures Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. Results After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15–24 and 25–64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. Conclusion Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health. PMID:26553826

  16. What Sort of Girl Wants to Study Physics after the Age of 16? Findings from a Large-Scale UK Survey

    ERIC Educational Resources Information Center

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-01-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK…

  17. A Short Survey on the State of the Art in Architectures and Platforms for Large Scale Data Analysis and Knowledge Discovery from Data

    SciTech Connect

    Begoli, Edmon

    2012-01-01

    Intended as a survey for practicing architects and researchers seeking an overview of the state-of-the-art architectures for data analysis, this paper provides an overview of the emerg- ing data management and analytic platforms including par- allel databases, Hadoop-based systems, High Performance Computing (HPC) platforms and platforms popularly re- ferred to as NoSQL platforms. Platforms are presented based on their relevance, analysis they support and the data organization model they support.

  18. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  19. Relationship between overactive bladder and irritable bowel syndrome: a large-scale internet survey in Japan using the overactive bladder symptom score and Rome III criteria

    PubMed Central

    Matsumoto, Seiji; Hashizume, Kazumi; Wada, Naoki; Hori, Jyunichi; Tamaki, Gaku; Kita, Masafumi; Iwata, Tatsuya; Kakizaki, Hidehiro

    2013-01-01

    What's known on the subject? and What does the study add? There is known to be an association between overactive bladder (OAB) and irritable bowel syndrome (IBS). The study investigates the association between OAB and IBS using an internet-based survey in Japan. It is the first to investigate the prevalence and severity of OAB in the general population using the OAB symptom score questionnaire. Objective To investigate the association between overactive bladder (OAB) and irritable bowel syndrome (IBS) by using an internet-based survey in Japan. Subjects and Methods Questionnaires were sent via the internet to Japanese adults. The overactive bladder symptom score was used for screening OAB, and the Japanese version of the Rome III criteria for the diagnosis of IBS was used for screening this syndrome. Results The overall prevalence of OAB and IBS was 9.3% and 21.2%, respectively. Among the subjects with OAB, 33.3% had concurrent IBS. The prevalence of OAB among men was 9.7% and among women it was 8.9%, while 18.6% of men and 23.9% of women had IBS. Concurrent IBS was noted in 32.0% of men and 34.8% of women with OAB. Conclusion Taking into account a high rate of concurrent IBS in patients with OAB, it seems to be important for physicians to assess the defaecation habits of patients when diagnosing and treating OAB. PMID:23106867

  20. Prevalence of HIV among MSM in Europe: comparison of self-reported diagnoses from a large scale internet survey and existing national estimates

    PubMed Central

    2012-01-01

    Background Country level comparisons of HIV prevalence among men having sex with men (MSM) is challenging for a variety of reasons, including differences in the definition and measurement of the denominator group, recruitment strategies and the HIV detection methods. To assess their comparability, self-reported data on HIV diagnoses in a 2010 pan-European MSM internet survey (EMIS) were compared with pre-existing estimates of HIV prevalence in MSM from a variety of European countries. Methods The first pan-European survey of MSM recruited more than 180,000 men from 38 countries across Europe and included questions on the year and result of last HIV test. HIV prevalence as measured in EMIS was compared with national estimates of HIV prevalence based on studies using biological measurements or modelling approaches to explore the degree of agreement between different methods. Existing estimates were taken from Dublin Declaration Monitoring Reports or UNAIDS country fact sheets, and were verified by contacting the nominated contact points for HIV surveillance in EU/EEA countries. Results The EMIS self-reported measurements of HIV prevalence were strongly correlated with existing estimates based on biological measurement and modelling studies using surveillance data (R2=0.70 resp. 0.72). In most countries HIV positive MSM appeared disproportionately likely to participate in EMIS, and prevalences as measured in EMIS are approximately twice the estimates based on existing estimates. Conclusions Comparison of diagnosed HIV prevalence as measured in EMIS with pre-existing estimates based on biological measurements using varied sampling frames (e.g. Respondent Driven Sampling, Time and Location Sampling) demonstrates a high correlation and suggests similar selection biases from both types of studies. For comparison with modelled estimates the self-selection bias of the Internet survey with increased participation of men diagnosed with HIV has to be taken into account. For

  1. First measurements of the scope for growth (SFG) in mussels from a large scale survey in the North-Atlantic Spanish coast.

    PubMed

    Albentosa, Marina; Viñas, Lucía; Besada, Victoria; Franco, Angeles; González-Quijano, Amelia

    2012-10-01

    SFG and physiological rates were measured in wild mussels from the Spanish Marine Pollution monitoring program (SMP) in order to determine seawater quality. It consists of 41 stations, covering almost 2500 km of coast, making the SMP the widest-ranging monitoring network in the Iberian Peninsula's Atlantic region. Results of the 2007 and 2008 surveys when 39 sites were sampled: (20 in 2007 and 19 in 2008, being 8 sites sampled both years) were presented. Chemical analyses were carried out to determine the relationships between physiological rates and the accumulation of toxic compounds. Data presented are the first to become available on the use of SFG as a biomarker of the marine environment on a large spatial scale (>1000 km) along Spain's Atlantic seaboard. SFG values enable significant differences to be established between the areas sampled and between the two years surveyed. The integration of biological and chemical data suggests that certain organochlorine compounds, namely chlordanes and DDTs, may have a negative effect on SFG, although such an effect is of a lesser magnitude than that associated with certain biological parameters such as condition index and mussel age. These variables act as confounding factors when attempting to determine the effect of chemical compounds present in the marine environment on mussel SFG. Further research is therefore needed on the relation between these confounding factors and SFG in order to apply the relevant corrective strategies to enable this index to be used in monitoring programs. The effect of these confounding factors is more clearly revealed in studies that cover a wide-ranging spatial and time scale, such as those carried out within the SMP. These results do not invalidate the use of biological data in monitoring programs, but rather point to the need to analyze all the factors affecting each biological process. PMID:22885349

  2. A 1.85-m mm-submm Telescope for Large-Scale Molecular Gas Surveys in 12CO, 13CO, and C18O (J = 2-1)

    NASA Astrophysics Data System (ADS)

    Onishi, Toshikazu; Nishimura, Atsushi; Ota, Yuya; Hashizume, Akio; Kojima, Yoshiharu; Minami, Akihito; Tokuda, Kazuki; Touga, Shiori; Abe, Yasuhiro; Kaiden, Masahiro; Kimura, Kimihiro; Muraoka, Kazuyuki; Maezawa, Hiroyuki; Ogawa, Hideo; Dobashi, Kazuhito; Shimoikura, Tomomi; Yonekura, Yoshinori; Asayama, Shin'ichiro; Handa, Toshihiro; Nakajima, Taku; Noguchi, Takashi; Kuno, Nario

    2013-08-01

    We have developed a new mm-submm telescope with a diameter of 1.85-m installed at the Nobeyama Radio Observatory. The scientific goal is to precisely reveal the physical properties of molecular clouds in the Milky Way Galaxy by obtaining a large-scale distribution of molecular gas, which can also be compared with large-scale observations at various wavelengths. The target frequency is ˜ 230 GHz; simultaneous observations at the molecular rotational lines of J = 2-1 of three carbon monoxide isotopes (12CO, 13CO, C18 O) are achieved with a beam size (HPBW) of 2.7'. In order to accomplish the simultaneous observations, we have developed waveguide-type sideband-separating SIS mixers to obtain spectra separately in the upper and lower side bands. A Fourier digital spectrometer with a 1 GHz bandwidth having 16384 channels is installed, and the bandwidth of the spectrometer is divided into three parts, corresponding to each of the three spectra; the IF system has been designed so as to inject these three lines into the spectrometer. A flexible observation system was created mainly in Python on Linux PCs, enabling effective OTF (On-The-Fly) scans for large-area mapping. The telescope is enclosed in a radome with a membrane covered to prevent any harmful effects of sunlight, strong wind, and precipitation in order to minimize errors in the telescope pointing, and to stabilize the receiver and the IF devices. From 2011 November, we started science operation, resulting in a large-scale survey of the Orion A/B clouds, Cygnus OB7, Galactic Plane, Taurus, and so on. We also updated the receiver system for dual-polarization observations.

  3. IRAM 30 m large scale survey of {sup 12}CO(2-1) and {sup 13}CO(2-1) emission in the Orion molecular cloud

    SciTech Connect

    Berné, O.; Cernicharo, J.; Marcelino, N.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the {sup 12}CO and {sup 13}CO (2-1) lines with a maximal spatial resolution of ∼11'' and spectral resolution of ∼0.4 km s{sup –1}. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M {sub ☉} (half of which is found in regions with visual extinctions A{sub V} below ∼10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/∼oberne/Olivier{sub B}erne/Data).

  4. IRAM 30 m Large Scale Survey of 12CO(2-1) and 13CO(2-1) Emission in the Orion Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Berné, O.; Marcelino, N.; Cernicharo, J.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the 12CO and 13CO (2-1) lines with a maximal spatial resolution of ~11'' and spectral resolution of ~0.4 km s-1. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M ⊙ (half of which is found in regions with visual extinctions AV below ~10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/~oberne/Olivier_Berne/Data).

  5. What Sort of Girl Wants to Study Physics After the Age of 16? Findings from a Large-scale UK Survey

    NASA Astrophysics Data System (ADS)

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-11-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK schools as learners of physics during the academic year 2008-2009. A comparison between boys and girls indicates the pervasiveness of gender issues, with boys more likely to respond positively towards physics-specific constructs than girls. The analysis also indicates that girls and boys who expressed intentions to participate in physics post-16 gave similar responses towards their physics teachers and physics lessons and had comparable physics extrinsic motivation. Girls (regardless of their intention to participate in physics) were less likely than boys to be encouraged to study physics post-16 by teachers, family and friends. Despite this, there were a subset of girls still intending to study physics post-16. The crucial differences between the girls who intended to study physics post-16 and those who did not is that girls who intend to study physics post-16 had higher physics extrinsic motivation, more positive perceptions of physics teachers and lessons, greater competitiveness and a tendency to be less extrovert. This strongly suggests that higher extrinsic motivation in physics could be the crucial underlying key that encourages a subset of girls (as well as boys) in wanting to pursue physics post-16.

  6. The clustering of galaxies in the SDSS-III Baryon Oscillation Spectroscopic Survey: cosmological implications of the large-scale two-point correlation function

    NASA Astrophysics Data System (ADS)

    Sánchez, Ariel G.; Scóccola, C. G.; Ross, A. J.; Percival, W.; Manera, M.; Montesano, F.; Mazzalay, X.; Cuesta, A. J.; Eisenstein, D. J.; Kazin, E.; McBride, C. K.; Mehta, K.; Montero-Dorta, A. D.; Padmanabhan, N.; Prada, F.; Rubiño-Martín, J. A.; Tojeiro, R.; Xu, X.; Magaña, M. Vargas; Aubourg, E.; Bahcall, N. A.; Bailey, S.; Bizyaev, D.; Bolton, A. S.; Brewington, H.; Brinkmann, J.; Brownstein, J. R.; Gott, J. Richard; Hamilton, J. C.; Ho, S.; Honscheid, K.; Labatie, A.; Malanushenko, E.; Malanushenko, V.; Maraston, C.; Muna, D.; Nichol, R. C.; Oravetz, D.; Pan, K.; Ross, N. P.; Roe, N. A.; Reid, B. A.; Schlegel, D. J.; Shelden, A.; Schneider, D. P.; Simmons, A.; Skibba, R.; Snedden, S.; Thomas, D.; Tinker, J.; Wake, D. A.; Weaver, B. A.; Weinberg, David H.; White, Martin; Zehavi, I.; Zhao, G.

    2012-09-01

    We obtain constraints on cosmological parameters from the spherically averaged redshift-space correlation function of the CMASS Data Release 9 (DR9) sample of the Baryonic Oscillation Spectroscopic Survey (BOSS). We combine this information with additional data from recent cosmic microwave background (CMB), supernova and baryon acoustic oscillation measurements. Our results show no significant evidence of deviations from the standard flat Λ cold dark matter model, whose basic parameters can be specified by Ωm = 0.285 ± 0.009, 100 Ωb = 4.59 ± 0.09, ns = 0.961 ± 0.009, H0 = 69.4 ± 0.8 km s-1 Mpc-1 and σ8 = 0.80 ± 0.02. The CMB+CMASS combination sets tight constraints on the curvature of the Universe, with Ωk = -0.0043 ± 0.0049, and the tensor-to-scalar amplitude ratio, for which we find r < 0.16 at the 95 per cent confidence level (CL). These data show a clear signature of a deviation from scale invariance also in the presence of tensor modes, with ns < 1 at the 99.7 per cent CL. We derive constraints on the fraction of massive neutrinos of fν < 0.049 (95 per cent CL), implying a limit of ∑mν < 0.51 eV. We find no signature of a deviation from a cosmological constant from the combination of all data sets, with a constraint of wDE = -1.033 ± 0.073 when this parameter is assumed time-independent, and no evidence of a departure from this value when it is allowed to evolve as wDE(a) = w0 + wa(1 - a). The achieved accuracy on our cosmological constraints is a clear demonstration of the constraining power of current cosmological observations.

  7. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  8. CHINA HEALTH AND NUTRITION SURVEY

    EPA Science Inventory

    The China Health and Nutrition Survey is designed to examine the effects of health, nutrition, and family planning policies and programs as they have been implemented by national and local governments. It is designed to examine how both the social and economic transformation of C...

  9. MEDICARE HEALTH OUTCOMES SURVEY (HOS)

    EPA Science Inventory

    The Medicare Health Outcomes Survey (HOS) is the first Medicare managed care outcomes measure. CMS, in collaboration with NCQA, launched the Medicare HOS in the 1998 Health Plan Employer Data and Information Set (HEDIS?). The measure includes the most recent advances in summarizi...

  10. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  11. An integrative structural health monitoring system for the local/global responses of a large-scale irregular building under construction.

    PubMed

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  12. Microfluidic large-scale integration.

    PubMed

    Thorsen, Todd; Maerkl, Sebastian J; Quake, Stephen R

    2002-10-18

    We developed high-density microfluidic chips that contain plumbing networks with thousands of micromechanical valves and hundreds of individually addressable chambers. These fluidic devices are analogous to electronic integrated circuits fabricated using large-scale integration. A key component of these networks is the fluidic multiplexor, which is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs. We used these integrated microfluidic networks to construct the microfluidic analog of a comparator array and a microfluidic memory storage device whose behavior resembles random-access memory. PMID:12351675

  13. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success.

    PubMed

    Cutler, G Christopher; Scott-Dupree, Cynthia D; Sultan, Maryam; McFarlane, Andrew D; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5-2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012-2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  14. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success

    PubMed Central

    Scott-Dupree, Cynthia D.; Sultan, Maryam; McFarlane, Andrew D.; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5–2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012–2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  15. Large scale topography of Io

    NASA Technical Reports Server (NTRS)

    Gaskell, R. W.; Synnott, S. P.

    1987-01-01

    To investigate the large scale topography of the Jovian satellite Io, both limb observations and stereographic techniques applied to landmarks are used. The raw data for this study consists of Voyager 1 images of Io, 800x800 arrays of picture elements each of which can take on 256 possible brightness values. In analyzing this data it was necessary to identify and locate landmarks and limb points on the raw images, remove the image distortions caused by the camera electronics and translate the corrected locations into positions relative to a reference geoid. Minimizing the uncertainty in the corrected locations is crucial to the success of this project. In the highest resolution frames, an error of a tenth of a pixel in image space location can lead to a 300 m error in true location. In the lowest resolution frames, the same error can lead to an uncertainty of several km.

  16. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  17. California's "5 a day--for better health!" campaign: an innovative population-based effort to effect large-scale dietary change.

    PubMed

    Foerster, S B; Kizer, K W; Disogra, L K; Bal, D G; Krieg, B F; Bunch, K L

    1995-01-01

    The annual toll of diet-related diseases in the United States is similar to that taken by tobacco, but less progress has been achieved in reaching the Public Health Service's Healthy People 2000 objectives for improving food consumption than for reducing tobacco use. In 1988, the California Department of Health Services embarked upon an innovative multi-year social marketing program to increase fruit and vegetable consumption. The 5 a Day--for Better Health! Campaign had several distinctive features, including its simple, positive, behavior-specific message to eat 5 servings of fruits and vegetables every day as part of a low-fat, high fiber diet; its use of mass media; its partnership between the state health department and the produce and supermarket industries; and its extensive use of point-of-purchase messages. Over its nearly three years of operation in California, the 5 a Day Campaign appears to have raised public awareness that fruits and vegetables help reduce cancer risk, increased fruit and vegetable consumption in major population segments, and created an ongoing partnership between public health and agribusiness that has allowed extension of the campaign to other population segments, namely children and Latino adults. In 1991 the campaign was adopted as a national initiative by the National Cancer Institute and the Produce for Better Health Foundation. By 1994, over 700 industry organizations and 48 states, territories, and the District of Columbia were licensed to participate. Preventive medicine practitioners and others involved in health promotion may build upon the 5 a Day Campaign experience in developing and implementing efforts to reach the nation's dietary goals. PMID:7632448

  18. National Adolescent Student Health Survey.

    ERIC Educational Resources Information Center

    Health Education (Washington D.C.), 1988

    1988-01-01

    Results are reported from a national survey of teenaged youth on their attitudes toward a variety of health related issues. Topics covered were Acquired Immune Deficiency Syndrome; sexually transmitted diseases, violence, suicide, injury prevention, drug abuse, nutrition, and consumer education. (JD)

  19. HEALTH AND DIET SURVEY (HDS)

    EPA Science Inventory

    The FDA conducts this periodic omnibus survey of American consumers to track consumer attitudes, knowledge, and reported behaviors related to diet and health issues including cholesterol awareness of diet-disease risk factors, food label use, dietary supplement use, and awarenes...

  20. The NIHR collaboration for leadership in applied health research and care (CLAHRC) for greater manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy

    PubMed Central

    2011-01-01

    Background In response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme. Discussion The paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning. Summary Designing and evaluating a large-scale implementation

  1. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  2. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  3. Individual Skill Differences and Large-Scale Environmental Learning

    ERIC Educational Resources Information Center

    Fields, Alexa W.; Shelton, Amy L.

    2006-01-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited…

  4. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  5. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  6. Psychological Resilience after Hurricane Sandy: The Influence of Individual- and Community-Level Factors on Mental Health after a Large-Scale Natural Disaster

    PubMed Central

    Lowe, Sarah R.; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13–16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience. PMID:25962178

  7. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  8. Evaluating effective reaction rates of kinetically driven solutes in large-scale, anisotropic media: human health risk implications in CO2 leakage

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2011-12-01

    The role of high and low hydraulic conductivity (K) regions in heterogeneous, stratified and non-stratified flow fields and the subsequent effect of rate dependent geochemical reactions are investigated with regards to mobilized arsenic from CO2 leakage at a Carbon Capture and Storage (CCS) site. Following the methodology of previous work, human health risk is used as an endpoint for comparison via a two-stage or nested Monte Carlo scheme, explicitly considering joint uncertainty and variability for a hypothetical population of individuals. This study identifies geo-hydrologic conditions where solute reactions are either rate limited (non-reactive), in equilibrium (linear equilibrium assumption, LEA, is appropriate), or are sensitive to time-dependent kinetic reaction rates. Potential interplay between multiple parameters (i.e. positive or negative feedbacks) is shown utilizing stochastic ensembles. In particular, the effect of preferential flow pathways and solute mixing on the field-scale (macrodispersion) and sub-grid (local dispersion) is examined for varying degrees of stratification and regional groundwater velocities. Results show effective reaction rates of kinetic ensembles are dissimilar from LEA ensembles with the inclusion of local dispersion, resulting in an additive tailing effect of the solute plume, a retarded peak time, and an increased cancer risk. This discrepancy between kinetic and LEA ensembles is augmented in highly anisotropic media, especially at intermediate regional groundwater velocities. The distribution, magnitude, and associated uncertainty of cancer risk are controlled by these factors, but are also strongly dependent on the regional groundwater velocity. We demonstrate a higher associated uncertainty of cancer risk in stratified domains is linked to higher aquifer connectivity and less macrodispersion in the flow field. This study has implications in CCS site selection and groundwater driven risk assessment modeling.

  9. Accuracy of Electronic Health Record Data for Identifying Stroke Cases in Large-Scale Epidemiological Studies: A Systematic Review from the UK Biobank Stroke Outcomes Group

    PubMed Central

    Woodfield, Rebecca; Grant, Ian; Sudlow, Cathie L. M.

    2015-01-01

    Objective Long-term follow-up of population-based prospective studies is often achieved through linkages to coded regional or national health care data. Our knowledge of the accuracy of such data is incomplete. To inform methods for identifying stroke cases in UK Biobank (a prospective study of 503,000 UK adults recruited in middle-age), we systematically evaluated the accuracy of these data for stroke and its main pathological types (ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage), determining the optimum codes for case identification. Methods We sought studies published from 1990-November 2013, which compared coded data from death certificates, hospital admissions or primary care with a reference standard for stroke or its pathological types. We extracted information on a range of study characteristics and assessed study quality with the Quality Assessment of Diagnostic Studies tool (QUADAS-2). To assess accuracy, we extracted data on positive predictive values (PPV) and—where available—on sensitivity, specificity, and negative predictive values (NPV). Results 37 of 39 eligible studies assessed accuracy of International Classification of Diseases (ICD)-coded hospital or death certificate data. They varied widely in their settings, methods, reporting, quality, and in the choice and accuracy of codes. Although PPVs for stroke and its pathological types ranged from 6–97%, appropriately selected, stroke-specific codes (rather than broad cerebrovascular codes) consistently produced PPVs >70%, and in several studies >90%. The few studies with data on sensitivity, specificity and NPV showed higher sensitivity of hospital versus death certificate data for stroke, with specificity and NPV consistently >96%. Few studies assessed either primary care data or combinations of data sources. Conclusions Particular stroke-specific codes can yield high PPVs (>90%) for stroke/stroke types. Inclusion of primary care data and combining data sources should

  10. Curvature constraints from large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  11. Young women's reproductive health survey.

    PubMed

    Lewis, H

    1987-08-12

    A survey of reproductive health issues was conducted on 15 year old Hutt Valley secondary school girls by means of a self-administered anonymous questionnaire. The prevalence of sexual intercourse in the sample was 29%. Sixteen percent of the sexually active respondents used no method of contraception. Knowledge of reproductive health facts and contraception was poor both amongst sexually experienced and inexperienced respondents. Twenty-six percent relied on peers for this information, with mothers, teachers and books being other important sources cited. Respondents requested more information on sexually transmitted diseases, contraception and sexual relationships. Most would like this information more readily accessible. Preferred sources of information mentioned were: parents, books, films/videos, family planning clinics and friends. PMID:3455514

  12. THE OBSERVATIONS OF REDSHIFT EVOLUTION IN LARGE-SCALE ENVIRONMENTS (ORELSE) SURVEY. I. THE SURVEY DESIGN AND FIRST RESULTS ON CL 0023+0423 AT z = 0.84 AND RX J1821.6+6827 AT z = 0.82

    SciTech Connect

    Lubin, L. M.; Lemaux, B. C.; Kocevski, D. D.; Gal, R. R.; Squires, G. K.

    2009-06-15

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h {sup -1} {sub 70} Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z {approx} 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc x 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s{sup -1}. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the H{delta} line in a composite spectrum of 138 members indicates a substantial contribution from recent

  13. The Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey. I. The Survey Design and First Results on CL 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82

    NASA Astrophysics Data System (ADS)

    Lubin, L. M.; Gal, R. R.; Lemaux, B. C.; Kocevski, D. D.; Squires, G. K.

    2009-06-01

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h -1 70 Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z ≈ 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc × 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s-1. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the Hδ line in a composite spectrum of 138 members indicates a substantial contribution from recent starbursts to the overall galaxy

  14. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  15. Large-scale infrared scene projectors

    NASA Astrophysics Data System (ADS)

    Murray, Darin A.

    1999-07-01

    Large-scale infrared scene projectors, typically have unique opto-mechanical characteristics associated to their application. This paper outlines two large-scale zoom lens assemblies with different environmental and package constraints. Various challenges and their respective solutions are discussed and presented.

  16. The convergent validity of three surveys as alternative sources of health information to the 2011 UK census.

    PubMed

    Taylor, Joanna; Twigg, Liz; Moon, Graham

    2014-09-01

    Censuses have traditionally been a key source of localised information on the state of a nation's health. Many countries are now adopting alternative approaches to the traditional census, placing such information at risk. The purpose of this paper is to inform debate about whether existing social surveys could provide an adequate 'base' for alternative model-based small area estimates of health data in a post traditional census era. Using a case study of 2011 UK Census questions on self-assessed health and limiting long term illness, we examine the extent to which the results from three large-scale surveys - the Health Survey for England, the Crime Survey for England and Wales and the Integrated Household Survey - conform to census output. Particularly in the case of limiting long term illness, the question wording renders comparisons difficult. However, with the exception of the general health question from the Health Survey for England all three surveys meet tests for convergent validity. PMID:25016326

  17. Individual skill differences and large-scale environmental learning.

    PubMed

    Fields, Alexa W; Shelton, Amy L

    2006-05-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited exposure and tested on judgments about relative locations of objects. They also performed a series of spatial and nonspatial component skill tests. With limited learning, performance after route encoding was worse than performance after survey encoding. Furthermore, performance after route and survey encoding appeared to be preferentially linked to perspective and object-based transformations, respectively. Together, the results provide clues to how different skills might be engaged by different individuals for the same goal of learning a large-scale environment. PMID:16719662

  18. The large-scale landslide risk classification in catchment scale

    NASA Astrophysics Data System (ADS)

    Liu, Che-Hsin; Wu, Tingyeh; Chen, Lien-Kuang; Lin, Sheng-Chi

    2013-04-01

    The landslide disasters caused heavy casualties during Typhoon Morakot, 2009. This disaster is defined as largescale landslide due to the casualty numbers. This event also reflects the survey on large-scale landslide potential is so far insufficient and significant. The large-scale landslide potential analysis provides information about where should be focused on even though it is very difficult to distinguish. Accordingly, the authors intend to investigate the methods used by different countries, such as Hong Kong, Italy, Japan and Switzerland to clarify the assessment methodology. The objects include the place with susceptibility of rock slide and dip slope and the major landslide areas defined from historical records. Three different levels of scales are confirmed necessarily from country to slopeland, which are basin, catchment, and slope scales. Totally ten spots were classified with high large-scale landslide potential in the basin scale. The authors therefore focused on the catchment scale and employ risk matrix to classify the potential in this paper. The protected objects and large-scale landslide susceptibility ratio are two main indexes to classify the large-scale landslide risk. The protected objects are the constructions and transportation facilities. The large-scale landslide susceptibility ratio is based on the data of major landslide area and dip slope and rock slide areas. Totally 1,040 catchments are concerned and are classified into three levels, which are high, medium, and low levels. The proportions of high, medium, and low levels are 11%, 51%, and 38%, individually. This result represents the catchments with high proportion of protected objects or large-scale landslide susceptibility. The conclusion is made and it be the base material for the slopeland authorities when considering slopeland management and the further investigation.

  19. Large-scale investment in green space as an intervention for physical activity, mental and cardiometabolic health: study protocol for a quasi-experimental evaluation of a natural experiment

    PubMed Central

    Astell-Burt, Thomas; Feng, Xiaoqi; Kolt, Gregory S

    2016-01-01

    Introduction ‘Green spaces’ such as public parks are regarded as determinants of health, but evidence from tends to be based on cross-sectional designs. This protocol describes a study that will evaluate a large-scale investment in approximately 5280 hectares of green space stretching 27 km north to south in Western Sydney, Australia. Methods and analysis A Geographic Information System was used to identify 7272 participants in the 45 and Up Study baseline data (2006–2008) living within 5 km of the Western Sydney Parklands and some of the features that have been constructed since 2009, such as public access points, advertising billboards, walking and cycle tracks, BBQ stations, and children's playgrounds. These data were linked to information on a range of health and behavioural outcomes, with the second wave of data collection initiated by the Sax Institute in 2012 and expected to be completed by 2015. Multilevel models will be used to analyse potential change in physical activity, weight status, social contacts, mental and cardiometabolic health within a closed sample of residentially stable participants. Comparisons between persons with contrasting proximities to different areas of the Parklands will provide ‘treatment’ and ‘control’ groups within a ‘quasi-experimental’ study design. In line with expectations, baseline results prior to the enhancement of the Western Sydney Parklands indicated virtually no significant differences in the distribution of any of the outcomes with respect to proximity to green space preintervention. Ethics and dissemination Ethical approval was obtained for the 45 and Up Study from the University of New South Wales Human Research Ethics Committee. Ethics approval for this study was obtained from the University of Western Sydney Ethics Committee. Findings will be disseminated through partner organisations (the Western Sydney Parklands and the National Heart Foundation of Australia), as well as to policymakers in

  20. Synthesis of small and large scale dynamos

    NASA Astrophysics Data System (ADS)

    Subramanian, Kandaswamy

    Using a closure model for the evolution of magnetic correlations, we uncover an interesting plausible saturated state of the small-scale fluctuation dynamo (SSD) and a novel analogy between quantum mechanical tunnelling and the generation of large-scale fields. Large scale fields develop via the α-effect, but as magnetic helicity can only change on a resistive timescale, the time it takes to organize the field into large scales increases with magnetic Reynolds number. This is very similar to the results which obtain from simulations using the full MHD equations.

  1. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  2. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  3. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  4. Washington State Survey of Adolescent Health Behaviors.

    ERIC Educational Resources Information Center

    Washington State Dept. of Social and Health Services, Olympia.

    The 1992 Washington State Survey of Adolescent Health Behaviors (WSSAHB) was created to collect information regarding a variety of adolescent health behaviors among students in the state of Washington. It expands on two previous administrations of a student tobacco, alcohol, and other drug survey and includes questions about medical care, safety,…

  5. NATIONAL EMPLOYER HEALTH INSURANCE SURVEY (NEHIS)

    EPA Science Inventory

    The National Employer Health Insurance Survey (NEHIS) was developed to produce estimates on employer-sponsored health insurance data in the United States. The NEHIS was the first Federal survey to represent all employers in the United States by State and obtain information on all...

  6. Estimating health expenditure shares from household surveys

    PubMed Central

    Brooks, Benjamin PC; Hanlon, Michael

    2013-01-01

    Abstract Objective To quantify the effects of household expenditure survey characteristics on the estimated share of a household’s expenditure devoted to health. Methods A search was conducted for all country surveys reporting data on health expenditure and total household expenditure. Data on total expenditure and health expenditure were extracted from the surveys to generate the health expenditure share (i.e. fraction of the household expenditure devoted to health). To do this the authors relied on survey microdata or survey reports to calculate the health expenditure share for the particular instrument involved. Health expenditure share was modelled as a function of the survey’s recall period, the number of health expenditure items, the number of total expenditure items, the data collection method and the placement of the health module within the survey. Data exists across space and time, so fixed effects for territory and year were included as well. The model was estimated by means of ordinary least squares regression with clustered standard errors. Findings A one-unit increase in the number of health expenditure questions was accompanied by a 1% increase in the estimated health expenditure share. A one-unit increase in the number of non-health expenditure questions resulted in a 0.2% decrease in the estimated share. Increasing the recall period by one month was accompanied by a 6% decrease in the health expenditure share. Conclusion The characteristics of a survey instrument examined in the study affect the estimate of the health expenditure share. Those characteristics need to be accounted for when comparing results across surveys within a territory and, ultimately, across territories. PMID:23825879

  7. Large-scale inhomogeneities and galaxy statistics

    NASA Technical Reports Server (NTRS)

    Schaeffer, R.; Silk, J.

    1984-01-01

    The density fluctuations associated with the formation of large-scale cosmic pancake-like and filamentary structures are evaluated using the Zel'dovich approximation for the evolution of nonlinear inhomogeneities in the expanding universe. It is shown that the large-scale nonlinear density fluctuations in the galaxy distribution due to pancakes modify the standard scale-invariant correlation function xi(r) at scales comparable to the coherence length of adiabatic fluctuations. The typical contribution of pancakes and filaments to the J3 integral, and more generally to the moments of galaxy counts in a volume of approximately (15-40 per h Mpc)exp 3, provides a statistical test for the existence of large scale inhomogeneities. An application to several recent three dimensional data sets shows that despite large observational uncertainties over the relevant scales characteristic features may be present that can be attributed to pancakes in most, but not all, of the various galaxy samples.

  8. Development of the adult and child complementary medicine questionnaires fielded on the National Health Interview Survey

    PubMed Central

    2013-01-01

    The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications. PMID:24267412

  9. Post-disaster mental health need assessment surveys - the challenge of improved future research.

    PubMed

    Kessler, Ronald C; Wittchen, Hans-Ulrich

    2008-12-01

    Disasters are very common occurrences, becoming increasingly prevalent throughout the world. The number of natural disasters either affecting more than 100 people or resulting in a call for international assistance, increased from roughly 100 per year worldwide in the late 1960s, to over 500 per year in the past decade. Population growth, environmental degradation, and global warming all play parts in accounting for these increases. There is also the possibility of a pandemic. This paper and associated journal issue focuses on the topic of growing worldwide importance: mental health needs assessment in the wake of large-scale disasters. Although natural and human-made disasters are known to have substantial effects on the mental health of the people who experience them, research shows that the prevalence of post-disaster psychopathology varies enormously from one disaster to another in ways that are difficult to predict merely by knowing the objective circumstances of the disaster. Mental health needs assessment surveys are consequently carried out after many large-scale natural and human-made disasters to provide information for service planners on the nature and magnitude of need for services. These surveys vary greatly, though, in the rigor with which they assess disaster-related stressors and post-disaster mental illness. Synthesis of findings across surveys is hampered by these inconsistencies. The typically limited focus of these surveys with regard to the inclusion of risk factors, follow-up assessments, and evaluations of treatment, also limit insights from these surveys concerning post-disaster mental illness and treatment response. The papers in this issue discuss methodological issues in the design and implementation of post-disaster mental health needs assessment surveys aimed at improving on the quality of previous such surveys. The many recommendations in these papers will hopefully help to foster improvements in the next generation of post

  10. Large Scale Commodity Clusters for Lattice QCD

    SciTech Connect

    A. Pochinsky; W. Akers; R. Brower; J. Chen; P. Dreher; R. Edwards; S. Gottlieb; D. Holmgren; P. Mackenzie; J. Negele; D. Richards; J. Simone; W. Watson

    2002-06-01

    We describe the construction of large scale clusters for lattice QCD computing being developed under the umbrella of the U.S. DoE SciDAC initiative. We discuss the study of floating point and network performance that drove the design of the cluster, and present our plans for future multi-Terascale facilities.

  11. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  12. A Large Scale Computer Terminal Output Controller.

    ERIC Educational Resources Information Center

    Tucker, Paul Thomas

    This paper describes the design and implementation of a large scale computer terminal output controller which supervises the transfer of information from a Control Data 6400 Computer to a PLATO IV data network. It discusses the cost considerations leading to the selection of educational television channels rather than telephone lines for…

  13. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-07-01

    The Jacksonville Electric Authority's large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy's Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process in included.

  14. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-04-01

    The Jacksonville Electric Authority`s large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy`s Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process is included.

  15. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  16. ARPACK: Solving large scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Lehoucq, Rich; Maschhoff, Kristi; Sorensen, Danny; Yang, Chao

    2013-11-01

    ARPACK is a collection of Fortran77 subroutines designed to solve large scale eigenvalue problems. The package is designed to compute a few eigenvalues and corresponding eigenvectors of a general n by n matrix A. It is most appropriate for large sparse or structured matrices A where structured means that a matrix-vector product w

  17. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  18. Planned NLM/AHCPR large-scale vocabulary test: using UMLS technology to determine the extent to which controlled vocabularies cover terminology needed for health care and public health.

    PubMed Central

    Humphreys, B L; Hole, W T; McCray, A T; Fitzmaurice, J M

    1996-01-01

    The National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR) are sponsoring a test to determine the extent to which a combination of existing health-related terminologies covers vocabulary needed in health information systems. The test vocabularies are the 30 that are fully or partially represented in the 1996 edition of the Unified Medical Language System (UMLS) Metathesaurus, plus three planned additions: the portions of SNOMED International not in the 1996 Metathesaurus Read Clinical Classification, and the Logical Observations Identifiers, Names, and Codes (LOINC) system. These vocabularies are available to testers through a special interface to the Internet-based UMLS Knowledge Source Server. The test will determine the ability of the test vocabularies to serve as a source of controlled vocabulary for health data systems and applications. It should provide the basis for realistic resource estimates for developing and maintaining a comprehensive "standard" health vocabulary that is based on existing terminologies. PMID:8816351

  19. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project--design, population and data harmonization of a large-scale, international study.

    PubMed

    Boffetta, Paolo; Bobak, Martin; Borsch-Supan, Axel; Brenner, Hermann; Eriksson, Sture; Grodstein, Fran; Jansen, Eugene; Jenab, Mazda; Juerges, Hendrik; Kampman, Ellen; Kee, Frank; Kuulasmaa, Kari; Park, Yikyung; Tjonneland, Anne; van Duijn, Cornelia; Wilsgaard, Tom; Wolk, Alicja; Trichopoulos, Dimitrios; Bamia, Christina; Trichopoulou, Antonia

    2014-12-01

    There is a public health demand to prevent health conditions which lead to increased morbidity and mortality among the rapidly-increasing elderly population. Data for the incidence of such conditions exist in cohort studies worldwide, which, however, differ in various aspects. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project aims at harmonizing data from existing major longitudinal studies for the elderly whilst focussing on cardiovascular diseases, diabetes mellitus, cancer, fractures and cognitive impairment in order to estimate their prevalence, incidence and cause-specific mortality, and identify lifestyle, socioeconomic, and genetic determinants and biomarkers for the incidence of and mortality from these conditions. A survey instrument assessing ageing-related conditions of the elderly will be also developed. Fourteen cohort studies participate in CHANCES with 683,228 elderly (and 150,210 deaths), from 23 European and three non-European countries. So far, 287 variables on health conditions and a variety of exposures, including biomarkers and genetic data have been harmonized. Different research hypotheses are investigated with meta-analyses. The results which will be produced can help international organizations, governments and policy-makers to better understand the broader implications and consequences of ageing and thus make informed decisions. PMID:25504016

  20. NATIONAL MATERNAL AND INFANT HEALTH SURVEY (NMIHS)

    EPA Science Inventory

    The National Maternal and Infant Health Survey (NMIHS) provides data on maternal and infant health, including prenatal care, birth weight, fetal loss, and infant mortality. The objective of the NMIHS is to collect data needed by Federal, State, and private researchers to study fa...

  1. THE GUATEMALAN SURVEY OF FAMILY HEALTH

    EPA Science Inventory

    The Guatemalan Survey of Family Health, known as EGSF from its name in Spanish, was designed to examine the way in which rural Guatemalan families and individuals cope with childhood illness and pregnancy, and the role of ethnicity, poverty, and social support and health beliefs ...

  2. HISPANIC HEALTH AND NUTRITION EXAMINATION SURVEY (HHANES)

    EPA Science Inventory

    The Hispanic Health and Nutrition Examination Survey (HHANES) was a nationwide probability sample of approximately 16,000 persons, 6 months-74 years of age. Hispanics were included in past health and nutrition examinations, but neither in sufficient numbers to produce estimates o...

  3. Fractals and cosmological large-scale structure

    NASA Technical Reports Server (NTRS)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  4. Large scale processes in the solar nebula.

    NASA Astrophysics Data System (ADS)

    Boss, A. P.

    Most proposed chondrule formation mechanisms involve processes occurring inside the solar nebula, so the large scale (roughly 1 to 10 AU) structure of the nebula is of general interest for any chrondrule-forming mechanism. Chondrules and Ca, Al-rich inclusions (CAIs) might also have been formed as a direct result of the large scale structure of the nebula, such as passage of material through high temperature regions. While recent nebula models do predict the existence of relatively hot regions, the maximum temperatures in the inner planet region may not be high enough to account for chondrule or CAI thermal processing, unless the disk mass is considerably greater than the minimum mass necessary to restore the planets to solar composition. Furthermore, it does not seem to be possible to achieve both rapid heating and rapid cooling of grain assemblages in such a large scale furnace. However, if the accretion flow onto the nebula surface is clumpy, as suggested by observations of variability in young stars, then clump-disk impacts might be energetic enough to launch shock waves which could propagate through the nebula to the midplane, thermally processing any grain aggregates they encounter, and leaving behind a trail of chondrules.

  5. Large-scale extraction of proteins.

    PubMed

    Cunha, Teresa; Aires-Barros, Raquel

    2002-01-01

    The production of foreign proteins using selected host with the necessary posttranslational modifications is one of the key successes in modern biotechnology. This methodology allows the industrial production of proteins that otherwise are produced in small quantities. However, the separation and purification of these proteins from the fermentation media constitutes a major bottleneck for the widespread commercialization of recombinant proteins. The major production costs (50-90%) for typical biological product resides in the purification strategy. There is a need for efficient, effective, and economic large-scale bioseparation techniques, to achieve high purity and high recovery, while maintaining the biological activity of the molecule. Aqueous two-phase systems (ATPS) allow process integration as simultaneously separation and concentration of the target protein is achieved, with posterior removal and recycle of the polymer. The ease of scale-up combined with the high partition coefficients obtained allow its potential application in large-scale downstream processing of proteins produced by fermentation. The equipment and the methodology for aqueous two-phase extraction of proteins on a large scale using mixer-settlerand column contractors are described. The operation of the columns, either stagewise or differential, are summarized. A brief description of the methods used to account for mass transfer coefficients, hydrodynamics parameters of hold-up, drop size, and velocity, back mixing in the phases, and flooding performance, required for column design, is also provided. PMID:11876297

  6. Afghan Health Education Project: a community survey.

    PubMed

    Lipson, J G; Omidian, P A; Paul, S M

    1995-06-01

    This study assessed the health concerns and needs for health education in the Afghan refugee and immigrant community of the San Francisco Bay Area. The study used a telephone survey, seven community meetings and a survey administered to 196 Afghan families through face-to-face interviews. Data were analyzed qualitatively and statistically. Health problems of most concern are mental health problems and stress related to past refugee trauma and loss, current occupational and economic problems, and culture conflict. Physical health problems include heart disease, diabetes and dental problems. Needed health education topics include dealing with stress, heart health, nutrition, raising children in the United States (particularly adolescents), aging in the United States, and diabetes. Using coalition building and involving Afghans in their community assessment, we found that the Afghan community is eager for culture- and language-appropriate health education programs through videos, television, lectures, and written materials. Brief health education talks in community meetings and a health fair revealed enthusiasm and willingness to consider health promotion and disease-prevention practices. PMID:7596962

  7. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  8. Evaluations of treatment efficacy of depression from perspective of both patients' symptoms and general sense of mental health and wellbeing: A large scale, multi-centered, longitudinal study in China.

    PubMed

    Zeng, Qingzhi; Wang, Wei Chun; Fang, Yiru; Mellor, David; Mccabe, Marita; Byrne, Linda; Zuo, Sai; Xu, Yifeng

    2016-07-30

    Relying on the absence, presence of level of symptomatology may not provide an adequate indication of the effects of treatment for depression, nor sufficient information for the development of treatment plans that meet patients' needs. Using a prospective, multi-centered, and observational design, the present study surveyed a large sample of outpatients with depression in China (n=9855). The 17-item Hamilton Rating Scale for Depression (HRSD-17) and the Remission Evaluation and Mood Inventory Tool (REMIT) were administered at baseline, two weeks later and 4 weeks, to assess patients' self-reported symptoms and general sense of mental health and wellbeing. Of 9855 outpatients, 91.3% were diagnosed as experiencing moderate to severe depression. The patients reported significant improvement over time on both depressive symptoms and general sense after 4-week treatment. The effect sizes of change in general sense were lower than those in symptoms at both two week and four week follow-up. Treatment effects on both general sense and depressive symptomatology were associated with demographic and clinical factors. The findings indicate that a focus on both general sense of mental health and wellbeing in addition to depressive symptomatology will provide clinicians, researchers and patients themselves with a broader perspective of the status of patients. PMID:27156024

  9. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  10. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  11. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  12. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  13. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  14. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  15. Nonthermal Components in the Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco

    2004-12-01

    I address the issue of nonthermal processes in the large scale structure of the universe. After reviewing the properties of cosmic shocks and their role as particle accelerators, I discuss the main observational results, from radio to γ-ray and describe the processes that are thought be responsible for the observed nonthermal emissions. Finally, I emphasize the important role of γ-ray astronomy for the progress in the field. Non detections at these photon energies have already allowed us important conclusions. Future observations will tell us more about the physics of the intracluster medium, shocks dissipation and CR acceleration.

  16. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  17. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    NASA Astrophysics Data System (ADS)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  18. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  19. Large-scale Globally Propagating Coronal Waves

    NASA Astrophysics Data System (ADS)

    Warmuth, Alexander

    2015-09-01

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  20. Korea Community Health Survey Data Profiles.

    PubMed

    Kang, Yang Wha; Ko, Yun Sil; Kim, Yoo Jin; Sung, Kyoung Mi; Kim, Hyo Jin; Choi, Hyung Yun; Sung, Changhyun; Jeong, Eunkyeong

    2015-06-01

    In 2008, Korea Centers for Disease Control and Prevention initiated the first nationwide survey, Korea Community Health Survey (KCHS), to provide data that could be used to plan, implement, monitor, and evaluate community health promotion and disease prevention programs. This community-based cross-sectional survey has been conducted by 253 community health centers, 35 community universities, and 1500 interviewers. The KCHS standardized questionnaire was developed jointly by the Korea Centers for Disease Control and Prevention staff, a working group of health indicators standardization subcommittee, and 16 metropolitan cities and provinces with 253 regional sites. The questionnaire covers a variety of topics related to health behaviors and prevention, which is used to assess the prevalence of personal health practices and behaviors related to the leading causes of disease, including smoking, alcohol use, drinking and driving, high blood pressure control, physical activity, weight control, quality of life (European Quality of Life-5 Dimensions, European Quality of Life-Visual Analogue Scale, Korean Instrumental Activities of Daily Living ), medical service, accident, injury, etc. The KCHS was administered by trained interviewers, and the quality control of the KCHS was improved by the introduction of a computer-assisted personal interview in 2010. The KCHS data allow a direct comparison of the differences of health issues among provinces. Furthermore, the provinces can use these data for their own cost-effective health interventions to improve health promotion and disease prevention. For users and researchers throughout the world, microdata (in the form of SAS files) and analytic guidelines can be downloaded from the KCHS website (http://KCHS.cdc.go.kr/) in Korean. PMID:26430619

  1. Decision maker perceptions of resource allocation processes in Canadian health care organizations: a national survey

    PubMed Central

    2013-01-01

    Background Resource allocation is a key challenge for healthcare decision makers. While several case studies of organizational practice exist, there have been few large-scale cross-organization comparisons. Methods Between January and April 2011, we conducted an on-line survey of senior decision makers within regional health authorities (and closely equivalent organizations) across all Canadian provinces and territories. We received returns from 92 individual managers, from 60 out of 89 organizations in total. The survey inquired about structures, process features, and behaviours related to organization-wide resource allocation decisions. We focus here on three main aspects: type of process, perceived fairness, and overall rating. Results About one-half of respondents indicated that their organization used a formal process for resource allocation, while the others reported that political or historical factors were predominant. Seventy percent (70%) of respondents self-reported that their resource allocation process was fair and just over one-half assessed their process as ‘good’ or ‘very good’. This paper explores these findings in greater detail and assesses them in context of the larger literature. Conclusion Data from this large-scale cross-jurisdictional survey helps to illustrate common challenges and areas of positive performance among Canada’s health system leadership teams. PMID:23819598

  2. Large-scale magnetic topologies of early M dwarfs

    NASA Astrophysics Data System (ADS)

    Donati, J.-F.; Morin, J.; Petit, P.; Delfosse, X.; Forveille, T.; Aurière, M.; Cabanac, R.; Dintrans, B.; Fares, R.; Gastine, T.; Jardine, M. M.; Lignières, F.; Paletou, F.; Ramirez Velez, J. C.; Théado, S.

    2008-10-01

    We present here additional results of a spectropolarimetric survey of a small sample of stars ranging from spectral type M0 to M8 aimed at investigating observationally how dynamo processes operate in stars on both sides of the full convection threshold (spectral type M4). The present paper focuses on early M stars (M0-M3), that is above the full convection threshold. Applying tomographic imaging techniques to time series of rotationally modulated circularly polarized profiles collected with the NARVAL spectropolarimeter, we determine the rotation period and reconstruct the large-scale magnetic topologies of six early M dwarfs. We find that early-M stars preferentially host large-scale fields with dominantly toroidal and non-axisymmetric poloidal configurations, along with significant differential rotation (and long-term variability); only the lowest-mass star of our subsample is found to host an almost fully poloidal, mainly axisymmetric large-scale field resembling those found in mid-M dwarfs. This abrupt change in the large-scale magnetic topologies of M dwarfs (occurring at spectral type M3) has no related signature on X-ray luminosities (measuring the total amount of magnetic flux); it thus suggests that underlying dynamo processes become more efficient at producing large-scale fields (despite producing the same flux) at spectral types later than M3. We suspect that this change relates to the rapid decrease in the radiative cores of low-mass stars and to the simultaneous sharp increase of the convective turnover times (with decreasing stellar mass) that models predict to occur at M3; it may also be (at least partly) responsible for the reduced magnetic braking reported for fully convective stars. Based on observations obtained at the Télescope Bernard Lyot (TBL), operated by the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique of France. E-mail: donati@ast.obs-mip.fr (J-FD); jmorin@ast.obs-mip.fr (JM); petit

  3. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  4. [National Strategic Promotion for Large-Scale Clinical Cancer Research].

    PubMed

    Toyama, Senya

    2016-04-01

    The number of clinical research by clinical cancer study groups has been decreasing this year in Japan. They say the reason is the abolition of donations to the groups from the pharmaceutical companies after the Diovan scandal. But I suppose fundamental problem is that government-supported large-scale clinical cancer study system for evidence based medicine (EBM) has not been fully established. An urgent establishment of the system based on the national strategy is needed for the cancer patients and the public health promotion. PMID:27220800

  5. Large scale water lens for solar concentration.

    PubMed

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation. PMID:26072893

  6. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  7. Large-scale simulations of reionization

    SciTech Connect

    Kohler, Katharina; Gnedin, Nickolay Y.; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  8. Large-scale databases of proper names.

    PubMed

    Conley, P; Burgess, C; Hage, D

    1999-05-01

    Few tools for research in proper names have been available--specifically, there is no large-scale corpus of proper names. Two corpora of proper names were constructed, one based on U.S. phone book listings, the other derived from a database of Usenet text. Name frequencies from both corpora were compared with human subjects' reaction times (RTs) to the proper names in a naming task. Regression analysis showed that the Usenet frequencies contributed to predictions of human RT, whereas phone book frequencies did not. In addition, semantic neighborhood density measures derived from the HAL corpus were compared with the subjects' RTs and found to be a better predictor of RT than was frequency in either corpus. These new corpora are freely available on line for download. Potentials for these corpora range from using the names as stimuli in experiments to using the corpus data in software applications. PMID:10495803

  9. Estimation of large-scale dimension densities.

    PubMed

    Raab, C; Kurths, J

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor. PMID:11461376

  10. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  11. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  12. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  13. Batteries for Large Scale Energy Storage

    SciTech Connect

    Soloveichik, Grigorii L.

    2011-07-15

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β”-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  14. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  15. Estimation of large-scale dimension densities

    NASA Astrophysics Data System (ADS)

    Raab, Corinna; Kurths, Jürgen

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor.

  16. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  17. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  18. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  19. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  20. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  1. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-02-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  2. Indonesian survey looks at adolescent reproductive health.

    PubMed

    Achmad, S I; Westley, S B

    1999-10-01

    The Baseline Survey of Young Adult Reproductive Welfare in Indonesia, conducted from September to December 1998, provides information about young Indonesians on topics concerning work, education, marriage, family life, sexuality, fertility, and HIV/AIDS and other sexually transmitted diseases. The survey interviewed 4106 men and 3978 women aged 15-24 years in three provinces of Java. Survey findings showed that 42% of the women and 8% of the men are currently or have been married. There was a strong inverse relationship between marriage and schooling, which suggests that greater educational attainment and a higher average age at marriage are likely to go together. Although most young couples prefer to delay and space births, only half of currently married young women are using any type of contraception. These results indicate that there is a need for better reproductive health care as well as improved reproductive health education. Moreover, the current economic crisis has lead to a decline in the use of the private sector for health care. Instead, young people are using the less-expensive government services, and young women are turning to pharmacies and midwives rather than to private doctors to obtain contraceptives. These findings have several policy implications including the need for reproductive health programs that provide services needed by young people. PMID:12295693

  3. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  4. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  5. Gravity and large-scale nonlocal bias

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Scoccimarro, Román; Sheth, Ravi K.

    2012-04-01

    For Gaussian primordial fluctuations the relationship between galaxy and matter overdensities, bias, is most often assumed to be local at the time of observation in the large-scale limit. This hypothesis is however unstable under time evolution, we provide proofs under several (increasingly more realistic) sets of assumptions. In the simplest toy model galaxies are created locally and linearly biased at a single formation time, and subsequently move with the dark matter (no velocity bias) conserving their comoving number density (no merging). We show that, after this formation time, the bias becomes unavoidably nonlocal and nonlinear at large scales. We identify the nonlocal gravitationally induced fields in which the galaxy overdensity can be expanded, showing that they can be constructed out of the invariants of the deformation tensor (Galileons), the main signature of which is a quadrupole field in second-order perturbation theory. In addition, we show that this result persists if we include an arbitrary evolution of the comoving number density of tracers. We then include velocity bias, and show that new contributions appear; these are related to the breaking of Galilean invariance of the bias relation, a dipole field being the signature at second order. We test these predictions by studying the dependence of halo overdensities in cells of fixed dark matter density: measurements in simulations show that departures from the mean bias relation are strongly correlated with the nonlocal gravitationally induced fields identified by our formalism, suggesting that the halo distribution at the present time is indeed more closely related to the mass distribution at an earlier rather than present time. However, the nonlocality seen in the simulations is not fully captured by assuming local bias in Lagrangian space. The effects on nonlocal bias seen in the simulations are most important for the most biased halos, as expected from our predictions. Accounting for these

  6. Large-Scale Statistics for Cu Electromigration

    NASA Astrophysics Data System (ADS)

    Hauschildt, M.; Gall, M.; Hernandez, R.

    2009-06-01

    Even after the successful introduction of Cu-based metallization, the electromigration failure risk has remained one of the important reliability concerns for advanced process technologies. The observation of strong bimodality for the electron up-flow direction in dual-inlaid Cu interconnects has added complexity, but is now widely accepted. The failure voids can occur both within the via ("early" mode) or within the trench ("late" mode). More recently, bimodality has been reported also in down-flow electromigration, leading to very short lifetimes due to small, slit-shaped voids under vias. For a more thorough investigation of these early failure phenomena, specific test structures were designed based on the Wheatstone Bridge technique. The use of these structures enabled an increase of the tested sample size close to 675000, allowing a direct analysis of electromigration failure mechanisms at the single-digit ppm regime. Results indicate that down-flow electromigration exhibits bimodality at very small percentage levels, not readily identifiable with standard testing methods. The activation energy for the down-flow early failure mechanism was determined to be 0.83±0.02 eV. Within the small error bounds of this large-scale statistical experiment, this value is deemed to be significantly lower than the usually reported activation energy of 0.90 eV for electromigration-induced diffusion along Cu/SiCN interfaces. Due to the advantages of the Wheatstone Bridge technique, we were also able to expand the experimental temperature range down to 150° C, coming quite close to typical operating conditions up to 125° C. As a result of the lowered activation energy, we conclude that the down-flow early failure mode may control the chip lifetime at operating conditions. The slit-like character of the early failure void morphology also raises concerns about the validity of the Blech-effect for this mechanism. A very small amount of Cu depletion may cause failure even before a

  7. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  8. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  9. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  10. Large-scale wind turbine structures

    NASA Astrophysics Data System (ADS)

    Spera, David A.

    1988-05-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  11. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  12. Large Scale Computer Simulation of Erthocyte Membranes

    NASA Astrophysics Data System (ADS)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  13. Simulating the large-scale structure of HI intensity maps

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 20483 particles (particle mass 1.6 × 1011 Msolar / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (108 Msolar / h < Mhalo < 1013 Msolar / h), we assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 lesssim z lesssim 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  14. The effective field theory of cosmological large scale structures

    SciTech Connect

    Carrasco, John Joseph M.; Hertzberg, Mark P.; Senatore, Leonardo

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  15. Halo detection via large-scale Bayesian inference

    NASA Astrophysics Data System (ADS)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  16. Nurse prescribing in mental health: national survey.

    PubMed

    Dobel-Ober, D; Brimblecombe, N; Bradley, E

    2010-08-01

    Mental health nurses can now train to become independent prescribers as well as supplementary prescribers. Independent nurse prescribing can potentially help to reorganize mental health services, increase access to medicines and improve service user information, satisfaction and concordance. However, mental health nursing has been slow to undertake prescribing roles, and there has been little work conducted to look at where nurse prescribing is proving successful, and those areas where it is less so. This survey was designed to collect information from directors of nursing in mental health trusts about the numbers of mental health prescribers in England, gather views about prescribing in practice, and elicit intentions with regards to the development of nurse prescribing. In some Trusts, the number of mental health nurse prescribers has increased to the point where wider impacts on workforce, the configuration of teams and services are inevitable. Currently, the way that prescribing is used within different organizations, services and teams varies and it is unclear which setting is most appropriate for the different modes of prescribing. Future work should focus on the impact of mental health nurse prescribing on service delivery, as well as on service users, colleagues and nurses themselves. PMID:20633075

  17. UAV Data Processing for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV) bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI) i.e. <5,000 hectares. For detail spatial planning purposes in Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan) level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D) and digital terrain models (3D) will be integrated in order to provide Digital Elevation Models (DEM) as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD). Finally this result will be used as the benchmark for alternative geospatial

  18. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  19. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  20. International space station. Large scale integration approach

    NASA Astrophysics Data System (ADS)

    Cohen, Brad

    The International Space Station is the most complex large scale integration program in development today. The approach developed for specification, subsystem development, and verification lay a firm basis on which future programs of this nature can be based. International Space Station is composed of many critical items, hardware and software, built by numerous International Partners, NASA Institutions, and U.S. Contractors and is launched over a period of five years. Each launch creates a unique configuration that must be safe, survivable, operable, and support ongoing assembly (assemblable) to arrive at the assembly complete configuration in 2003. The approaches to integrating each of the modules into a viable spacecraft and continue the assembly is a challenge in itself. Added to this challenge are the severe schedule constraints and lack of an "Iron Bird", which prevents assembly and checkout of each on-orbit configuration prior to launch. This paper will focus on the following areas: 1) Specification development process explaining how the requirements and specifications were derived using a modular concept driven by launch vehicle capability. Each module is composed of components of subsystems versus completed subsystems. 2) Approach to stage (each stage consists of the launched module added to the current on-orbit spacecraft) specifications. Specifically, how each launched module and stage ensures support of the current and future elements of the assembly. 3) Verification approach, due to the schedule constraints, is primarily analysis supported by testing. Specifically, how are the interfaces ensured to mate and function on-orbit when they cannot be mated before launch. 4) Lessons learned. Where can we improve this complex system design and integration task?

  1. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  2. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  3. The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation

    NASA Astrophysics Data System (ADS)

    Noh, Yookyung

    The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.

  4. Nonzero Density-Velocity Consistency Relations for Large Scale Structures.

    PubMed

    Rizzo, Luca Alberto; Mota, David F; Valageas, Patrick

    2016-08-19

    We present exact kinematic consistency relations for cosmological structures that do not vanish at equal times and can thus be measured in surveys. These rely on cross correlations between the density and velocity, or momentum, fields. Indeed, the uniform transport of small-scale structures by long-wavelength modes, which cannot be detected at equal times by looking at density correlations only, gives rise to a shift in the amplitude of the velocity field that could be measured. These consistency relations only rely on the weak equivalence principle and Gaussian initial conditions. They remain valid in the nonlinear regime and for biased galaxy fields. They can be used to constrain nonstandard cosmological scenarios or the large-scale galaxy bias. PMID:27588842

  5. Nonzero Density-Velocity Consistency Relations for Large Scale Structures

    NASA Astrophysics Data System (ADS)

    Rizzo, Luca Alberto; Mota, David F.; Valageas, Patrick

    2016-08-01

    We present exact kinematic consistency relations for cosmological structures that do not vanish at equal times and can thus be measured in surveys. These rely on cross correlations between the density and velocity, or momentum, fields. Indeed, the uniform transport of small-scale structures by long-wavelength modes, which cannot be detected at equal times by looking at density correlations only, gives rise to a shift in the amplitude of the velocity field that could be measured. These consistency relations only rely on the weak equivalence principle and Gaussian initial conditions. They remain valid in the nonlinear regime and for biased galaxy fields. They can be used to constrain nonstandard cosmological scenarios or the large-scale galaxy bias.

  6. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. PMID:26595397

  7. Large Scale Archaeological Satellite Classification and Data Mining Tools

    NASA Astrophysics Data System (ADS)

    Canham, Kelly

    Archaeological applications routinely use many different forms of remote sensing imagery, the exception being hyperspectral imagery (HSI). HSI tends to be utilized in a similar fashion as multispectral imagery (MSI) or processed to the point that it can be utilized similarly to MSI, thus reducing the benefits of HSI. However, for large scale archaeological surveys, HSI data can be used to differentiate materials more accurately than MSI because of HSI's larger number of spectral bands. HSI also has the ability to identify multiple materials found within a single pixel (sub-pixel material mixing), which is traditionally not possible with MSI. The Zapotec people of Oaxaca, Mexico, lived in an environment that isolates the individual settlements by rugged mountain ranges and dramatically different ecosystems. The rugged mountains of Oaxaca make large scale ground based archaeological surveys expensive in terms of both time and money. The diverse ecosystems of Oaxaca make multispectral satellite imagery inadequate for local material identification. For these reasons hyperspectral imagery was collected over Oaxaca, Mexico. Using HSI, investigations were conducted into how the Zapotec statehood was impacted by the environment, and conversely, how the environment impacted the statehood. Emphasis in this research is placed on identifying the number of pure materials present in the imagery, what these materials are, and identifying archaeological regions of interest using image processing techniques. The HSI processing techniques applied include a new spatially adaptive spectral unmixing approach (LoGlo) to identify pure materials across broad regions of Oaxaca, vegetation indices analysis, and spectral change detection algorithms. Verification of identified archaeological sites is completed using Geospatial Information System (GIS) tools, ground truth data, and high-resolution satellite MSI. GIS tools are also used to analyze spatial trends in lost archaeological sites due

  8. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  9. Sufficient observables for large-scale structure in galaxy surveys

    NASA Astrophysics Data System (ADS)

    Carron, J.; Szapudi, I.

    2014-03-01

    Beyond the linear regime, the power spectrum and higher order moments of the matter field no longer capture all cosmological information encoded in density fluctuations. While non-linear transforms have been proposed to extract this information lost to traditional methods, up to now, the way to generalize these techniques to discrete processes was unclear; ad hoc extensions had some success. We pointed out in Carron and Szapudi's paper that the logarithmic transform approximates extremely well the optimal `sufficient statistics', observables that extract all information from the (continuous) matter field. Building on these results, we generalize optimal transforms to discrete galaxy fields. We focus our calculations on the Poisson sampling of an underlying lognormal density field. We solve and test the one-point case in detail, and sketch out the sufficient observables for the multipoint case. Moreover, we present an accurate approximation to the sufficient observables in terms of the mean and spectrum of a non-linearly transformed field. We find that the corresponding optimal non-linear transformation is directly related to the maximum a posteriori Bayesian reconstruction of the underlying continuous field with a lognormal prior as put forward in the paper of Kitaura et al.. Thus, simple recipes for realizing the sufficient observables can be built on previously proposed algorithms that have been successfully implemented and tested in simulations.

  10. Large-Scale Survey for Tickborne Bacteria, Khammouan Province, Laos

    PubMed Central

    Vongphayloth, Khamsing; Vongsouvath, Malavanh; Grandadam, Marc; Brey, Paul T.; Newton, Paul N.; Sutherland, Ian W.; Dittrich, Sabine

    2016-01-01

    We screened 768 tick pools containing 6,962 ticks from Khammouan Province, Laos, by using quantitative real-time PCR and identified Rickettsia spp., Ehrlichia spp., and Borrelia spp. Sequencing of Rickettsia spp.–positive and Borrelia spp.–positive pools provided evidence for distinct genotypes. Our results identified bacteria with human disease potential in ticks in Laos. PMID:27532491

  11. Large-Scale Survey for Tickborne Bacteria, Khammouan Province, Laos.

    PubMed

    Taylor, Andrew J; Vongphayloth, Khamsing; Vongsouvath, Malavanh; Grandadam, Marc; Brey, Paul T; Newton, Paul N; Sutherland, Ian W; Dittrich, Sabine

    2016-09-01

    We screened 768 tick pools containing 6,962 ticks from Khammouan Province, Laos, by using quantitative real-time PCR and identified Rickettsia spp., Ehrlichia spp., and Borrelia spp. Sequencing of Rickettsia spp.-positive and Borrelia spp.-positive pools provided evidence for distinct genotypes. Our results identified bacteria with human disease potential in ticks in Laos. PMID:27532491

  12. Large Scale Survey of Tocol Content in Barley Germplasm

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The tocochromanols (or tocols, compounds showing vitamin E activity) are biosynthesized exclusively by photosynthetic organisms. Although the precise benefits of vitamin E are not known, diets deficient in vitamin E appear to be associated with atherosclerosis and other cardiovascular disease. The...

  13. Linking Large-Scale Reading Assessments: Comment

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  14. From Systematic Errors to Cosmology Using Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Hunterer, Dragan

    We propose to carry out a two-pronged program to significantly improve links between galaxy surveys and constraints on primordial cosmology and fundamental physics. We will first develop the methodology to self-calibrate the survey, that is, determine the large-angle calibration systematics internally from the survey. We will use this information to correct biases that propagate from the largest to smaller angular scales. Our approach for tackling the systematics is very complementary to existing ones, in particular in the sense that it does not assume knowledge of specific systematic maps or templates. It is timely to undertake these analyses, since none of the currently known methods addresses the multiplicative effects of large-angle calibration errors that contaminate the small-scale signal and present one of the most significant sources of error in the large-scale structure. The second part of the proposal is to precisely quantify the statistical and systematic errors in the reconstruction of the Integrated Sachs-Wolfe (ISW) contribution to the cosmic microwave background (CMB) sky map using information from galaxy surveys. Unlike the ISW contributions to CMB power, the ISW map reconstruction has not been studied in detail to date. We will create a nimble plug-and-play pipeline to ascertain how reliably a map from an arbitrary LSS survey can be used to separate the late-time and early-time contributions to CMB anisotropy at large angular scales. We will pay particular attention to partial sky coverage, incomplete redshift information, finite redshift range, and imperfect knowledge of the selection function for the galaxy survey. Our work should serve as the departure point for a variety of implications in cosmology, including the physical origin of the large-angle CMB "anomalies".

  15. Probing the imprint of interacting dark energy on very large scales

    NASA Astrophysics Data System (ADS)

    Duniya, Didam G. A.; Bertacca, Daniele; Maartens, Roy

    2015-03-01

    The observed galaxy power spectrum acquires relativistic corrections from light-cone effects, and these corrections grow on very large scales. Future galaxy surveys in optical, infrared and radio bands will probe increasingly large wavelength modes and reach higher redshifts. In order to exploit the new data on large scales, an accurate analysis requires inclusion of the relativistic effects. This is especially the case for primordial non-Gaussianity and for extending tests of dark energy models to horizon scales. Here we investigate the latter, focusing on models where the dark energy interacts nongravitationally with dark matter. Interaction in the dark sector can also lead to large-scale deviations in the power spectrum. If the relativistic effects are ignored, the imprint of interacting dark energy will be incorrectly identified and thus lead to a bias in constraints on interacting dark energy on very large scales.

  16. NATIONAL SURVEY OF CHILDREN WITH SPECIAL HEALTH CARE NEEDS (CSHCN)

    EPA Science Inventory

    The National Survey of Children with Special Health Care Needs (CSHCN) was sponsored and funded by the Maternal and Child Health Bureau of the Health Resources and Services. Administration. The survey was conducted by the National Center for Health Statistics of the Centers for D...

  17. "Cosmological Parameters from Large Scale Structure"

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    2005-01-01

    This grant has provided primary support for graduate student Mark Neyrinck, and some support for the PI and for colleague Nick Gnedin, who helped co-supervise Neyrinck. This award had two major goals. First, to continue to develop and apply methods for measuring galaxy power spectra on large, linear scales, with a view to constraining cosmological parameters. And second, to begin try to understand galaxy clustering at smaller. nonlinear scales well enough to constrain cosmology from those scales also. Under this grant, the PI and collaborators, notably Max Tegmark. continued to improve their technology for measuring power spectra from galaxy surveys at large, linear scales. and to apply the technology to surveys as the data become available. We believe that our methods are best in the world. These measurements become the foundation from which we and other groups measure cosmological parameters.

  18. Illinois department of public health H1N1/A pandemic communications evaluation survey.

    SciTech Connect

    Walsh, D.; Decision and Information Sciences

    2010-09-16

    Because of heightened media coverage, a 24-hour news cycle and the potential miscommunication of health messages across all levels of government during the onset of the H1N1 influenza outbreak in spring 2009, the Illinois Department of Public Health (IDPH) decided to evaluate its H1N1 influenza A communications system. IDPH wanted to confirm its disease information and instructions were helping stakeholders prepare for and respond to a novel influenza outbreak. In addition, the time commitment involved in preparing, issuing, monitoring, updating, and responding to H1N1 federal guidelines/updates and media stories became a heavy burden for IDPH staff. The process and results of the H1N1 messaging survey represent a best practice that other health departments and emergency management agencies can replicate to improve coordination efforts with stakeholder groups during both emergency preparedness and response phases. Importantly, the H1N1 survey confirmed IDPH's messages were influencing stakeholders decisions to activate their pandemic plans and initiate response operations. While there was some dissatisfaction with IDPH's delivery of information and communication tools, such as the fax system, this report should demonstrate to IDPH that its core partners believe it has the ability and expertise to issue timely and accurate instructions that can help them respond to a large-scale disease outbreak in Illinois. The conclusion will focus on three main areas: (1) the survey development process, (2) survey results: best practices and areas for improvement and (3) recommendations: next steps.

  19. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  20. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  1. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  2. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  3. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  4. Star formation associated with a large-scale infrared bubble

    NASA Astrophysics Data System (ADS)

    Xu, Jin-Long; Ju, Bing-Gang

    2014-09-01

    Aims: To investigate how a large-scale infrared bubble centered at l = 53.9° and b = 0.2° forms, and to study if star formation is taking place at the periphery of the bubble, we performed a multiwavelength study. Methods: Using the data from the Galactic Ring Survey (GRS) and Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE), we performed a study of a large-scale infrared bubble with a size of about 16 pc at a distance of 2.0 kpc. We present the 12CO J = 1-0, 13CO J = 1-0, and C18O J = 1-0 observations of HII region G53.54-0.01 (Sh2-82) obtained at the Purple Mountain Observation (PMO) 13.7 m radio telescope to investigate the detailed distribution of associated molecular material. In addition, we also used radiorecombination line and VLA data. To select young stellar objects (YSOs) consistent with this region, we used the GLIMPSE I catalog. Results: The large-scale infrared bubble shows a half-shell morphology at 8 μm. The H II regions of G53.54-0.01, G53.64+0.24, and G54.09-0.06 are situated on the bubble. Comparing the radio recombination line velocities and associated 13CO J = 1-0 components of the three H II regions, we found that the 8 μm emission associated with H II region G53.54-0.01 should belong to the foreground emission, and only overlap with the large-scale infrared bubble in the line of sight. Three extended green objects (EGOs, the candidate massive young stellar objects), as well as three H II regions and two small-scale bubbles are found located in the G54.09-0.06 complex, indicating an active massive star-forming region. Emission from C18O at J = 1-0 presents four cloud clumps on the northeastern border of H II region G53.54-0.01. By comparing the spectral profiles of 12CO J = 1-0, 13CO J = 1-0, and C18O J = 1-0 at the peak position of each clump, we found the collected gas in the three clumps, except for the clump coinciding with a massive YSO (IRAS 19282+1814). Using the evolutive model of the H II region, we derived that

  5. Precision Measurement of Large Scale Structure

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    2001-01-01

    The purpose of this grant was to develop and to start to apply new precision methods for measuring the power spectrum and redshift distortions from the anticipated new generation of large redshift surveys. A highlight of work completed during the award period was the application of the new methods developed by the PI to measure the real space power spectrum and redshift distortions of the IRAS PSCz survey, published in January 2000. New features of the measurement include: (1) measurement of power over an unprecedentedly broad range of scales, 4.5 decades in wavenumber, from 0.01 to 300 h/Mpc; (2) at linear scales, not one but three power spectra are measured, the galaxy-galaxy, galaxy-velocity, and velocity-velocity power spectra; (3) at linear scales each of the three power spectra is decorrelated within itself, and disentangled from the other two power spectra (the situation is analogous to disentangling scalar and tensor modes in the Cosmic Microwave Background); and (4) at nonlinear scales the measurement extracts not only the real space power spectrum, but also the full line-of-sight pairwise velocity distribution in redshift space.

  6. Parallel block schemes for large scale least squares computations

    SciTech Connect

    Golub, G.H.; Plemmons, R.J.; Sameh, A.

    1986-04-01

    Large scale least squares computations arise in a variety of scientific and engineering problems, including geodetic adjustments and surveys, medical image analysis, molecular structures, partial differential equations and substructuring methods in structural engineering. In each of these problems, matrices often arise which possess a block structure which reflects the local connection nature of the underlying physical problem. For example, such super-large nonlinear least squares computations arise in geodesy. Here the coordinates of positions are calculated by iteratively solving overdetermined systems of nonlinear equations by the Gauss-Newton method. The US National Geodetic Survey will complete this year (1986) the readjustment of the North American Datum, a problem which involves over 540 thousand unknowns and over 6.5 million observations (equations). The observation matrix for these least squares computations has a block angular form with 161 diagnonal blocks, each containing 3 to 4 thousand unknowns. In this paper parallel schemes are suggested for the orthogonal factorization of matrices in block angular form and for the associated backsubstitution phase of the least squares computations. In addition, a parallel scheme for the calculation of certain elements of the covariance matrix for such problems is described. It is shown that these algorithms are ideally suited for multiprocessors with three levels of parallelism such as the Cedar system at the University of Illinois. 20 refs., 7 figs.

  7. Detailed investigation of flowfields within large scale hypersonic inlet models

    NASA Technical Reports Server (NTRS)

    Seebaugh, W. R.; Doran, R. W.; Decarlo, J. P.

    1971-01-01

    Analytical and experimental investigations were conducted to determine the characteristics of the internal flows in model passages representative of hypersonic inlets and also sufficiently large for meaningful data to be obtained. Three large-scale inlet models, each having a different compression ratio, were designed to provide high performance and approximately uniform static-pressure distributions at the throat stations. A wedge forebody was used to simulate the flowfield conditions at the entrance of the internal passages, thus removing the actual vehicle forebody from consideration in the design of the wind-tunnel models. Tests were conducted in a 3.5 foot hypersonic wind tunnel at a nominal test Mach number of 7.4 and freestream unit Reynolds number of 2,700,000 per foot. From flowfield survey data the inlet entrance, the entering inviscid and viscous flow conditions were determined prior to the analysis of the data obtained in the internal passages. Detailed flowfield survey data were obtained near the centerlines of the internal passages to define the boundary-layer development on the internal surfaces and the internal shock-wave configuration. Finally, flowfield data were measured across the throats of the inlet models to evaluate the internal performance of the internal passages. These data and additional results from surface instrumentation and flow visualization studies were utilized to determine the internal flowfield patterns and the inlet performance.

  8. Probes of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Suto, Yasushi; Gorski, Krzysztof; Juszkiewicz, Roman; Silk, Joseph

    1988-01-01

    A general formalism is developed which shows that the gravitational instability theory for the origin of the large-scale structure of the universe is now capable of critically confronting observational results on cosmic background radiation angular anisotropies, large-scale bulk motions, and large-scale clumpiness in the galaxy counts. The results indicate that presently advocated cosmological models will have considerable difficulty in simultaneously explaining the observational results.

  9. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  10. NATIONAL HEALTH INTERVIEW SURVEY ON DISABILITY - (NHIS-D)

    EPA Science Inventory

    National Health Interview Survey-Disability Survey was developed to collect data that can be used to understand disability, to develop public health policy, to produce simple prevalence estimates of selected health conditions, and to provide descriptive baseline statistics on the...

  11. Large-scale structural monitoring systems

    NASA Astrophysics Data System (ADS)

    Solomon, Ian; Cunnane, James; Stevenson, Paul

    2000-06-01

    Extensive structural health instrumentation systems have been installed on three long-span cable-supported bridges in Hong Kong. The quantities measured include environment and applied loads (such as wind, temperature, seismic and traffic loads) and the bridge response to these loadings (accelerations, displacements, and strains). Measurements from over 1000 individual sensors are transmitted to central computing facilities via local data acquisition stations and a fault- tolerant fiber-optic network, and are acquired and processed continuously. The data from the systems is used to provide information on structural load and response characteristics, comparison with design, optimization of inspection, and assurance of continued bridge health. Automated data processing and analysis provides information on important structural and operational parameters. Abnormal events are noted and logged automatically. Information of interest is automatically archived for post-processing. Novel aspects of the instrumentation system include a fluid-based high-accuracy long-span Level Sensing System to measure bridge deck profile and tower settlement. This paper provides an outline of the design and implementation of the instrumentation system. A description of the design and implementation of the data acquisition and processing procedures is also given. Examples of the use of similar systems in monitoring other large structures are discussed.

  12. A health survey of toll booth workers

    SciTech Connect

    Strauss, P.; Orris, P.; Buckley, L. )

    1992-01-01

    The prevalence of respiratory and other health problems in a cohort of highway toll booth workers was surveyed by mailed questionnaire. In a low proportion of respondents (43.2%), a high prevalence of central nervous system complaints (headaches, irritability, or anxiety, and unusual tiredness), mucous membrane irritation (eye irritation, nasal congestion, and dry throat), and musculoskeletal problems (joint and back pains) was found. We believe these symptoms are reflective of the acute irritant and central nervous system effects of exposure to motor vehicle exhaust. The musculoskeletal complaints are likely the result of bending, reaching, and leaning out of the toll booth. The need for in-depth evaluation of the ventilation systems and the ergonomic and job stressors of work at toll booths is suggested by these results.

  13. Bacteriological survey of sixty health foods.

    PubMed Central

    Andrews, W H; Wilson, C R; Poelma, P L; Romero, A; Mislivec, P B

    1979-01-01

    A bacteriological survey was performed on 1,960 food samples encompassing 60 types of health foods available in the Baltimore-Washington, D.C., metropolitan area. No consistent bacteriological distinction (aerobic plate counts, total coliform and fecal coliform most probable numbers) was observed between foods labeled as organic (raised on soil with compost or nonchemical fertilizer and without application of pesticides, fungicides, and herbicides) and their counterpart food types bearing no such label. Types and numbers of samples containing Salmonella were: sunflower seeds, 4; soy flour, 3; soy protein powder, 2; soy milk powder, 1; dried active yeast, 1; brewers' years, 1; rye flour, 1; brown rice, 1; and alfalfa seeds,1. The occurrence of this pathogen in three types of soybean products should warrant further investigation of soybean derivatives as potentially significant sources of Salmonella. PMID:572198

  14. [Blinding trachoma: results of a prevalence survey in 8 health districts in CAR].

    PubMed

    Yaya, G; Kemata, B; Youfegan Baanam, M; Bobossi-Serengbe, G

    2015-10-01

    Support of visual disabilities in terms of preventive and curative treatment, is a priority for public health in Central African Republic. The lack of recent and reliable data on ocular pathologies in general including trachoma particularly, has led health authorities, in collaboration with partners to undertake an epidemiological investigation to determine the mapping. This study was designed to assess the importance of endemicity in the most sensitive groups within population, including children of 1 to 9 years old. Eight from sixteen health districts in the country, were selected for this survey as a first step. The data collected will assess the real needs in medical and surgical care to develop an appropriate strategic plan of support for this condition on a large scale. This is a cross-sectional descriptive survey carried out in one month, from November 23 to December 26, 2011 in eight health prefectures of the country. The sampling frame was the population of eight health districts. The exhaustive list of villages and demographic data from the national census conducted in December 2003, adjusted by the rate of annual increase of 2.5%has been used. The administrative headquarters of the places of the visited districts leaders were excluded from the sampling frame. A random survey in clusters at two levels made from formed bases. Twenty villages (clusters) in each health district have been drawn according to the proportional probability to the size of the totals cumulative. 12,800 children of both sexes, aged 1 to 9 years have been identified in this investigation and 11,287 were actually examined, or 88.2 %, sex ratio is significantly 1.11. The proportion by age group of the children sampled is stackable to the general population. 26.9 % of TF and 5.9 % TI have been diagnosed. Six from eight districts surveyed are endemic. Three of them had respectively rates of 32.3 %, 47.1 % and 54.3 %. PMID:26277710

  15. Updating Geospatial Data from Large Scale Data Sources

    NASA Astrophysics Data System (ADS)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  16. A review of national health surveys in India.

    PubMed

    Dandona, Rakhi; Pandey, Anamika; Dandona, Lalit

    2016-04-01

    Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India's disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health. PMID:27034522

  17. Understanding Participation in E-Learning in Organizations: A Large-Scale Empirical Study of Employees

    ERIC Educational Resources Information Center

    Garavan, Thomas N.; Carbery, Ronan; O'Malley, Grace; O'Donnell, David

    2010-01-01

    Much remains unknown in the increasingly important field of e-learning in organizations. Drawing on a large-scale survey of employees (N = 557) who had opportunities to participate in voluntary e-learning activities, the factors influencing participation in e-learning are explored in this empirical paper. It is hypothesized that key variables…

  18. What Can We Really Expect from Large-Scale Voucher Programs?

    ERIC Educational Resources Information Center

    Corwin, Ronald G.; Dianda, Marcella R.

    1993-01-01

    Using data from a survey of private schools in California, this article concludes that a proposed large-scale voucher program would not significantly affect public school enrollment. Limited transportation, academic qualifications, socioeconomic status, and English proficiency will affect private school access. Only 1-5% of California's public…

  19. Strategic Leadership for Large-Scale Reform: The Case of England's National Literacy and Numeracy Strategy

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Jantzi, Doris; Earl, Lorna; Watson, Nancy; Levin, Benjamin; Fullan, Michael

    2004-01-01

    Both 'strategic' and 'distributed' forms of leadership are considered promising responses to the demands placed on school systems by large-scale reform initiatives. Using observation, interview and survey data collected as part of a larger evaluation of England's National Literacy and Numeracy Strategies, this study inquired about sources of…

  20. On the Estimation of Hierarchical Latent Regression Models for Large-Scale Assessments

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas; Jiang, Yanlin

    2009-01-01

    To find population proficiency distributions, a two-level hierarchical linear model may be applied to large-scale survey assessments such as the National Assessment of Educational Progress (NAEP). The model and parameter estimation are developed and a simulation was carried out to evaluate parameter recovery. Subsequently, both a hierarchical and…

  1. A large-scale study of epilepsy in Ecuador: methodological aspects.

    PubMed

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type. PMID:1495577

  2. The Challenge of Large-Scale Literacy Improvement

    ERIC Educational Resources Information Center

    Levin, Ben

    2010-01-01

    This paper discusses the challenge of making large-scale improvements in literacy in schools across an entire education system. Despite growing interest and rhetoric, there are very few examples of sustained, large-scale change efforts around school-age literacy. The paper reviews 2 instances of such efforts, in England and Ontario. After…

  3. INTERNATIONAL WORKSHOP ON LARGE-SCALE REFORESTATION: PROCEEDINGS

    EPA Science Inventory

    The purpose of the workshop was to identify major operational and ecological considerations needed to successfully conduct large-scale reforestation projects throughout the forested regions of the world. Large-scale" for this workshop means projects where, by human effort, approx...

  4. Using Large-Scale Assessment Scores to Determine Student Grades

    ERIC Educational Resources Information Center

    Miller, Tess

    2013-01-01

    Many Canadian provinces provide guidelines for teachers to determine students' final grades by combining a percentage of students' scores from provincial large-scale assessments with their term scores. This practice is thought to hold students accountable by motivating them to put effort into completing the large-scale assessment, thereby…

  5. Nonlinear density fluctuation field theory for large scale structure

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Miao, Hai-Xing

    2009-05-01

    We develop an effective field theory of density fluctuations for a Newtonian self-gravitating N-body system in quasi-equilibrium and apply it to a homogeneous universe with small density fluctuations. Keeping the density fluctuations up to second order, we obtain the nonlinear field equation of 2-pt correlation ξ(r), which contains 3-pt correlation and formal ultra-violet divergences. By the Groth-Peebles hierarchical ansatz and mass renormalization, the equation becomes closed with two new terms beyond the Gaussian approximation, and their coefficients are taken as parameters. The analytic solution is obtained in terms of the hypergeometric functions, which is checked numerically. With one single set of two fixed parameters, the correlation ξ(r) and the corresponding power spectrum P(κ) simultaneously match the results from all the major surveys, such as APM, SDSS, 2dfGRS, and REFLEX. The model gives a unifying understanding of several seemingly unrelated features of large scale structure from a field-theoretical perspective. The theory is worth extending to study the evolution effects in an expanding universe.

  6. Implicit solvers for large-scale nonlinear problems

    SciTech Connect

    Keyes, D E; Reynolds, D; Woodward, C S

    2006-07-13

    Computational scientists are grappling with increasingly complex, multi-rate applications that couple such physical phenomena as fluid dynamics, electromagnetics, radiation transport, chemical and nuclear reactions, and wave and material propagation in inhomogeneous media. Parallel computers with large storage capacities are paving the way for high-resolution simulations of coupled problems; however, hardware improvements alone will not prove enough to enable simulations based on brute-force algorithmic approaches. To accurately capture nonlinear couplings between dynamically relevant phenomena, often while stepping over rapid adjustments to quasi-equilibria, simulation scientists are increasingly turning to implicit formulations that require a discrete nonlinear system to be solved for each time step or steady state solution. Recent advances in iterative methods have made fully implicit formulations a viable option for solution of these large-scale problems. In this paper, we overview one of the most effective iterative methods, Newton-Krylov, for nonlinear systems and point to software packages with its implementation. We illustrate the method with an example from magnetically confined plasma fusion and briefly survey other areas in which implicit methods have bestowed important advantages, such as allowing high-order temporal integration and providing a pathway to sensitivity analyses and optimization. Lastly, we overview algorithm extensions under development motivated by current SciDAC applications.

  7. Approximate registration of point clouds with large scale differences

    NASA Astrophysics Data System (ADS)

    Novak, D.; Schindler, K.

    2013-10-01

    3D reconstruction of objects is a basic task in many fields, including surveying, engineering, entertainment and cultural heritage. The task is nowadays often accomplished with a laser scanner, which produces dense point clouds, but lacks accurate colour information, and lacks per-point accuracy measures. An obvious solution is to combine laser scanning with photogrammetric recording. In that context, the problem arises to register the two datasets, which feature large scale, translation and rotation differences. The absence of approximate registration parameters (3D translation, 3D rotation and scale) precludes the use of fine-registration methods such as ICP. Here, we present a method to register realistic photogrammetric and laser point clouds in a fully automated fashion. The proposed method decomposes the registration into a sequence of simpler steps: first, two rotation angles are determined by finding dominant surface normal directions, then the remaining parameters are found with RANSAC followed by ICP and scale refinement. These two steps are carried out at low resolution, before computing a precise final registration at higher resolution.

  8. Selection within households in health surveys

    PubMed Central

    Alves, Maria Cecilia Goi Porto; Escuder, Maria Mercedes Loureiro; Claro, Rafael Moreira; da Silva, Nilza Nunes

    2014-01-01

    OBJECTIVE To compare the efficiency and accuracy of sampling designs including and excluding the sampling of individuals within sampled households in health surveys. METHODS From a population survey conducted in Baixada Santista Metropolitan Area, SP, Southeastern Brazil, lowlands between 2006 and 2007, 1,000 samples were drawn for each design and estimates for people aged 18 to 59 and 18 and over were calculated for each sample. In the first design, 40 census tracts, 12 households per sector, and one person per household were sampled. In the second, no sampling within the household was performed and 40 census sectors and 6 households for the 18 to 59-year old group and 5 or 6 for the 18 and over age group or more were sampled. Precision and bias of proportion estimates for 11 indicators were assessed in the two final sets of the 1000 selected samples with the two types of design. They were compared by means of relative measurements: coefficient of variation, bias/mean ratio, bias/standard error ratio, and relative mean square error. Comparison of costs contrasted basic cost per person, household cost, number of people, and households. RESULTS Bias was found to be negligible for both designs. A lower precision was found in the design including individuals sampling within households, and the costs were higher. CONCLUSIONS The design excluding individual sampling achieved higher levels of efficiency and accuracy and, accordingly, should be first choice for investigators. Sampling of household dwellers should be adopted when there are reasons related to the study subject that may lead to bias in individual responses if multiple dwellers answer the proposed questionnaire. PMID:24789641

  9. Generating mock data sets for large-scale Lyman-α forest correlation measurements

    SciTech Connect

    Font-Ribera, Andreu; McDonald, Patrick; Miralda-Escudé, Jordi E-mail: pvmcdonald@lbl.gov

    2012-01-01

    Massive spectroscopic surveys of high-redshift quasars yield large numbers of correlated Lyα absorption spectra that can be used to measure large-scale structure. Simulations of these surveys are required to accurately interpret the measurements of correlations and correct for systematic errors. An efficient method to generate mock realizations of Lyα forest surveys is presented which generates a field over the lines of sight to the survey sources only, instead of having to generate it over the entire three-dimensional volume of the survey. The method can be calibrated to reproduce the power spectrum and one-point distribution function of the transmitted flux fraction, as well as the redshift evolution of these quantities, and is easily used for modeling any survey systematic effects. We present an example of how these mock surveys are applied to predict the measurement errors in a survey with similar parameters as the BOSS quasar survey in SDSS-III.

  10. A Review of International Large-Scale Assessments in Education: Assessing Component Skills and Collecting Contextual Data. PISA for Development

    ERIC Educational Resources Information Center

    Cresswell, John; Schwantner, Ursula; Waters, Charlotte

    2015-01-01

    This report reviews the major international and regional large-scale educational assessments, including international surveys, school-based surveys and household-based surveys. The report compares and contrasts the cognitive and contextual data collection instruments and implementation methods used by the different assessments in order to identify…

  11. Testing gravity using large-scale redshift-space distortions

    NASA Astrophysics Data System (ADS)

    Raccanelli, Alvise; Bertacca, Daniele; Pietrobon, Davide; Schmidt, Fabian; Samushia, Lado; Bartolo, Nicola; Doré, Olivier; Matarrese, Sabino; Percival, Will J.

    2013-11-01

    We use luminous red galaxies from the Sloan Digital Sky Survey (SDSS) II to test the cosmological structure growth in two alternatives to the standard Λ cold dark matter (ΛCDM)+general relativity (GR) cosmological model. We compare observed three-dimensional clustering in SDSS Data Release 7 (DR7) with theoretical predictions for the standard vanilla ΛCDM+GR model, unified dark matter (UDM) cosmologies and the normal branch Dvali-Gabadadze-Porrati (nDGP). In computing the expected correlations in UDM cosmologies, we derive a parametrized formula for the growth factor in these models. For our analysis we apply the methodology tested in Raccanelli et al. and use the measurements of Samushia et al. that account for survey geometry, non-linear and wide-angle effects and the distribution of pair orientation. We show that the estimate of the growth rate is potentially degenerate with wide-angle effects, meaning that extremely accurate measurements of the growth rate on large scales will need to take such effects into account. We use measurements of the zeroth and second-order moments of the correlation function from SDSS DR7 data and the Large Suite of Dark Matter Simulations (LasDamas), and perform a likelihood analysis to constrain the parameters of the models. Using information on the clustering up to rmax = 120 h-1 Mpc, and after marginalizing over the bias, we find, for UDM models, a speed of sound c∞ ≤ 6.1e-4, and, for the nDGP model, a cross-over scale rc ≥ 340 Mpc, at 95 per cent confidence level.

  12. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  13. Survey on Continuing Education Needs for Health Professionals: Report.

    ERIC Educational Resources Information Center

    System Development Corp., Santa Monica, CA.

    The report documents the results of a 1967 survey of health professionals in the four-State Western Interstate Commission for Higher Education (WICHE) Mountain States Regional Medical Program (MS/RMP). Addressed to health professionals in each of the four States--Idaho, Montana, Nevada, and Wyoming--the survey focuses primarily on the…

  14. HARRISBURG TRI-COUNTY HEALTH MANPOWER SURVEY REPORT. PRELIMINARY.

    ERIC Educational Resources Information Center

    RATNER, MURIEL

    THE HARRISBURG AREA COMMUNITY COLLEGE COOPERATED WITH TWO HOSPITALS IN A SURVEY OF THE AREA'S NEEDS FOR HEALTH TECHNICIANS. DATA, COLLECTED BY QUESTIONNAIRE SURVEYS OF DOCTORS AND DENTISTS AND BY INTERVIEWS WITH ADMINISTRATORS OF HOSPITALS, NURSING HOMES AND PROFESSIONAL ORGANIZATIONS, INDICATED THAT (1) A 60-PERCENT INCREASE IN HEALTH MNAPOWER…

  15. THIRD NATIONAL HEALTH AND NUTRITION EXAMINATION SURVEY (NHANES III)

    EPA Science Inventory

    The Third National Health and Nutrition Examination Survey (NHANES III), 1988-94, was conducted on a nationwide probability sample of approximately 33,994 persons 2 months and over. The survey was designed to obtain nationally representative information on the health and nutritio...

  16. Worksite Health Promotion Activities. 1992 National Survey. Summary Report.

    ERIC Educational Resources Information Center

    Public Health Service (DHHS), Rockville, MD. Office of Disease Prevention and Health Promotion.

    The survey reported in this document examined worksite health promotion and disease prevention activities in 1,507 private worksites in the United States. Specificlly, the survey assessed policies, practices, services, facilities, information, and activities sponsored by employers to improve the health of their employees, and assessed health…

  17. Quasars as a Tracer of Large-scale Structures in the Distant Universe

    NASA Astrophysics Data System (ADS)

    Song, Hyunmi; Park, Changbom; Lietzen, Heidi; Einasto, Maret

    2016-08-01

    We study the dependence of the number density and properties of quasars on the background galaxy density using the currently largest spectroscopic data sets of quasars and galaxies. We construct a galaxy number density field smoothed over the variable smoothing scale of between approximately 10 and 20 h ‑1 Mpc over the redshift range 0.46 < z < 0.59 using the Sloan Digital Sky Survey (SDSS) Data Release 12 (DR12) Constant MASS galaxies. The quasar sample is prepared from the SDSS-I/II DR7. We examine the correlation of incidence of quasars with the large-scale background density and the dependence of quasar properties such as bolometric luminosity, black hole mass, and Eddington ratio on the large-scale density. We find a monotonic correlation between the quasar number density and large-scale galaxy number density, which is fitted well with a power-law relation, {n}Q\\propto {ρ }G0.618. We detect weak dependences of quasar properties on the large-scale density such as a positive correlation between black hole mass and density, and a negative correlation between luminosity and density. We discuss the possibility of using quasars as a tracer of large-scale structures at high redshifts, which may be useful for studies of the growth of structures in the high-redshift universe.

  18. Ultra-large-scale Cosmology in Next-generation Experiments with Single Tracers

    NASA Astrophysics Data System (ADS)

    Alonso, David; Bull, Philip; Ferreira, Pedro G.; Maartens, Roy; Santos, Mário G.

    2015-12-01

    Future surveys of large-scale structure will be able to measure perturbations on the scale of the cosmological horizon, and so could potentially probe a number of novel relativistic effects that are negligibly small on sub-horizon scales. These effects leave distinctive signatures in the power spectra of clustering observables and, if measurable, would open a new window on relativistic cosmology. We quantify the size and detectability of the effects for the most relevant future large-scale structure experiments: spectroscopic and photometric galaxy redshift surveys, intensity mapping surveys of neutral hydrogen, and radio continuum surveys. Our forecasts show that next-generation experiments, reaching out to redshifts z≃ 4, will not be able to detect previously undetected general-relativistic effects by using individual tracers of the density field, although the contribution of weak lensing magnification on large scales should be clearly detectable. We also perform a rigorous joint forecast for the detection of primordial non-Gaussianity through the excess power it produces in the clustering of biased tracers on large scales, finding that uncertainties of σ ({f}{{NL}})∼ 1-2 should be achievable. We study the level of degeneracy of these large-scale effects with several tracer-dependent nuisance parameters, quantifying the minimal priors on the latter that are needed for an optimal measurement of the former. Finally, we discuss the systematic effects that must be mitigated to achieve this level of sensitivity, and some alternative approaches that should help to improve the constraints. The computational tools developed to carry out this study, which requires the full-sky computation of the theoretical angular power spectra for {O}(100) redshift bins, as well as realistic models of the luminosity function, are publicly available at http://intensitymapping.physics.ox.ac.uk/codes.html.

  19. The application of online surveys for workplace health research.

    PubMed

    Scriven, A; Smith-Ferrier, S

    2003-06-01

    Work has a synergistic relationship to health, and workplaces are increasingly the focus for both health promotion initiatives and health research. Electronic technologies are a common feature of many workplaces and as such are convenient low cost survey vehicles for the health researcher. The methods and the implications of employing Internet methods for health research are covered and a wide range of issues connected with online sampling and data collection are discussed. Because the workplace does not provide a neutral research environment, the security, anonymity and associated issues of using the two main forms of Internet surveys for workplace health research are examined and their advantages and disadvantages are debated. PMID:12852193

  20. Brief 75 Health Physics Enrollments and Degrees Survey, 2014 Data

    SciTech Connect

    None, None

    2015-03-05

    The 2014 survey includes degrees granted between September 1, 2013 and August 31, 2014. Enrollment information refers to the fall term 2014. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.

  1. Brief 73 Health Physics Enrollments and Degrees Survey, 2013 Data

    SciTech Connect

    None, None

    2014-02-15

    The survey includes degrees granted between September 1, 2012 and August 31, 2013. Enrollment information refers to the fall term 2013. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.taoi_na

  2. Large-scale spatial population databases in infectious disease research

    PubMed Central

    2012-01-01

    Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Moreover, a variety of different methods have been used to model population distribution at large spatial scales. In this review we describe the main global gridded population datasets that are freely available for health researchers and compare their construction methods, and highlight the uncertainties inherent in these population datasets. We review their application in past studies on disease risk and dynamics, and discuss how the choice of dataset can affect results. Moreover, we highlight how the lack of contemporary, detailed and reliable data on human population distribution in low income countries is proving a barrier to obtaining accurate large-scale estimates of population at risk and constructing reliable models of disease spread, and suggest research directions required to further reduce these barriers. PMID:22433126

  3. Large-scale spatial population databases in infectious disease research.

    PubMed

    Linard, Catherine; Tatem, Andrew J

    2012-01-01

    Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Moreover, a variety of different methods have been used to model population distribution at large spatial scales. In this review we describe the main global gridded population datasets that are freely available for health researchers and compare their construction methods, and highlight the uncertainties inherent in these population datasets. We review their application in past studies on disease risk and dynamics, and discuss how the choice of dataset can affect results. Moreover, we highlight how the lack of contemporary, detailed and reliable data on human population distribution in low income countries is proving a barrier to obtaining accurate large-scale estimates of population at risk and constructing reliable models of disease spread, and suggest research directions required to further reduce these barriers. PMID:22433126

  4. Determining Environmental Impacts of Large Scale Irrigation in Turkey

    NASA Astrophysics Data System (ADS)

    Simpson, K.; Douglas, E. M.; Limbrunner, J. F.; Ozertan, G.

    2010-12-01

    In 1989, the Turkish government launched their most comprehensive regional development plan in history entitled the Southeastern Anatolia Project (SAP) which focuses on improving the quality of life and income level within the most underdeveloped region in Turkey. This project aims to integrate sustainable human development through agriculture, industry, transportation, education, health and rural and urban infrastructure building. In May 2008, a new action plan was announced for the region which includes the designation of almost 800,000 hectares of previously unirrigated land to be open for irrigation within the next five years. If not done in a sustainable manner, such a large-scale irrigation project could cause severe environmental impacts. The first objective of our research is to use computer simulations to reproduce the observed environmental impacts of irrigated agriculture in this arid region, primarily by simulating the effects of soil salinization. The second objective of our research is to estimate soil salinization that could result from expanded irrigation and suggest sustainable strategies for the newly irrigated land in Turkey in order to minimize these environmental impacts.

  5. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    EPA Science Inventory

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  6. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  7. Interpretation of large-scale deviations from the Hubble flow

    NASA Astrophysics Data System (ADS)

    Grinstein, B.; Politzer, H. David; Rey, S.-J.; Wise, Mark B.

    1987-03-01

    The theoretical expectation for large-scale streaming velocities relative to the Hubble flow is expressed in terms of statistical correlation functions. Only for objects that trace the mass would these velocities have a simple cosmological interpretation. If some biasing effects the objects' formation, then nonlinear gravitational evolution is essential to predicting the expected large-scale velocities, which also depend on the nature of the biasing.

  8. Large scale suppression of scalar power on a spatial condensation

    NASA Astrophysics Data System (ADS)

    Kouwn, Seyen; Kwon, O.-Kab; Oh, Phillial

    2015-03-01

    We consider a deformed single-field inflation model in terms of three SO(3) symmetric moduli fields. We find that spatially linear solutions for the moduli fields induce a phase transition during the early stage of the inflation and the suppression of scalar power spectrum at large scales. This suppression can be an origin of anomalies for large-scale perturbation modes in the cosmological observation.

  9. Large-scale motions in a plane wall jet

    NASA Astrophysics Data System (ADS)

    Gnanamanickam, Ebenezer; Jonathan, Latim; Shibani, Bhatt

    2015-11-01

    The dynamic significance of large-scale motions in turbulent boundary layers have been the focus of several recent studies, primarily focussing on canonical flows - zero pressure gradient boundary layers, flows within pipes and channels. This work presents an investigation into the large-scale motions in a boundary layer that is used as the prototypical flow field for flows with large-scale mixing and reactions, the plane wall jet. An experimental investigation is carried out in a plane wall jet facility designed to operate at friction Reynolds numbers Reτ > 1000 , which allows for the development of a significant logarithmic region. The streamwise turbulent intensity across the boundary layer is decomposed into small-scale (less than one integral length-scale δ) and large-scale components. The small-scale energy has a peak in the near-wall region associated with the near-wall turbulent cycle as in canonical boundary layers. However, eddies of large-scales are the dominating eddies having significantly higher energy, than the small-scales across almost the entire boundary layer even at the low to moderate Reynolds numbers under consideration. The large-scales also appear to amplitude and frequency modulate the smaller scales across the entire boundary layer.

  10. Large scale modelling of bankfull flow: An example for Europe

    NASA Astrophysics Data System (ADS)

    Schneider, Christof; Flörke, Martina; Eisner, Stephanie; Voss, Frank

    2011-10-01

    SummaryBankfull flow is a relevant parameter in the field of large scale modelling especially for the analysis of environmental flows and flood related hydrological processes. In our case, bankfull flow data were required within the SCENES project in order to analyse ecological important inundation events at selected grid cells of a European raster. In practise, the determination of bankfull flow is a complex task even on local scale. Subsequent to a literature survey of bankfull flow studies, this paper describes a method which can be applied to estimate bankfull flow on a global or continental grid cell raster. The method is based on the partial duration series approach taking into account a 40-years time series of daily discharge data modelled by the global water model WaterGAP. An increasing threshold censoring procedure, a declustering scheme and the generalised Pareto distribution are applied. Modelled bankfull flow values are then validated by different efficiency criteria against bankfull flows observed at gauging stations in Europe. Thereby, the impact of (i) the applied distribution function, (ii) the threshold setting in the partial duration series, (iii) the climate input data and (iv) applying the annual maxima series are evaluated and compared to the proposed approach. The results show that bankfull flow can be reasonably estimated with a high model efficiency ( E1 = 0.71) and weighted correlation ( ωr2 = 0.90) as well as a systematic overestimation of 22.8%. Finally it turned out that in our study focusing on hydrological extremes, the appliance of the daily climate input data is a basic requirement. While the choice of the distribution function had no significant impact on the final results, the threshold setting in the partial duration series was crucial.

  11. Inflationary tensor fossils in large-scale structure

    SciTech Connect

    Dimastrogiovanni, Emanuela; Fasiello, Matteo; Jeong, Donghui; Kamionkowski, Marc E-mail: mrf65@case.edu E-mail: kamion@jhu.edu

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  12. A review of national health surveys in India

    PubMed Central

    Pandey, Anamika; Dandona, Lalit

    2016-01-01

    Abstract Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India’s disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health. PMID:27034522

  13. Asian American Field Survey: Re-Analysis of Health Data.

    ERIC Educational Resources Information Center

    Ito, Karen L.; So, Alvin

    Data from the Asian American Field Survey of 1973 were examined to determine health problems, methods of seeking and paying for health services, health insurance coverage, and frequency of medical examinations among Japanese, Chinese, Filipino, Korean, and Samoan families in the United States. The analysis indicated that the Chinese reported the…

  14. ADHD and Health Services Utilization in the National Health Interview Survey

    ERIC Educational Resources Information Center

    Cuffe, Steven P.; Moore, Charity G.; McKeown, Robert

    2009-01-01

    Objective: Describe the general health, comorbidities and health service use among U.S. children with ADHD. Method: The 2001 National Health Interview Survey (NHIS) contained the Strengths and Difficulties Questionnaire (SDQ; used to determine probable ADHD), data on medical problems, overall health, and health care utilization. Results: Asthma…

  15. CONSUMER ASSESSMENT OF HEALTH PLANS SURVEY (CAHPS)

    EPA Science Inventory

    This 5-year project has been used for consumers to identify the best health care plans and services for their needs. The goals of the Consumer Assessment of Health Plans (CAHPS?) are to (1) develop and test questionnaires that assess health plans and services, (2) produce easily ...

  16. Toward the intelligent use of health care consumer surveys.

    PubMed

    Allen, H M

    1995-01-01

    Consumer surveys are at a pivotal moment in health care. With demand for consumer-supplied data escalating in every sector of the industry, current opportunities for consumer surveys to demonstrate unique value in the marketplace are unparalleled. These opportunities, however, carry considerable risks, particularly with respect to performance report cards for competing health plans and providers. As investigators multiply in an area notably lacking in standardization, the chances increase that surveys will arrive at conflicting assessments of plans and providers. To resolve these inconsistencies, users will need to sharpen their understanding of the role of consumer surveys, the business and operational needs they can address, and how their results can be affected by methodology. This article discusses each of these issues with an eye toward promoting intelligent use of consumer surveys in the health care marketplace. PMID:10151590

  17. Unsaturated Hydraulic Conductivity for Evaporation in Large scale Heterogeneous Soils

    NASA Astrophysics Data System (ADS)

    Sun, D.; Zhu, J.

    2014-12-01

    In this study we aim to provide some practical guidelines of how the commonly used simple averaging schemes (arithmetic, geometric, or harmonic mean) perform in simulating large scale evaporation in a large scale heterogeneous landscape. Previous studies on hydraulic property upscaling focusing on steady state flux exchanges illustrated that an effective hydraulic property is usually more difficult to define for evaporation. This study focuses on upscaling hydraulic properties of large scale transient evaporation dynamics using the idea of the stream tube approach. Specifically, the two main objectives are: (1) if the three simple averaging schemes (i.e., arithmetic, geometric and harmonic means) of hydraulic parameters are appropriate in representing large scale evaporation processes, and (2) how the applicability of these simple averaging schemes depends on the time scale of evaporation processes in heterogeneous soils. Multiple realizations of local evaporation processes are carried out using HYDRUS-1D computational code (Simunek et al, 1998). The three averaging schemes of soil hydraulic parameters were used to simulate the cumulative flux exchange, which is then compared with the large scale average cumulative flux. The sensitivity of the relative errors to the time frame of evaporation processes is also discussed.

  18. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  19. Numerical methods for large-scale, time-dependent partial differential equations

    NASA Technical Reports Server (NTRS)

    Turkel, E.

    1979-01-01

    A survey of numerical methods for time dependent partial differential equations is presented. The emphasis is on practical applications to large scale problems. A discussion of new developments in high order methods and moving grids is given. The importance of boundary conditions is stressed for both internal and external flows. A description of implicit methods is presented including generalizations to multidimensions. Shocks, aerodynamics, meteorology, plasma physics and combustion applications are also briefly described.

  20. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  1. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  2. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  3. Coupling between convection and large-scale circulation

    NASA Astrophysics Data System (ADS)

    Becker, T.; Stevens, B. B.; Hohenegger, C.

    2014-12-01

    The ultimate drivers of convection - radiation, tropospheric humidity and surface fluxes - are altered both by the large-scale circulation and by convection itself. A quantity to which all drivers of convection contribute is moist static energy, or gross moist stability, respectively. Therefore, a variance analysis of the moist static energy budget in radiative-convective equilibrium helps understanding the interaction of precipitating convection and the large-scale environment. In addition, this method provides insights concerning the impact of convective aggregation on this coupling. As a starting point, the interaction is analyzed with a general circulation model, but a model intercomparison study using a hierarchy of models is planned. Effective coupling parameters will be derived from cloud resolving models and these will in turn be related to assumptions used to parameterize convection in large-scale models.

  4. Large-scale current systems in the dayside Venus ionosphere

    NASA Technical Reports Server (NTRS)

    Luhmann, J. G.; Elphic, R. C.; Brace, L. H.

    1981-01-01

    The occasional observation of large-scale horizontal magnetic fields within the dayside ionosphere of Venus by the flux gate magnetometer on the Pioneer Venus orbiter suggests the presence of large-scale current systems. Using the measured altitude profiles of the magnetic field and the electron density and temperature, together with the previously reported neutral atmosphere density and composition, it is found that the local ionosphere can be described at these times by a simple steady state model which treats the unobserved quantities, such as the electric field, as parameters. When the model is appropriate, the altitude profiles of the ion and electron velocities and the currents along the satellite trajectory can be inferred. These results elucidate the configurations and sources of the ionospheric current systems which produce the observed large-scale magnetic fields, and in particular illustrate the effect of ion-neutral coupling in the determination of the current system at low altitudes.

  5. Do Large-Scale Topological Features Correlate with Flare Properties?

    NASA Astrophysics Data System (ADS)

    DeRosa, Marc L.; Barnes, Graham

    2016-05-01

    In this study, we aim to identify whether the presence or absence of particular topological features in the large-scale coronal magnetic field are correlated with whether a flare is confined or eruptive. To this end, we first determine the locations of null points, spine lines, and separatrix surfaces within the potential fields associated with the locations of several strong flares from the current and previous sunspot cycles. We then validate the topological skeletons against large-scale features in observations, such as the locations of streamers and pseudostreamers in coronagraph images. Finally, we characterize the topological environment in the vicinity of the flaring active regions and identify the trends involving their large-scale topologies and the properties of the associated flares.

  6. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  7. New Mexico Adolescent Health Risks Survey.

    ERIC Educational Resources Information Center

    Antle, David

    To inform students of health risks (posed by behavior, environment, and genetics) and provide schools with collective risk appraisal information as a basis for planning/evaluating health and wellness initiatives, New Mexico administered the Teen Wellness Check in 1985 to 1,573 ninth-grade students from 7 New Mexico public schools. Subjects were…

  8. SURVEY OF THE PUBLIC HEALTH NUTRITION WORKFORCE

    EPA Science Inventory

    The Association of State and Territorial Public Health Nutrition Directors (ASTPHND), with support from a cooperative agreement with the U.S. Department of Agriculture (USDA), conducted a census of the professional and paraprofessional public health nutrition workforce in the sta...

  9. French Frigate Shoals reef health survey

    USGS Publications Warehouse

    Work, Thierry M.; Coles, Steve L.; Rameyer, Robert

    2002-01-01

    French Frigate Shoals consists of a large (31 nm) fringing reef partially enclosing a lagoon. A basalt pinnacle (La Perouse Pinnacle) arises approximately halfway between the two ends of the arcs of the fringing reefs. Tern Island is situated at the northern end of the lagoon and is surrounded by a dredged ship channel. The lagoon becomes progressively shallower from west to east and harbors a variety of marine life including corals, fish, marine mammals, and sea turtles (Amerson 1971). In 2000, an interagency survey of the northwestern Hawaiian Islands was done to document the fauna and flora in FFS (Maragos and Gulko, 2002). During that survey, 38 stations were examined, and 41 species of stony corals were documented, the most of any of the NW Hawaiian islands (Maragos and Gulko 2002). In some of these stations, corals with abnormalities were observed. The present study aimed to expand on the 2000 survey to evaluate the lesions in areas where they were documented.

  10. The Evolution of Baryons in Cosmic Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali; Arielle Phillips, Lara; Mathews, Grant James; Coughlin, Jared; Suh, In-Saeng; Bhattacharya, Aparna

    2015-01-01

    The environments of galaxies play a critical role in their formation and evolution. We study these environments using cosmological simulations with star formation and supernova feedback included. From these simulations, we parse the large scale structure into clusters, filaments and voids using a segmentation algorithm adapted from medical imaging. We trace the star formation history, gas phase and metal evolution of the baryons in the intergalactic medium as function of structure. We find that our algorithm reproduces the baryon fraction in the intracluster medium and that the majority of star formation occurs in cold, dense filaments. We present the consequences this large scale environment has for galactic halos and galaxy evolution.

  11. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  12. Large-scale ER-damper for seismic protection

    NASA Astrophysics Data System (ADS)

    McMahon, Scott; Makris, Nicos

    1997-05-01

    A large scale electrorheological (ER) damper has been designed, constructed, and tested. The damper consists of a main cylinder and a piston rod that pushes an ER-fluid through a number of stationary annular ducts. This damper is a scaled- up version of a prototype ER-damper which has been developed and extensively studied in the past. In this paper, results from comprehensive testing of the large-scale damper are presented, and the proposed theory developed for predicting the damper response is validated.

  13. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  14. Large-scale liquid scintillation detectors for solar neutrinos

    NASA Astrophysics Data System (ADS)

    Benziger, Jay B.; Calaprice, Frank P.

    2016-04-01

    Large-scale liquid scintillation detectors are capable of providing spectral yields of the low energy solar neutrinos. These detectors require > 100 tons of liquid scintillator with high optical and radiopurity. In this paper requirements for low-energy neutrino detection by liquid scintillation are specified and the procedures to achieve low backgrounds in large-scale liquid scintillation detectors for solar neutrinos are reviewed. The designs, operations and achievements of Borexino, KamLAND and SNO+ in measuring the low-energy solar neutrino fluxes are reviewed.

  15. Contribution of peculiar shear motions to large-scale structure

    NASA Technical Reports Server (NTRS)

    Mueler, Hans-Reinhard; Treumann, Rudolf A.

    1994-01-01

    Self-gravitating shear flow instability simulations in a cold dark matter-dominated expanding Einstein-de Sitter universe have been performed. When the shear flow speed exceeds a certain threshold, self-gravitating Kelvin-Helmoholtz instability occurs, forming density voids and excesses along the shear flow layer which serve as seeds for large-scale structure formation. A possible mechanism for generating shear peculiar motions are velocity fluctuations induced by the density perturbations of the postinflation era. In this scenario, short scales grow earlier than large scales. A model of this kind may contribute to the cellular structure of the luminous mass distribution in the universe.

  16. Clearing and Labeling Techniques for Large-Scale Biological Tissues.

    PubMed

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-06-30

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  17. Large-scale volcanism associated with coronae on Venus

    NASA Technical Reports Server (NTRS)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated

  18. Summary Health Statistics for U.S. Children: National Health Interview Survey, 1999.

    ERIC Educational Resources Information Center

    Blackwell, Debra L.; Tonthat, Luong

    This report presents statistics from the 1999 National Health Interview Survey (NHIS) on selected health measures for children under 18 years of age, classified by sex, age, race/ethnicity, family structure, parent education, family income, poverty status, health insurance coverage, place of residence, region, and current health status. The NHIS…

  19. The California Health Interview Survey 2001: translation of a major survey for California's multiethnic population.

    PubMed Central

    Ponce, Ninez A.; Lavarreda, Shana Alex; Yen, Wei; Brown, E. Richard; DiSogra, Charles; Satter, Delight E.

    2004-01-01

    The cultural and linguistic diversity of the U.S. population presents challenges to the design and implementation of population-based surveys that serve to inform public policies. Information derived from such surveys may be less than representative if groups with limited or no English language skills are not included. The California Health Interview Survey (CHIS), first administered in 2001, is a population-based health survey of more than 55,000 California households. This article describes the process that the designers of CHIS 2001 underwent in culturally adapting the survey and translating it into an unprecedented number of languages: Spanish, Chinese, Vietnamese, Korean, and Khmer. The multiethnic and multilingual CHIS 2001 illustrates the importance of cultural and linguistic adaptation in raising the quality of population-based surveys, especially when the populations they intend to represent are as diverse as California's. PMID:15219795

  20. National Natality Survey/National Maternal and Infant Health Survey (NMIHS)

    Cancer.gov

    The survey provides data on socioeconomic and demographic characteristics of mothers, prenatal care, pregnancy history, occupational background, health status of mother and infant, and types and sources of medical care received.

  1. Health sciences library building projects, 1998 survey.

    PubMed Central

    Bowden, V M

    1999-01-01

    Twenty-eight health sciences library building projects are briefly described, including twelve new buildings and sixteen additions, remodelings, and renovations. The libraries range in size from 2,144 square feet to 190,000 gross square feet. Twelve libraries are described in detail. These include three hospital libraries, one information center sponsored by ten institutions, and eight academic health sciences libraries. Images PMID:10550027

  2. NATIONAL HEALTH AND NUTRITION EXAMINATION SURVEY (NHANES) 1999-2000

    EPA Science Inventory

    The National Health and Nutrition Examination Survey (NHANES) is conducted by the National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention. Health Research Facilities: A survey of Doctorate-Granting Institutions.

    ERIC Educational Resources Information Center

    Atelsek, Frank J.; Gomberg, Irene L.

    The survey data cover three broad categories: (1) the status of existing health research facilities at doctorate-granting institutions (including their current value, adequacy, and condition); (2) the volume of new construction in progress; and (3) the additions to health research facilities anticipated during the next 5 years…

  3. The Illinois 9th Grade Adolescent Health Survey. Full Report.

    ERIC Educational Resources Information Center

    Illinois State Board of Education, Springfield.

    A survey was conducted in Illinois to identify the risk of certain health problems among adolescents; to determine the health status of Illinois youth in relation to the Surgeon General's "Healthy People 2000 Objectives" and monitor progress toward national and state goals; and to help those working at national, state, and local levels develop…

  4. Taking the Pulse of Undergraduate Health Psychology: A Nationwide Survey

    ERIC Educational Resources Information Center

    Brack, Amy Badura; Kesitilwe, Kutlo; Ware, Mark E.

    2010-01-01

    We conducted a random national survey of 100 doctoral, 100 comprehensive, and 100 baccalaureate institutions to determine the current state of the undergraduate health psychology course. We found clear evidence of a maturing course with much greater commonality in name (health psychology), theoretical foundation (the biopsychosocial model), and…

  5. Licensed Practical Nurses in Occupational Health. An Initial Survey.

    ERIC Educational Resources Information Center

    Lee, Jane A.; And Others

    The study, conducted in 1971, assessed characteristics of licensed practical nurses (LPN's) who worked in occupational health nursing. The survey instrument, a questionnaire, was returned by 591 LPN's in occupational health and provided data related to: personal characteristics, work and setting, administrative and professional functioning,…

  6. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  7. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  8. Large-scale search for dark-matter axions

    SciTech Connect

    Hagmann, C.A., LLNL; Kinion, D.; Stoeffl, W.; Van Bibber, K.; Daw, E.J.; McBride, J.; Peng, H.; Rosenberg, L.J.; Xin, H.; Laveigne, J.; Sikivie, P.; Sullivan, N.S.; Tanner, D.B.; Moltz, D.M.; Powell, J.; Clarke, J.; Nezrick, F.A.; Turner, M.S.; Golubev, N.A.; Kravchuk, L.V.

    1998-01-01

    Early results from a large-scale search for dark matter axions are presented. In this experiment, axions constituting our dark-matter halo may be resonantly converted to monochromatic microwave photons in a high-Q microwave cavity permeated by a strong magnetic field. Sensitivity at the level of one important axion model (KSVZ) has been demonstrated.

  9. Over-driven control for large-scale MR dampers

    NASA Astrophysics Data System (ADS)

    Friedman, A. J.; Dyke, S. J.; Phillips, B. M.

    2013-04-01

    As semi-active electro-mechanical control devices increase in scale for use in real-world civil engineering applications, their dynamics become increasingly complicated. Control designs that are able to take these characteristics into account will be more effective in achieving good performance. Large-scale magnetorheological (MR) dampers exhibit a significant time lag in their force-response to voltage inputs, reducing the efficacy of typical controllers designed for smaller scale devices where the lag is negligible. A new control algorithm is presented for large-scale MR devices that uses over-driving and back-driving of the commands to overcome the challenges associated with the dynamics of these large-scale MR dampers. An illustrative numerical example is considered to demonstrate the controller performance. Via simulations of the structure using several seismic ground motions, the merits of the proposed control strategy to achieve reductions in various response parameters are examined and compared against several accepted control algorithms. Experimental evidence is provided to validate the improved capabilities of the proposed controller in achieving the desired control force levels. Through real-time hybrid simulation (RTHS), the proposed controllers are also examined and experimentally evaluated in terms of their efficacy and robust performance. The results demonstrate that the proposed control strategy has superior performance over typical control algorithms when paired with a large-scale MR damper, and is robust for structural control applications.

  10. Large-scale search for dark-matter axions

    SciTech Connect

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  11. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  12. Global smoothing and continuation for large-scale molecular optimization

    SciTech Connect

    More, J.J.; Wu, Zhijun

    1995-10-01

    We discuss the formulation of optimization problems that arise in the study of distance geometry, ionic systems, and molecular clusters. We show that continuation techniques based on global smoothing are applicable to these molecular optimization problems, and we outline the issues that must be resolved in the solution of large-scale molecular optimization problems.

  13. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  14. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  15. Large-scale drift and Rossby wave turbulence

    NASA Astrophysics Data System (ADS)

    Harper, K. L.; Nazarenko, S. V.

    2016-08-01

    We study drift/Rossby wave turbulence described by the large-scale limit of the Charney–Hasegawa–Mima equation. We define the zonal and meridional regions as Z:= \\{{k} :| {k}y| \\gt \\sqrt{3}{k}x\\} and M:= \\{{k} :| {k}y| \\lt \\sqrt{3}{k}x\\} respectively, where {k}=({k}x,{k}y) is in a plane perpendicular to the magnetic field such that k x is along the isopycnals and k y is along the plasma density gradient. We prove that the only types of resonant triads allowed are M≤ftrightarrow M+Z and Z≤ftrightarrow Z+Z. Therefore, if the spectrum of weak large-scale drift/Rossby turbulence is initially in Z it will remain in Z indefinitely. We present a generalised Fjørtoft’s argument to find transfer directions for the quadratic invariants in the two-dimensional {k}-space. Using direct numerical simulations, we test and confirm our theoretical predictions for weak large-scale drift/Rossby turbulence, and establish qualitative differences with cases when turbulence is strong. We demonstrate that the qualitative features of the large-scale limit survive when the typical turbulent scale is only moderately greater than the Larmor/Rossby radius.

  16. Large Scale Field Campaign Contributions to Soil Moisture Remote Sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Large-scale field experiments have been an essential component of soil moisture remote sensing for over two decades. They have provided test beds for both the technology and science necessary to develop and refine satellite mission concepts. The high degree of spatial variability of soil moisture an...

  17. Large-scale V/STOL testing. [in wind tunnels

    NASA Technical Reports Server (NTRS)

    Koenig, D. G.; Aiken, T. N.; Aoyagi, K.; Falarski, M. D.

    1977-01-01

    Several facets of large-scale testing of V/STOL aircraft configurations are discussed with particular emphasis on test experience in the Ames 40- by 80-foot wind tunnel. Examples of powered-lift test programs are presented in order to illustrate tradeoffs confronting the planner of V/STOL test programs. It is indicated that large-scale V/STOL wind-tunnel testing can sometimes compete with small-scale testing in the effort required (overall test time) and program costs because of the possibility of conducting a number of different tests with a single large-scale model where several small-scale models would be required. The benefits of both high- and full-scale Reynolds numbers, more detailed configuration simulation, and number and type of onboard measurements increase rapidly with scale. Planning must be more detailed at large scale in order to balance the trade-offs between the increased costs, as number of measurements and model configuration variables increase and the benefits of larger amounts of information coming out of one test.

  18. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  19. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  1. Ecosystem resilience despite large-scale altered hydro climatic conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  2. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  3. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  4. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  5. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  6. Newton iterative methods for large scale nonlinear systems

    SciTech Connect

    Walker, H.F.; Turner, K.

    1993-01-01

    Objective is to develop robust, efficient Newton iterative methods for general large scale problems well suited for discretizations of partial differential equations, integral equations, and other continuous problems. A concomitant objective is to develop improved iterative linear algebra methods. We first outline research on Newton iterative methods and then review work on iterative linear algebra methods. (DLC)

  7. Implicit solution of large-scale radiation diffusion problems

    SciTech Connect

    Brown, P N; Graziani, F; Otero, I; Woodward, C S

    2001-01-04

    In this paper, we present an efficient solution approach for fully implicit, large-scale, nonlinear radiation diffusion problems. The fully implicit approach is compared to a semi-implicit solution method. Accuracy and efficiency are shown to be better for the fully implicit method on both one- and three-dimensional problems with tabular opacities taken from the LEOS opacity library.

  8. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  9. Polymers in 2D Turbulence: Suppression of Large Scale Fluctuations

    NASA Astrophysics Data System (ADS)

    Amarouchene, Y.; Kellay, H.

    2002-08-01

    Small quantities of a long chain molecule or polymer affect two-dimensional turbulence in unexpected ways. Their presence inhibits the transfers of energy to large scales causing their suppression in the energy density spectrum. This also leads to the change of the spectral properties of a passive scalar which turns out to be highly sensitive to the presence of energy transfers.

  10. Creating a Large-Scale, Third Generation, Distance Education Course.

    ERIC Educational Resources Information Center

    Weller, Martin James

    2000-01-01

    Outlines the course development of an introductory large-scale distance education course offered via the World Wide Web at the Open University in the United Kingdom. Topics include developing appropriate student skills; maintaining quality control; facilitating easy updating of material; ensuring student interaction; and making materials…

  11. Cosmic strings and the large-scale structure

    NASA Technical Reports Server (NTRS)

    Stebbins, Albert

    1988-01-01

    A possible problem for cosmic string models of galaxy formation is presented. If very large voids are common and if loop fragmentation is not much more efficient than presently believed, then it may be impossible for string scenarios to produce the observed large-scale structure with Omega sub 0 = 1 and without strong environmental biasing.

  12. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  13. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  14. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  15. Decomposition and coordination of large-scale operations optimization

    NASA Astrophysics Data System (ADS)

    Cheng, Ruoyu

    Nowadays, highly integrated manufacturing has resulted in more and more large-scale industrial operations. As one of the most effective strategies to ensure high-level operations in modern industry, large-scale engineering optimization has garnered a great amount of interest from academic scholars and industrial practitioners. Large-scale optimization problems frequently occur in industrial applications, and many of them naturally present special structure or can be transformed to taking special structure. Some decomposition and coordination methods have the potential to solve these problems at a reasonable speed. This thesis focuses on three classes of large-scale optimization problems: linear programming, quadratic programming, and mixed-integer programming problems. The main contributions include the design of structural complexity analysis for investigating scaling behavior and computational efficiency of decomposition strategies, novel coordination techniques and algorithms to improve the convergence behavior of decomposition and coordination methods, as well as the development of a decentralized optimization framework which embeds the decomposition strategies in a distributed computing environment. The complexity study can provide fundamental guidelines to practical applications of the decomposition and coordination methods. In this thesis, several case studies imply the viability of the proposed decentralized optimization techniques for real industrial applications. A pulp mill benchmark problem is used to investigate the applicability of the LP/QP decentralized optimization strategies, while a truck allocation problem in the decision support of mining operations is used to study the MILP decentralized optimization strategies.

  16. Results from the 2010 National Survey on Drug Use and Health: Mental Health Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2012

    2012-01-01

    This report presents results pertaining to mental health from the 2010 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. This report presents national estimates of the prevalence of past year mental disorders and past year mental health…

  17. Response of Tradewind Cumuli to Large-Scale Processes.

    NASA Astrophysics Data System (ADS)

    Soong, S.-T.; Ogura, Y.

    1980-09-01

    The two-dimensional slab-symmetric numerical cloud model used by Soong and Ogura (1973) for studying the evolution of an isolated cumulus cloud is extended to investigate the statistical properties of cumulus clouds which would be generated under a given large-scale forcing composed of the horizontal advection of temperature and water vapor mixing ratio, vertical velocity, sea surface temperature and radiative cooling. Random disturbances of small amplitude are introduced into the model at low levels to provide random motion for cloud formation.The model is applied to a case of suppressed weather conditions during BOMEX for the period 22-23 June 1969 when a nearly steady state prevailed. The composited temperature and mixing ratio profiles of these two days are used as initial conditions and the time-independent large-scale forcing terms estimated from the observations are applied to the model. The result of numerical integration shows that a number of small clouds start developing after 1 h. Some of them decay quickly, but some of them develop and reach the tradewind inversion. After a few hours of simulation, the vertical profiles of the horizontally averaged temperature and moisture are found to deviate only slightly from the observed profiles, indicating that the large-scale effect and the feedback effects of clouds on temperature and mixing ratio reach an equilibrium state. The three major components of the cloud feedback effect, i.e., condensation, evaporation and vertical fluxes associated with the clouds, are determined from the model output. The vertical profiles of vertical heat and moisture fluxes in the subcloud layer in the model are found to be in general agreement with the observations.Sensitivity tests of the model are made for different magnitudes of the large-scale vertical velocity. The most striking result is that the temperature and humidity in the cloud layer below the inversion do not change significantly in spite of a relatively large

  18. Development and Implementation of Culturally Tailored Offline Mobile Health Surveys

    PubMed Central

    2016-01-01

    Background In low and middle income countries (LMICs), and other areas with low resources and unreliable access to the Internet, understanding the emerging best practices for the implementation of new mobile health (mHealth) technologies is needed for efficient and secure data management and for informing public health researchers. Innovations in mHealth technology can improve on previous methods, and dissemination of project development details and lessons learned during implementation are needed to provide lessons learned to stakeholders in both the United States and LMIC settings. Objective The aims of this paper are to share implementation strategies and lessons learned from the development and implementation stages of two survey research projects using offline mobile technology, and to inform and prepare public health researchers and practitioners to implement new mobile technologies in survey research projects in LMICs. Methods In 2015, two survey research projects were developed and piloted in Puerto Rico and pre-tested in Costa Rica to collect face-to-face data, get formative evaluation feedback, and to test the feasibility of an offline mobile data collection process. Fieldwork in each setting involved survey development, back translation with cultural tailoring, ethical review and approvals, data collector training, and piloting survey implementation on mobile tablets. Results Critical processes and workflows for survey research projects in low resource settings were identified and implemented. This included developing a secure mobile data platform tailored to each survey, establishing user accessibility, and training and eliciting feedback from data collectors and on-site LMIC project partners. Conclusions Formative and process evaluation strategies are necessary and useful for the development and implementation of survey research projects using emerging mHealth technologies in LMICs and other low resource settings. Lessons learned include: (1) plan

  19. EPIDEMIOLOGY and Health Care Reform The National Health Survey of 1935-1936

    PubMed Central

    2011-01-01

    The National Health Survey undertaken in 1935 and 1936 was the largest morbidity survey until that time. It was also the first national survey to focus on chronic disease and disability. The decision to conduct a survey of this magnitude was part of the larger strategy to reform health care in the United States. The focus on morbidity allowed reformers to argue that the health status of Americans was poor, despite falling mortality rates that suggested the opposite. The focus on chronic disease morbidity proved to be an especially effective way of demonstrating the poor health of the population and the strong links between poverty and illness. The survey, undertaken by a small group of reform-minded epidemiologists led by Edgar Sydenstricker, was made possible by the close interaction during the Depression of agencies and actors in the public health and social welfare sectors, a collaboration which produced new ways of thinking about disease burdens. PMID:21233434

  20. CUMULATIVE TRAUMAS AND RISK THRESHOLDS: 12-MONTH PTSD IN THE WORLD MENTAL HEALTH (WMH) SURVEYS

    PubMed Central

    Karam, Elie G.; Friedman, Matthew J.; Hill, Eric D.; Kessler, Ronald C.; McLaughlin, Katie A.; Petukhova, Maria; Sampson, Laura; Shahly, Victoria; Angermeyer, Matthias C.; Bromet, Evelyn J.; de Girolamo, Giovanni; de Graaf, Ron; Demyttenaere, Koen; Ferry, Finola; Florescu, Silvia E.; Haro, Josep Maria; He, Yanling; Karam, Aimee N.; Kawakami, Norito; Kovess-Masfety, Viviane; Medina-Mora, María Elena; Browne, Mark A. Oakley; Posada-Villa, José A.; Shalev, Arieh Y.; Stein, Dan J.; Viana, Maria Carmen; Zarkov, Zahari; Koenen, Karestan C.

    2014-01-01

    Background Clinical research suggests that posttraumatic stress disorder (PTSD) patients exposed to multiple traumatic events (TEs) rather than a single TE have increased morbidity and dysfunction. Although epidemiological surveys in the United States and Europe also document high rates of multiple TE exposure, no population-based cross-national data have examined this issue. Methods Data were analyzed from 20 population surveys in the World Health Organization World Mental Health Survey Initiative (n 51,295 aged 18+). The Composite International Diagnostic Interview (3.0) assessed 12-month PTSD and other common DSM-IV disorders. Respondents with 12-month PTSD were assessed for single versus multiple TEs implicated in their symptoms. Associations were examined with age of onset (AOO), functional impairment, comorbidity, and PTSD symptom counts. Results 19.8% of respondents with 12-month PTSD reported that their symptoms were associated with multiple TEs. Cases who associated their PTSD with four or more TEs had greater functional impairment, an earlier AOO, longer duration, higher comorbidity with mood and anxiety disorders, elevated hyper-arousal symptoms, higher proportional exposures to partner physical abuse and other types of physical assault, and lower proportional exposure to unexpected death of a loved one than cases with fewer associated TEs. Conclusions A risk threshold was observed in this large-scale cross-national database wherein cases who associated their PTSD with four or more TEs presented a more “complex” clinical picture with substantially greater functional impairment and greater morbidity than other cases of PTSD. PTSD cases associated with four or more TEs may merit specific and targeted intervention strategies. Depression and Anxiety 31:130–142, 2014. PMID:23983056

  1. Effects of large-scale environment on the assembly history of central galaxies

    SciTech Connect

    Jung, Intae; Lee, Jaehyun; Yi, Sukyoung K.

    2014-10-10

    We examine whether large-scale environment affects the mass assembly history of central galaxies. To facilitate this, we constructed dark matter halo merger trees from a cosmological N-body simulation and calculated the formation and evolution of galaxies using a semi-analytic method. We confirm earlier results that smaller halos show a notable difference in formation time with a mild dependence on large-scale environment. However, using a semi-analytic model, we found that on average the growth rate of the stellar mass of central galaxies is largely insensitive to large-scale environment. Although our results show that the star formation rate (SFR) and the stellar mass of central galaxies in smaller halos are slightly affected by the assembly bias of halos, those galaxies are faint and the difference in the SFR is minute, therefore it is challenging to detect it in real galaxies given the current observational accuracy. Future galaxy surveys, such as the BigBOSS experiment and the Large Synoptic Survey Telescope, which are expected to provide observational data for fainter objects, will provide a chance to test our model predictions.

  2. Assessing response reliability of health interview surveys using reinterviews.

    PubMed

    Fabricant, S J; Harpham, T

    1993-01-01

    Data from interview surveys of households or health facilities are used to assess community parameters such as health status and factors related to the ability and willingness of individuals to pay for health services. Although the effect of sample size on confidence intervals is generally well understood by the survey designers and policy-makers who use the results, the typical survey is also subject to non-sampling errors whose magnitude may exceed that of the sampling errors. The non-sampling errors associated with surveys are only rarely assessed and reported, even though they may have a major effect on the interpretation of findings. The present study reports the non-sampling errors associated with a household survey in Sierra Leone by comparing the results of reinterviews with the responses given during the original interviews. Certain types of questions were subject to greater non-sampling errors than others. The findings should be of use to designers of similar surveys and to those who rely on such surveys for making policy decisions. PMID:8324853

  3. Behavioral Health in the Gulf Coast Region Following the Deepwater Horizon Oil Spill: Findings from Two Federal Surveys

    PubMed Central

    Gould, Deborah W.; Pemberton, Michael R.; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents—including compensation for lost income from BP and increases in available mental health services—may have resulted in a reduction in potential mental health problems. PMID:25339594

  4. Behavioral health in the gulf coast region following the Deepwater Horizon oil spill: findings from two federal surveys.

    PubMed

    Gould, Deborah W; Teich, Judith L; Pemberton, Michael R; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents-including compensation for lost income from BP and increases in available mental health services-may have resulted in a reduction in potential mental health problems. PMID:25339594

  5. Spin Alignments of Spiral Galaxies within the Large-scale Structure from SDSS DR7

    NASA Astrophysics Data System (ADS)

    Zhang, Youcai; Yang, Xiaohu; Wang, Huiyuan; Wang, Lei; Luo, Wentao; Mo, H. J.; van den Bosch, Frank C.

    2015-01-01

    Using a sample of spiral galaxies selected from the Sloan Digital Sky Survey Data Release 7 and Galaxy Zoo 2, we investigate the alignment of spin axes of spiral galaxies with their surrounding large-scale structure, which is characterized by the large-scale tidal field reconstructed from the data using galaxy groups above a certain mass threshold. We find that the spin axes only have weak tendencies to be aligned with (or perpendicular to) the intermediate (or minor) axis of the local tidal tensor. The signal is the strongest in a cluster environment where all three eigenvalues of the local tidal tensor are positive. Compared to the alignments between halo spins and the local tidal field obtained in N-body simulations, the above observational results are in best agreement with those for the spins of inner regions of halos, suggesting that the disk material traces the angular momentum of dark matter halos in the inner regions.

  6. Resurrecting hot dark matter - Large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1988-01-01

    These are the results of a numerical simulation of the formation of large-scale structure from cosmic-string loops in a universe dominated by massive neutrinos (hot dark matter). This model has several desirable features. The final matter distribution contains isolated density peaks embedded in a smooth background, producing a natural bias in the distribution of luminous matter. Because baryons can accrete onto the cosmic strings before the neutrinos, the galaxies will have baryon cores and dark neutrino halos. Galaxy formation in this model begins much earlier than in random-phase models. On large scales the distribution of clustered matter visually resembles the CfA survey, with large voids and filaments.

  7. SPIN ALIGNMENTS OF SPIRAL GALAXIES WITHIN THE LARGE-SCALE STRUCTURE FROM SDSS DR7

    SciTech Connect

    Zhang, Youcai; Yang, Xiaohu; Luo, Wentao; Wang, Huiyuan; Wang, Lei; Mo, H. J.; Van den Bosch, Frank C. E-mail: xyang@sjtu.edu.cn

    2015-01-01

    Using a sample of spiral galaxies selected from the Sloan Digital Sky Survey Data Release 7 and Galaxy Zoo 2, we investigate the alignment of spin axes of spiral galaxies with their surrounding large-scale structure, which is characterized by the large-scale tidal field reconstructed from the data using galaxy groups above a certain mass threshold. We find that the spin axes only have weak tendencies to be aligned with (or perpendicular to) the intermediate (or minor) axis of the local tidal tensor. The signal is the strongest in a cluster environment where all three eigenvalues of the local tidal tensor are positive. Compared to the alignments between halo spins and the local tidal field obtained in N-body simulations, the above observational results are in best agreement with those for the spins of inner regions of halos, suggesting that the disk material traces the angular momentum of dark matter halos in the inner regions.

  8. Health literacy among young adults: a short survey tool for public health and health promotion research.

    PubMed

    Abel, Thomas; Hofmann, Karen; Ackermann, Sabine; Bucher, Sabine; Sakarya, Sibel

    2015-09-01

    Health literacy (HL) is context-specific. In public health and health promotion, HL in the private realm refers to individuals' knowledge and skills to prevent disease and to promote health in everyday life. However, there is a scarcity of measurement tools explicitly geared to private realm contexts. Our aim was to develop and test a short survey tool that captures different dimensions of HL in the context of family and friends. We used cross-sectional data from the Swiss Federal Surveys of Adolescents from 2010 to 2011, comprising 7983 males and 366 females between 18 and 25 years. HL was assessed through a set of eight items (self-reports). We used principal component analysis to explore the underlying factor structure among these items in the male sample and confirmatory factor analysis to verify the factor structure in the female sample. The results showed that the tested item set represented dimensions of functional, interactive and critical HL. Two sub-dimensions, understanding versus finding health-relevant information, denoted functional HL. Interactive and critical HL were each represented with two items. A sum score based on all eight items (Cronbach's α: 0.64) showed expected positive associations with own and parental education among males and females (p < 0.05). The short item set appears to be a feasible measurement tool to assess HL in the private realm. Its broader application in survey studies may help to improve our understanding of how this form of HL is distributed in the general population. PMID:24482542

  9. National workplace health promotion surveys: the Affordable Care Act and future surveys.

    PubMed

    DeJoy, David M; Dyal, Mari-Amanda; Padilla, Heather M; Wilson, Mark G

    2014-01-01

    This commentary reviews findings from the four previous national surveys of workplace health promotion activities (1985, 1992, 1999, and 2004, respectively) and offers recommendations for future surveys mandated under the Affordable Care Act of 2010. Future surveys should place greater emphasis on assessing program quality, reach, and effectiveness. Both employer and employee input should be sought. In addition, sampling plans should differentiate worksites from employers, and results should include public as well as private sector organizations. Ideas are offered for addressing these limitations and for creating a sustainable survey process and multifunctional database of results. PMID:24380423

  10. Large-scale quantification of CVD graphene surface coverage

    NASA Astrophysics Data System (ADS)

    Ambrosi, Adriano; Bonanni, Alessandra; Sofer, Zdeněk; Pumera, Martin

    2013-02-01

    The extraordinary properties demonstrated for graphene and graphene-related materials can be fully exploited when a large-scale fabrication procedure is made available. Chemical vapor deposition (CVD) of graphene on Cu and Ni substrates is one of the most promising procedures to synthesize large-area and good quality graphene films. Parallel to the fabrication process, a large-scale quality monitoring technique is equally crucial. We demonstrate here a rapid and simple methodology that is able to probe the effectiveness of the growth process over a large substrate area for both Ni and Cu substrates. This method is based on inherent electrochemical signals generated by the underlying metal catalysts when fractures or discontinuities of the graphene film are present. The method can be applied immediately after the CVD growth process without the need for any graphene transfer step and represents a powerful quality monitoring technique for the assessment of large-scale fabrication of graphene by the CVD process.The extraordinary properties demonstrated for graphene and graphene-related materials can be fully exploited when a large-scale fabrication procedure is made available. Chemical vapor deposition (CVD) of graphene on Cu and Ni substrates is one of the most promising procedures to synthesize large-area and good quality graphene films. Parallel to the fabrication process, a large-scale quality monitoring technique is equally crucial. We demonstrate here a rapid and simple methodology that is able to probe the effectiveness of the growth process over a large substrate area for both Ni and Cu substrates. This method is based on inherent electrochemical signals generated by the underlying metal catalysts when fractures or discontinuities of the graphene film are present. The method can be applied immediately after the CVD growth process without the need for any graphene transfer step and represents a powerful quality monitoring technique for the assessment of large-scale

  11. LARGE-SCALE MOTIONS IN THE PERSEUS GALAXY CLUSTER

    SciTech Connect

    Simionescu, A.; Werner, N.; Urban, O.; Allen, S. W.; Fabian, A. C.; Sanders, J. S.; Mantz, A.; Nulsen, P. E. J.; Takei, Y.

    2012-10-01

    By combining large-scale mosaics of ROSAT PSPC, XMM-Newton, and Suzaku X-ray observations, we present evidence for large-scale motions in the intracluster medium of the nearby, X-ray bright Perseus Cluster. These motions are suggested by several alternating and interleaved X-ray bright, low-temperature, low-entropy arcs located along the east-west axis, at radii ranging from {approx}10 kpc to over a Mpc. Thermodynamic features qualitatively similar to these have previously been observed in the centers of cool-core clusters, and were successfully modeled as a consequence of the gas sloshing/swirling motions induced by minor mergers. Our observations indicate that such sloshing/swirling can extend out to larger radii than previously thought, on scales approaching the virial radius.

  12. Generating Large-Scale Longitudinal Data Resources for Aging Research

    PubMed Central

    Hofer, Scott M.

    2011-01-01

    Objectives. The need for large studies and the types of large-scale data resources (LSDRs) are discussed along with their general scientific utility, role in aging research, and affordability. The diversification of approaches to large-scale data resourcing is described in order to facilitate their use in aging research. Methods. The need for LSDRs is discussed in terms of (a) large sample size; (b) longitudinal design; (c) as platforms for additional investigator-initiated research projects; and (d) broad-based access to core genetic, biological, and phenotypic data. Discussion. It is concluded that a “lite-touch, lo-tech, lo-cost” approach to LSDRs is a viable strategy for the development of LSDRs and would enhance the likelihood of LSDRs being established which are dedicated to the wide range of important aging-related issues. PMID:21743049

  13. Lagrangian space consistency relation for large scale structure

    NASA Astrophysics Data System (ADS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-09-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias & Riotto and Peloso & Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.

  14. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    NASA Technical Reports Server (NTRS)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  15. The Large Scale Synthesis of Aligned Plate Nanostructures.

    PubMed

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ' phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  16. Large-scale processes in the solar nebula

    NASA Technical Reports Server (NTRS)

    Boss, A. P.

    1994-01-01

    Theoretical models of the structure of a minimum mass solar nebula should be able to provide the physical context to help evaluate the efficacy of any mechanism proposed for the formation of chondrules or Ca, Al-rich inclusions (CAI's). These models generally attempt to use the equations of radiative hydrodynamics to calculate the large-scale structure of the solar nebula throughout the planet-forming region. In addition, it has been suggested that chondrules and CAI's (=Ch&CAI's) may have been formed as a direct result of large-scale nebula processing such as passage of material through high-temperature regions associated with the global structure of the nebula. In this report we assess the status of global models of solar nebula structure and of various related mechanisms that have been suggested for Ch and CAI formation.

  17. Planar Doppler Velocimetry for Large-Scale Wind Tunnel Applications

    NASA Technical Reports Server (NTRS)

    McKenzie, Robert L.

    1998-01-01

    Planar Doppler Velocimetry (PDV) concepts using a pulsed laser are described and the obtainable minimum resolved velocities in large-scale wind tunnels are evaluated. Velocity-field measurements are shown to be possible at ranges of tens of meters and with single pulse resolutions as low as 2 m/s. Velocity measurements in the flow of a low-speed, turbulent jet are reported that demonstrate the ability of PDV to acquire both average velocity fields and their fluctuation amplitudes, using procedures that are compatible with large-scale facility operations. The advantages of PDV over current Laser Doppler Anemometry and Particle Image Velocimetry techniques appear to be significant for applications to large facilities.

  18. Transparent and Flexible Large-scale Graphene-based Heater

    NASA Astrophysics Data System (ADS)

    Kang, Junmo; Lee, Changgu; Kim, Young-Jin; Choi, Jae-Boong; Hong, Byung Hee

    2011-03-01

    We report the application of transparent and flexible heater with high optical transmittance and low sheet resistance using graphene films, showing outstanding thermal and electrical properties. The large-scale graphene films were grown on Cu foil by chemical vapor deposition methods, and transferred to transparent substrates by multiple stacking. The wet chemical doping process enhanced the electrical properties, showing a sheet resistance as low as 35 ohm/sq with 88.5 % transmittance. The temperature response usually depends on the dimension and the sheet resistance of the graphene-based heater. We show that a 4x4 cm2 heater can reach 80& circ; C within 40 seconds and large-scale (9x9 cm2) heater shows uniformly heating performance, which was measured using thermocouple and infra-red camera. These heaters would be very useful for defogging systems and smart windows.

  19. Large Scale Diffuse X-ray Emission from Abell 3571

    NASA Technical Reports Server (NTRS)

    Molnar, Sandor M.; White, Nicholas E. (Technical Monitor)

    2001-01-01

    Observations of the Luman alpha forest suggest that there are many more baryons at high redshift than we can find in the Universe nearby. The largest known concentration of baryons in the nearby Universe is the Shapley supercluster. We scanned the Shapley supercluster to search for large scale diffuse emission with the Rossi X-ray Timing Explorer (RXTE), and found some evidence for such emission. Large scale diffuse emission may be associated to the supercluster, or the clusters of galaxies within the supercluster. In this paper we present results of scans near Abell 3571. We found that the sum of a cooling flow and an isothermal beta model adequately describes the X-ray emission from the cluster. Our results suggest that diffuse emission from A3571 extends out to about two virial radii. We briefly discuss the importance of the determination of the cut off radius of the beta model.

  20. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  1. The Large Scale Synthesis of Aligned Plate Nanostructures

    PubMed Central

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ′ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  2. Large scale meteorological influence during the Geysers 1979 field experiment

    SciTech Connect

    Barr, S.

    1980-01-01

    A series of meteorological field measurements conducted during July 1979 near Cobb Mountain in Northern California reveals evidence of several scales of atmospheric circulation consistent with the climatic pattern of the area. The scales of influence are reflected in the structure of wind and temperature in vertically stratified layers at a given observation site. Large scale synoptic gradient flow dominates the wind field above about twice the height of the topographic ridge. Below that there is a mixture of effects with evidence of a diurnal sea breeze influence and a sublayer of katabatic winds. The July observations demonstrate that weak migratory circulations in the large scale synoptic meteorological pattern have a significant influence on the day-to-day gradient winds and must be accounted for in planning meteorological programs including tracer experiments.

  3. Large-scale quantum effects in biological systems

    NASA Astrophysics Data System (ADS)

    Mesquita, Marcus V.; Vasconcellos, Áurea R.; Luzzi, Roberto; Mascarenhas, Sergio

    Particular aspects of large-scale quantum effects in biological systems, such as biopolymers and also microtubules in the cytoskeleton of neurons which can have relevance in brain functioning, are discussed. The microscopic (quantum mechanical) and macroscopic (quantum statistical mechanical) aspects, and the emergence of complex behavior, are described. This phenomena consists of the large-scale coherent process of Fröhlich-Bose-Einstein condensation in open and sufficiently far-from-equilibrium biopolymers. Associated with this phenomenon is the presence of Schrödinger-Davydov solitons, which propagate, undistorted and undamped, when embedded in the Fröhlich-Bose-Einstein condensate, thus allowing for the transmission of signals at long distances, involving a question relevant to bioenergetics.

  4. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  5. Large-scale objective phenotyping of 3D facial morphology

    PubMed Central

    Hammond, Peter; Suttie, Michael

    2012-01-01

    Abnormal phenotypes have played significant roles in the discovery of gene function, but organized collection of phenotype data has been overshadowed by developments in sequencing technology. In order to study phenotypes systematically, large-scale projects with standardized objective assessment across populations are considered necessary. The report of the 2006 Human Variome Project meeting recommended documentation of phenotypes through electronic means by collaborative groups of computational scientists and clinicians using standard, structured descriptions of disease-specific phenotypes. In this report, we describe progress over the past decade in 3D digital imaging and shape analysis of the face, and future prospects for large-scale facial phenotyping. Illustrative examples are given throughout using a collection of 1107 3D face images of healthy controls and individuals with a range of genetic conditions involving facial dysmorphism. PMID:22434506

  6. Electron drift in a large scale solid xenon

    SciTech Connect

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  7. Electron drift in a large scale solid xenon

    DOE PAGESBeta

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor twomore » faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.« less

  8. Large-scale micropropagation system of plant cells.

    PubMed

    Honda, Hiroyuki; Kobayashi, Takeshi

    2004-01-01

    Plant micropropagation is an efficient method of propagating disease-free, genetically uniform and massive amounts of plants in vitro. The scale-up of the whole process for plant micropropagation should be established by an economically feasible technology for large-scale production of them in appropriate bioreactors. It is necessary to design suitable bioreactor configuration which can provide adequate mixing and mass transfer while minimizing the intensity of shear stress and hydrodynamic pressure. Automatic selection of embryogenic calli and regenerated plantlets using image analysis system should be associated with the system. The aim of this chapter is to identify the problems related to large-scale plant micropropagation via somatic embryogenesis, and to summarize the micropropagation technology and computer-aided image analysis. Viscous additive supplemented culture, which is including the successful results obtained by us for callus regeneration, is also introduced. PMID:15453194

  9. Quantum Noise in Large-Scale Coherent Nonlinear Photonic Circuits

    NASA Astrophysics Data System (ADS)

    Santori, Charles; Pelc, Jason S.; Beausoleil, Raymond G.; Tezak, Nikolas; Hamerly, Ryan; Mabuchi, Hideo

    2014-06-01

    A semiclassical simulation approach is presented for studying quantum noise in large-scale photonic circuits incorporating an ideal Kerr nonlinearity. A circuit solver is used to generate matrices defining a set of stochastic differential equations, in which the resonator field variables represent random samplings of the Wigner quasiprobability distributions. Although the semiclassical approach involves making a large-photon-number approximation, tests on one- and two-resonator circuits indicate satisfactory agreement between the semiclassical and full-quantum simulation results in the parameter regime of interest. The semiclassical model is used to simulate random errors in a large-scale circuit that contains 88 resonators and hundreds of components in total and functions as a four-bit ripple counter. The error rate as a function of on-state photon number is examined, and it is observed that the quantum fluctuation amplitudes do not increase as signals propagate through the circuit, an important property for scalability.

  10. Large-scale flow generation in turbulent convection

    PubMed Central

    Krishnamurti, Ruby; Howard, Louis N.

    1981-01-01

    In a horizontal layer of fluid heated from below and cooled from above, cellular convection with horizontal length scale comparable to the layer depth occurs for small enough values of the Rayleigh number. As the Rayleigh number is increased, cellular flow disappears and is replaced by a random array of transient plumes. Upon further increase, these plumes drift in one direction near the bottom and in the opposite direction near the top of the layer with the axes of plumes tilted in such a way that horizontal momentum is transported upward via the Reynolds stress. With the onset of this large-scale flow, the largest scale of motion has increased from that comparable to the layer depth to a scale comparable to the layer width. The conditions for occurrence and determination of the direction of this large-scale circulation are described. Images PMID:16592996

  11. The Large Scale Synthesis of Aligned Plate Nanostructures

    NASA Astrophysics Data System (ADS)

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-07-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ‧ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential.

  12. Large-scale behavior and statistical equilibria in rotating flows

    NASA Astrophysics Data System (ADS)

    Mininni, P. D.; Dmitruk, P.; Matthaeus, W. H.; Pouquet, A.

    2011-01-01

    We examine long-time properties of the ideal dynamics of three-dimensional flows, in the presence or not of an imposed solid-body rotation and with or without helicity (velocity-vorticity correlation). In all cases, the results agree with the isotropic predictions stemming from statistical mechanics. No accumulation of excitation occurs in the large scales, although, in the dissipative rotating case, anisotropy and accumulation, in the form of an inverse cascade of energy, are known to occur. We attribute this latter discrepancy to the linearity of the term responsible for the emergence of inertial waves. At intermediate times, inertial energy spectra emerge that differ somewhat from classical wave-turbulence expectations and with a trace of large-scale excitation that goes away for long times. These results are discussed in the context of partial two dimensionalization of the flow undergoing strong rotation as advocated by several authors.

  13. The workshop on iterative methods for large scale nonlinear problems

    SciTech Connect

    Walker, H.F.; Pernice, M.

    1995-12-01

    The aim of the workshop was to bring together researchers working on large scale applications with numerical specialists of various kinds. Applications that were addressed included reactive flows (combustion and other chemically reacting flows, tokamak modeling), porous media flows, cardiac modeling, chemical vapor deposition, image restoration, macromolecular modeling, and population dynamics. Numerical areas included Newton iterative (truncated Newton) methods, Krylov subspace methods, domain decomposition and other preconditioning methods, large scale optimization and optimal control, and parallel implementations and software. This report offers a brief summary of workshop activities and information about the participants. Interested readers are encouraged to look into an online proceedings available at http://www.usi.utah.edu/logan.proceedings. In this, the material offered here is augmented with hypertext abstracts that include links to locations such as speakers` home pages, PostScript copies of talks and papers, cross-references to related talks, and other information about topics addresses at the workshop.

  14. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    SciTech Connect

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  15. Village health survey of Sina Mala, Gongola State, Nigeria.

    PubMed

    Thompson, J S; Dixon, R A

    1993-09-01

    A survey of the environment, life-style, and health status, knowledge, attitudes and practices in the village of Sina Mala was carried out prior to the introduction of a village health post by a church-run rural health programme. In addition to the perceived needs of the villagers for a school, easier access to medicine and external assistance with well drilling, the study identified the need to train traditional midwives in hygienic delivery, to make local health workers more aware of onchocerciasis and to educate the community on sanitation and hygiene, including the harmful effects of the guinea corn beer. PMID:7839910

  16. Seasonal components of avian population change: Joint analysis of two large-scale monitoring programs

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    2007-01-01

    We present a combined analysis of data from two large-scale surveys of bird populations. The North American Breeding Bird Survey is conducted each summer; the Christmas Bird Count is conducted in early winter. The temporal staggering of these surveys allows investigation of seasonal components of population change, which we illustrate with an examination of the effects of severe winters on the Carolina Wren (Thryothorus ludovicianus). Our analysis uses a hierarchical log-linear model with controls for survey-specific sampling covariates. Temporal change in population size is modeled seasonally, with covariates for winter severity. Overall, the winter?spring seasons are associated with 82% of the total population variation for Carolina Wrens, and an additional day of snow cover during winter?spring is associated with an incremental decline of 1.1% of the population.

  17. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  18. Analysis plan for 1985 large-scale tests. Technical report

    SciTech Connect

    McMullan, F.W.

    1983-01-01

    The purpose of this effort is to assist DNA in planning for large-scale (upwards of 5000 tons) detonations of conventional explosives in the 1985 and beyond time frame. Primary research objectives were to investigate potential means to increase blast duration and peak pressures. This report identifies and analyzes several candidate explosives. It examines several charge designs and identifies advantages and disadvantages of each. Other factors including terrain and multiburst techniques are addressed as are test site considerations.

  19. Large-scale Alfvén vortices

    NASA Astrophysics Data System (ADS)

    Onishchenko, O. G.; Pokhotelov, O. A.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-01

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  20. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  1. The Phoenix series large scale LNG pool fire experiments.

    SciTech Connect

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  2. Cosmic string and formation of large scale structure.

    NASA Astrophysics Data System (ADS)

    Fang, L.-Z.; Xiang, S.-P.

    Cosmic string formed due to phase transition in the early universe may be the cause of galaxy formation and clustering. The advantage of string model is that it can give a consistent explanation of all observed results related to large scale structure, such as correlation functions of galaxies, clusters and superclusters, the existence of voids and/or bubbles, anisotropy of cosmic background radiation. A systematic review on string model has been done.

  3. Large-scale Alfvén vortices

    SciTech Connect

    Onishchenko, O. G.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  4. Climate: large-scale warming is not urban.

    PubMed

    Parker, David E

    2004-11-18

    Controversy has persisted over the influence of urban warming on reported large-scale surface-air temperature trends. Urban heat islands occur mainly at night and are reduced in windy conditions. Here we show that, globally, temperatures over land have risen as much on windy nights as on calm nights, indicating that the observed overall warming is not a consequence of urban development. PMID:15549087

  5. Relic vector field and CMB large scale anomalies

    SciTech Connect

    Chen, Xingang; Wang, Yi E-mail: yw366@cam.ac.uk

    2014-10-01

    We study the most general effects of relic vector fields on the inflationary background and density perturbations. Such effects are observable if the number of inflationary e-folds is close to the minimum requirement to solve the horizon problem. We show that this can potentially explain two CMB large scale anomalies: the quadrupole-octopole alignment and the quadrupole power suppression. We discuss its effect on the parity anomaly. We also provide analytical template for more detailed data comparison.

  6. Turbulent large-scale structure effects on wake meandering

    NASA Astrophysics Data System (ADS)

    Muller, Y.-A.; Masson, C.; Aubrun, S.

    2015-06-01

    This work studies effects of large-scale turbulent structures on wake meandering using Large Eddy Simulations (LES) over an actuator disk. Other potential source of wake meandering such as the instablility mechanisms associated with tip vortices are not treated in this study. A crucial element of the efficient, pragmatic and successful simulations of large-scale turbulent structures in Atmospheric Boundary Layer (ABL) is the generation of the stochastic turbulent atmospheric flow. This is an essential capability since one source of wake meandering is these large - larger than the turbine diameter - turbulent structures. The unsteady wind turbine wake in ABL is simulated using a combination of LES and actuator disk approaches. In order to dedicate the large majority of the available computing power in the wake, the ABL ground region of the flow is not part of the computational domain. Instead, mixed Dirichlet/Neumann boundary conditions are applied at all the computational surfaces except at the outlet. Prescribed values for Dirichlet contribution of these boundary conditions are provided by a stochastic turbulent wind generator. This allows to simulate large-scale turbulent structures - larger than the computational domain - leading to an efficient simulation technique of wake meandering. Since the stochastic wind generator includes shear, the turbulence production is included in the analysis without the necessity of resolving the flow near the ground. The classical Smagorinsky sub-grid model is used. The resulting numerical methodology has been implemented in OpenFOAM. Comparisons with experimental measurements in porous-disk wakes have been undertaken, and the agreements are good. While temporal resolution in experimental measurements is high, the spatial resolution is often too low. LES numerical results provide a more complete spatial description of the flow. They tend to demonstrate that inflow low frequency content - or large- scale turbulent structures - is

  7. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  8. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  9. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    NASA Astrophysics Data System (ADS)

    Youmei, Han; Bogang, Yang

    2014-03-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods.

  10. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  11. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  12. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  13. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. PMID:25731989

  14. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, C.P.; Olden, J.D.; Lytle, D.A.; Melis, T.S.; Schmidt, J.C.; Bray, E.N.; Freeman, Mary C.; Gido, K.B.; Hemphill, N.P.; Kennard, M.J.; McMullen, L.E.; Mims, M.C.; Pyron, M.; Robinson, C.T.; Williams, J.G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems. ?? 2011 by American Institute of Biological Sciences. All rights reserved.

  15. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  16. Large-scale flow generation by inhomogeneous helicity

    NASA Astrophysics Data System (ADS)

    Yokoi, N.; Brandenburg, A.

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  17. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances.

    PubMed

    Parker, V Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  18. Large-scale quantification of CVD graphene surface coverage.

    PubMed

    Ambrosi, Adriano; Bonanni, Alessandra; Sofer, Zdeněk; Pumera, Martin

    2013-03-21

    The extraordinary properties demonstrated for graphene and graphene-related materials can be fully exploited when a large-scale fabrication procedure is made available. Chemical vapor deposition (CVD) of graphene on Cu and Ni substrates is one of the most promising procedures to synthesize large-area and good quality graphene films. Parallel to the fabrication process, a large-scale quality monitoring technique is equally crucial. We demonstrate here a rapid and simple methodology that is able to probe the effectiveness of the growth process over a large substrate area for both Ni and Cu substrates. This method is based on inherent electrochemical signals generated by the underlying metal catalysts when fractures or discontinuities of the graphene film are present. The method can be applied immediately after the CVD growth process without the need for any graphene transfer step and represents a powerful quality monitoring technique for the assessment of large-scale fabrication of graphene by the CVD process. PMID:23396554

  19. Photorealistic large-scale urban city model reconstruction.

    PubMed

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite). PMID:19423889

  20. Large scale structure in universes dominated by cold dark matter

    NASA Technical Reports Server (NTRS)

    Bond, J. Richard

    1986-01-01

    The theory of Gaussian random density field peaks is applied to a numerical study of the large-scale structure developing from adiabatic fluctuations in models of biased galaxy formation in universes with Omega = 1, h = 0.5 dominated by cold dark matter (CDM). The angular anisotropy of the cross-correlation function demonstrates that the far-field regions of cluster-scale peaks are asymmetric, as recent observations indicate. These regions will generate pancakes or filaments upon collapse. One-dimensional singularities in the large-scale bulk flow should arise in these CDM models, appearing as pancakes in position space. They are too rare to explain the CfA bubble walls, but pancakes that are just turning around now are sufficiently abundant and would appear to be thin walls normal to the line of sight in redshift space. Large scale streaming velocities are significantly smaller than recent observations indicate. To explain the reported 700 km/s coherent motions, mass must be significantly more clustered than galaxies with a biasing factor of less than 0.4 and a nonlinear redshift at cluster scales greater than one for both massive neutrino and cold models.

  1. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  2. Geospatial Optimization of Siting Large-Scale Solar Projects

    SciTech Connect

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  3. How Large Scales Flows May Influence Solar Activity

    NASA Technical Reports Server (NTRS)

    Hathaway, D. H.

    2004-01-01

    Large scale flows within the solar convection zone are the primary drivers of the Sun's magnetic activity cycle and play important roles in shaping the Sun's magnetic field. Differential rotation amplifies the magnetic field through its shearing action and converts poloidal field into toroidal field. Poleward meridional flow near the surface carries magnetic flux that reverses the magnetic poles at about the time of solar maximum. The deeper, equatorward meridional flow can carry magnetic flux back toward the lower latitudes where it erupts through the surface to form tilted active regions that convert toroidal fields into oppositely directed poloidal fields. These axisymmetric flows are themselves driven by large scale convective motions. The effects of the Sun's rotation on convection produce velocity correlations that can maintain both the differential rotation and the meridional circulation. These convective motions can also influence solar activity directly by shaping the magnetic field pattern. While considerable theoretical advances have been made toward understanding these large scale flows, outstanding problems in matching theory to observations still remain.

  4. New Large-scale Control Strategies for Turbulent Boundary Layers

    NASA Astrophysics Data System (ADS)

    Schoppa, Wade; Hussain, Fazle

    1997-11-01

    Using direct numerical simulations of turbulent channel flow, we present robust strategies for drag reduction by prevention of streamwise vortex formation near the wall. Instability of lifted, vortex-free low-speed streaks is shown to generate new streamwise vortices, which dominate near-wall turbulence phenomena. The newly-found instability mechanism initiates streak waviness in the (x,z) plane which leads to ωx sheets. Streak waviness induces positive partial u/partial x (i.e. positive VISA) which causes these sheets to then collapse via stretching (rather than roll up) into streamwise vortices. Significantly, the 3D features of the (instantaneous) instability-generated vortices agree well with the coherent structures educed (i.e. ensemble-averaged) from fully turbulent flow, suggesting the prevalence of this instability mechanism. The new control via large-scale streak manipulation exploits this crucial role of streak instability in vortex generation. An x-independent forcing with a z wavelength of 4 streak spacings, with an amplitude of only 5% of the centerline velocity, produces a significant sustained drag reduction: 20% for imposed counterrotating large-scale swirls and 50% for colliding spanwise wall jet-like forcing. These results suggest promising drag reduction strategies, involving large-scale (hence more durable) actuation and requiring no wall sensors or feedback logic.

  5. Lateral stirring of large-scale tracer fields by altimetry

    NASA Astrophysics Data System (ADS)

    Dencausse, Guillaume; Morrow, Rosemary; Rogé, Marine; Fleury, Sara

    2014-01-01

    Ocean surface fronts and filaments have a strong impact on the global ocean circulation and biogeochemistry. Surface Lagrangian advection with time-evolving altimetric geostrophic velocities can be used to simulate the submesoscale front and filament structures in large-scale tracer fields. We study this technique in the Southern Ocean region south of Tasmania, a domain marked by strong meso- to submesoscale features such as the fronts of the Antarctic Circumpolar Current (ACC). Starting with large-scale surface tracer fields that we stir with altimetric velocities, we determine `advected' fields which compare well with high-resolution in situ or satellite tracer data. We find that fine scales are best represented in a statistical sense after an optimal advection time of ˜2 weeks, with enhanced signatures of the ACC fronts and better spectral energy. The technique works best in moderate to high EKE regions where lateral advection dominates. This technique may be used to infer the distribution of unresolved small scales in any physical or biogeochemical surface tracer that is dominated by lateral advection. Submesoscale dynamics also impact the subsurface of the ocean, and the Lagrangian advection at depth shows promising results. Finally, we show that climatological tracer fields computed from the advected large-scale fields display improved fine-scale mean features, such as the ACC fronts, which can be useful in the context of ocean modelling.

  6. Large scale anisotropy of UHECRs for the Telescope Array

    SciTech Connect

    Kido, E.

    2011-09-22

    The origin of Ultra High Energy Cosmic Rays (UHECRs) is one of the most interesting questions in astroparticle physics. Despite of the efforts by other previous measurements, there is no consensus of both of the origin and the mechanism of UHECRs generation and propagation yet. In this context, Telescope Array (TA) experiment is expected to play an important role as the largest detector in the northern hemisphere which consists of an array of surface particle detectors (SDs) and fluorescence detectors (FDs) and other important calibration devices. We searched for large scale anisotropy using SD data of TA. UHECRs are expected to be restricted in GZK horizon when the composition of UHECRs is proton, so the observed arrival directions are expected to exhibit local large scale anisotropy if UHECR sources are some astrophysical objects. We used the SD data set from 11 May 2008 to 7 September 2010 to search for large-scale anisotropy. The discrimination power between LSS and isotropy is not enough yet, but the statistics in TA is expected to discriminate between those in about 95% confidence level on average in near future.

  7. Large-scale functional connectivity networks in the rodent brain.

    PubMed

    Gozzi, Alessandro; Schwarz, Adam J

    2016-02-15

    Resting-state functional Magnetic Resonance Imaging (rsfMRI) of the human brain has revealed multiple large-scale neural networks within a hierarchical and complex structure of coordinated functional activity. These distributed neuroanatomical systems provide a sensitive window on brain function and its disruption in a variety of neuropathological conditions. The study of macroscale intrinsic connectivity networks in preclinical species, where genetic and environmental conditions can be controlled and manipulated with high specificity, offers the opportunity to elucidate the biological determinants of these alterations. While rsfMRI methods are now widely used in human connectivity research, these approaches have only relatively recently been back-translated into laboratory animals. Here we review recent progress in the study of functional connectivity in rodent species, emphasising the ability of this approach to resolve large-scale brain networks that recapitulate neuroanatomical features of known functional systems in the human brain. These include, but are not limited to, a distributed set of regions identified in rats and mice that may represent a putative evolutionary precursor of the human default mode network (DMN). The impact and control of potential experimental and methodological confounds are also critically discussed. Finally, we highlight the enormous potential and some initial application of connectivity mapping in transgenic models as a tool to investigate the neuropathological underpinnings of the large-scale connectional alterations associated with human neuropsychiatric and neurological conditions. We conclude by discussing the translational potential of these methods in basic and applied neuroscience. PMID:26706448

  8. Upscaling of elastic properties for large scale geomechanical simulations

    NASA Astrophysics Data System (ADS)

    Chalon, F.; Mainguy, M.; Longuemare, P.; Lemonnier, P.

    2004-09-01

    Large scale geomechanical simulations are being increasingly used to model the compaction of stress dependent reservoirs, predict the long term integrity of under-ground radioactive waste disposals, and analyse the viability of hot-dry rock geothermal sites. These large scale simulations require the definition of homogenous mechanical properties for each geomechanical cell whereas the rock properties are expected to vary at a smaller scale. Therefore, this paper proposes a new methodology that makes possible to define the equivalent mechanical properties of the geomechanical cells using the fine scale information given in the geological model. This methodology is implemented on a synthetic reservoir case and two upscaling procedures providing the effective elastic properties of the Hooke's law are tested. The first upscaling procedure is an analytical method for perfectly stratified rock mass, whereas the second procedure computes lower and upper bounds of the equivalent properties with no assumption on the small scale heterogeneity distribution. Both procedures are applied to one geomechanical cell extracted from the reservoir structure. The results show that the analytical and numerical upscaling procedures provide accurate estimations of the effective parameters. Furthermore, a large scale simulation using the homogenized properties of each geomechanical cell calculated with the analytical method demonstrates that the overall behaviour of the reservoir structure is well reproduced for two different loading cases. Copyright

  9. Health inequalities in Armenia - analysis of survey results”

    PubMed Central

    2012-01-01

    Introduction Prevailing sociopolitical and economic obstacles have been implicated in the inadequate utilization and delivery of the Armenian health care system. Methods A random survey of 1,000 local residents, from all administrative regions of Armenia, concerned with health care services cost and satisfaction was conducted. Participation in the survey was voluntary and the information was collected using anonymous telephone interviews. Results The utilization of health care services was low, particularly in rural areas. This under-utilization of services correlated with low income of the population surveyed. The state funded health care services are inadequate to ensure availability of free-of-charge services even to economically disadvantaged groups. Continued reliance on direct out-of pocket and illicit payments, for medical services, are serious issues which plague healthcare, pharmaceutical and medical technology sectors of Armenia. Conclusions Restructuring of the health care system to implement a cost-effective approach to the prevention and treatment of diseases, especially disproportionately affect the poor, should be undertaken. Public payments, increasing the amount of subsidies for poor and lower income groups through a compulsory health insurance system should be evaluated and included as appropriate in this health system redesign. Current medical services reimbursement practices undermine the principle of equity in financing and access. Measures designed to improve healthcare access and affordability for poor and disadvantaged households should be enacted. PMID:22695079

  10. Geochemical surveys in the United States in relation to health.

    USGS Publications Warehouse

    Tourtelot, H.A.

    1979-01-01

    Geochemical surveys in relation to health may be classified as having one, two or three dimensions. One-dimensional surveys examine relations between concentrations of elements such as Pb in soils and other media and burdens of the same elements in humans, at a given time. The spatial distributions of element concentrations are not investigated. The primary objective of two-dimensional surveys is to map the distributions of element concentrations, commonly according to stratified random sampling designs based on either conceptual landscape units or artificial sampling strata, but systematic sampling intervals have also been used. Political units have defined sample areas that coincide with the units used to accumulate epidemiological data. Element concentrations affected by point sources have also been mapped. Background values, location of natural or technological anomalies and the geographic scale of variation for several elements often are determined. Three-dimensional surveys result when two-dimensional surveys are repeated to detect environmental changes. -Author

  11. Neglected Value of Small Population-based Surveys: A Comparison with Demographic and Health Survey Data

    PubMed Central

    Langston, Anne C.; Sarriot, Eric G.

    2015-01-01

    ABSTRACT We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  12. Neglected value of small population-based surveys: a comparison with demographic and health survey data.

    PubMed

    Langston, Anne C; Prosnitz, Debra M; Sarriot, Eric G

    2015-03-01

    We believe that global health practice and evaluation operate with misleading assumptions about lack of reliability of small population-based health surveys (district level and below), leading managers and decision-makers to under-use this valuable information and programmatic tool and to rely on health information from large national surveys when neither timing nor available data meet their needs. This paper uses a unique opportunity for comparison between a knowledge, practice, and coverage (KPC) household survey and Rwanda Demographic and Health Survey (RDHS) carried out in overlapping timeframes to disprove these enduring suspicions. Our analysis shows that the KPC provides coverage estimates consistent with the RDHS estimates for the same geographic areas. We discuss cases of divergence between estimates. Application of the Lives Saved Tool to the KPC results also yields child mortality estimates comparable with DHS-measured mortality. We draw three main lessons from the study and conclude with recommendations for challenging unfounded assumptions against the value of small household coverage surveys, which can be a key resource in the arsenal of local health programmers. PMID:25995729

  13. Robust large-scale parallel nonlinear solvers for simulations.

    SciTech Connect

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write

  14. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  15. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  16. Beliefs about breastfeeding: a statewide survey of health professionals.

    PubMed

    Barnett, E; Sienkiewicz, M; Roholt, S

    1995-03-01

    A statewide project was implemented in 1993 to increase breastfeeding among low-income women in North Carolina through improved institutional policies and practices and professional lactation-management skills. A survey designed to ascertain professional beliefs about breastfeeding was mailed to 31 hospitals and 25 public health agencies. A total of 2209 health professionals completed the survey and met the study selection criteria. Nutritionists and pediatricians were most likely to have positive beliefs about breastfeeding, whereas hospital nurses were most likely to have negative beliefs. Personal breastfeeding experience contributed to positive beliefs. Professionals were least convinced of the emotional benefits of breastfeeding. Those with negative beliefs were most likely to advocate complete infant weaning from the breast before nine months of age. Although most health professionals had positive beliefs about breastfeeding, differences by profession, work environment, and personal breastfeeding experience indicate the need for comprehensive training in lactation management, and improvements in hospital and public health clinic environments. PMID:7741946

  17. [The Project on a Large Scale in its first times: time for a recall].

    PubMed

    Bassinello, Greicelene Aparecida Hespanhol; Bagnato, Maria Helena Salgado

    2009-01-01

    In this work we rebuilt the first attempts on the creation of the Program of Formation on a Large Scale of Elementary and High School people for basic health services. We examined the Program of Formation on a Large Scale from its beginning, being supported by documentary sources, such as Izabel dos Santos's interview, which filled in all the meanings of this experience. In the investigations, we went trough the purpose and the procedures of the proposal on a national scale. According to our point of view, this experience acquired a wider meaning of qualification: in which the focal point of the work, as a condition to workers' formation process, constituted as a methodological-pedagogical purpose of qualification at the work environment in order to obtain a critical professional. PMID:19768343

  18. Ethical and practical challenges to studying patients who opt out of large-scale biorepository research

    PubMed Central

    Rosenbloom, S Trent; Madison, Jennifer L; Brothers, Kyle B; Bowton, Erica A; Clayton, Ellen Wright; Malin, Bradley A; Roden, Dan M; Pulley, Jill

    2013-01-01

    Large-scale biorepositories that couple biologic specimens with electronic health records containing documentation of phenotypic expression can accelerate scientific research and discovery. However, differences between those subjects who participate in biorepository-based research and the population from which they are drawn may influence research validity. While an opt-out approach to biorepository-based research enhances inclusiveness, empirical research evaluating voluntariness, risk, and the feasibility of an opt-out approach is sparse, and factors influencing patients’ decisions to opt out are understudied. Determining why patients choose to opt out may help to improve voluntariness, however there may be ethical and logistical challenges to studying those who opt out. In this perspective paper, the authors explore what is known about research based on the opt-out model, describe a large-scale biorepository that leverages the opt-out model, and review specific ethical and logistical challenges to bridging the research gaps that remain. PMID:23886923

  19. Canada's health promotion survey as a milestone in public health research.

    PubMed

    Rootman, Irving; Warren, Reg; Catlin, Gary

    2010-01-01

    This commentary describes the contribution of the 1985 Canadian National Health Promotion Survey to the development of public health research and policy-making in Canada and argues that on the basis of that contribution, it should be considered to be a public health research milestone. In terms of research, among its contributions which subsequently have been adopted in other survey studies were: going beyond risk factors to operationalize concepts implicit in the Ottawa Charter for Health Promotion; empowering users to participate in knowledge translation, sharing and transfer; ensuring sufficient sample sizes for each jurisdiction to be able to confidently generalize to its population; establishing a model as well as questions for subsequent health surveys; encouraging widespread use of data through making them available early; and developing and using an explicit social marketing strategy to reach target audiences, including the general public. With regard to policy-making, among its contributions which have been adopted were: using survey data to develop and enhance healthy public policy initiatives; encouraging researchers to work with policy-makers in developing policies; using survey data to contribute to the evaluation of public health initiatives; engaging policy-makers in the development of surveys; and encouraging the use of survey data for advocacy. PMID:21370775

  20. 75 FR 62636 - Proposed Information Collection (Veterans Health Benefits Handbook Satisfaction Survey) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-12

    ... AFFAIRS Proposed Information Collection (Veterans Health Benefits Handbook Satisfaction Survey) Activity... forms of information technology. Title: Veterans Health Benefits Handbook Satisfaction Survey, VA Form... benefits information contained in Veterans Health Benefits handbook. DATES: Written comments...