Science.gov

Sample records for large-scale health survey

  1. Large Scale Surveys: Interagency Collaboration.

    ERIC Educational Resources Information Center

    Patrick, Edward; And Others

    Outlined and evaluated in this paper is the collaborative process used by Project 81 state educational agency (SEA), local educational agency (LEA) and support staff in planning and managing a survey to validate life-role competency statements at twelve pilot sites in Pennsylvania. The project is a Pennsylvania Department of Education…

  2. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  3. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  4. Opportunities and challenges for the use of large-scale surveys in public health research: A comparison of the assessment of cancer screening behaviors

    PubMed Central

    Hamilton, Jada G.; Breen, Nancy; Klabunde, Carrie N.; Moser, Richard P.; Leyva, Bryan; Breslau, Erica S.; Kobrin, Sarah C.

    2014-01-01

    Large-scale surveys that assess cancer prevention and control behaviors are a readily-available, rich resource for public health researchers. Although these data are used by a subset of researchers who are familiar with them, their potential is not fully realized by the research community for reasons including lack of awareness of the data, and limited understanding of their content, methodology, and utility. Until now, no comprehensive resource existed to describe and facilitate use of these data. To address this gap and maximize use of these data, we catalogued the characteristics and content of four surveys that assessed cancer screening behaviors in 2005, the most recent year with concurrent periods of data collection: the National Health Interview Survey, Health Information National Trends Survey, Behavioral Risk Factor Surveillance System, and California Health Interview Survey. We documented each survey's characteristics, measures of cancer screening, and relevant correlates; examined how published studies (n=78) have used the surveys’ cancer screening data; and reviewed new cancer screening constructs measured in recent years. This information can guide researchers in deciding how to capitalize on the opportunities presented by these data resources. PMID:25300474

  5. Large scale survey of enteric viruses in river and waste water underlines the health status of the local population.

    PubMed

    Prevost, B; Lucas, F S; Goncalves, A; Richard, F; Moulin, L; Wurtzer, S

    2015-06-01

    Although enteric viruses constitute a major cause of acute waterborne diseases worldwide, environmental data about occurrence and viral load of enteric viruses in water are not often available. In this study, enteric viruses (i.e., adenovirus, aichivirus, astrovirus, cosavirus, enterovirus, hepatitis A and E viruses, norovirus of genogroups I and II, rotavirus A and salivirus) were monitored in the Seine River and the origin of contamination was untangled. A total of 275 water samples were collected, twice a month for one year, from the river Seine, its tributaries and the major WWTP effluents in the Paris agglomeration. All water samples were negative for hepatitis A and E viruses. AdV, NVGI, NVGII and RV-A were the most prevalent and abundant populations in all water samples. The viral load and the detection frequency increased significantly between the samples collected the most upstream and the most downstream of the Paris urban area. The calculated viral fluxes demonstrated clearly the measurable impact of WWTP effluents on the viral contamination of the Seine River. The viral load was seasonal for almost all enteric viruses, in accordance with the gastroenteritis recordings provided by the French medical authorities. These results implied the existence of a close relationship between the health status of inhabitants and the viral contamination of WWTP effluents and consequently surface water contamination. Subsequently, the regular analysis of wastewater could serve as a proxy for the monitoring of the human viruses circulating in both a population and surface water.

  6. Survey Design for Large-Scale, Unstructured Resistivity Surveys

    NASA Astrophysics Data System (ADS)

    Labrecque, D. J.; Casale, D.

    2009-12-01

    In this paper, we discuss the issues in designing data collection strategies for large-scale, poorly structured resistivity surveys. Existing or proposed applications for these types of surveys include carbon sequestration, enhanced oil recovery monitoring, monitoring of leachate from working or abandoned mines, and mineral surveys. Electrode locations are generally chosen by land access, utilities, roads, existing wells etc. Classical arrays such as the Wenner array or dipole-dipole arrays are not applicable if the electrodes cannot be placed in quasi-regular lines or grids. A new, far more generalized strategy is needed for building data collection schemes. Following the approach of earlier two-dimensional (2-D) survey designs, the proposed method begins by defining a base array. In (2-D) design, this base array is often a standard dipole-dipole array. For unstructured three-dimensional (3-D) design, determining this base array is a multi-step process. The first step is to determine a set of base dipoles with similar characteristics. For example, the base dipoles may consist of electrode pairs trending within 30 degrees of north and with a length between 100 and 250 m in length. These dipoles are then combined into a trial set of arrays. This trial set of arrays is reduced by applying a series of filters based on criteria such as separation between the dipoles. Using the base array set, additional arrays are added and tested to determine the overall improvement in resolution and to determine an optimal set of arrays. Examples of the design process are shown for a proposed carbon sequestration monitoring system.

  7. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  8. Interloper bias in future large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Pullen, Anthony R.; Hirata, Christopher M.; Doré, Olivier; Raccanelli, Alvise

    2016-02-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing power spectrum inferences. We also construct a formalism for predicting cosmological parameter measurement bias, demonstrating that a 0.15%-0.3% interloper fraction could bias the growth rate by more than 10% of the error, which can affect constraints on gravity from upcoming surveys. We use the COSMOS Mock Catalog (CMC), with the emission lines rescaled to better reproduce recent data, to predict potential interloper fractions for the Prime Focus Spectrograph (PFS) and the Wide-Field InfraRed Survey Telescope (WFIRST). We find that secondary line identification, or confirming galaxy redshifts by finding correlated emission lines, can remove interlopers for PFS. For WFIRST, we use the CMC to predict that the 0.2% target can be reached for the WFIRST Hα survey, but sensitive optical and near-infrared photometry will be required. For the WFIRST [O III] survey, the predicted interloper fractions reach several percent and their effects will have to be estimated and removed statistically (e.g., with deep training samples). These results are optimistic as the CMC does not capture the full set of correlations of galaxy properties in the real Universe, and they do not include blending effects. Mitigating interloper contamination will be crucial to the next generation of

  9. A Large Scale Survey of Molecular Clouds at Nagoya University

    NASA Astrophysics Data System (ADS)

    Mizuno, A.; Onishi, T.; Yamaguchi, N.; Hara, A.; Hayakawa, T.; Kato, S.; Mizuno, N.; Abe, R.; Saito, H.; Yamaguchi, R.; Mine, Y.; Moriguchi, Y.; Mano, S.; Matsunaga, K.; Tachihara, K.; Kawamura, A.; Yonekura, Y.; Ogawa, H.; Fukui, Y.

    1999-10-01

    Large scale 12CO and 13CO (J=1-0) surveys have been carried out by using two 4-m radio telescopes at Nagoya University since 1990 in order to obtain a complete sample of the Galactic molecular clouds. The southern survey started in 1996 with one of the telescopes, named "NANTEN", installed at the Las Campanas Observatory in Chile. The observations made at a grid spacing of 2' - 8' with a 2.'7 beam allow us to identify and resolve the individual star forming dense cores within 1-2 kpc of the sun. The present coverage in the 12CO and 13CO are ~ 7% and ~ 21% of the sky, respectively. The data are used to derive physical parameters of dense cores and to study the mass spectrum, morphology, and conditions for star formation. For example, the survey revealed that the cloud mass function is fairly universal for various regions (e.g., Yonekura et al. 1998, ApJS, 110, 21), and that star forming clouds tend to be characterized by low Mvir/MLTE (e.g., Kawamura et al. 1998, ApJS, 117, 387; Mizuno et al. 1999, PASJ, in press). The survey will provide invaluable database of southern star and planet forming regions, one of the important scientific targets of ALMA.

  10. Characterizing unknown systematics in large scale structure surveys

    SciTech Connect

    Agarwal, Nishant; Ho, Shirley; Myers, Adam D.; Seo, Hee-Jong; Ross, Ashley J.; Bahcall, Neta; Brinkmann, Jonathan; Eisenstein, Daniel J.; Muna, Demitri; Palanque-Delabrouille, Nathalie; Yèche, Christophe; Petitjean, Patrick; Schneider, Donald P.; Streblyanska, Alina; Weaver, Benjamin A.

    2014-04-01

    Photometric large scale structure (LSS) surveys probe the largest volumes in the Universe, but are inevitably limited by systematic uncertainties. Imperfect photometric calibration leads to biases in our measurements of the density fields of LSS tracers such as galaxies and quasars, and as a result in cosmological parameter estimation. Earlier studies have proposed using cross-correlations between different redshift slices or cross-correlations between different surveys to reduce the effects of such systematics. In this paper we develop a method to characterize unknown systematics. We demonstrate that while we do not have sufficient information to correct for unknown systematics in the data, we can obtain an estimate of their magnitude. We define a parameter to estimate contamination from unknown systematics using cross-correlations between different redshift slices and propose discarding bins in the angular power spectrum that lie outside a certain contamination tolerance level. We show that this method improves estimates of the bias using simulated data and further apply it to photometric luminous red galaxies in the Sloan Digital Sky Survey as a case study.

  11. Consent and widespread access to personal health information for the delivery of care: a large scale telephone survey of consumers' attitudes using vignettes in New Zealand

    PubMed Central

    Whiddett, Dick; Hunter, Inga; McDonald, Barry; Norris, Tony; Waldon, John

    2016-01-01

    Objectives In light of recent health policy, to examine factors which influence the public's willingness to consent to share their health information in a national electronic health record (EHR). Design Data were collected in a national telephone survey in 2008. Respondents were presented with vignettes that described situations in which their health information was shared and asked if they would consent to such sharing. The subset, consisting of the 18 vignettes that covered proving care, was reanalysed in depth using new statistical methods in 2016. Setting Adult population of New Zealand accessible by telephone landline. Participants 4209 adults aged 18+ years in the full data set, 2438 of which are included in the selected subset. Main outcome measures For each of 18 vignettes, we measured the percentage of respondents who would consent for their information to be shared for 2 groups; for those who did not consider that their records contained sensitive information, and for those who did or refused to say. Results Rates of consent ranged from 89% (95% CI 87% to 92%) for sharing of information with hospital doctors and nurses to 51% (47% to 55%) for government agencies. Mixed-effects logistic regression was used to identify factors which had significant impact on consent. The role of the recipient and the level of detail influenced respondents' willingness to consent (p<0.0001 for both factors). Of the individual characteristics, the biggest impact was that respondents whose records contain sensitive information (or who refused to answer) were less willing to consent (p<0.0001). Conclusions A proportion of the population are reluctant to share their health information beyond doctors, nurses and paramedics, particularly when records contain sensitive information. These findings may have adverse implications for healthcare strategies based on widespread sharing of information. Further research is needed to understand and overcome peoples' ambivalence towards

  12. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  13. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  14. The Role of Plausible Values in Large-Scale Surveys

    ERIC Educational Resources Information Center

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1) address…

  15. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  16. Performance Health Monitoring of Large-Scale Systems

    SciTech Connect

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  17. A large-scale integrated aerogeophysical survey of Afghanistan

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Childers, V. A.; Gardner, J. M.; Liang, R. T.; Bowles, J. H.; Abraham, J. D.

    2007-12-01

    A multi-sensor, multidisciplinary aerogeophysical survey of a major portion of Afghanistan was recently conducted by investigators from the Naval Research Laboratory and the U.S. Geological Survey. More than 110,000 line km of data tracks were flown aboard an NP-3D Orion aircraft. Sensor systems installed on the P-3 included dual gravimeters, scalar and vector magnetometers, a digital photogrammetric camera, a hyperspectral imager, and an L-band polarimetric synthetic aperture radar (SAR). Data from all sources were precisely co-registered to the ground by a combination of interferometric-mode Global Positioning System (GPS) and inertial measurements. The data from this integrated mapping mission support numerous basic and applied science efforts in Afghanistan including: resource assessment and exploration for oil, gas, and minerals, development of techniques for sensor fusion and automated analysis, and topics in crustal geophysics and geodesy. The data will also support civil infrastructure needs such as cadastral surveying, urban planning and development, and pipeline/powerline/road routing and construction, agriculture and hydrologic resource management, earthquake hazard analysis, and base-maps for humanitarian relief missions.

  18. Large Scale Structure at 24 Microns in the SWIRE Survey

    NASA Astrophysics Data System (ADS)

    Masci, F. J.; SWIRE Team

    2006-12-01

    We present initial results of galaxy clustering at 24μm by analyzing statistics of the projected galaxy distribution from counts-in-cells. This study focuses on the ELAIS-North1 SWIRE field. The sample covers ≃5.9 deg2 and contains 24,715 sources detected at 24μm to a 5.6σ limit of 250μJy (in the lowest coverage regions). We have explored clustering as a function of 3.6 - 24μm and 24μm flux density using angular-averaged two-point correlation functions derived from the variance of counts-in-cells on scales 0°.05-0°.7. Using a power-law parameterization, w2(θ)=A(θ/deg)1-γ, we find [A,γ] = [(5.43±0.20)×10-4,2.01±0.02] for the full sample (1σ errors throughout). We have inverted Limber's equation and estimated a spatial correlation length of r0=3.32±0.19 h-1Mpc for the full sample, assuming stable clustering and a redshift model consistent with observed 24μm counts. We also find that blue [fν(24)/fν(3.6)≤5.5] and red [fν(24)/fν(3.6)≥6.5] galaxies have the lowest and highest r0 values respectively, implying that redder galaxies are more clustered (by a factor of ≈3 on scales ⪆0°.2). Overall, the clustering estimates are smaller than those derived from optical surveys, but in agreement with results from IRAS and ISO in the mid-infrared. This extends the notion to higher redshifts that infrared selected surveys show weaker clustering than optical surveys.

  19. Large Scale CO Survey of the Northern Galaxy

    NASA Astrophysics Data System (ADS)

    Jiang, Zhibo; Li, Junyu

    2013-07-01

    We present the 12CO, 13CO and C18O line survey towards a region of L=(13.2,16.3) and B=(-1.2, 0.2), as a part the ambitious project, the Milky Way Image Scroll Painting. The CO line emissions show a number of velocity components, each component shows different structure and morphology than the others. A common feature that shows up in every component is the filamentary structures. Meanwhile a number of bubbles are found in some components, which are especially clear at 40 km s-1 and 60 km s-1. The velocity gradient of the 20 km s-1 component is surprisingly large, which can be interpreted neither by simply assuming the global rotation of the spiral arms nor by assuming the local motion along the spiral arm.

  20. A Novel Electronic Data Collection System for Large-Scale Surveys of Neglected Tropical Diseases

    PubMed Central

    King, Jonathan D.; Buolamwini, Joy; Cromwell, Elizabeth A.; Panfel, Andrew; Teferi, Tesfaye; Zerihun, Mulat; Melak, Berhanu; Watson, Jessica; Tadesse, Zerihun; Vienneau, Danielle; Ngondi, Jeremiah; Utzinger, Jürg; Odermatt, Peter; Emerson, Paul M.

    2013-01-01

    -based technology was suitable for a large-scale health survey, saved time, provided more accurate geo-coordinates, and was preferred by recorders over standard paper-based questionnaires. PMID:24066147

  1. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  2. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we

  3. Language Learning Motivation in China: Results of a Large-Scale Stratified Survey

    ERIC Educational Resources Information Center

    You, Chenjing; Dörnyei, Zoltán

    2016-01-01

    This article reports on the findings of a large-scale cross-sectional survey of the motivational disposition of English language learners in secondary schools and universities in China. The total sample involved over 10,000 students and was stratified according to geographical region and teaching contexts, selecting participants both from urban…

  4. The Use of Online Social Networks by Polish Former Erasmus Students: A Large-Scale Survey

    ERIC Educational Resources Information Center

    Bryla, Pawel

    2014-01-01

    There is an increasing role of online social networks in the life of young Poles. We conducted a large-scale survey among Polish former Erasmus students. We have received 2450 completed questionnaires from alumni of 115 higher education institutions all over Poland. 85.4% of our respondents reported they kept in touch with their former Erasmus…

  5. PERSPECTIVES ON LARGE-SCALE NATURAL RESOURCES SURVEYS WHEN CAUSE-EFFECT IS A POTENTIAL ISSUE

    EPA Science Inventory

    Our objective is to present a perspective on large-scale natural resource monitoring when cause-effect is a potential issue. We believe that the approach of designing a survey to meet traditional commodity production and resource state descriptive objectives is too restrictive an...

  6. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  7. Can community hospitals survive without large scale health reform?

    PubMed

    Unland, James J

    2004-01-01

    This nation's not-for-profit community hospitals, numbering over 4000 and providing the largest percentage of all hospital services to the US population, are threatened as never before by erratic reimbursement, reduced capital access and, more recently, by physicians who now compete both by virtue of outpatient/ambulatory services and by starting "specialty hospitals." This article examines some of these trends and their implications, raising the issue of whether it is time for major restructuring of our reimbursement systems and other significant health reforms. PMID:15151196

  8. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey

  9. Use of large-scale, multi-species surveys to monitor gyrfalcon and ptarmigan populations

    USGS Publications Warehouse

    Bart, Jonathan; Fuller, Mark; Smith, Paul; Dunn, Leah; Watson, Richard T.; Cade, Tom J.; Fuller, Mark; Hunt, Grainger; Potapov, Eugene

    2011-01-01

    We evaluated the ability of three large-scale, multi-species surveys in the Arctic to provide information on abundance and habitat relationships of Gyrfalcons (Falco rusticolus) and ptarmigan. The Program for Regional and International Shorebird Monitoring (PRISM) has surveyed birds widely across the arctic regions of Canada and Alaska since 2001. The Arctic Coastal Plain survey has collected abundance information on the North Slope of Alaska using fixed-wing aircraft since 1992. The Northwest Territories-Nunavut Bird Checklist has collected presenceabsence information from little-known locations in northern Canada since 1995. All three surveys provide extensive information on Willow Ptarmigan (Lagopus lagopus) and Rock Ptarmigan (L. muta). For example, they show that ptarmigan are most abundant in western Alaska, next most abundant in northern Alaska and northwest Canada, and least abundant in the Canadian Archipelago. PRISM surveys were less successful in detecting Gyrfalcons, and the Arctic Coastal Plain Survey is largely outside the Gyrfalcon?s breeding range. The Checklist Survey, however, reflects the expansive Gyrfalcon range in Canada. We suggest that collaboration by Gyrfalcon and ptarmigan biologists with the organizers of large scale surveys like the ones we investigated provides an opportunity for obtaining useful information on these species and their environment across large areas.

  10. Bayesian inference of the initial conditions from large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Leclercq, Florent

    2016-10-01

    Analysis of three-dimensional cosmological surveys has the potential to answer outstanding questions on the initial conditions from which structure appeared, and therefore on the very high energy physics at play in the early Universe. We report on recently proposed statistical data analysis methods designed to study the primordial large-scale structure via physical inference of the initial conditions in a fully Bayesian framework, and applications to the Sloan Digital Sky Survey data release 7. We illustrate how this approach led to a detailed characterization of the dynamic cosmic web underlying the observed galaxy distribution, based on the tidal environment.

  11. Characterising large-scale structure with the REFLEX II cluster survey

    NASA Astrophysics Data System (ADS)

    Chon, Gayoung

    2016-10-01

    We study the large-scale structure with superclusters from the REFLEX X-ray cluster survey together with cosmological N-body simulations. It is important to construct superclusters with criteria such that they are homogeneous in their properties. We lay out our theoretical concept considering future evolution of superclusters in their definition, and show that the X-ray luminosity and halo mass functions of clusters in superclusters are found to be top-heavy, different from those of clusters in the field. We also show a promising aspect of using superclusters to study the local cluster bias and mass scaling relation with simulations.

  12. Measures of large-scale structure in the CfA redshift survey slices

    NASA Technical Reports Server (NTRS)

    De Lapparent, Valerie; Geller, Margaret J.; Huchra, John P.

    1991-01-01

    Variations of the counts-in-cells with cell size are used here to define two statistical measures of large-scale clustering in three 6 deg slices of the CfA redshift survey. A percolation criterion is used to estimate the filling factor which measures the fraction of the total volume in the survey occupied by the large-scale structures. For the full 18 deg slice of the CfA redshift survey, f is about 0.25 + or - 0.05. After removing groups with more than five members from two of the slices, variations of the counts in occupied cells with cell size have a power-law behavior with a slope beta about 2.2 on scales from 1-10/h Mpc. Application of both this statistic and the percolation analysis to simulations suggests that a network of two-dimensional structures is a better description of the geometry of the clustering in the CfA slices than a network of one-dimensional structures. Counts-in-cells are also used to estimate at 0.3 galaxy h-squared/Mpc the average galaxy surface density in sheets like the Great Wall.

  13. Up-to-date radon-thoron discriminative detector for a large scale survey

    SciTech Connect

    Tokonami, Shinji; Takahashi, Hiroyuki; Kobayashi, Yosuke; Zhuo, Weihai; Hulber, Erik

    2005-11-15

    An up-to-date radon-thoron discriminative detector has been developed for conducting a large scale survey. Comparing with our previous detector, some functional problems have been solved. The lowest and highest detection limits of the detector were estimated to be around 5 and 1000 Bq m{sup -3} for radon, and 15 and 1000 Bq m{sup -3} for thoron, respectively, with a 6 month exposure and several theoretical assumptions. Small indoor survey were carried out in Japan and in Hungary using the present detector. The presence of thoron was demonstrated in any indoor surveys by the two results. Since any measurements without discrimination of radon isotopes will result in different risk estimates from actual situations, a special attention should be paid to thoron and its concentration should be accordingly measured as well as the radon concentration.

  14. The Observatorio Astrofisico de Javalambre. A planned facility for large scale surveys

    NASA Astrophysics Data System (ADS)

    Moles, M.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Gruel, N.; Marín Franch, A.; Valdivielso, L.; Viironen, K.

    2011-11-01

    All-sky surveys play a fundamental role for the development of Astrophysics. The need for large-scale surveys comes from two basic motivations: one is to make an inventory of sources as complete as possible and allow for their classification in families. The other is to attack some problems demanding the sampling of large volumes to give a detectable signal. New challenges, in particular in the domain of Cosmology are giving impulse to a new kind of large-scale surveys, combining area coverage, depth and accurate enough spectral information to recover the redshift and spectral energy distribution (SED) of the detected objects. New instruments are needed to satisfy the requirements of those large-scale surveys, in particular large Etendue telescopes. The Observatorio Astrofisico de Javalambre, OAJ, project includes a telescope of 2.5 m aperture, with a wide field of view, 3 degrees in diameter, and excellent image quality in the whole field. Taking into account that it is going to be fully devoted to carry out surveys, it will be the highest effective Etendue telescope up to date. The project is completed with a smaller, wide field auxiliary telescope. The Observatory is being built at Pico del Buitre, Sierra de Javalambre, Teruel, a site with excellent seeing and low sky surface brightness. The institution in charge of the Observatory is the Centro de Estudios de Fisica del Cosmos de Aragon, CEFCA, a new center created in Teruel for the operation and scientific exploitation of the Javalambre Observatory. CEFCA will be also in charge of the data management and archiving. The data will be made accessible to the community.The first planned scientific project is a multi-narrow-band photometric survey covering 8,000 square degrees, designed to produce precise SEDs, and photometric redshifts accurate at the 0.3 % level. A total of 42, 100-120 Å band pass filters covering most of the optical spectral range will be used. In this sense it is the development, at a much

  15. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation. PMID:27577416

  16. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  17. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    PubMed

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments. PMID:27624742

  18. Public health concerns for neighbors of large-scale swine production operations.

    PubMed

    Thu, K M

    2002-05-01

    This article provides a review and critical synthesis of research related to public health concerns for neighbors exposed to emissions from large-scale swine production operations. The rapid industrialization of pork production in the 1990s produced a generation of confined animal feeding operations (CAFOs) of a size previously unseen in the U.S. Recent research and results from federally sponsored scientific symposia consistently indicate that neighbors of large-scale swine CAFOs can experience health problems at significantly higher rates than controlled comparison populations. Symptoms experienced by swine CAFO neighbors are generally oriented toward irritation of the respiratory tract and are consistent with the types of symptoms among interior confinement workers thathave been well documented in the occupational health literature. However, additional exposure assessment research is required to elucidate the relationship of reported symptoms among swine CAFO neighbors and CAFO emissions. PMID:12046804

  19. Large-Scale Surveys of Snow Depth on Arctic Sea Ice from Operation IceBridge

    NASA Technical Reports Server (NTRS)

    Kurtz, Nathan T.; Farrell, Sinead L.

    2011-01-01

    We show the first results of a large ]scale survey of snow depth on Arctic sea ice from NASA fs Operation IceBridge snow radar system for the 2009 season and compare the data to climatological snow depth values established over the 1954.1991 time period. For multiyear ice, the mean radar derived snow depth is 33.1 cm and the corresponding mean climatological snow depth is 33.4 cm. The small mean difference suggests consistency between contemporary estimates of snow depth with the historical climatology for the multiyear ice region of the Arctic. A 16.5 cm mean difference (climatology minus radar) is observed for first year ice areas suggesting that the increasingly seasonal sea ice cover of the Arctic Ocean has led to an overall loss of snow as the region has transitioned away from a dominantly multiyear ice cover.

  20. Large-scale internal structure in volcanogenic breakout flood deposits: Extensive GPR survey on volcaniclastic deposits

    NASA Astrophysics Data System (ADS)

    Kataoka, K.; Gomez, C. A.

    2012-12-01

    Large-scale outburst floods from volcanic lakes such as caldera lakes or volcanically dammed river-valleys tend to be voluminous with total discharge of > 1-10s km3 and peak discharge of >10000s to 100000s m3 s-1. Such a large flood can travel long distance and leave sediments and bedforms/landforms extensively with large-scale internal structures, which are difficult to assess from single local sites. Moreover, the sediments and bedforms/landforms are sometimes untraceable, and outcrop information obtained by classical geological and geomorphological field surveys is limited to the dissected/terraced parts of fan body, road cuts and/or large quarries. Therefore, GPR (Ground Penetrating Radar), using the properties of electromagnetic waves' propagation through media, seems best adapted for the appraisal of large-scale subsurface structures. Recently, studies on GPR applications to volcanic deposits have successfully captured images of lava flows and volcaniclastic deposits and proved the usefulness of this method even onto the volcanic areas which often encompass complicated stratigraphy and structures with variable material, grainsize, and ferromagnetic content. Using GPR, the present study aims to understand the large-scale internal structures of volcanogenic flood deposits. The survey was carried out over two volcanogenic flood fan (or apron) sediments in northeast Japan, at Numazawa and Towada volcanoes. The 5 ka Numazawa flood deposits in the Tadami river catchment that has been emplaced by a breakout flood from ignimbrite-dammed valley leaving pumiceous gravelly sediments with meter-sized boulders in the flow path. At Towada volcano, a comparable flood event originating from a breach in the caldera rim emplaced the 13-15 ka Sanbongi fan deposits in the Oirase river valley, which is characterized by a bouldery fan deposits. The GPR data was collected following 200 to 500 m long lateral and longitudinal transects, which were captured using a GPR Pulse

  1. Measuring Large-Scale Structure at z ~ 1 with the VIPERS galaxy survey

    NASA Astrophysics Data System (ADS)

    Guzzo, Luigi

    2016-10-01

    The VIMOS Public Extragalactic Redshift Survey (VIPERS) is the largest redshift survey ever conducted with the ESO telescopes. It has used the Very Large Telescope to collect nearly 100,000 redshifts from the general galaxy population at 0.5 < z < 1.2. With a combination of volume and high sampling density that is unique for these redshifts, it allows statistical measurements of galaxy clustering and related cosmological quantities to be obtained on an equal footing with classic results from local redshift surveys. At the same time, the simple magnitude-limited selection and the wealth of ancillary photometric data provide a general view of the galaxy population, its physical properties and the relation of the latter to large-scale structure. This paper presents an overview of the galaxy clustering results obtained so far, together with their cosmological implications. Most of these are based on the ~ 55,000 galaxies forming the first public data release (PDR-1). As of January 2015, observations and data reduction are complete and the final data set of more than 90,000 redshifts is being validated and made ready for the final investigations.

  2. Searching transients in large-scale surveys. A method based on the Abbe value

    NASA Astrophysics Data System (ADS)

    Mowlavi, N.

    2014-08-01

    Aims: A new method is presented to identify transient candidates in large-scale surveys based on the variability pattern in their light curves. Methods: The method is based on the Abbe value, Ab, that estimates the smoothness of a light curve, and on a newly introduced value called the excess Abbe and denoted excessAb, that estimates the regularity of the light curve variability pattern over the duration of the observations. Results: Based on simulated light curves, transients are shown to occupy a specific region in the {diagram} diagram, distinct from sources presenting pulsating-like features in their light curves or having featureless light curves. The method is tested on real light curves taken from EROS-2 and OGLE-II surveys in a 0.50° × 0.17° field of the sky in the Large Magellanic Cloud centered at RA(J2000) = 5h25m56.5s and Dec(J2000) = -69d29m43.3s. The method identifies 43 EROS-2 transient candidates out of a total of 1300 variable stars, and 19 more OGLE-II candidates, 10 of which do not have any EROS-2 variable star matches and which would need further confirmation to assess their reliability. The efficiency of the method is further tested by comparing the list of transient candidates with known Be stars in the literature. It is shown that all Be stars known in the studied field of view with detectable bursts or outbursts are successfully extracted by the method. In addition, four new transient candidates displaying bursts and/or outbursts are found in the field, of which at least two are good new Be candidates. Conclusions: The new method proves to be a potentially powerful tool to extract transient candidates from large-scale multi-epoch surveys. The better the photometric measurement uncertainties are, the cleaner the list of detected transient candidates is. In addition, the diagram diagram is shown to be a good diagnostic tool to check the data quality of multi-epoch photometric surveys. A trend of instrumental and/or data reduction origin

  3. Ten key considerations for the successful implementation and adoption of large-scale health information technology

    PubMed Central

    Cresswell, Kathrin M; Bates, David W; Sheikh, Aziz

    2013-01-01

    The implementation of health information technology interventions is at the forefront of most policy agendas internationally. However, such undertakings are often far from straightforward as they require complex strategic planning accompanying the systemic organizational changes associated with such programs. Building on our experiences of designing and evaluating the implementation of large-scale health information technology interventions in the USA and the UK, we highlight key lessons learned in the hope of informing the on-going international efforts of policymakers, health directorates, healthcare management, and senior clinicians. PMID:23599226

  4. Inclusive constraints on unified dark matter models from future large-scale surveys

    NASA Astrophysics Data System (ADS)

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ``dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  5. Survey and analysis of selected jointly owned large-scale electric utility storage projects

    SciTech Connect

    Not Available

    1982-05-01

    The objective of this study was to examine and document the issues surrounding the curtailment in commercialization of large-scale electric storage projects. It was sensed that if these issues could be uncovered, then efforts might be directed toward clearing away these barriers and allowing these technologies to penetrate the market to their maximum potential. Joint-ownership of these projects was seen as a possible solution to overcoming the major barriers, particularly economic barriers, of commercializaton. Therefore, discussions with partners involved in four pumped storage projects took place to identify the difficulties and advantages of joint-ownership agreements. The four plants surveyed included Yards Creek (Public Service Electric and Gas and Jersey Central Power and Light); Seneca (Pennsylvania Electric and Cleveland Electric Illuminating Company); Ludington (Consumers Power and Detroit Edison, and Bath County (Virginia Electric Power Company and Allegheny Power System, Inc.). Also investigated were several pumped storage projects which were never completed. These included Blue Ridge (American Electric Power); Cornwall (Consolidated Edison); Davis (Allegheny Power System, Inc.) and Kttatiny Mountain (General Public Utilities). Institutional, regulatory, technical, environmental, economic, and special issues at each project were investgated, and the conclusions relative to each issue are presented. The major barriers preventing the growth of energy storage are the high cost of these systems in times of extremely high cost of capital, diminishing load growth and regulatory influences which will not allow the building of large-scale storage systems due to environmental objections or other reasons. However, the future for energy storage looks viable despite difficult economic times for the utility industry. Joint-ownership can ease some of the economic hardships for utilites which demonstrate a need for energy storage.

  6. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  7. Photometric Redshifts for the Dark Energy Survey and VISTA and Implications for Large Scale Structure

    SciTech Connect

    Banerji, Manda; Abdalla, Filipe B.; Lahav, Ofer; Lin, Huan; /Fermilab

    2007-11-01

    We conduct a detailed analysis of the photometric redshift requirements for the proposed Dark Energy Survey (DES) using two sets of mock galaxy simulations and an artificial neural network code-ANNz. In particular, we examine how optical photometry in the DES grizY bands can be complemented with near infra-red photometry from the planned VISTA Hemisphere Survey (VHS) in the JHK{sub s} bands in order to improve the photometric redshift estimate by a factor of two at z > 1. We draw attention to the effects of galaxy formation scenarios such as reddening on the photo-z estimate and using our neural network code, calculate A{sub v} for these reddened galaxies. We also look at the impact of using different training sets when calculating photometric redshifts. In particular, we find that using the ongoing DEEP2 and VVDS-Deep spectroscopic surveys to calibrate photometric redshifts for DES, will prove effective. However we need to be aware of uncertainties in the photometric redshift bias that arise when using different training sets as these will translate into errors in the dark energy equation of state parameter, w. Furthermore, we show that the neural network error estimate on the photometric redshift may be used to remove outliers from our samples before any kind of cosmological analysis, in particular for large-scale structure experiments. By removing all galaxies with a 1{sigma} photo-z scatter greater than 0.1 from our DES+VHS sample, we can constrain the galaxy power spectrum out to a redshift of 2 and reduce the fractional error on this power spectrum by {approx}15-20% compared to using the entire catalogue.

  8. The WiggleZ Dark Energy Survey: the transition to large-scale cosmic homogeneity

    NASA Astrophysics Data System (ADS)

    Scrimgeour, Morag I.; Davis, Tamara; Blake, Chris; James, J. Berian; Poole, Gregory B.; Staveley-Smith, Lister; Brough, Sarah; Colless, Matthew; Contreras, Carlos; Couch, Warrick; Croom, Scott; Croton, Darren; Drinkwater, Michael J.; Forster, Karl; Gilbank, David; Gladders, Mike; Glazebrook, Karl; Jelliffe, Ben; Jurek, Russell J.; Li, I.-hui; Madore, Barry; Martin, D. Christopher; Pimbblet, Kevin; Pracy, Michael; Sharp, Rob; Wisnioski, Emily; Woods, David; Wyder, Ted K.; Yee, H. K. C.

    2012-09-01

    We have made the largest volume measurement to date of the transition to large-scale homogeneity in the distribution of galaxies. We use the WiggleZ survey, a spectroscopic survey of over 200 000 blue galaxies in a cosmic volume of ˜1 h-3 Gpc3. A new method of defining the 'homogeneity scale' is presented, which is more robust than methods previously used in the literature, and which can be easily compared between different surveys. Due to the large cosmic depth of WiggleZ (up to z = 1), we are able to make the first measurement of the transition to homogeneity over a range of cosmic epochs. The mean number of galaxies N(< r) in spheres of comoving radius r is proportional to r3 within 1 per cent, or equivalently the fractal dimension of the sample is within 1 per cent of D2 = 3, at radii larger than 71 ± 8 h-1Mpc at z ˜ 0.2, 70 ± 5 h-1 Mpc at z ˜ 0.4, 81 ± 5 h-1 Mpc at z ˜ 0.6 and 75 ± 4 h-1 Mpc at z ˜ 0.8. We demonstrate the robustness of our results against selection function effects, using a Λ cold dark matter (ΛCDM) N-body simulation and a suite of inhomogeneous fractal distributions. The results are in excellent agreement with both the ΛCDM N-body simulation and an analytical ΛCDM prediction. We can exclude a fractal distribution with fractal dimension below D2 = 2.97 on scales from ˜80 h-1 Mpc up to the largest scales probed by our measurement, ˜300 h-1 Mpc, at 99.99 per cent confidence.

  9. Large Scale Structure Studies: Final Results from a Rich Cluster Redshift Survey

    NASA Astrophysics Data System (ADS)

    Slinglend, K.; Batuski, D.; Haase, S.; Hill, J.

    1995-12-01

    The results from the COBE satellite show the existence of structure on scales on the order of 10% or more of the horizon scale of the universe. Rich clusters of galaxies from the Abell-ACO catalogs show evidence of structure on scales of 100 Mpc and hold the promise of confirming structure on the scale of the COBE result. Unfortunately, until now, redshift information has been unavailable for a large percentage of these clusters, so present knowledge of their three dimensional distribution has quite large uncertainties. Our approach in this effort has been to use the MX multifiber spectrometer on the Steward 2.3m to measure redshifts of at least ten galaxies in each of 88 Abell cluster fields with richness class R>= 1 and mag10 <= 16.8 (estimated z<= 0.12) and zero or one measured redshifts. This work has resulted in a deeper, 95% complete and more reliable sample of 3-D positions of rich clusters. The primary intent of this survey has been to constrain theoretical models for the formation of the structure we see in the universe today through 2-pt. spatial correlation function and other analyses of the large scale structures traced by these clusters. In addition, we have obtained enough redshifts per cluster to greatly improve the quality and size of the sample of reliable cluster velocity dispersions available for use in other studies of cluster properties. This new data has also allowed the construction of an updated and more reliable supercluster candidate catalog. Our efforts have resulted in effectively doubling the volume traced by these clusters. Presented here is the resulting 2-pt. spatial correlation function, as well as density plots and several other figures quantifying the large scale structure from this much deeper and complete sample. Also, with 10 or more redshifts in most of our cluster fields, we have investigated the extent of projection effects within the Abell catalog in an effort to quantify and understand how this may effect the Abell sample.

  10. Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey

    PubMed Central

    Erhart, Annette; Thang, Ngo Duc; Van Ky, Phan; Tinh, Ta Thi; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Hung, Le Xuan; Thuan, Le Khanh; Coosemans, Marc; D'alessandro, Umberto

    2005-01-01

    In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics. A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10–60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0–42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1–75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment. PMID:16336671

  11. Epidemiology of forest malaria in central Vietnam: a large scale cross-sectional survey.

    PubMed

    Erhart, Annette; Ngo, Duc Thang; Phan, Van Ky; Ta, Thi Tinh; Van Overmeir, Chantal; Speybroeck, Niko; Obsomer, Valerie; Le, Xuan Hung; Le, Khanh Thuan; Coosemans, Marc; D'alessandro, Umberto

    2005-01-01

    In Vietnam, a large proportion of all malaria cases and deaths occurs in the central mountainous and forested part of the country. Indeed, forest malaria, despite intensive control activities, is still a major problem which raises several questions about its dynamics.A large-scale malaria morbidity survey to measure malaria endemicity and identify important risk factors was carried out in 43 villages situated in a forested area of Ninh Thuan province, south central Vietnam. Four thousand three hundred and six randomly selected individuals, aged 10-60 years, participated in the survey. Rag Lays (86%), traditionally living in the forest and practising "slash and burn" cultivation represented the most common ethnic group. The overall parasite rate was 13.3% (range [0-42.3] while Plasmodium falciparum seroprevalence was 25.5% (range [2.1-75.6]). Mapping of these two variables showed a patchy distribution, suggesting that risk factors other than remoteness and forest proximity modulated the human-vector interactions. This was confirmed by the results of the multivariate-adjusted analysis, showing that forest work was a significant risk factor for malaria infection, further increased by staying in the forest overnight (OR= 2.86; 95%CI [1.62; 5.07]). Rag Lays had a higher risk of malaria infection, which inversely related to education level and socio-economic status. Women were less at risk than men (OR = 0.71; 95%CI [0.59; 0.86]), a possible consequence of different behaviour. This study confirms that malaria endemicity is still relatively high in this area and that the dynamics of transmission is constantly modulated by the behaviour of both humans and vectors. A well-targeted intervention reducing the "vector/forest worker" interaction, based on long-lasting insecticidal material, could be appropriate in this environment.

  12. The Influence of Local and Large-Scale Environment on Galaxy Gas Reservoirs in the RESOLVE Survey

    NASA Astrophysics Data System (ADS)

    Stark, David V.; Kannappan, Sheila; Baker, Ashley; Berlind, Andreas A.; Burchett, Joseph; Eckert, Kathleen D.; Florez, Jonathan; Hall, Kirsten; Haynes, Martha P.; Giovanelli, Riccardo; Gonzalez, Roberto; Guynn, David; Hoversten, Erik A.; Leroy, Adam K.; Moffett, Amanda J.; Pisano, Daniel J.; Watson, Linda C.; Wei, Lisa H.; Resolve Team

    2015-01-01

    There is growing evidence to suggest galaxy gas reservoirs have been replenished over time, but a clear picture of how this process depends on local and large-scale environment is still an active area of research. I will present an analysis of galaxy gas content with respect to environment using the ~90% complete 21cm census for the volume-limited RESOLVE survey, which yields an unbiased inventory of HI masses (or strong upper limits < 5-10% of the stellar mass) for ~1550 galaxies with baryonic mass greater than 109 M⊙ in >50,000 cubic Mpc of the z=0 universe. We quantify large-scale environment via identification of cosmic web filaments and walls using a modified friends-of-friends technique, while also using photometric redshifts to identify additional potential companions around each galaxy. Combining this powerful data set with estimates of HI profile asymmetries and star formation histories, we examine whether there are local or large-scale environments where cold gas accretion is more effective. Specifically, we investigate whether galaxy interactions can induce enhanced HI content. We also explore whether galaxies residing in large-scale filaments or walls, where simulations show large-scale gas flows, display signatures of enhanced gas accretion relative to other large-scale environments. This project is supported by NSF funding for the RESOLVE survey (AST-0955368), the GBT Student Observing Support program, and a UNC Royster Society of Fellows Dissertation Completion Fellowship.

  13. The prevalence of medical services use. How comparable are the results of large-scale population surveys in Germany?

    PubMed Central

    Swart, Enno

    2012-01-01

    Background: The large-scale representative population surveys conducted by Germany’s Robert Koch Institute (RKI) contain questions pertaining to health and its determinants as well as the prevalence and frequency of outpatient services utilization. The same holds for the Socioeconomic Panel (SOEP, Sozio-ökonomisches Panel) and the Bertelsmann Healthcare Monitor (Gesundheitsmonitor) surveys. The purpose of this study is to examine the comparability of the instruments used in these surveys and their results. Methods: The questions on outpatient care utilization examined in this study were taken from the public use files of the East-West Health Survey (Ost-West Survey; OW1991), the 1998 Federal National Health Survey (Bundesgesundheitssurvey; BGS1998), the 2003 Telephone Health Survey (TEL2003), and the 2009 German Health Update (Gesundheit in Deutschland aktuell GEDA2009). The study also used data from the 26 waves of the SOEP (1984–2009) and the 16 waves of the Bertelsmann Healthcare Monitor (2001–2009) studies. Results: In the OW1991 and the BGS1998, questions on outpatient services utilization differ by the types of physicians inquired about. The four-week prevalence of contact with general practitioneers (GP) was 29% in the OW1991; the twelve-month prevalence in the BGS1998 was 69%. The OW1991 and the BGS1998 also surveyed participants on the number of physician contacts made during those reference periods (average number of contacts: 1.8 over the previous four weeks (OW1991) and 4.9 over the previous 12 months (BGS1998)). The TEL2003 inquires into the three-month prevalence of contact with private practice physicians in general (63%) as well as the number of contacts with primary care physicians over the previous twelve months (88% with at least one contact, average number of contacts: 4.6, range: 1–92). In the GEDA2009 survey, 88% of participants reported having contacted a physician at least once over the previous twelve months and an average of 6

  14. EVALUATION OF A MEASUREMENT METHOD FOR FOREST VEGETATION IN A LARGE-SCALE ECOLOGICAL SURVEY

    EPA Science Inventory

    We evaluate a field method for determining species richness and canopy cover of vascular plants for the Forest Health Monitoring Program (FHM), an ecological survey of U.S. forests. Measurements are taken within 12 1-m2 quadrats on 1/15 ha plots in FHM. Species richness and cover...

  15. A Large-scale Survey of CRF55_01B from Men-Who-Have-Sex-with-Men in China: implying the Evolutionary History and Public Health Impact.

    PubMed

    Han, Xiaoxu; Takebe, Yutaka; Zhang, Weiqing; An, Minghui; Zhao, Bin; Hu, Qinghai; Xu, Junjie; Wu, Hao; Wu, Jianjun; Lu, Lin; Chen, Xi; Liang, Shu; Wang, Zhe; Yan, Hongjing; Fu, Jihua; Cai, Weiping; Zhuang, Minghua; Liao, Christina; Shang, Hong

    2015-01-01

    The HIV-1 epidemic among men-who-have-sex-with-men (MSM) continues to expand in China, involving the co-circulation of several different lineages of HIV-1 strains, including subtype B and CRF01_AE. This expansion has created conditions that facilitate the generation of new recombinant strains. A molecular epidemiologic survey among MSM in 11 provinces/cities around China was conducted from 2008 to 2013. Based on pol nucleotide sequences, a total of 19 strains (1.95%) belonged to the CRF55_01B were identified from 975 MSM in 7 provinces, with the prevalence range from 1.5% to 12.5%. Near full length genome (NFLG) sequences from six epidemiologically-unlinked MSM were amplified for analyzing evolutionary history, an identical genome structure composed of CRF01_AE and subtype B with four unique recombination breakpoints in the pol region were identified. Bayesian molecular clock analyses for both CRF01_AE and B segments indicated that the estimated time of the most recent common ancestors of CRF55_01B was around the year 2000. Our study found CRF55_01B has spread throughout the most provinces with high HIV-1 prevalence and highlights the importance of continual surveillance of dynamic changes in HIV-1 strains, the emergence of new recombinants, and the need for implementing effective prevention measures specifically targeting the MSM population in China.

  16. A Large-scale Survey of CRF55_01B from Men-Who-Have-Sex-with-Men in China: implying the Evolutionary History and Public Health Impact

    PubMed Central

    Han, Xiaoxu; Takebe, Yutaka; Zhang, Weiqing; An, Minghui; Zhao, Bin; Hu, Qinghai; Xu, Junjie; Wu, Hao; Wu, Jianjun; Lu, Lin; Chen, Xi; Liang, Shu; Wang, Zhe; Yan, Hongjing; Fu, Jihua; Cai, Weiping; Zhuang, Minghua; Liao, Christina; Shang, Hong

    2015-01-01

    The HIV-1 epidemic among men-who-have-sex-with-men (MSM) continues to expand in China, involving the co-circulation of several different lineages of HIV-1 strains, including subtype B and CRF01_AE. This expansion has created conditions that facilitate the generation of new recombinant strains. A molecular epidemiologic survey among MSM in 11 provinces/cities around China was conducted from 2008 to 2013. Based on pol nucleotide sequences, a total of 19 strains (1.95%) belonged to the CRF55_01B were identified from 975 MSM in 7 provinces, with the prevalence range from 1.5% to 12.5%. Near full length genome (NFLG) sequences from six epidemiologically-unlinked MSM were amplified for analyzing evolutionary history, an identical genome structure composed of CRF01_AE and subtype B with four unique recombination breakpoints in the pol region were identified. Bayesian molecular clock analyses for both CRF01_AE and B segments indicated that the estimated time of the most recent common ancestors of CRF55_01B was around the year 2000. Our study found CRF55_01B has spread throughout the most provinces with high HIV-1 prevalence and highlights the importance of continual surveillance of dynamic changes in HIV-1 strains, the emergence of new recombinants, and the need for implementing effective prevention measures specifically targeting the MSM population in China. PMID:26667846

  17. A process for creating multimetric indices for large-scale aquatic surveys

    EPA Science Inventory

    Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...

  18. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    NASA Astrophysics Data System (ADS)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  19. ELISA: A small balloon Experiment for a Large Scale Survey in the Sub-millimeter

    NASA Astrophysics Data System (ADS)

    Bernard, J.-Ph.; Ristorcelli, I.; Stepnik, B.; Abergel, A.; Boulanger, F.; Giard, M.; Lagache, G.; Lamarre, J. M.; Meny, C.; Torre, J. P.; Armengaud, M.; Crussaire, J. P.; Leriche, B.; Longval, Y.

    2002-03-01

    HERSCHEL and the PLANCK space missions to be launched in 2007. The ELISA data will also be usable to help calibrate the observations of HERSCHEL and PLANCK and to plan the large-scale surveys to be undertaken with HERSCHEL. Owing to these objectives, 3 flights of the ELISA experiment, including one from Southern hemisphere, are foreseen in the period from 2004 to 2006. The ELISA project is carried out by an international collaboration including France (CESR, IAS, CEA, CNES), Netherlands (SSD/ESTEC), Denmark (DSRI), England (QMW), USA (JPL/Caltech), Italy (ASI). .

  20. Linking Errors in Trend Estimation in Large-Scale Surveys: A Case Study. Research Report. ETS RR-10-10

    ERIC Educational Resources Information Center

    Xu, Xueli; von Davier, Matthias

    2010-01-01

    One of the major objectives of large-scale educational surveys is reporting trends in academic achievement. For this purpose, a substantial number of items are carried from one assessment cycle to the next. The linking process that places academic abilities measured in different assessments on a common scale is usually based on a concurrent…

  1. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts. PMID:22084529

  2. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  3. Large-scale organizational and managerial change in health care: a review of the literature.

    PubMed

    Ferlie, E

    1997-07-01

    This paper takes an overview of the organizational and managerial literature on recent large-scale change efforts within health care organizations. Such literature refers to issues of enhanced policy significance, as a succession of such changes has swept through health care, at an international level. Interpretive and case study method have been widely employed in this field. While the literature is emergent, key empirical concerns can be identified: (1) Changing roles and relationships, with the rise of management and the challenge to clinical domination; some argue that radical deprofessionalization now is evident, while others take a more nuanced view. (2) The impact of marketization, with health care becoming more of a commodity; various models of a health care 'quasi market' have been formulated. (3) Understanding the process of change in health care organizations, such as the development of a management of change literature. New theoretical frameworks have been developed, notably 'the reform cycle' as a way of understanding progressive cycles of organizational reform, the impact on health care of the rise of the new public management, and examining the demedicalization thesis through the more generic literature on professions. The paper concludes with a discussion of what this research base could contribute to policy-making. PMID:10180380

  4. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  5. A survey on routing protocols for large-scale wireless sensor networks.

    PubMed

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. "Large-scale" means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner and

  6. A Survey on Routing Protocols for Large-Scale Wireless Sensor Networks

    PubMed Central

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. “Large-scale” means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner

  7. Public knowledge and preventive behavior during a large-scale Salmonella outbreak: results from an online survey in the Netherlands

    PubMed Central

    2014-01-01

    Background Food-borne Salmonella infections are a worldwide concern. During a large-scale outbreak, it is important that the public follows preventive advice. To increase compliance, insight in how the public gathers its knowledge and which factors determine whether or not an individual complies with preventive advice is crucial. Methods In 2012, contaminated salmon caused a large Salmonella Thompson outbreak in the Netherlands. During the outbreak, we conducted an online survey (n = 1,057) to assess the general public’s perceptions, knowledge, preventive behavior and sources of information. Results Respondents perceived Salmonella infections and the 2012 outbreak as severe (m = 4.21; five-point scale with 5 as severe). Their knowledge regarding common food sources, the incubation period and regular treatment of Salmonella (gastro-enteritis) was relatively low (e.g., only 28.7% knew that Salmonella is not normally treated with antibiotics). Preventive behavior differed widely, and the majority (64.7%) did not check for contaminated salmon at home. Most information about the outbreak was gathered through traditional media and news and newspaper websites. This was mostly determined by time spent on the medium. Social media played a marginal role. Wikipedia seemed a potentially important source of information. Conclusions To persuade the public to take preventive actions, public health organizations should deliver their message primarily through mass media. Wikipedia seems a promising instrument for educating the public about food-borne Salmonella. PMID:24479614

  8. Child Maltreatment Experience among Primary School Children: A Large Scale Survey in Selangor State, Malaysia

    PubMed Central

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10–12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  9. Child maltreatment experience among primary school children: a large scale survey in Selangor state, Malaysia.

    PubMed

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10-12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  10. Child maltreatment experience among primary school children: a large scale survey in Selangor state, Malaysia.

    PubMed

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10-12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country.

  11. The Muenster Red Sky Survey: Large-scale structures in the universe

    NASA Astrophysics Data System (ADS)

    Ungruhe, R.; Seitter, W. C.; Duerbeck, H. W.

    2003-01-01

    We present a large-scale galaxy catalogue for the red spectral region which covers an area of 5 000 square degrees. It contains positions, red magnitudes, radii, ellipticities and position angles of about 5.5 million galaxies. Together with the APM catalogue (4,300 square degrees) in the blue spectral region, this catalogue forms at present the largest coherent data base for cosmological investigations in the southern hemisphere. 217 ESO Southern Sky Atlas R Schmidt plates with galactic latitudes -45 degrees were digitized with the two PDS microdensitometers of the Astronomisches Institut Münster, with a step width of 15 microns, corresponding to 1.01 arcseconds per pixel. All data were stored on different storage media and are available for further investigations. Suitable search parameters must be chosen in such a way that all objects are found on the plates, and that the percentage of artificial objects remains as low as possible. Based on two reference areas on different plates, a search threshold of 140 PDS density units and a minimum number of four pixels per object were chosen. The detected objects were stored, according to size, in frames of different size length. Each object was investigated in its frame, and 18 object parameters were determined. The classification of objects into stars, galaxies and perturbed objects was done with an automatic procedure which makes use of combinations of computed object parameters. In the first step, the perturbed objects are removed from the catalogue. Double objects and noise objects can be excluded on the basis of symmetry properties, while for satellite trails, a new classification criterium based on apparent magnitude, effective radius and apparent ellipticity, was developed. For the remaining objects, a star/galaxy separation was carried out. For bright objects, the relation between apparent magnitude and effective radius serves as the discriminating property, for fainter objects, the relation between effective

  12. Investigating genetic discrimination in Australia: a large-scale survey of clinical genetics clients.

    PubMed

    Taylor, S; Treloar, S; Barlow-Stewart, K; Stranger, M; Otlowski, M

    2008-07-01

    We report first results from the Australian Genetic Discrimination Project of clinical genetics services clients' perceptions and experiences regarding alleged differential treatment associated with having genetic information. Adults (n = 2667) who had presented from 1998 to 2003 regarding predictive or presymptomatic testing for designated mature-onset conditions were surveyed; 951/1185 respondents met inclusion criteria for current asymptomatic status. Neurological conditions and familial cancers were primary relevant conditions for 87% of asymptomatic respondents. Specific incidents of alleged negative treatment, reported by 10% (n = 93) of respondents, occurred in life insurance (42%), employment (5%), family (22%), social (11%) and health (20%) domains. Respondents where neuro-degenerative conditions were relevant were more likely overall to report incidents and significantly more likely to report incidents in the social domain. Most incidents in the post-test period occurred in the first year after testing. Only 15% of respondents knew where to complain officially if treated negatively because of genetics issues. Recommendations include the need for increased community and clinical education regarding genetic discrimination, for extended clinical genetics sector engagement and for co-ordinated monitoring, research and policy development at national levels in order for the full benefits of genetic testing technology to be realised. PMID:18492091

  13. Estimation of Large-Scale Organ Motion in B-Mode Ultrasound Image Sequences: A Survey.

    PubMed

    De Luca, Valeria; Székely, Gábor; Tanner, Christine

    2015-12-01

    Reviewed here are methods developed for following (i.e., tracking) structures in medical B-mode ultrasound time sequences during large-scale motion. The resulting motion estimation problem and its key components are defined. The main tracking approaches are described, and their strengths and weaknesses are discussed. Existing motion estimation methods, tested on multiple in vivo sequences, are categorized with respect to their clinical applications, namely, cardiac, respiratory and muscular motion. A large number of works in this field had to be discarded as thorough validation of the results was missing. The remaining relevant works identified indicate the possibility of reaching an average tracking accuracy up to 1-2 mm. Real-time performance can be achieved using several methods. Yet only very few of these have progressed to clinical practice. The latest trends include incorporation of complementary and prior information. Advances are expected from common evaluation databases and 4-D ultrasound scanning technologies.

  14. Multi-stage sampling for large scale natural resources surveys: A case study of rice and waterfowl

    USGS Publications Warehouse

    Stafford, J.D.; Reinecke, K.J.; Kaminski, R.M.; Gerard, P.D.

    2005-01-01

    Large-scale sample surveys to estimate abundance and distribution of organisms and their habitats are increasingly important in ecological studies. Multi-stage sampling (MSS) is especially suited to large-scale surveys because of the natural clustering of resources. To illustrate an application, we: (1) designed a stratified MSS to estimate late autumn abundance (kg/ha) of rice seeds in harvested fields as food for waterfowl wintering in the Mississippi Alluvial Valley (MAV); (2) investigated options for improving the MSS design; and (3) compared statistical and cost efficiency of MSS to simulated simple random sampling (SRS). During 2000?2002, we sampled 25?35 landowners per year, 1 or 2 fields per landowner per year, and measured seed mass in 10 soil cores collected within each field. Analysis of variance components and costs for each stage of the survey design indicated that collecting 10 soil cores per field was near the optimum of 11?15, whereas sampling >1 field per landowner provided few benefits because data from fields within landowners were highly correlated. Coefficients of variation (CV) of annual estimates of rice abundance ranged from 0.23 to 0.31 and were limited by variation among landowners and the number of landowners sampled. Design effects representing the statistical efficiency of MSS relative to SRS ranged from 3.2 to 9.0, and simulations indicated SRS would cost, on average, 1.4 times more than MSS because clustering of sample units in MSS decreased travel costs. We recommend MSS as a potential sampling strategy for large-scale natural resource surveys and specifically for future surveys of the availability of rice as food for waterfowl in the MAV and similar areas.

  15. Assessing the Hypothesis of Measurement Invariance in the Context of Large-Scale International Surveys

    ERIC Educational Resources Information Center

    Rutkowski, Leslie; Svetina, Dubravka

    2014-01-01

    In the field of international educational surveys, equivalence of achievement scale scores across countries has received substantial attention in the academic literature; however, only a relatively recent emphasis on scale score equivalence in nonachievement education surveys has emerged. Given the current state of research in multiple-group…

  16. A Survey of Residents' Perceptions of the Effect of Large-Scale Economic Developments on Perceived Safety, Violence, and Economic Benefits

    PubMed Central

    Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi

    2015-01-01

    Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310

  17. A Strong-Lens Survey in AEGIS: the Influence of Large Scale Structure

    SciTech Connect

    Moustakas, Leonidas A.; Marshall, Phil J.; Newman, Jeffrey A.; Coil, Alison L.; Cooper, Michael C.; Davis, Marc; Fassnacht, Christopher D.; Guhathakurta, Puragra; Hopkins, Andrew; Koekemoer, Anton; Konidaris, Nicholas P.; Lotz, Jennifer M.; Willmer, Christopher N.A.; /Arizona U., Astron. Dept. - Steward Observ.

    2006-07-14

    We report on the results of a visual search for galaxy-scale strong gravitational lenses over 650 arcmin2 of HST/ACS imaging in the Extended Groth Strip (EGS). These deep F606W- and F814W-band observations are in the DEEP2-EGS field. In addition to a previously-known Einstein Cross also found by our search (the ''Cross'', HSTJ141735+52264, with z{sub lens} = 0.8106 and a published z{sub source} = 3.40), we identify two new strong galaxy-galaxy lenses with multiple extended arcs. The first, HSTJ141820+52361 (the ''Dewdrop''; z{sub lens} = 0.5798), lenses two distinct extended sources into two pairs of arcs (z{sub source} = 0.9818 by nebular [O{sub II}] emission), while the second, HSTJ141833+52435 (the ''Anchor''; z{sub lens} = 0.4625), produces a single pair of arcs (source redshift not yet known). Four less convincing arc/counter-arc and two-image lens candidates are also found and presented for completeness. All three definite lenses are fit reasonably well by simple singular isothermal ellipsoid models including external shear, giving {chi}{sub {nu}}{sup 2}values close to unity. Using the three-dimensional line-of-sight (LOS) information on galaxies from the DEEP2 data, we calculate the convergence and shear contributions {kappa}{sub los} and {gamma}{sub los} to each lens, assuming singular isothermal sphere halos truncated at 200 h{sup -1} kpc. These are compared against a robust measure of local environment, {delta}{sub 3}, a normalized density that uses the distance to the third nearest neighbor. We find that even strong lenses in demonstrably underdense local environments may be considerably affected by LOS contributions, which in turn, under the adopted assumptions, may be underestimates of the effect of large scale structure.

  18. Testing LSST Dither Strategies for Survey Uniformity and Large-scale Structure Systematics

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric; Kurczynski, Peter; Jones, R. Lynne; Zhan, Hu; Padilla, Nelson D.; Muñoz Arancibia, Alejandra M.; Orsi, Alvaro; Cora, Sofía A.; Yoachim, Peter

    2016-09-01

    The Large Synoptic Survey Telescope (LSST) will survey the southern sky from 2022-2032 with unprecedented detail. Since the observing strategy can lead to artifacts in the data, we investigate the effects of telescope-pointing offsets (called dithers) on the r-band coadded 5σ depth yielded after the 10-year survey. We analyze this survey depth for several geometric patterns of dithers (e.g., random, hexagonal lattice, spiral) with amplitudes as large as the radius of the LSST field of view, implemented on different timescales (per season, per night, per visit). Our results illustrate that per night and per visit dither assignments are more effective than per season assignments. Also, we find that some dither geometries (e.g., hexagonal lattice) are particularly sensitive to the timescale on which the dithers are implemented, while others like random dithers perform well on all timescales. We then model the propagation of depth variations to artificial fluctuations in galaxy counts, which are a systematic for LSS studies. We calculate the bias in galaxy counts caused by the observing strategy accounting for photometric calibration uncertainties, dust extinction, and magnitude cuts; uncertainties in this bias limit our ability to account for structure induced by the observing strategy. We find that after 10 years of the LSST survey, the best dither strategies lead to uncertainties in this bias that are smaller than the minimum statistical floor for a galaxy catalog as deep as r < 27.5. A few of these strategies bring the uncertainties close to the statistical floor for r < 25.7 after the first year of survey.

  19. An Analysis of Rich Cluster Redshift Survey Data for Large Scale Structure Studies

    NASA Astrophysics Data System (ADS)

    Slinglend, K.; Batuski, D.; Haase, S.; Hill, J.

    1994-12-01

    The results from the COBE satellite show the existence of structure on scales on the order of 10% or more of the horizon scale of the universe. Rich clusters of galaxies from Abell's catalog show evidence of structure on scales of 100 Mpc and may hold the promise of confirming structure on the scale of the COBE result. However, many Abell clusters have zero or only one measured redshift, so present knowledge of their three dimensional distribution has quite large uncertainties. The shortage of measured redshifts for these clusters may also mask a problem of projection effects corrupting the membership counts for the clusters. Our approach in this effort has been to use the MX multifiber spectrometer on the Steward 2.3m to measure redshifts of at least ten galaxies in each of 80 Abell cluster fields with richness class R>= 1 and mag10 <= 16.8 (estimated z<= 0.12) and zero or one measured redshifts. This work will result in a deeper, more complete (and reliable) sample of positions of rich clusters. Our primary intent for the sample is for two-point correlation and other studies of the large scale structure traced by these clusters in an effort to constrain theoretical models for structure formation. We are also obtaining enough redshifts per cluster so that a much better sample of reliable cluster velocity dispersions will be available for other studies of cluster properties. To date, we have collected such data for 64 clusters, and for most of them, we have seven or more cluster members with redshifts, allowing for reliable velocity dispersion calculations. Velocity histograms and stripe density plots for several interesting cluster fields are presented, along with summary tables of cluster redshift results. Also, with 10 or more redshifts in most of our cluster fields (30({') } square, just about an `Abell diameter' at z ~ 0.1) we have investigated the extent of projection effects within the Abell catalog in an effort to quantify and understand how this may effect

  20. Nonparametric Bayesian Multiple Imputation for Incomplete Categorical Variables in Large-Scale Assessment Surveys

    ERIC Educational Resources Information Center

    Si, Yajuan; Reiter, Jerome P.

    2013-01-01

    In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…

  1. GLOBAL CLIMATE AND LARGE-SCALE INFLUENCES ON AQUATIC ANIMAL HEALTH

    EPA Science Inventory

    The last 3 decades have witnessed numerous large-scale mortality events of aquatic organisms in North America. Affected species range from ecologically-important sea urchins to commercially-valuable American lobsters and protected marine mammals. Short-term forensic investigation...

  2. The Muenster Red Sky Survey: Large-scale structures in the universe

    NASA Astrophysics Data System (ADS)

    Ungruhe, R.; Seitter, W. C.; Duerbeck, H. W.

    2003-01-01

    We present a large-scale galaxy catalogue for the red spectral region which covers an area of 5 000 square degrees. It contains positions, red magnitudes, radii, ellipticities and position angles of about 5.5 million galaxies. Together with the APM catalogue (4,300 square degrees) in the blue spectral region, this catalogue forms at present the largest coherent data base for cosmological investigations in the southern hemisphere. 217 ESO Southern Sky Atlas R Schmidt plates with galactic latitudes -45 degrees were digitized with the two PDS microdensitometers of the Astronomisches Institut Münster, with a step width of 15 microns, corresponding to 1.01 arcseconds per pixel. All data were stored on different storage media and are available for further investigations. Suitable search parameters must be chosen in such a way that all objects are found on the plates, and that the percentage of artificial objects remains as low as possible. Based on two reference areas on different plates, a search threshold of 140 PDS density units and a minimum number of four pixels per object were chosen. The detected objects were stored, according to size, in frames of different size length. Each object was investigated in its frame, and 18 object parameters were determined. The classification of objects into stars, galaxies and perturbed objects was done with an automatic procedure which makes use of combinations of computed object parameters. In the first step, the perturbed objects are removed from the catalogue. Double objects and noise objects can be excluded on the basis of symmetry properties, while for satellite trails, a new classification criterium based on apparent magnitude, effective radius and apparent ellipticity, was developed. For the remaining objects, a star/galaxy separation was carried out. For bright objects, the relation between apparent magnitude and effective radius serves as the discriminating property, for fainter objects, the relation between effective

  3. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys

    PubMed Central

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov–Malyshev–Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  4. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys.

    PubMed

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov-Malyshev-Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  5. A satellite geodetic survey of large-scale deformation of volcanic centres in the central Andes.

    PubMed

    Pritchard, Matthew E; Simons, Mark

    2002-07-11

    Surface deformation in volcanic areas usually indicates movement of magma or hydrothermal fluids at depth. Stratovolcanoes tend to exhibit a complex relationship between deformation and eruptive behaviour. The characteristically long time spans between such eruptions requires a long time series of observations to determine whether deformation without an eruption is common at a given edifice. Such studies, however, are logistically difficult to carry out in most volcanic arcs, as these tend to be remote regions with large numbers of volcanoes (hundreds to even thousands). Here we present a satellite-based interferometric synthetic aperture radar (InSAR) survey of the remote central Andes volcanic arc, a region formed by subduction of the Nazca oceanic plate beneath continental South America. Spanning the years 1992 to 2000, our survey reveals the background level of activity of about 900 volcanoes, 50 of which have been classified as potentially active. We find four centres of broad (tens of kilometres wide), roughly axisymmetric surface deformation. None of these centres are at volcanoes currently classified as potentially active, although two lie within about 10 km of volcanoes with known activity. Source depths inferred from the patterns of deformation lie between 5 and 17 km. In contrast to the four new sources found, we do not observe any deformation associated with recent eruptions of Lascar, Chile. PMID:12110886

  6. Taming of the Slew: Optimization of the Large Scale X-Ray Surveys with Observing Strategy

    NASA Technical Reports Server (NTRS)

    Ptak, Andrew

    2010-01-01

    We will discuss simulations intended to address the relative efficiency of observing large areas with a slew observing strategy as opposed to pointing at fields individually. We will emphasize observing with the Wide Field X-ray Telescope (WFXT) but will also discuss optimization of observing strategy with the IXO Wide-Field Imager (WFI) and eRosita. The slew survey simulation is being implemented by translating the point direction along an arbitrary direction which addresses the impact of smoothing the telescope response during a given slew. However the simulation software is being designed to also allow the visibility of the sky to also be incorporated, in which case long-term observing plans could be developed to optimize the total sky coverage at a given depth and spatial resolution.

  7. Studying Displacement After a Disaster Using Large Scale Survey Methods: Sumatra After the 2004 Tsunami

    PubMed Central

    Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan

    2014-01-01

    Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature. PMID:24839300

  8. Studying Displacement After a Disaster Using Large Scale Survey Methods: Sumatra After the 2004 Tsunami.

    PubMed

    Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan

    2014-01-01

    Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature.

  9. Large-scale survey to describe acne management in Brazilian clinical practice

    PubMed Central

    Seité, Sophie; Caixeta, Clarice; Towersey, Loan

    2015-01-01

    Background Acne is a chronic disease of the pilosebaceous unit that mainly affects adolescents. It is the most common dermatological problem, affecting approximately 80% of teenagers between 12 and 18 years of age. Diagnosis is clinical and is based on the patient’s age at the time the lesions first appear, and on its polymorphism, type of lesions, and their anatomical location. The right treatment for the right patient is key to treating acne safely. The aim of this investigational survey was to evaluate how Brazilian dermatologists in private practice currently manage acne. Materials and methods Dermatologists practicing in 12 states of Brazil were asked how they manage patients with grades I, II, III, and IV acne. Each dermatologist completed a written questionnaire about patient characteristics, acne severity, and the therapy they usually prescribe for each situation. Results In total, 596 dermatologists were interviewed. Adolescents presented as the most common acneic population received by dermatologists, and the most common acne grade was grade II. The doctors could choose more than one type of treatment for each patient, and treatment choices varied according to acne severity. A great majority of dermatologists considered treatment with drugs as the first alternative for all acne grades, choosing either topical or oral presentation depending on the pathology severity. Dermocosmetics were chosen mostly as adjunctive therapy, and their inclusion in the treatment regimen decreased as acne grades increased. Conclusion This survey illustrates that Brazilian dermatologists employ complex treatment regimens to manage acne, choosing systemic drugs, particularly isotretinoin, even in some cases of grade I acne, and heavily prescribe antibiotics. Because complex regimens are harder for patients to comply with, this result notably raises the question of adherence, which is a key factor in successful treatment. PMID:26609243

  10. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  11. Implementing large-scale workforce change: learning from 55 pilot sites of allied health workforce redesign in Queensland, Australia

    PubMed Central

    2013-01-01

    Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616

  12. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    DOE PAGESBeta

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; et al

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less

  13. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    SciTech Connect

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; Sanchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia -Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco -Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastian E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodriguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scoccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.

  14. SDSS-III Baryon Oscillation Spectroscopic Survey Data Release 12: galaxy target selection and large-scale structure catalogues

    NASA Astrophysics Data System (ADS)

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rita; White, Martin; Eisenstein, Daniel J.; Maraston, Claudia; Ross, Ashley J.; Sánchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia-Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco-Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastián E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodríguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scóccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2016-01-01

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. The code used, designated MKSAMPLE, is released with this paper.

  15. The Health System and Population Health Implications of Large-Scale Diabetes Screening in India: A Microsimulation Model of Alternative Approaches

    PubMed Central

    Basu, Sanjay; Millett, Christopher; Vijan, Sandeep; Hayward, Rodney A.; Kinra, Sanjay; Ahuja, Rahoul; Yudkin, John S.

    2015-01-01

    Background Like a growing number of rapidly developing countries, India has begun to develop a system for large-scale community-based screening for diabetes. We sought to identify the implications of using alternative screening instruments to detect people with undiagnosed type 2 diabetes among diverse populations across India. Methods and Findings We developed and validated a microsimulation model that incorporated data from 58 studies from across the country into a nationally representative sample of Indians aged 25–65 y old. We estimated the diagnostic and health system implications of three major survey-based screening instruments and random glucometer-based screening. Of the 567 million Indians eligible for screening, depending on which of four screening approaches is utilized, between 158 and 306 million would be expected to screen as “high risk” for type 2 diabetes, and be referred for confirmatory testing. Between 26 million and 37 million of these people would be expected to meet international diagnostic criteria for diabetes, but between 126 million and 273 million would be “false positives.” The ratio of false positives to true positives varied from 3.9 (when using random glucose screening) to 8.2 (when using a survey-based screening instrument) in our model. The cost per case found would be expected to be from US$5.28 (when using random glucose screening) to US$17.06 (when using a survey-based screening instrument), presenting a total cost of between US$169 and US$567 million. The major limitation of our analysis is its dependence on published cohort studies that are unlikely fully to capture the poorest and most rural areas of the country. Because these areas are thought to have the lowest diabetes prevalence, this may result in overestimation of the efficacy and health benefits of screening. Conclusions Large-scale community-based screening is anticipated to produce a large number of false-positive results, particularly if using currently

  16. Cosmic cartography of the large-scale structure with Sloan Digital Sky Survey data release 6

    NASA Astrophysics Data System (ADS)

    Kitaura, Francisco S.; Jasche, Jens; Li, Cheng; Enßlin, Torsten A.; Metcalf, R. Benton; Wandelt, Benjamin D.; Lemson, Gerard; White, Simon D. M.

    2009-11-01

    We present the largest Wiener reconstruction of the cosmic density field made to date. The reconstruction is based on the Sloan Digital Sky Survey (SDSS) data release 6 covering the northern Galactic cap. We use a novel supersampling algorithm to suppress aliasing effects and a Krylov-space inversion method to enable high performance with high resolution. These techniques are implemented in the ARGO computer code. We reconstruct the field over a 500Mpc cube with Mpc grid resolution while accounting for both the angular and the radial selection functions of the SDSS, and the shot noise giving an effective resolution of the order of ~10Mpc. In addition, we correct for the redshift distortions in the linear and non-linear regimes in an approximate way. We show that the commonly used method of inverse weighting the galaxies by the corresponding selection function heads to excess noise in regions where the density of the observed galaxies is small. It is more accurate and conservative to adopt a Bayesian framework in which we model the galaxy selection/detection process to be Poisson binomial. This results in heavier smoothing in regions of reduced sampling density. Our results show a complex cosmic web structure with huge void regions indicating that the recovered matter distribution is highly non-Gaussian. Filamentary structures are clearly visible on scales of up to ~20Mpc. We also calculate the statistical distribution of density after smoothing the reconstruction with Gaussian kernels of different radii rS and find good agreement with a lognormal distribution for 10Mpc <~ rS <~ 30Mpc.

  17. [No relationship between blood type and personality: evidence from large-scale surveys in Japan and the US].

    PubMed

    Nawata, Kengo

    2014-06-01

    Despite the widespread popular belief in Japan about a relationship between personality and ABO blood type, this association has not been empirically substantiated. This study provides more robust evidence that there is no relationship between blood type and personality, through a secondary analysis of large-scale survey data. Recent data (after 2000) were collected using large-scale random sampling from over 10,000 people in total from both Japan and the US. Effect sizes were calculated. Japanese datasets from 2004 (N = 2,878-2,938), and 2,005 (N = 3,618-3,692) as well as one dataset from the US in 2004 (N = 3,037-3,092) were used. In all the datasets, 65 of 68 items yielded non-significant differences between blood groups. Effect sizes (eta2) were less than .003. This means that blood type explained less than 0.3% of the total variance in personality. These results show the non-relevance of blood type for personality.

  18. Large-scale distribution of surface ozone mixing ratio in southern Mongolia: A survey

    NASA Astrophysics Data System (ADS)

    Meixner, F. X.; Behrendt, T.; Ermel, M.; Hempelmann, N.; Andreae, M. O.; Jöckel, P.

    2012-04-01

    For the first time, measurements of surface ozone mixing ratio have been performed from semi-arid steppe to arid/hyper-arid southern Mongolian Gobi desert. During 12-29 August 2009, ozone mixing ratio was continuously measured from a mobile platform (4x4 Furgon SUV). The survey (3060 km / 229171km2) started at the Mongolian capital Ulaan-Baatar (47.9582° N, 107.0190° E ), heading to south-west (Echin Gol, 43.2586° N, 99.0255° E), eastward to Dalanzadgad (43.6061° N, 104.4445° E), and finally back to Ulaan-Baatar. Ambient air was sampled (approx. 1 l/min) through a 4 m long PTFE-intake line along a forward facing boom mounted on the roof of a 4x4 Furgon SUV. Ozone mixing ratio has been measured by UV-spectroscopy using a mobile dual-cell ozone analyzer (model 205, 2BTechnologies, Boulder, U.S.A.). While ozone signals were measured every 5 seconds, 1 minute averages and standard deviations have been calculated on-line and stored into the data logger. The latter are used to identify and to discriminate against unrealistic low or high ozone mixing ratios which have been due to occasionally passing plumes of vehicle exhaust and/or biomass burning gases, as well as gasoline (at gas filling stations). Even under desert conditions, the temporal behaviour of ozone mixing ratio was characterized by considerable and regular diel variations. Minimum mixing ratios (15-25 ppb) occurred early in the morning (approx. 06:00 local), when surface depletion of ozone (by dry deposition) can not be compensated by supply from the free troposphere due to thermodynamic stability of the nocturnal boundary layer. Late in the afternoon (approx. 17:00 local), under conditions of a turbulently well mixed convective boundary layer, maximum ozone mixing ratios (45-55 ppb) were reached. Daily amplitudes of the diel cycle of ozone mixing ratio were in the order of 30 ppb (steppe), 20 ppb (arid desert), to approx. 5 ppb (hyper-arid Gobi desert (Shargyn Gobi)). Ozone surface measurements were

  19. Effects on aquatic and human health due to large scale bioenergy crop expansion.

    PubMed

    Love, Bradley J; Einheuser, Matthew D; Nejadhashemi, A Pouyan

    2011-08-01

    In this study, the environmental impacts of large scale bioenergy crops were evaluated using the Soil and Water Assessment Tool (SWAT). Daily pesticide concentration data for a study area consisting of four large watersheds located in Michigan (totaling 53,358 km²) was estimated over a six year period (2000-2005). Model outputs for atrazine, bromoxynil, glyphosate, metolachlor, pendimethalin, sethoxydim, triflualin, and 2,4-D model output were used to predict the possible long-term implications that large-scale bioenergy crop expansion may have on the bluegill (Lepomis macrochirus) and humans. Threshold toxicity levels were obtained for the bluegill and for human consumption for all pesticides being evaluated through an extensive literature review. Model output was compared to each toxicity level for the suggested exposure time (96-hour for bluegill and 24-hour for humans). The results suggest that traditional intensive row crops such as canola, corn and sorghum may negatively impact aquatic life, and in most cases affect the safe drinking water availability. The continuous corn rotation, the most representative rotation for current agricultural practices for a starch-based ethanol economy, delivers the highest concentrations of glyphosate to the stream. In addition, continuous canola contributed to a concentration of 1.11 ppm of trifluralin, a highly toxic herbicide, which is 8.7 times the 96-hour ecotoxicity of bluegills and 21 times the safe drinking water level. Also during the period of study, continuous corn resulted in the impairment of 541,152 km of stream. However, there is promise with second-generation lignocellulosic bioenergy crops such as switchgrass, which resulted in a 171,667 km reduction in total stream length that exceeds the human threshold criteria, as compared to the base scenario. Results of this study may be useful in determining the suitability of bioenergy crop rotations and aid in decision making regarding the adaptation of large-scale

  20. The eROSITA/SRG All-Sky Survey: A new era of large-scale structure studies with AGN

    NASA Astrophysics Data System (ADS)

    Kolodzig, A.; Gilfanov, M.; H"utsi, G.; Sunyaev, R.

    2014-07-01

    The four-year X-ray all-sky survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma (SRG) satellite will detect ˜3 million active galactic nuclei (AGN) with a median redshift of z≈1 and typical luminosity of L_{0.5-2.0keV}˜10^{44} erg s^{-1}. We show that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure research. For the first time with a sample of X-ray selected AGN, it will become possible to perform detailed redshift- and luminosity-resolved studies of the linear bias factor. These studies will dramatically improve our understanding of AGN environment, triggering mechanisms, growth of super-massive black holes and their co-evolution with dark matter halos. The eROSITA AGN sample will become a powerful cosmological probe. It will become possible to convincingly detect baryonic acoustic oscillations (BAOs) with ˜8σ confidence in the 0.8surveys. To exploit the full potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  1. Large scale food retailing as an intervention for diet and health: quasi-experimental evaluation of a natural experiment

    PubMed Central

    Cummins, S.; Petticrew, M.; Higgins, C.; Findlay, A.; Sparks, L.

    2005-01-01

    Design: Prospective quasi-experimental design comparing baseline and follow up data in an "intervention" community with a matched "comparison" community in Glasgow, UK. Participants: 412 men and women aged 16 or over for whom follow up data on fruit and vegetable consumption and GHQ-12 were available. Main outcome measures: Fruit and vegetable consumption in portions per day, poor self reported health, and poor psychological health (GHQ-12). Main results: Adjusting for age, sex, educational attainment, and employment status there was no population impact on daily fruit and vegetable consumption, self reported, and psychological health. There was some evidence for a net reduction in the prevalence of poor psychological health for residents who directly engaged with the intervention. Conclusions: Government policy has advocated using large scale food retailing as a social intervention to improve diet and health in poor communities. In contrast with a previous uncontrolled study this study did not find evidence for a net intervention effect on fruit and vegetable consumption, although there was evidence for an improvement in psychological health for those who directly engaged with the intervention. Although definitive conclusions about the effect of large scale retailing on diet and health in deprived communities cannot be drawn from non-randomised controlled study designs, evaluations of the impacts of natural experiments may offer the best opportunity to generate evidence about the health impacts of retail interventions in poor communities. PMID:16286490

  2. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications. PMID:18430292

  3. Bayesian non-linear large-scale structure inference of the Sloan Digital Sky Survey Data Release 7

    NASA Astrophysics Data System (ADS)

    Jasche, Jens; Kitaura, Francisco S.; Li, Cheng; Enßlin, Torsten A.

    2010-11-01

    In this work, we present the first non-linear, non-Gaussian full Bayesian large-scale structure analysis of the cosmic density field conducted so far. The density inference is based on the Sloan Digital Sky Survey (SDSS) Data Release 7, which covers the northern galactic cap. We employ a novel Bayesian sampling algorithm, which enables us to explore the extremely high dimensional non-Gaussian, non-linear lognormal Poissonian posterior of the three-dimensional density field conditional on the data. These techniques are efficiently implemented in the Hamiltonian Density Estimation and Sampling (HADES) computer algorithm and permit the precise recovery of poorly sampled objects and non-linear density fields. The non-linear density inference is performed on a 750-Mpc cube with roughly 3-Mpc grid resolution, while accounting for systematic effects, introduced by survey geometry and selection function of the SDSS, and the correct treatment of a Poissonian shot noise contribution. Our high-resolution results represent remarkably well the cosmic web structure of the cosmic density field. Filaments, voids and clusters are clearly visible. Further, we also conduct a dynamical web classification and estimate the web-type posterior distribution conditional on the SDSS data.

  4. A global survey of martian central mounds: Central mounds as remnants of previously more extensive large-scale sedimentary deposits

    NASA Astrophysics Data System (ADS)

    Bennett, Kristen A.; Bell, James F.

    2016-01-01

    We conducted a survey of central mounds within large (>25 km diameter) impact craters on Mars. We use mound locations, mound offsets within their host craters, and relative mound heights to address and extend various mound formation hypotheses. The results of this survey support the hypothesis that mound sediments once filled their host craters and were later eroded into the features we observe today. The majority of mounds are located near the boundaries of previously identified large-scale sedimentary deposits. We discuss the implications of the hypothesis that central mounds are part of previously more extensive sedimentary units that filled and overtopped underlying impact craters. In this scenario, as erosion of the sedimentary unit occurred, the sediment within impact craters was preserved slightly longer than the overlying sediment because it was sheltered by the crater walls. Our study also reveals that most mounds are offset from the center of their host crater in the same direction as the present regional winds (e.g., the mounds in Arabia Terra are offset towards the western portion of their craters). We propose that this implies that wind has been the dominant agent causing the erosion of central mounds. Mound offset (r) is normalized to each crater's radius. The Mound offset (θ) is such that 0 is north and 270 is west.

  5. The SRG/eROSITA All-Sky Survey: A new era of large-scale structure studies with AGN

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2015-08-01

    The four-year X-ray All-Sky Survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma (SRG) satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z~1 and typical luminosity of L0.5-2.0keV ~ 1044 erg/s. We demonstrate that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure (LSS) studies.We show that with this sample of X-ray selected AGN, it will become possible for the first time to perform detailed redshift- and luminosity-resolved studies of the AGN clustering. This enable us to put strong constraints on different AGN triggering/fueling models as a function of AGN environment, which will dramatically improve our understanding of super-massive black hole growth and its correlation with the co-evolving LSS.Further, the eRASS AGN sample will become a powerful cosmological probe. We demonstrate for the first time that, given the breadth and depth of eRASS, it will become possible to convincingly detect baryonic acoustic oscillations (BAOs) with ~8σ confidence in the 0.8 < z < 2.0 range, currently uncovered by any existing BAO survey.Finally, we discuss the requirements for follow-up missions and demonstrate that in order to fully exploit the potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  6. AGN and QSOs in the eROSITA All-Sky Survey. II. The large-scale structure

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2013-10-01

    The four-year X-ray all-sky survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z ≈ 1 and a typical luminosity of L0.5-2.0 keV ~ 1044 ergs-1. We show that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure research. For the first time, detailed redshift- and luminosity-resolved studies of the bias factor for X-ray selected AGN will become possible. The eRASS AGN sample will not only improve the redshift- and luminosity resolution of these studies, but will also expand their luminosity range beyond L0.5-2.0 keV ~ 1044 ergs-1, thus enabling a direct comparison of the clustering properties of luminous X-ray AGN and optical quasars. These studies will dramatically improve our understanding of the AGN environment, triggering mechanisms, the growth of supermassive black holes and their co-evolution with dark matter halos. The eRASS AGN sample will become a powerful cosmological probe. It will enable detecting baryonic acoustic oscillations (BAOs) for the first time with X-ray selected AGN. With the data from the entire extragalactic sky, BAO will be detected at a ≳10σ confidence level in the full redshift range and with ~8σ confidence in the 0.8 < z < 2.0 range, which is currently not covered by any existing BAO surveys. To exploit the full potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  7. Awareness and Concern about Large-Scale Livestock and Poultry: Results from a Statewide Survey of Ohioans

    ERIC Educational Resources Information Center

    Sharp, Jeff; Tucker, Mark

    2005-01-01

    The development of large-scale livestock facilities has become a controversial issue in many regions of the U.S. in recent years. In this research, rural-urban differences in familiarity and concern about large-scale livestock facilities among Ohioans is examined as well as the relationship of social distance from agriculture and trust in risk…

  8. Pre- and Postnatal Influences on Preschool Mental Health: A Large-Scale Cohort Study

    ERIC Educational Resources Information Center

    Robinson, Monique; Oddy, Wendy H.; Li, Jianghong; Kendall, Garth E.; de Klerk, Nicholas H.; Silburn, Sven R.; Zubrick, Stephen R.; Newnham, John P.; Stanley, Fiona J.; Mattes, Eugen

    2008-01-01

    Background: Methodological challenges such as confounding have made the study of the early determinants of mental health morbidity problematic. This study aims to address these challenges in investigating antenatal, perinatal and postnatal risk factors for the development of mental health problems in pre-school children in a cohort of Western…

  9. Cosmology from large scale galaxy clustering and galaxy-galaxy lensing with Dark Energy Survey Science Verification data

    DOE PAGESBeta

    Kwan, J.

    2016-10-05

    Here, we present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe as Ωm = 0.31 ± 0.09 and the clustering amplitude of the matter power spectrum as σ8 = 0.74 ± 0.13 after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into S8 Ξ σ8more » (Ωm/0.3)0.16 = 0.74 ± 0.12 for our fiducial lens redshift bin at 0.35 < z < 0.5, while S8 = 0.78 ± 0.09 using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less

  10. Cosmology from large scale galaxy clustering and galaxy-galaxy lensing with Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Kwan, J.; Sánchez, C.; Clampitt, J.; Blazek, J.; Crocce, M.; Jain, B.; Zuntz, J.; Amara, A.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; DeRose, J.; Dodelson, S.; Eifler, T. F.; Gaztanaga, E.; Giannantonio, T.; Gruen, D.; Hartley, W. G.; Kacprzak, T.; Kirk, D.; Krause, E.; MacCrann, N.; Miquel, R.; Park, Y.; Ross, A. J.; Rozo, E.; Rykoff, E. S.; Sheldon, E.; Troxel, M. A.; Wechsler, R. H.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Rosell, A. Carnero; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Jarvis, M.; Kuehn, K.; Lahav, O.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.

    2016-10-01

    We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe as Ωm = 0.31 ± 0.09 and the clustering amplitude of the matter power spectrum as σ8 = 0.74 ± 0.13 after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into S8 ≡ σ8(Ωm/0.3)0.16 = 0.74 ± 0.12 for our fiducial lens redshift bin at 0.35

  11. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  12. Explaining Large-Scale Policy Change in the Turkish Health Care System: Ideas, Institutions, and Political Actors.

    PubMed

    Agartan, Tuba I

    2015-10-01

    Explaining policy change has been one of the major concerns of the health care politics and policy development literature. This article aims to explain the specific dynamics of large-scale reforms introduced within the framework of the Health Transformation Program in Turkey. It argues that confluence of the three streams - problem, policy, and politics - with the exceptional political will of the Justice and Development Party's (JDP) leaders opened up a window of opportunity for a large-scale policy change. The article also underscores the contribution of recent ideational perspectives that help explain "why" political actors in Turkey would focus on health care reform, given that there are a number of issues waiting to be addressed in the policy agenda. Examining how political actors framed problems and policies deepens our understanding of the content of the reform initiatives as well as the construction of the need to reform. The article builds on the insights of both the ideational and institutionalist perspectives when it argues that the interests, aspirations, and fears of the JDP, alongside the peculiar characteristics of the institutional context, have shaped its priorities and determination to carry out this reform initiative. PMID:26195607

  13. Explaining Large-Scale Policy Change in the Turkish Health Care System: Ideas, Institutions, and Political Actors.

    PubMed

    Agartan, Tuba I

    2015-10-01

    Explaining policy change has been one of the major concerns of the health care politics and policy development literature. This article aims to explain the specific dynamics of large-scale reforms introduced within the framework of the Health Transformation Program in Turkey. It argues that confluence of the three streams - problem, policy, and politics - with the exceptional political will of the Justice and Development Party's (JDP) leaders opened up a window of opportunity for a large-scale policy change. The article also underscores the contribution of recent ideational perspectives that help explain "why" political actors in Turkey would focus on health care reform, given that there are a number of issues waiting to be addressed in the policy agenda. Examining how political actors framed problems and policies deepens our understanding of the content of the reform initiatives as well as the construction of the need to reform. The article builds on the insights of both the ideational and institutionalist perspectives when it argues that the interests, aspirations, and fears of the JDP, alongside the peculiar characteristics of the institutional context, have shaped its priorities and determination to carry out this reform initiative.

  14. Large-Scale Survey Findings Inform Patients’ Experiences in Using Secure Messaging to Engage in Patient-Provider Communication and Self-Care Management: A Quantitative Assessment

    PubMed Central

    Patel, Nitin R; Lind, Jason D; Antinori, Nicole

    2015-01-01

    Background Secure email messaging is part of a national transformation initiative in the United States to promote new models of care that support enhanced patient-provider communication. To date, only a limited number of large-scale studies have evaluated users’ experiences in using secure email messaging. Objective To quantitatively assess veteran patients’ experiences in using secure email messaging in a large patient sample. Methods A cross-sectional mail-delivered paper-and-pencil survey study was conducted with a sample of respondents identified as registered for the Veteran Health Administrations’ Web-based patient portal (My HealtheVet) and opted to use secure messaging. The survey collected demographic data, assessed computer and health literacy, and secure messaging use. Analyses conducted on survey data include frequencies and proportions, chi-square tests, and one-way analysis of variance. Results The majority of respondents (N=819) reported using secure messaging 6 months or longer (n=499, 60.9%). They reported secure messaging to be helpful for completing medication refills (n=546, 66.7%), managing appointments (n=343, 41.9%), looking up test results (n=350, 42.7%), and asking health-related questions (n=340, 41.5%). Notably, some respondents reported using secure messaging to address sensitive health topics (n=67, 8.2%). Survey responses indicated that younger age (P=.039) and higher levels of education (P=.025) and income (P=.003) were associated with more frequent use of secure messaging. Females were more likely to report using secure messaging more often, compared with their male counterparts (P=.098). Minorities were more likely to report using secure messaging more often, at least once a month, compared with nonminorities (P=.086). Individuals with higher levels of health literacy reported more frequent use of secure messaging (P=.007), greater satisfaction (P=.002), and indicated that secure messaging is a useful (P=.002) and easy

  15. Monitoring and Evaluating the Transition of Large-Scale Programs in Global Health

    PubMed Central

    Bao, James; Rodriguez, Daniela C; Paina, Ligia; Ozawa, Sachiko; Bennett, Sara

    2015-01-01

    Purpose: Donors are increasingly interested in the transition and sustainability of global health programs as priorities shift and external funding declines. Systematic and high-quality monitoring and evaluation (M&E) of such processes is rare. We propose a framework and related guiding questions to systematize the M&E of global health program transitions. Methods: We conducted stakeholder interviews, searched the peer-reviewed and gray literature, gathered feedback from key informants, and reflected on author experiences to build a framework on M&E of transition and to develop guiding questions. Findings: The conceptual framework models transition as a process spanning pre-transition and transition itself and extending into sustained services and outcomes. Key transition domains include leadership, financing, programming, and service delivery, and relevant activities that drive the transition in these domains forward include sustaining a supportive policy environment, creating financial sustainability, developing local stakeholder capacity, communicating to all stakeholders, and aligning programs. Ideally transition monitoring would begin prior to transition processes being implemented and continue for some time after transition has been completed. As no set of indicators will be applicable across all types of health program transitions, we instead propose guiding questions and illustrative quantitative and qualitative indicators to be considered and adapted based on the transition domains identified as most important to the particular health program transition. The M&E of transition faces new and unique challenges, requiring measuring constructs to which evaluators may not be accustomed. Many domains hinge on measuring “intangibles” such as the management of relationships. Monitoring these constructs may require a compromise between rigorous data collection and the involvement of key stakeholders. Conclusion: Monitoring and evaluating transitions in global

  16. Engaging in large-scale digital health technologies and services. What factors hinder recruitment?

    PubMed

    O'Connor, Siobhan; Mair, Frances S; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; O'Donnell, Kate

    2015-01-01

    Implementing consumer oriented digital health products and services at scale is challenging and a range of barriers to reaching and recruiting users to these types of solutions can be encountered. This paper describes the experience of implementers with the rollout of the Delivering Assisted Living Lifestyles at Scale (dallas) programme. The findings are based on qualitative analysis of baseline and midpoint interviews and project documentation. Eight main themes emerged as key factors which hindered participation. These include how the dallas programme was designed and operationalised, constraints imposed by partnerships, technology, branding, and recruitment strategies, as well as challenges with the development cycle and organisational culture. PMID:25991155

  17. LARGE-SCALE STAR-FORMATION-DRIVEN OUTFLOWS AT 1 < z < 2 IN THE 3D-HST SURVEY

    SciTech Connect

    Lundgren, Britt F.; Van Dokkum, Pieter; Bezanson, Rachel; Momcheva, Ivelina; Nelson, Erica; Skelton, Rosalind E.; Wake, David; Whitaker, Katherine; Brammer, Gabriel; Franx, Marijn; Fumagalli, Mattia; Labbe, Ivo; Patel, Shannon; Da Cunha, Elizabete; Rix, Hans Walter; Schmidt, Kasper; Erb, Dawn K.; Fan Xiaohui; Kriek, Mariska; Marchesini, Danilo; and others

    2012-11-20

    We present evidence of large-scale outflows from three low-mass (log(M {sub *}/M {sub Sun }) {approx} 9.75) star-forming (SFR > 4 M {sub Sun} yr{sup -1}) galaxies observed at z = 1.24, z = 1.35, and z = 1.75 in the 3D-HST Survey. Each of these galaxies is located within a projected physical distance of 60 kpc around the sight line to the quasar SDSS J123622.93+621526.6, which exhibits well-separated strong (W {sup {lambda}2796} {sub r} {approx}> 0.8 A) Mg II absorption systems matching precisely to the redshifts of the three galaxies. We derive the star formation surface densities from the H{alpha} emission in the WFC3 G141 grism observations for the galaxies and find that in each case the star formation surface density well exceeds 0.1 M {sub Sun} yr{sup -1} kpc{sup -2}, the typical threshold for starburst galaxies in the local universe. From a small but complete parallel census of the 0.65 < z < 2.6 galaxies with H {sub 140} {approx}< 24 proximate to the quasar sight line, we detect Mg II absorption associated with galaxies extending to physical distances of 130 kpc. We determine that the W{sub r} > 0.8 A Mg II covering fraction of star-forming galaxies at 1 < z < 2 may be as large as unity on scales extending to at least 60 kpc, providing early constraints on the typical extent of starburst-driven winds around galaxies at this redshift. Our observations additionally suggest that the azimuthal distribution of W{sub r} > 0.4 A Mg II absorbing gas around star-forming galaxies may evolve from z {approx} 2 to the present, consistent with recent observations of an increasing collimation of star-formation-driven outflows with time from z {approx} 3.

  18. Characteristics of Belgian “life-ending acts without explicit patient request”: a large-scale death certificate survey revisited

    PubMed Central

    Chambaere, Kenneth; Bernheim, Jan L.; Downar, James; Deliens, Luc

    2014-01-01

    Background “Life-ending acts without explicit patient request,” as identified in robust international studies, are central in current debates on physician-assisted dying. Despite their contentiousness, little attention has been paid to their actual characteristics and to what extent they truly represent nonvoluntary termination of life. Methods We analyzed the 66 cases of life-ending acts without explicit patient request identified in a large-scale survey of physicians certifying a representative sample of deaths (n = 6927) in Flanders, Belgium, in 2007. The characteristics we studied included physicians’ labelling of the act, treatment course and doses used, and patient involvement in the decision. Results In most cases (87.9%), physicians labelled their acts in terms of symptom treatment rather than in terms of ending life. By comparing drug combinations and doses of opioids used, we found that the life-ending acts were similar to intensified pain and symptom treatment and were distinct from euthanasia. In 45 cases, there was at least 1 characteristic inconsistent with the common understanding of the practice: either patients had previously expressed a wish for ending life (16/66, 24.4%), physicians reported that the administered doses had not been higher than necessary to relieve suffering (22/66, 33.3%), or both (7/66, 10.6%). Interpretation Most of the cases we studied did not fit the label of “nonvoluntary life-ending” for at least 1 of the following reasons: the drugs were administered with a focus on symptom control; a hastened death was highly unlikely; or the act was taken in accordance with the patient’s previously expressed wishes. Thus, we recommend a more nuanced view of life-ending acts without explicit patient request in the debate on physician-assisted dying. PMID:25485252

  19. Perspectives on clinical informatics: integrating large-scale clinical, genomic, and health information for clinical care.

    PubMed

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K; Chung, Yeun-Jun

    2013-12-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population.

  20. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    PubMed Central

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K.

    2013-01-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  1. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    PubMed

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation.

  2. Large-scale latitude distortions of the inner Milky Way disk from the Herschel/Hi-GAL Survey

    NASA Astrophysics Data System (ADS)

    Molinari, S.; Noriega-Crespo, A.; Bally, J.; Moore, T. J. T.; Elia, D.; Schisano, E.; Plume, R.; Swinyard, B.; Di Giorgio, A. M.; Pezzuto, S.; Benedettini, M.; Testi, L.

    2016-04-01

    -infrared catalogues are filtered according to criteria that primarily select Young Stellar Objects (YSOs). Conclusions: The distortions of the Galactic inner disk revealed by Herschel confirm previous findings from CO surveys and HII/OB source counts but with much greater statistical significance and are interpreted as large-scale bending modes of the plane. The lack of similar distortions in tracers of more evolved YSOs or stars rules out gravitational instabilities or satellite-induced perturbations, because they should act on both the diffuse and stellar disk components. We propose that the observed bends are caused by incoming flows of extra-planar gas from the Galactic fountain or the Galactic halo interacting with the gaseous disk. With a much lower cross-section, stars decouple from the gaseous ISM and relax into the stellar disk potential. The timescale required for the disappearance of the distortions from the diffuse ISM to the relatively evolved YSO stages are compatible with star formation timescales.

  3. Evaluating a Large-Scale Community-Based Intervention to Improve Pregnancy and Newborn Health Among the Rural Poor in India.

    PubMed

    Acharya, Arnab; Lalwani, Tanya; Dutta, Rahul; Rajaratnam, Julie Knoll; Ruducha, Jenny; Varkey, Leila Caleb; Wunnava, Sita; Menezes, Lysander; Taylor, Catharine; Bernson, Jeff

    2015-01-01

    Objectives. We evaluated the effectiveness of the Sure Start project, which was implemented in 7 districts of Uttar Pradesh, India, to improve maternal and newborn health. Methods. Interventions were implemented at 2 randomly assigned levels of intensity. Forty percent of the areas received a more intense intervention, including community-level meetings with expectant mothers. A baseline survey consisted of 12 000 women who completed pregnancy in 2007; a follow-up survey was conducted for women in 2010 in the same villages. Our quantitative analyses provide an account of the project's impact. Results. We observed significant health improvements in both intervention areas over time; in the more intensive intervention areas, we found greater improvements in care-seeking and healthy behaviors. The more intensive intervention areas did not experience a significantly greater decline in neonatal mortality. Conclusions. This study demonstrates that community-based efforts, especially mothers' group meetings designed to increase care-seeking and healthy behaviors, are effective and can be implemented at large scale. PMID:25393175

  4. Prevalence of disability in Manikganj district of Bangladesh: results from a large-scale cross-sectional survey

    PubMed Central

    Zaman, M Mostafa; Mashreky, Saidur Rahman

    2016-01-01

    Objective To conduct a comprehensive survey on disability to determine the prevalence and distribution of cause-specific disability among residents of the Manikganj district in Bangladesh. Methods The survey was conducted in Manikganj, a typical district in Bangladesh, in 2009. Data were collected from 37 030 individuals of all ages. Samples were drawn from 8905 households from urban and rural areas proportionate to population size. Three sets of interviewer-administered questionnaires were used separately for age groups 0–1 years, 2–10 years and 11 years and above to collect data. For the age groups 0–1 years and 2–10 years, the parents or the head of the household were interviewed to obtain the responses. Impairments, activity limitations and restriction of participation were considered in defining disability consistent with the International Classification of Functioning, Disability and Health framework. Results Overall, age-standardised prevalence of disability per 1000 was 46.5 (95% CI 44.4 to 48.6). Prevalence was significantly higher among respondents living in rural areas (50.2; 95% CI 47.7 to 52.7) than in urban areas (31.0; 95% CI 27.0 to 35.0). Overall, female respondents had more disability (50.0; 95% CI 46.9 to 53.1) than male respondents (43.4; 95% CI 40.5 to 46.3). Educational deprivation was closely linked to higher prevalence of disability. Commonly reported prevalences (per 1000) for underlying causes of disability were 20.2 for illness, followed by 9.4 for congenital causes and 6.8 for injury, and these were consistent in males and females. Conclusions Disability is a common problem in this typical district of Bangladesh, which is largely generalisable. Interventions at community level with special attention to the socioeconomically deprived are warranted. PMID:27431897

  5. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  6. Adult Siblings of Individuals with Down Syndrome versus with Autism: Findings from a Large-Scale US Survey

    ERIC Educational Resources Information Center

    Hodapp, R. M.; Urbano, R. C.

    2007-01-01

    Background: As adults with Down syndrome live increasingly longer lives, their adult siblings will most likely assume caregiving responsibilities. Yet little is known about either the sibling relationship or the general functioning of these adult siblings. Using a national, web-based survey, this study compared adult siblings of individuals with…

  7. Assessing large-scale surveyor variability in the historic forest data of the original U.S. Public Land Survey

    USGS Publications Warehouse

    Manies, K.L.; Mladenoff, D.J.; Nordheim, E.V.

    2001-01-01

    The U.S. General Land Office Public Land Survey (PLS) records are a valuable resource for studying pre-European settlement vegetation. However, these data were taken for legal, not ecological, purposes. In turn, the instructions the surveyors followed affected the data collected. For this reason, it has been suggested that the PLS data may not truly represent the surveyed landscapes. This study examined the PLS data of northern Wisconsin, U.S.A., to determine the extent of variability among surveyors. We statistically tested for differences among surveyors in recorded tree species, size, location, and distance from the survey point. While we cannot rule out effects from other influences (e.g., environmental factors), we found evidence suggesting some level of surveyor bias for four of five variables, including tree species and size. The PLS data remain one of the best records of pre-European settlement vegetation available. However, based on our findings, we recommend that projects using PLS records examine these data carefully. This assessment should include not only the choice of variables to be studied but also the spatial extent at which the data will be examined.

  8. Abuse of Medications Employed for the Treatment of ADHD: Results From a Large-Scale Community Survey

    PubMed Central

    Bright, George M.

    2008-01-01

    Objective The objective is to assess abuse of prescription and illicit stimulants among individuals being treated for attention-deficit/hyperactivity disorder (ADHD). Methods A survey was distributed to patients enrolled in an ADHD treatment center. It included questions designed to gain information about demographics; ADHD treatment history; illicit drug use; and misuse of prescribed stimulant medications, including type of stimulant medication most frequently misused or abused, and how the stimulant was prepared and administered. Results A total of 545 subjects (89.2% with ADHD) were included in the survey. Results indicated that 14.3% of respondents abused prescription stimulants. Of these, 79.8% abused short-acting agents; 17.2% abused long-acting stimulants; 2.0% abused both short- and long-acting agents; and 1.0% abused other agents. The specific medications abused most often were mixed amphetamine salts (Adderall; 40.0%), mixed amphetamine salts extended release (Adderall XR; 14.2%), and methylphenidate (Ritalin; 15.0%), and the most common manner of stimulant abuse was crushing pills and snorting (75.0%). Survey results also showed that 39.1% of respondents used nonprescription stimulants, most often cocaine (62.2%), methamphetamine (4.8%), and both cocaine and amphetamine (31.1%). Choice of illicit drug was based on rapidity of high onset (43.5%), ease of acquisition (40.7%), ease of use (10.2%), and cost (5.5%). Conclusions The risks for abuse of prescription and illicit stimulants are elevated among individuals being treated in an ADHD clinic. Prescription agents used most often are those with pharmacologic and pharmacokinetic characteristics that provide a rapid high. This suggests that long-acting stimulant preparations that have been developed for the treatment of ADHD may have lower abuse potential than short-acting formulations. PMID:18596945

  9. Large-scale survey of Chinese precollege students' epistemological beliefs about physics: A progression or a regression?

    NASA Astrophysics Data System (ADS)

    Zhang, Ping; Ding, Lin

    2013-06-01

    This paper reports a cross-grade comparative study of Chinese precollege students’ epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade to the 12th grade in China. The original CLASS was translated into Mandarin through a rigorous transadaption process, and then it was administered as a pencil-and-paper in-class survey to a total of 1318 students across all the five grade levels (8-12). Our results showed that although in general student epistemological beliefs became less expertlike after receiving more years of traditional instruction (a trend consistent with what was reported in the previous literature), the cross-grade change was not a monotonous decrease. Instead, students at grades 9 and 12 showed a slight positive shift in their beliefs measured by CLASS. Particularly, when compared to the 8th graders, students at the 9th grade demonstrated a significant increase in their views about the conceptual nature of physics and problem-solving sophistication. We hypothesize that both pedagogical and nonpedagogical factors may have contributed to these positive changes. Our results cast light on the complex nature of the relationship between formal instruction and student epistemological beliefs.

  10. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  11. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  12. Evaluation of airborne geophysical surveys for large-scale mapping of contaminated mine pools: draft final report

    SciTech Connect

    Hammack, R. W.

    2006-12-28

    Decades of underground coal mining has left about 5,000 square miles of abandoned mine workings that are rapidly filling with water. The water quality of mine pools is often poor; environmental regulatory agencies are concerned because water from mine pools could contaminate diminishing surface and groundwater supplies. Mine pools are also a threat to the safety of current mining operations. Conversely, mine pools are a large, untapped water resource that, with treatment, could be used for a variety of industrial purposes. Others have proposed using mine pools in conjunction with heat pumps as a source of heating and cooling for large industrial facilities. The management or use of mine pool water requires accurate maps of mine pools. West Virginia University has predicted the likely location and volume of mine pools in the Pittsburgh Coalbed using existing mine maps, structure contour maps, and measured mine pool elevations. Unfortunately, mine maps only reflect conditions at the time of mining, are not available for all mines, and do not always denote the maximum extent of mining. Since 1999, the National Energy Technology Laboratory (NETL) has been evaluating helicopter-borne, electromagnetic sensing technologies for the detection and mapping of mine pools. Frequency domain electromagnetic sensors are able to detect shallow mine pools (depth < 50 m) if there is sufficient contrast between the conductance of the mine pool and the conductance of the overburden. The mine pools (conductors) most confidently detected by this technology are overlain by thick, resistive sandstone layers. In 2003, a helicopter time domain electromagnetic sensor was applied to mined areas in southwestern Virginia in an attempt to increase the depth of mine pool detection. This study failed because the mine pool targets were thin and not very conductive. Also, large areas of the surveys were degraded or made unusable by excessive amounts of cultural electromagnetic noise that obscured the

  13. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe.

  14. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Technical Reports Server (NTRS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-01-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  15. Galaxy evolution and large-scale structure in the far-infrared. II. The IRAS faint source survey

    SciTech Connect

    Lonsdale, C.J.; Hacking, P.B.; Conrow, T.P.; Rowan-Robinson, M. Queen Mary College, London )

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling. 105 refs.

  16. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  17. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  18. Research on the Second Region of Sino-German 6 cm Polarization Survey of the Galactic Plane and Large-scale Supernova Remnants

    NASA Astrophysics Data System (ADS)

    Xiao, L.

    2011-11-01

    Polarization observation provides a useful tool to study the properties of interstellar medium (ISM). It could directly show the orientation of large-scale magnetic fields, and help us understand the structure of large-scale magnetic field in our Galaxy and the evolution of supernova remnants (SNRs). Moreover, combing with polarization observations at other wavelengths, the Faraday rotation could be applied to study the properties of the thermal electron density, filling factor, regular and random magnetic fields in ISM and SNRs.The previous polarization measurements mostly conducted at low frequencies were significantly influenced by the Faraday effects of ISM, while at 6 cm, they are much less affected and polarized emission from larger distances could be detected. By studying Faraday screens, we could explore the physical parameters of the sources as well as the synchrotron emissivities of the Galaxy. The 6 cm total intensity measurements are the key data to clarify the spectrum behavior of diffused emission or individual objects at high frequencies, and help us understand the distribution of relativistic electrons, the disk-halo interaction and the evolution of late-stage SNRs. In August 2009, the project of 6~cm continuum and polarization survey of Galactic plane had been completed successfully using the 25~m radio telescope at Urumqi. The work presented in this thesis is mainly based on data analysis of the second survey region with 60° ≤ l ≤129° and |b|≤5°. We tried to compensate the missing large-scale structures by extrapolating the WMAP K-band polarization data with the spectral index model and simulation of the rotation measures (RMs). By comparing the maps pre- with post-``calibration'', we studied the extended objects in this region. We analyzed the depolarization structure at the periphery of HII region complex using Faraday screen model, and studied the distribution of fluctuation in the entire survey region using structure functions

  19. Collective response to public health emergencies and large-scale disasters: putting hospitals at the core of community resilience.

    PubMed

    Paturas, James L; Smith, Deborah; Smith, Stewart; Albanese, Joseph

    2010-07-01

    Healthcare organisations are a critical part of a community's resilience and play a prominent role as the backbone of medical response to natural and manmade disasters. The importance of healthcare organisations, in particular hospitals, to remain operational extends beyond the necessity to sustain uninterrupted medical services for the community, in the aftermath of a large-scale disaster. Hospitals are viewed as safe havens where affected individuals go for shelter, food, water and psychosocial assistance, as well as to obtain information about missing family members or learn of impending dangers related to the incident. The ability of hospitals to respond effectively to high-consequence incidents producing a massive arrival of patients that disrupt daily operations requires surge capacity and capability. The activation of hospital emergency support functions provides an approach by which hospitals manage a short-term shortfall of hospital personnel through the reallocation of hospital employees, thereby obviating the reliance on external qualified volunteers for surge capacity and capability. Recent revisions to the Joint Commission's hospital emergency preparedness standard have impelled healthcare facilities to participate actively in community-wide planning, rather than confining planning exclusively to a single healthcare facility, in order to harmonise disaster management strategies and effectively coordinate the allocation of community resources and expertise across all local response agencies.

  20. The Vimos VLT Deep Survey. Stellar mass segregation and large-scale galaxy environment in the redshift range 0.2 < z < 1.4

    NASA Astrophysics Data System (ADS)

    Scodeggio, M.; Vergani, D.; Cucciati, O.; Iovino, A.; Franzetti, P.; Garilli, B.; Lamareille, F.; Bolzonella, M.; Pozzetti, L.; Abbas, U.; Marinoni, C.; Contini, T.; Bottini, D.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Scaramella, R.; Tresse, L.; Vettolani, G.; Zanichelli, A.; Adami, C.; Arnouts, S.; Bardelli, S.; Cappi, A.; Charlot, S.; Ciliegi, P.; Foucaud, S.; Gavignaud, I.; Guzzo, L.; Ilbert, O.; McCracken, H. J.; Marano, B.; Mazure, A.; Meneux, B.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Radovich, M.; Zamorani, G.; Zucca, E.; Bondi, M.; Bongiorno, A.; Brinchmann, J.; de La Torre, S.; de Ravel, L.; Gregorini, L.; Memeo, P.; Perez-Montero, E.; Mellier, Y.; Temporin, S.; Walcher, C. J.

    2009-07-01

    Context: Hierarchical models of galaxy formation predict that the properties of a dark matter halo depend on the large-scale environment surrounding the halo. As a result of this correlation, we expect massive haloes to be present in larger number in overdense regions than in underdense ones. Given that a correlation exists between a galaxy stellar mass and the hosting dark matter halo mass, the segregation in dark matter halo mass should then result in a segregation in the distribution of stellar mass in the galaxy population. Aims: In this work we study the distribution of galaxy stellar mass and rest-frame optical color as a function of the large-scale galaxy distribution using the VLT VIMOS Deep Survey sample, in order to verify the presence of segregation in the properties of the galaxy population. Methods: We use VVDS redshift measurements and multi-band photometric data to derive estimates of the stellar mass, rest-frame optical color, and of the large-scale galaxy density, on a scale of approximately 8 Mpc, for a sample of 5619 galaxies in the redshift range 0.2 0.7. However, when we consider only galaxies in narrow bins of stellar mass, in order to exclude the effects of stellar mass segregation on galaxy properties, we no longer observe any significant color segregation. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, program 070.A-9007(A), and on data obtained at the Canada-France-Hawaii Telescope

  1. The CIDA-QUEST large-scale survey of Orion OB1: evidence for rapid disk dissipation in a dispersed stellar population.

    PubMed

    Briceño, C; Vivas, A K; Calvet, N; Hartmann, L; Pacheco, R; Herrera, D; Romero, L; Berlind, P; Sánchez, G; Snyder, J A; Andrews, P

    2001-01-01

    We are conducting a large-scale, multiepoch, optical photometric survey [Centro de Investigaciones de Astronomia-Quasar Equatorial Survey Team (CIDA-QUEST)] covering about 120 square degrees to identify the young low-mass stars in the Orion OB1 association. We present results for an area of 34 square degrees. Using photometric variability as our main selection criterion, as well as follow-up spectroscopy, we confirmed 168 previously unidentified pre-main sequence stars that are about 0.6 to 0.9 times the mass of the sun (Mo), with ages of about 1 million to 3 million years (Ori OB1b) and about 3 million to 10 million years (Ori OB1a). The low-mass stars are spatially coincident with the high-mass (at least 3 Mo) members of the associations. Indicators of disk accretion such as Halpha emission and near-infrared emission from dusty disks fall sharply from Ori OB1b to Ori OB1a, indicating that the time scale for disk dissipation and possibly the onset of planet formation is a few million years.

  2. A Deep Survey of Low-Redshift Absorbers and Their Connections with Galaxies: Probing the Roles of Dwarfs, Satellites, and Large-Scale Environment

    NASA Astrophysics Data System (ADS)

    Burchett, Joseph

    2014-10-01

    In the not-too-distant past, the study of galaxy evolution neglected the vast interface between the stars in a galaxy and intergalactic space except for the dynamical effects of dark matter. Thanks to QSO absorption line spectroscopy and the Cosmic Origins Spectrograph {COS}, the circumgalactic medium {CGM} has come into sharp focus as a rich ecosystem playing a vital role in the evolution of the host galaxy. However, attributing the gas detected in absorption with host dwarf galaxies detected in optical surveys around the sightline becomes very difficult very quickly with increasing redshift. In addition, both targeted UV spectroscopy and ground-based galaxy surveys are resource intensive, which complicates compiling large, statistically robust samples of very-low-redshift absorber/galaxy pairs. We propose a CGM study of unprecedented statistical power by exploiting the vast number of sightlines in the HST/COS archive located within the Sloan Digital Sky Survey {SDSS} footprint to compile an estimated sample of 586 absorbers at z<0.015. This very-low-redshift criterion enables spectroscopic completeness down to L<0.01 L* galaxies in publicly available optical imaging and spectroscopy.Our survey is uniquely poised to address the following questions: {1} What is the role of dwarf galaxies that would be undetectable at higher redshift in giving rise to the gas detected in QSO spectroscopy? {2} How does galaxy environment and large-scale structure affect the CGM and what are the implications for environmental quenching of star formation? {3} How efficiently do feedback mechanisms expel metal-enriched gas to great distances into the galaxy halo and into the IGM?

  3. The VIMOS Public Extragalactic Redshift Survey (VIPERS). An unprecedented view of galaxies and large-scale structure at 0.5 < z < 1.2

    NASA Astrophysics Data System (ADS)

    Guzzo, L.; Scodeggio, M.; Garilli, B.; Granett, B. R.; Fritz, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bolzonella, M.; Bottini, D.; Branchini, E.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; De Lucia, G.; de la Torre, S.; Franzetti, P.; Fumana, M.; Hudelot, P.; Ilbert, O.; Iovino, A.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Peacock, J. A.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zamorani, G.; Zanichelli, A.; Burden, A.; Di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Moscardini, L.; Nichol, R. C.; Percival, W. J.; Phleps, S.; Wolk, M.

    2014-06-01

    We describe the construction and general features of VIPERS, the VIMOS Public Extragalactic Redshift Survey. This ESO Large Programme is using the Very Large Telescope with the aim of building a spectroscopic sample of ~ 100 000 galaxies with iAB< 22.5 and 0.5 survey covers a total area of ~ 24 deg2 within the CFHTLS-Wide W1 and W4 fields. VIPERS is designed to address a broad range of problems in large-scale structure and galaxy evolution, thanks to a unique combination of volume (~ 5 × 107h-3 Mpc3) and sampling rate (~ 40%), comparable to state-of-the-art surveys of the local Universe, together with extensive multi-band optical and near-infrared photometry. Here we present the survey design, the selection of the source catalogue and the development of the spectroscopic observations. We discuss in detail the overall selection function that results from the combination of the different constituents of the project. This includes the masks arising from the parent photometric sample and the spectroscopic instrumental footprint, together with the weights needed to account for the sampling and the success rates of the observations. Using the catalogue of 53 608 galaxy redshifts composing the forthcoming VIPERS Public Data Release 1 (PDR-1), we provide a first assessment of the quality of the spectroscopic data. The stellar contamination is found to be only 3.2%, endorsing the quality of the star-galaxy separation process and fully confirming the original estimates based on the VVDS data, which also indicate a galaxy incompleteness from this process of only 1.4%. Using a set of 1215 repeated observations, we estimate an rms redshift error σz/ (1 + z) = 4.7 × 10-4 and calibrate the internal spectral quality grading. Benefiting from the combination of size and detailed sampling of this dataset, we conclude by presenting a map showing in unprecedented detail the large-scale distribution of galaxies between 5 and 8 billion years ago. Based on observations

  4. The Big Drink Debate: perceptions of the impact of price on alcohol consumption from a large scale cross-sectional convenience survey in north west England

    PubMed Central

    2011-01-01

    Background A large-scale survey was conducted in 2008 in north west England, a region with high levels of alcohol-related harm, during a regional 'Big Drink Debate' campaign. The aim of this paper is to explore perceptions of how alcohol consumption would change if alcohol prices were to increase or decrease. Methods A convenience survey of residents (≥ 18 years) of north west England measured demographics, income, alcohol consumption in previous week, and opinions on drinking behaviour under two pricing conditions: low prices and discounts and increased alcohol prices (either 'decrease', 'no change' or 'increase'). Multinomial logistic regression used three outcomes: 'completely elastic' (consider that lower prices increase drinking and higher prices decrease drinking); 'lower price elastic' (lower prices increase drinking, higher prices have no effect); and 'price inelastic' (no change for either). Results Of 22,780 drinkers surveyed, 80.3% considered lower alcohol prices and discounts would increase alcohol consumption, while 22.1% thought raising prices would decrease consumption, making lower price elasticity only (i.e. lower prices increase drinking, higher prices have no effect) the most common outcome (62%). Compared to a high income/high drinking category, the lightest drinkers with a low income (adjusted odds ratio AOR = 1.78, 95% confidence intervals CI 1.38-2.30) or medium income (AOR = 1.88, CI 1.47-2.41) were most likely to be lower price elastic. Females were more likely than males to be lower price elastic (65% vs 57%) while the reverse was true for complete elasticity (20% vs 26%, P < 0.001). Conclusions Lower pricing increases alcohol consumption, and the alcohol industry's continued focus on discounting sales encourages higher drinking levels. International evidence suggests increasing the price of alcohol reduces consumption, and one in five of the surveyed population agreed; more work is required to increase this agreement to achieve public

  5. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    SciTech Connect

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many of the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.

  6. Health Occupations Survey.

    ERIC Educational Resources Information Center

    Willett, Lynn H.

    A survey was conducted to determine the need for health occupations personnel in the Moraine Valley Community College district, specifically to: (1) describe present employment for selected health occupations; (2) project health occupation employment to 1974; (3) identify the supply of applicants for the selected occupations; and (4) identify…

  7. Health risks from large-scale water pollution: Current trends and implications for improving drinking water quality in the lower Amu Darya drainage basin, Uzbekistan

    NASA Astrophysics Data System (ADS)

    Törnqvist, Rebecka; Jarsjö, Jerker

    2010-05-01

    Safe drinking water is a primary prerequisite to human health, well being and development. Yet, there are roughly one billion people around the world that lack access to safe drinking water supply. Health risk assessments are effective for evaluating the suitability of using various water sources as drinking water supply. Additionally, knowledge of pollutant transport processes on relatively large scales is needed to identify effective management strategies for improving water resources of poor quality. The lower Amu Darya drainage basin close to the Aral Sea in Uzbekistan suffers from physical water scarcity and poor water quality. This is mainly due to the intensive agriculture production in the region, which requires extensive freshwater withdrawals and use of fertilizers and pesticides. In addition, recurrent droughts in the region affect the surface water availability. On average 20% of the population in rural areas in Uzbekistan lack access to improved drinking water sources, and the situation is even more severe in the lower Amu Darya basin. In this study, we consider health risks related to water-borne contaminants by dividing measured substance concentrations with health-risk based guideline values from the World Health Organisation (WHO). In particular, we analyse novel results of water quality measurements performed in 2007 and 2008 in the Mejdurechye Reservoir (located in the downstream part of the Amu Darya river basin). We furthermore identify large-scale trends by comparing the Mejdurechye results to reported water quality results from a considerable stretch of the Amu Darya river basin, including drainage water, river water and groundwater. The results show that concentrations of cadmium and nitrite exceed the WHO health-risk based guideline values in Mejdurechye Reservoir. Furthermore, concentrations of the since long ago banned and highly toxic pesticides dichlorodiphenyltrichloroethane (DDT) and γ-hexachlorocyclohexane (γ-HCH) were detected in

  8. Is the universe homogeneous on large scale?

    NASA Astrophysics Data System (ADS)

    Zhu, Xingfen; Chu, Yaoquan

    Wether the distribution of matter in the universe is homogeneous or fractal on large scale is vastly debated in observational cosmology recently. Pietronero and his co-workers have strongly advocated that the fractal behaviour in the galaxy distribution extends to the largest scale observed (≍1000h-1Mpc) with the fractal dimension D ≍ 2. Most cosmologists who hold the standard model, however, insist that the universe be homogeneous on large scale. The answer of whether the universe is homogeneous or not on large scale should wait for the new results of next generation galaxy redshift surveys.

  9. Seismic texture and amplitude analysis of large scale fluid escape pipes using time lapses seismic surveys: examples from the Loyal Field (Scotland, UK)

    NASA Astrophysics Data System (ADS)

    Maestrelli, Daniele; Jihad, Ali; Iacopini, David; Bond, Clare

    2016-04-01

    ) affected by large scale fracture (semblance image) and seem consistent with a suspended mud/sand mixture non-fluidized fluid flow. Near-Middle-Far offsets amplitude analysis confirms that most of the amplitude anomalies within the pipes conduit and terminus are only partly related to gas. An interpretation of the possible texture observed is proposed with a discussion of the noise and artefact induced by resolution and migration problems. Possible hypothetical formation mechanisms for those Pipes are discussed.

  10. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  11. Why we need a large-scale open metadata initiative in health informatics - a vision paper on open data models for clinical phenotypes.

    PubMed

    Dugas, Martin

    2013-01-01

    Clinical phenotypes are very complex and not well described. For instance, more than 100.000 biomedical concepts are needed to describe clinical properties of patients. At present, information systems dealing with clinical phenotype data are based on secret, heterogeneous and incompatible data models. This is the root cause for the well-known grand challenge of semantic interoperability in healthcare: data exchange and analysis of medical information systems has major limitations. This problem slows down medical progressand wastes time of health care professionals. A large-scale open metadata initiative can foster exchange, discussion and consensus regarding data models for clinical phenotypes. This would be an important contribution to improve information systems in healthcare and to solve the grand challenge of semantic interoperability. PMID:23920688

  12. Assessment of human health risks for arsenic bioaccumulation in tilapia (Oreochromis mossambicus) and large-scale mullet (Liza macrolepis) from blackfoot disease area in Taiwan.

    PubMed

    Liao, C M; Ling, M P

    2003-08-01

    This paper carries out probabilistic risk analysis methods to quantify arsenic (As) bioaccumulation in cultured fish of tilapia (Orechromis mossambicus) and large-scale mullet (Liza macrolepis) at blackfoot disease (BFD) area in Taiwan and to assess the range of exposures for the people who eat the contaminated fish. The models implemented include a probabilistic bioaccumulation model to account for As accumulation in fish and a human health exposure and risk model that accounts for hazard quotient and lifetime risk for humans consuming contaminated fish. Results demonstrate that the ninety-fifth percentile of hazard quotient for inorganic As ranged from 0.77-2.35 for Taipei city residents with fish consumption rates of 10-70 g/d, whereas it ranged 1.86-6.09 for subsistence fishers in the BFD area with 48-143 g/d, consumption rates. The highest ninety-fifth percentile of potential health risk for inorganic As ranged from 1.92 x 10(-4)-5.25 x 10(-4) for Taipei city residents eating tilapia harvested from Hsuehchia fish farms, with consumption rates of 10-70 g/d, whereas for subsistence fishers it was 7.36 x 10(-4)-1.12 x 10(-3) with 48-143 g/d consumption rates. These findings indicate that As exposure poses risks to residents and subsistence fishers, yet these results occur under highly conservative conditions. We calculate the maximum allowable inorganic As residues associated to a standard unit risk, resulting in the maximum target residues, are 0.0019-0.0175 and 0.0023-0.0053 microg/g dry weight for tilapia and large-scale mullet, respectively, with consumption rates of 70-10 g/d, or 0.0009-0.0029 and 0.0011-0.0013 microg/g dry weight for consumption rates of 169-48 g/d.

  13. Analysis and Modeling of Threatening Factors of Workforce’s Health in Large-Scale Workplaces: Comparison of Four-Fitting Methods to select optimum technique

    PubMed Central

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Savareh, Behrouz Alizadeh

    2016-01-01

    Introduction Workforce is one of the pillars of development in any country. Therefore, the workforce’s health is very important, and analyzing its threatening factors is one of the fundamental steps for health planning. This study was the first part of a comprehensive study aimed at comparing the fitting methods to analyze and model the factors threatening health in occupational injuries. Methods In this study, 980 human occupational injuries in 10 Iranian large-scale workplaces within 10 years (2005–2014) were analyzed and modeled based on the four fitting methods: linear regression, regression analysis, generalized linear model, and artificial neural networks (ANN) using IBM SPSS Modeler 14.2. Results Accident Severity Rate (ASR) of occupational injuries was 557.47 ± 397.87. The results showed that the mean of age and work experience of injured workers were 27.82 ± 5.23 and 4.39 ± 3.65 years, respectively. Analysis of health-threatening factors showed that some factors, including age, quality of provided H&S training, number of workers, hazard identification (HAZID), and periodic risk assessment, and periodic H&S training were important factors that affected ASR. In addition, the results of comparison of the four fitting methods showed that the correlation coefficient of ANN (R = 0.968) and the relative error (R.E) of ANN (R.E = 0.063) were the highest and lowest, respectively, among other fitting methods. Conclusion The findings of the present study indicated that, despite the suitability and effectiveness of all fitting methods in analyzing severity of occupational injuries, ANN is the best fitting method for modeling of the threatening factors of a workforce’s health. Furthermore, all fitting methods, especially ANN, should be considered more in analyzing and modeling of occupational injuries and health-threatening factors as well as planning to provide and improve the workforce’s health. PMID:27053999

  14. Large-scale survey of rates of achieving targets for blood glucose, blood pressure, and lipids and prevalence of complications in type 2 diabetes (JDDM 40)

    PubMed Central

    Yokoyama, Hiroki; Oishi, Mariko; Takamura, Hiroshi; Yamasaki, Katsuya; Shirabe, Shin-ichiro; Uchida, Daigaku; Sugimoto, Hidekatsu; Kurihara, Yoshio; Araki, Shin-ichi; Maegawa, Hiroshi

    2016-01-01

    Objective The fact that population with type 2 diabetes mellitus and bodyweight of patients are increasing but diabetes care is improving makes it important to explore the up-to-date rates of achieving treatment targets and prevalence of complications. We investigated the prevalence of microvascular/macrovascular complications and rates of achieving treatment targets through a large-scale multicenter-based cohort. Research design and methods A cross-sectional nationwide survey was performed on 9956 subjects with type 2 diabetes mellitus who consecutively attended primary care clinics. The prevalence of nephropathy, retinopathy, neuropathy, and macrovascular complications and rates of achieving targets of glycated hemoglobin (HbA1c) <7.0%, blood pressure <130/80 mm Hg, and lipids of low-density/high-density lipoprotein cholesterol <3.1/≥1.0 mmol/L and non-high-density lipoprotein cholesterol <3.8 mmol/L were investigated. Results The rates of achieving targets for HbA1c, blood pressure, and lipids were 52.9%, 46.8% and 65.5%, respectively. The prevalence of microvascular complications was ∼28% each, 6.4% of which had all microvascular complications, while that of macrovascular complications was 12.6%. With an increasing duration of diabetes, the rate of achieving target HbA1c decreased and the prevalence of each complication increased despite increased use of diabetes medication. The prevalence of each complication decreased according to the number achieving the 3 treatment targets and was lower in subjects without macrovascular complications than those with. Adjustments for considerable covariates exhibited that each complication was closely inter-related, and the achievement of each target was significantly associated with being free of each complication. Conclusions Almost half of the subjects examined did not meet the recommended targets. The risk of each complication was significantly affected by 1 on-target treatment (inversely) and the

  15. Self-Assessments or Tests? Comparing Cross-National Differences in Patterns and Outcomes of Graduates' Skills Based on International Large-Scale Surveys

    ERIC Educational Resources Information Center

    Humburg, Martin; van der Velden, Rolf

    2015-01-01

    In this paper an analysis is carried out whether objective tests and subjective self-assessments in international large-scale studies yield similar results when looking at cross-national differences in the effects of skills on earnings, and skills patterns across countries, fields of study and gender. The findings indicate that subjective skills…

  16. The Development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): A Large-Scale Data Sharing Initiative

    PubMed Central

    Lutomski, Jennifer E.; Baars, Maria A. E.; Schalk, Bianca W. M.; Boter, Han; Buurman, Bianca M.; den Elzen, Wendy P. J.; Jansen, Aaltje P. D.; Kempen, Gertrudis I. J. M.; Steunenberg, Bas; Steyerberg, Ewout W.; Olde Rikkert, Marcel G. M.; Melis, René J. F.

    2013-01-01

    Introduction In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons’ health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). Materials and Methods A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Results Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. Discussion TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the

  17. A Numeric Scorecard Assessing the Mental Health Preparedness for Large-Scale Crises at College and University Campuses: A Delphi Study

    ERIC Educational Resources Information Center

    Burgin, Rick A.

    2012-01-01

    Large-scale crises continue to surprise, overwhelm, and shatter college and university campuses. While the devastation to physical plants and persons is often evident and is addressed with crisis management plans, the number of emotional casualties left in the wake of these large-scale crises may not be apparent and are often not addressed with…

  18. Rapid identification of viruses causing sugarcane mosaic by direct sequencing of RT-PCR products from crude extracts: a method for large scale virus surveys.

    PubMed

    Gómez, Maximiliano; Rago, Alejandro M; Serino, Germán

    2009-05-01

    Sugarcane mosaic virus (SCMV) and sorghum mosaic virus (SrMV) diversity studies are important to characterize virus populations in sugarcane producing areas, enabling (i) identification of shifts in predominant strains, (ii) detecting associations of strains with specific varieties, and (iii) possibly exposing the appearance of new strains which may affect the performance of varieties in a region. Recent studies have shown significant sequence variability within SCMV populations around the world, indicating that isolate identification would be best achieved by direct analysis of sequence data. Because virus sequence-based studies that require the characterization of large numbers of isolates may be impractical using standard sample preparation and processing methodology, a simple protocol that yields quality sequence information, requiring neither viral RNA purification nor cloning of RT-PCR products was developed. Rapid virus release extracts are obtained by submerging a portion of leaf tissue into an extraction buffer, followed by a brief incubation at 95 degrees C. An aliquot of the extract is pipetted into an RT-PCR amplification mix for the detection of SCMV and the SrMV coat protein gene fragments. RT-PCR fragments are sequenced directly using oligonucleotide primers similar to the RT-PCR primers, yielding sequence information of an adequate quality. This rapid, cost effective protocol is practical for large scale virus diversity and evolutionary studies.

  19. Collecting reliable information about violence against women safely in household interviews: experience from a large-scale national survey in South Asia.

    PubMed

    Andersson, Neil; Cockcroft, Anne; Ansari, Noor; Omer, Khalid; Chaudhry, Ubaid Ullah; Khan, Amir; Pearson, Luwei

    2009-04-01

    This article describes the first national survey of violence against women in Pakistan from 2001 to 2004 covering 23,430 women. The survey took account of methodological and ethical recommendations, ensuring privacy of interviews through one person interviewing the mother-in-law while another interviewed the eligible woman privately. The training module for interviewers focused on empathy with respondents, notably increasing disclosure rates. Only 3% of women declined to participate, and 1% were not permitted to participate. Among women who disclosed physical violence, only one third had previously told anyone. Surveys of violence against women in Pakistan not using methods to minimize underreporting could seriously underestimate prevalence.

  20. The Cosmic Evolution Survey (COSMOS): A Large-Scale Structure at z=0.73 and the Relation of Galaxy Morphologies to Local Environment

    NASA Astrophysics Data System (ADS)

    Guzzo, L.; Cassata, P.; Finoguenov, A.; Massey, R.; Scoville, N. Z.; Capak, P.; Ellis, R. S.; Mobasher, B.; Taniguchi, Y.; Thompson, D.; Ajiki, M.; Aussel, H.; Böhringer, H.; Brusa, M.; Calzetti, D.; Comastri, A.; Franceschini, A.; Hasinger, G.; Kasliwal, M. M.; Kitzbichler, M. G.; Kneib, J.-P.; Koekemoer, A.; Leauthaud, A.; McCracken, H. J.; Murayama, T.; Nagao, T.; Rhodes, J.; Sanders, D. B.; Sasaki, S.; Shioya, Y.; Tasca, L.; Taylor, J. E.

    2007-09-01

    We have identified a large-scale structure at z~=0.73 in the COSMOS field, coherently described by the distribution of galaxy photometric redshifts, an ACS weak-lensing convergence map, and the distribution of extended X-ray sources in a mosaic of XMM-Newton observations. The main peak seen in these maps corresponds to a rich cluster with TX=3.51+0.60-0.46 keV and LX=(1.56+/-0.04)×1044 ergs s-1 (0.1-2.4 keV band). We estimate an X-ray mass within r500 corresponding to M500~=1.6×1014 Msolar and a total lensing mass (extrapolated by fitting a NFW profile) MNFW=(6+/-3)×1015 Msolar. We use an automated morphological classification of all galaxies brighter than IAB=24 over the structure area to measure the fraction of early-type objects as a function of local projected density Σ10, based on photometric redshifts derived from ground-based deep multiband photometry. We recover a robust morphology-density relation at this redshift, indicating, for comparable local densities, a smaller fraction of early-type galaxies than today. Interestingly, this difference is less strong at the highest densities and becomes more severe in intermediate environments. We also find, however, local ``inversions'' of the observed global relation, possibly driven by the large-scale environment. In particular, we find direct correspondence of a large concentration of disk galaxies to (the colder side of) a possible shock region detected in the X-ray temperature map and surface brightness distribution of the dominant cluster. We interpret this as potential evidence of shock-induced star formation in existing galaxy disks, during the ongoing merger between two subclusters. Our analysis reveals the value of combining various measures of the projected mass density to locate distant structures and their potential for elucidating the physical processes at work in the transformation of galaxy morphologies. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space

  1. A large scale survey of trace metal levels in coastal waters of the Western Mediterranean basin using caged mussels (Mytilus galloprovincialis).

    PubMed

    Benedicto, José; Andral, Bruno; Martínez-Gómez, Concepción; Guitart, Carlos; Deudero, Salud; Cento, Alessandro; Scarpato, Alfonso; Caixach, Josep; Benbrahim, Samir; Chouba, Lassaad; Boulahdid, Mostefa; Galgani, François

    2011-05-01

    A large scale study of trace metal contamination (Hg, Cd, Pb and Ni) by means of caged mussels Mytilus galloprovincialis was undertaken along the coastal waters of the Western Mediterranean Sea within the context of the MYTILOS project. Individual mussels from an homogeneous population (shell size 50 ± 5 mm) obtained from an aquaculture farm were consecutively caged and deployed at 123 sites located in the Alborán, North-Western, South-Western and Tyrrhenian sub-basins for 12 weeks (April-July) in 2004, 2005 and 2006. After cage recoveries, both the metal content in the whole mussel tissue and the allometric parameters were measured. Statistical analysis of the datasets showed significant differences in concentrations between sub-basins for some metals and mussel condition index (CI). Linear regression models coupled to the CI were revisited for the data adjustment of certain trace metals (Hg, Cd and Ni), and four level categories were statistically derived to facilitate interregional comparison. Seawater masses surrounding coastal areas impacted by run-off from land mineralised coasts and industrial activities displayed the highest concentration ranges (Hg: 0.15-0.31 mg kg(-1) dw; Cd: 1.97-2.11; Ni: 2.18-3.20 and Pb: 3.1-3.8), although the levels obtained in most of the sites fitted within moderate or low categories, and they could be considered as baseline concentrations. However, few sites considered little-influenced by human activities, at present, showed high concentrations of Cd, Ni and Pb, which constitute new areas of concern. Overall, the use of active biomonitoring (ABM) approach allowed to investigate trace metal contamination in order to support policy makers in establishing regional strategies (particularly, with regard to the European Marine Strategy Directive). PMID:21384032

  2. Effectiveness of a large-scale health and nutritional education program on anemia in children younger than 5 years in Shifang, a heavily damaged area of Wenchuan earthquake.

    PubMed

    Yang, Fan; Wang, Chuan; Yang, Hui; Yang, Huiming; Yang, Sufei; Yu, Tao; Tang, Zhanghui; Ji, Qiaoyun; Li, Fengyi; Shi, Hua; Mao, Meng

    2015-03-01

    This study aimed to explore an ideal way to prevent anemia among children younger than 5 years after disasters especially when health care facilities are not enough. A preliminary survey was carried out involving 13 065 children younger than 5 years. Pretested questionnaires were used for data collection and hemoglobin levels were measured. After 12-month intervention, the impact survey involving 2769 children was conducted. Results showed that there were some improvements both in feeding knowledge and practice related to anemia. The total prevalence of anemia decreased from 14.3% to 7.8% (P < .001), and the severity of anemia also declined. The hemoglobin concentration increased significantly from 118.8 ± 10.5 to 122.0 ± 9.9 g/L (P < .001). Thus, health and nutritional education could be an ideal way to combat anemia after disasters especially in less developed areas with multiparty cooperation. The methods and experiences of this study may be well worth learning and implementing.

  3. What Sort of Girl Wants to Study Physics after the Age of 16? Findings from a Large-Scale UK Survey

    ERIC Educational Resources Information Center

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-01-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK…

  4. Is cost-related non-collection of prescriptions associated with a reduction in health? Findings from a large-scale longitudinal study of New Zealand adults

    PubMed Central

    Jatrana, Santosh; Richardson, Ken; Norris, Pauline; Crampton, Peter

    2015-01-01

    Objective To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. Settings New Zealand Survey of Family, Income and Employment (SoFIE)-Health. Participants Data from 17 363 participants with at least two observations in three waves (2004–2005, 2006–2007, 2008–2009) of a panel study were analysed using fixed effects regression modelling. Primary outcome measures Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. Results After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15–24 and 25–64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. Conclusion Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health. PMID:26553826

  5. A Short Survey on the State of the Art in Architectures and Platforms for Large Scale Data Analysis and Knowledge Discovery from Data

    SciTech Connect

    Begoli, Edmon

    2012-01-01

    Intended as a survey for practicing architects and researchers seeking an overview of the state-of-the-art architectures for data analysis, this paper provides an overview of the emerg- ing data management and analytic platforms including par- allel databases, Hadoop-based systems, High Performance Computing (HPC) platforms and platforms popularly re- ferred to as NoSQL platforms. Platforms are presented based on their relevance, analysis they support and the data organization model they support.

  6. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  7. A 1.85-m mm-submm Telescope for Large-Scale Molecular Gas Surveys in 12CO, 13CO, and C18O (J = 2-1)

    NASA Astrophysics Data System (ADS)

    Onishi, Toshikazu; Nishimura, Atsushi; Ota, Yuya; Hashizume, Akio; Kojima, Yoshiharu; Minami, Akihito; Tokuda, Kazuki; Touga, Shiori; Abe, Yasuhiro; Kaiden, Masahiro; Kimura, Kimihiro; Muraoka, Kazuyuki; Maezawa, Hiroyuki; Ogawa, Hideo; Dobashi, Kazuhito; Shimoikura, Tomomi; Yonekura, Yoshinori; Asayama, Shin'ichiro; Handa, Toshihiro; Nakajima, Taku; Noguchi, Takashi; Kuno, Nario

    2013-08-01

    We have developed a new mm-submm telescope with a diameter of 1.85-m installed at the Nobeyama Radio Observatory. The scientific goal is to precisely reveal the physical properties of molecular clouds in the Milky Way Galaxy by obtaining a large-scale distribution of molecular gas, which can also be compared with large-scale observations at various wavelengths. The target frequency is ˜ 230 GHz; simultaneous observations at the molecular rotational lines of J = 2-1 of three carbon monoxide isotopes (12CO, 13CO, C18 O) are achieved with a beam size (HPBW) of 2.7'. In order to accomplish the simultaneous observations, we have developed waveguide-type sideband-separating SIS mixers to obtain spectra separately in the upper and lower side bands. A Fourier digital spectrometer with a 1 GHz bandwidth having 16384 channels is installed, and the bandwidth of the spectrometer is divided into three parts, corresponding to each of the three spectra; the IF system has been designed so as to inject these three lines into the spectrometer. A flexible observation system was created mainly in Python on Linux PCs, enabling effective OTF (On-The-Fly) scans for large-area mapping. The telescope is enclosed in a radome with a membrane covered to prevent any harmful effects of sunlight, strong wind, and precipitation in order to minimize errors in the telescope pointing, and to stabilize the receiver and the IF devices. From 2011 November, we started science operation, resulting in a large-scale survey of the Orion A/B clouds, Cygnus OB7, Galactic Plane, Taurus, and so on. We also updated the receiver system for dual-polarization observations.

  8. IRAM 30 m large scale survey of {sup 12}CO(2-1) and {sup 13}CO(2-1) emission in the Orion molecular cloud

    SciTech Connect

    Berné, O.; Cernicharo, J.; Marcelino, N.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the {sup 12}CO and {sup 13}CO (2-1) lines with a maximal spatial resolution of ∼11'' and spectral resolution of ∼0.4 km s{sup –1}. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M {sub ☉} (half of which is found in regions with visual extinctions A{sub V} below ∼10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/∼oberne/Olivier{sub B}erne/Data).

  9. What Sort of Girl Wants to Study Physics After the Age of 16? Findings from a Large-scale UK Survey

    NASA Astrophysics Data System (ADS)

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-11-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK schools as learners of physics during the academic year 2008-2009. A comparison between boys and girls indicates the pervasiveness of gender issues, with boys more likely to respond positively towards physics-specific constructs than girls. The analysis also indicates that girls and boys who expressed intentions to participate in physics post-16 gave similar responses towards their physics teachers and physics lessons and had comparable physics extrinsic motivation. Girls (regardless of their intention to participate in physics) were less likely than boys to be encouraged to study physics post-16 by teachers, family and friends. Despite this, there were a subset of girls still intending to study physics post-16. The crucial differences between the girls who intended to study physics post-16 and those who did not is that girls who intend to study physics post-16 had higher physics extrinsic motivation, more positive perceptions of physics teachers and lessons, greater competitiveness and a tendency to be less extrovert. This strongly suggests that higher extrinsic motivation in physics could be the crucial underlying key that encourages a subset of girls (as well as boys) in wanting to pursue physics post-16.

  10. A large-scale survey of the novel 15q24 microdeletion syndrome in autism spectrum disorders identifies an atypical deletion that narrows the critical region

    PubMed Central

    2010-01-01

    Background The 15q24 microdeletion syndrome has been recently described as a recurrent, submicroscopic genomic imbalance found in individuals with intellectual disability, typical facial appearance, hypotonia, and digital and genital abnormalities. Gene dosage abnormalities, including copy number variations (CNVs), have been identified in a significant fraction of individuals with autism spectrum disorders (ASDs). In this study we surveyed two ASD cohorts for 15q24 abnormalities to assess the frequency of genomic imbalances in this interval. Methods We screened 173 unrelated subjects with ASD from the Central Valley of Costa Rica and 1336 subjects with ASD from 785 independent families registered with the Autism Genetic Resource Exchange (AGRE) for CNVs across 15q24 using oligonucleotide arrays. Rearrangements were confirmed by array comparative genomic hybridization and quantitative PCR. Results Among the patients from Costa Rica, an atypical de novo deletion of 3.06 Mb in 15q23-q24.1 was detected in a boy with autism sharing many features with the other 13 subjects with the 15q24 microdeletion syndrome described to date. He exhibited intellectual disability, constant smiling, characteristic facial features (high anterior hairline, broad medial eyebrows, epicanthal folds, hypertelorism, full lower lip and protuberant, posteriorly rotated ears), single palmar crease, toe syndactyly and congenital nystagmus. The deletion breakpoints are atypical and lie outside previously characterized low copy repeats (69,838-72,897 Mb). Genotyping data revealed that the deletion had occurred in the paternal chromosome. Among the AGRE families, no large 15q24 deletions were observed. Conclusions From the current and previous studies, deletions in the 15q24 region represent rare causes of ASDs with an estimated frequency of 0.1 to 0.2% in individuals ascertained for ASDs, although the proportion might be higher in sporadic cases. These rates compare with a frequency of about 0.3% in

  11. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  12. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  13. An integrative structural health monitoring system for the local/global responses of a large-scale irregular building under construction.

    PubMed

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-07-15

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction.

  14. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success

    PubMed Central

    Scott-Dupree, Cynthia D.; Sultan, Maryam; McFarlane, Andrew D.; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5–2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012–2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  15. California's "5 a day--for better health!" campaign: an innovative population-based effort to effect large-scale dietary change.

    PubMed

    Foerster, S B; Kizer, K W; Disogra, L K; Bal, D G; Krieg, B F; Bunch, K L

    1995-01-01

    The annual toll of diet-related diseases in the United States is similar to that taken by tobacco, but less progress has been achieved in reaching the Public Health Service's Healthy People 2000 objectives for improving food consumption than for reducing tobacco use. In 1988, the California Department of Health Services embarked upon an innovative multi-year social marketing program to increase fruit and vegetable consumption. The 5 a Day--for Better Health! Campaign had several distinctive features, including its simple, positive, behavior-specific message to eat 5 servings of fruits and vegetables every day as part of a low-fat, high fiber diet; its use of mass media; its partnership between the state health department and the produce and supermarket industries; and its extensive use of point-of-purchase messages. Over its nearly three years of operation in California, the 5 a Day Campaign appears to have raised public awareness that fruits and vegetables help reduce cancer risk, increased fruit and vegetable consumption in major population segments, and created an ongoing partnership between public health and agribusiness that has allowed extension of the campaign to other population segments, namely children and Latino adults. In 1991 the campaign was adopted as a national initiative by the National Cancer Institute and the Produce for Better Health Foundation. By 1994, over 700 industry organizations and 48 states, territories, and the District of Columbia were licensed to participate. Preventive medicine practitioners and others involved in health promotion may build upon the 5 a Day Campaign experience in developing and implementing efforts to reach the nation's dietary goals.

  16. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  17. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  18. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  19. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  20. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  1. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  2. A Navajo health consumer survey.

    PubMed

    Stewart, T; May, P; Muneta, A

    1980-12-01

    The findings of a health consumer survey of 309 Navajo families in three areas of the Navajo Reservation are reported. The survey shows that access to facilities and lack of safe water and sanitary supplies are continuing problems for these families. The families show consistent use of Indian Health Service providers, particularly nurses, pharmacists and physicians, as well as traditional Navajo medicine practitioners. Only incidental utilization of private medical services is reported. Extended waiting times and translation from English to Navajo are major concerns in their contacts with providers. A surprisingly high availability of third-party insurance is noted. Comparisons are made between this data base and selected national and regional surveys, and with family surveys from other groups assumed to be disadvantaged in obtaining health care. The comparisons indicate somewhat lower utilization rates and more problems in access to care for this Navajo sample. The discussion suggests that attitudes regarding free health care eventually may be a factor for Navajo people and other groups, that cultural considerations are often ignored or accepted as truisms in delivering care, and that the Navajo Reservation may serve as a unique microcosm of health care in the U.S. PMID:7464299

  3. Psychological Resilience after Hurricane Sandy: The Influence of Individual- and Community-Level Factors on Mental Health after a Large-Scale Natural Disaster

    PubMed Central

    Lowe, Sarah R.; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13–16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience. PMID:25962178

  4. Psychological resilience after Hurricane Sandy: the influence of individual- and community-level factors on mental health after a large-scale natural disaster.

    PubMed

    Lowe, Sarah R; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13-16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience.

  5. Evaluating effective reaction rates of kinetically driven solutes in large-scale, anisotropic media: human health risk implications in CO2 leakage

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Maxwell, R. M.

    2011-12-01

    The role of high and low hydraulic conductivity (K) regions in heterogeneous, stratified and non-stratified flow fields and the subsequent effect of rate dependent geochemical reactions are investigated with regards to mobilized arsenic from CO2 leakage at a Carbon Capture and Storage (CCS) site. Following the methodology of previous work, human health risk is used as an endpoint for comparison via a two-stage or nested Monte Carlo scheme, explicitly considering joint uncertainty and variability for a hypothetical population of individuals. This study identifies geo-hydrologic conditions where solute reactions are either rate limited (non-reactive), in equilibrium (linear equilibrium assumption, LEA, is appropriate), or are sensitive to time-dependent kinetic reaction rates. Potential interplay between multiple parameters (i.e. positive or negative feedbacks) is shown utilizing stochastic ensembles. In particular, the effect of preferential flow pathways and solute mixing on the field-scale (macrodispersion) and sub-grid (local dispersion) is examined for varying degrees of stratification and regional groundwater velocities. Results show effective reaction rates of kinetic ensembles are dissimilar from LEA ensembles with the inclusion of local dispersion, resulting in an additive tailing effect of the solute plume, a retarded peak time, and an increased cancer risk. This discrepancy between kinetic and LEA ensembles is augmented in highly anisotropic media, especially at intermediate regional groundwater velocities. The distribution, magnitude, and associated uncertainty of cancer risk are controlled by these factors, but are also strongly dependent on the regional groundwater velocity. We demonstrate a higher associated uncertainty of cancer risk in stratified domains is linked to higher aquifer connectivity and less macrodispersion in the flow field. This study has implications in CCS site selection and groundwater driven risk assessment modeling.

  6. Study Protocol for the Fukushima Health Management Survey

    PubMed Central

    Yasumura, Seiji; Hosoya, Mitsuaki; Yamashita, Shunichi; Kamiya, Kenji; Abe, Masafumi; Akashi, Makoto; Kodama, Kazunori; Ozasa, Kotaro

    2012-01-01

    and birth survey. This long-term large-scale epidemiologic study is expected to provide valuable data in the investigation of the health effects of low-dose radiation and disaster-related stress. PMID:22955043

  7. Curvature constraints from large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  8. Automating large-scale reactor systems

    SciTech Connect

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig.

  9. Large-scale clustering of cosmic voids

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Hamaus, Nico; Desjacques, Vincent

    2014-11-01

    We study the clustering of voids using N -body simulations and simple theoretical models. The excursion-set formalism describes fairly well the abundance of voids identified with the watershed algorithm, although the void formation threshold required is quite different from the spherical collapse value. The void cross bias bc is measured and its large-scale value is found to be consistent with the peak background split results. A simple fitting formula for bc is found. We model the void auto-power spectrum taking into account the void biasing and exclusion effect. A good fit to the simulation data is obtained for voids with radii ≳30 Mpc h-1 , especially when the void biasing model is extended to 1-loop order. However, the best-fit bias parameters do not agree well with the peak-background results. Being able to fit the void auto-power spectrum is particularly important not only because it is the direct observable in galaxy surveys, but also our method enables us to treat the bias parameters as nuisance parameters, which are sensitive to the techniques used to identify voids.

  10. Young women's reproductive health survey.

    PubMed

    Lewis, H

    1987-08-12

    A survey of reproductive health issues was conducted on 15 year old Hutt Valley secondary school girls by means of a self-administered anonymous questionnaire. The prevalence of sexual intercourse in the sample was 29%. Sixteen percent of the sexually active respondents used no method of contraception. Knowledge of reproductive health facts and contraception was poor both amongst sexually experienced and inexperienced respondents. Twenty-six percent relied on peers for this information, with mothers, teachers and books being other important sources cited. Respondents requested more information on sexually transmitted diseases, contraception and sexual relationships. Most would like this information more readily accessible. Preferred sources of information mentioned were: parents, books, films/videos, family planning clinics and friends.

  11. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  12. Large-scale investment in green space as an intervention for physical activity, mental and cardiometabolic health: study protocol for a quasi-experimental evaluation of a natural experiment

    PubMed Central

    Astell-Burt, Thomas; Feng, Xiaoqi; Kolt, Gregory S

    2016-01-01

    Introduction ‘Green spaces’ such as public parks are regarded as determinants of health, but evidence from tends to be based on cross-sectional designs. This protocol describes a study that will evaluate a large-scale investment in approximately 5280 hectares of green space stretching 27 km north to south in Western Sydney, Australia. Methods and analysis A Geographic Information System was used to identify 7272 participants in the 45 and Up Study baseline data (2006–2008) living within 5 km of the Western Sydney Parklands and some of the features that have been constructed since 2009, such as public access points, advertising billboards, walking and cycle tracks, BBQ stations, and children's playgrounds. These data were linked to information on a range of health and behavioural outcomes, with the second wave of data collection initiated by the Sax Institute in 2012 and expected to be completed by 2015. Multilevel models will be used to analyse potential change in physical activity, weight status, social contacts, mental and cardiometabolic health within a closed sample of residentially stable participants. Comparisons between persons with contrasting proximities to different areas of the Parklands will provide ‘treatment’ and ‘control’ groups within a ‘quasi-experimental’ study design. In line with expectations, baseline results prior to the enhancement of the Western Sydney Parklands indicated virtually no significant differences in the distribution of any of the outcomes with respect to proximity to green space preintervention. Ethics and dissemination Ethical approval was obtained for the 45 and Up Study from the University of New South Wales Human Research Ethics Committee. Ethics approval for this study was obtained from the University of Western Sydney Ethics Committee. Findings will be disseminated through partner organisations (the Western Sydney Parklands and the National Heart Foundation of Australia), as well as to policymakers in

  13. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  14. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  15. Large-scale motions in the universe

    SciTech Connect

    Rubin, V.C.; Coyne, G.V.

    1988-01-01

    The present conference on the large-scale motions of the universe discusses topics on the problems of two-dimensional and three-dimensional structures, large-scale velocity fields, the motion of the local group, small-scale microwave fluctuations, ab initio and phenomenological theories, and properties of galaxies at high and low Z. Attention is given to the Pisces-Perseus supercluster, large-scale structure and motion traced by galaxy clusters, distances to galaxies in the field, the origin of the local flow of galaxies, the peculiar velocity field predicted by the distribution of IRAS galaxies, the effects of reionization on microwave background anisotropies, the theoretical implications of cosmological dipoles, and n-body simulations of universe dominated by cold dark matter.

  16. Large-scale nanophotonic phased array.

    PubMed

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  17. Large Scale Shape Optimization for Accelerator Cavities

    SciTech Connect

    Akcelik, Volkan; Lee, Lie-Quan; Li, Zenghai; Ng, Cho; Xiao, Li-Ling; Ko, Kwok; /SLAC

    2011-12-06

    We present a shape optimization method for designing accelerator cavities with large scale computations. The objective is to find the best accelerator cavity shape with the desired spectral response, such as with the specified frequencies of resonant modes, field profiles, and external Q values. The forward problem is the large scale Maxwell equation in the frequency domain. The design parameters are the CAD parameters defining the cavity shape. We develop scalable algorithms with a discrete adjoint approach and use the quasi-Newton method to solve the nonlinear optimization problem. Two realistic accelerator cavity design examples are presented.

  18. Sensitivity analysis for large-scale problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  19. ARPACK: Solving large scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Lehoucq, Rich; Maschhoff, Kristi; Sorensen, Danny; Yang, Chao

    2013-11-01

    ARPACK is a collection of Fortran77 subroutines designed to solve large scale eigenvalue problems. The package is designed to compute a few eigenvalues and corresponding eigenvectors of a general n by n matrix A. It is most appropriate for large sparse or structured matrices A where structured means that a matrix-vector product w

  20. A Large Scale Computer Terminal Output Controller.

    ERIC Educational Resources Information Center

    Tucker, Paul Thomas

    This paper describes the design and implementation of a large scale computer terminal output controller which supervises the transfer of information from a Control Data 6400 Computer to a PLATO IV data network. It discusses the cost considerations leading to the selection of educational television channels rather than telephone lines for…

  1. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  2. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  3. Estimating health expenditure shares from household surveys

    PubMed Central

    Brooks, Benjamin PC; Hanlon, Michael

    2013-01-01

    Abstract Objective To quantify the effects of household expenditure survey characteristics on the estimated share of a household’s expenditure devoted to health. Methods A search was conducted for all country surveys reporting data on health expenditure and total household expenditure. Data on total expenditure and health expenditure were extracted from the surveys to generate the health expenditure share (i.e. fraction of the household expenditure devoted to health). To do this the authors relied on survey microdata or survey reports to calculate the health expenditure share for the particular instrument involved. Health expenditure share was modelled as a function of the survey’s recall period, the number of health expenditure items, the number of total expenditure items, the data collection method and the placement of the health module within the survey. Data exists across space and time, so fixed effects for territory and year were included as well. The model was estimated by means of ordinary least squares regression with clustered standard errors. Findings A one-unit increase in the number of health expenditure questions was accompanied by a 1% increase in the estimated health expenditure share. A one-unit increase in the number of non-health expenditure questions resulted in a 0.2% decrease in the estimated share. Increasing the recall period by one month was accompanied by a 6% decrease in the health expenditure share. Conclusion The characteristics of a survey instrument examined in the study affect the estimate of the health expenditure share. Those characteristics need to be accounted for when comparing results across surveys within a territory and, ultimately, across territories. PMID:23825879

  4. ASHA Survey of Health Curriculum Needs: Survey Results.

    ERIC Educational Resources Information Center

    Schneider, Livingston S.; Thier, Herbert D.

    The results of a survey conducted by the Ad hoc Committee to Study the Needs and Problems of the Classroom Teacher in Curriculum Development are reported. Questionnaires were sent to members of the American School Health Association (ASHA). The survey was composed of four sections: (1) background information on demographic data, institutional…

  5. Large-scale Advanced Propfan (LAP) program

    NASA Technical Reports Server (NTRS)

    Sagerser, D. A.; Ludemann, S. G.

    1985-01-01

    The propfan is an advanced propeller concept which maintains the high efficiencies traditionally associated with conventional propellers at the higher aircraft cruise speeds associated with jet transports. The large-scale advanced propfan (LAP) program extends the research done on 2 ft diameter propfan models to a 9 ft diameter article. The program includes design, fabrication, and testing of both an eight bladed, 9 ft diameter propfan, designated SR-7L, and a 2 ft diameter aeroelastically scaled model, SR-7A. The LAP program is complemented by the propfan test assessment (PTA) program, which takes the large-scale propfan and mates it with a gas generator and gearbox to form a propfan propulsion system and then flight tests this system on the wing of a Gulfstream 2 testbed aircraft.

  6. Fractals and cosmological large-scale structure

    NASA Technical Reports Server (NTRS)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  7. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  8. Large-scale fibre-array multiplexing

    SciTech Connect

    Cheremiskin, I V; Chekhlova, T K

    2001-05-31

    The possibility of creating a fibre multiplexer/demultiplexer with large-scale multiplexing without any basic restrictions on the number of channels and the spectral spacing between them is shown. The operating capacity of a fibre multiplexer based on a four-fibre array ensuring a spectral spacing of 0.7 pm ({approx} 10 GHz) between channels is demonstrated. (laser applications and other topics in quantum electronics)

  9. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project--design, population and data harmonization of a large-scale, international study.

    PubMed

    Boffetta, Paolo; Bobak, Martin; Borsch-Supan, Axel; Brenner, Hermann; Eriksson, Sture; Grodstein, Fran; Jansen, Eugene; Jenab, Mazda; Juerges, Hendrik; Kampman, Ellen; Kee, Frank; Kuulasmaa, Kari; Park, Yikyung; Tjonneland, Anne; van Duijn, Cornelia; Wilsgaard, Tom; Wolk, Alicja; Trichopoulos, Dimitrios; Bamia, Christina; Trichopoulou, Antonia

    2014-12-01

    There is a public health demand to prevent health conditions which lead to increased morbidity and mortality among the rapidly-increasing elderly population. Data for the incidence of such conditions exist in cohort studies worldwide, which, however, differ in various aspects. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project aims at harmonizing data from existing major longitudinal studies for the elderly whilst focussing on cardiovascular diseases, diabetes mellitus, cancer, fractures and cognitive impairment in order to estimate their prevalence, incidence and cause-specific mortality, and identify lifestyle, socioeconomic, and genetic determinants and biomarkers for the incidence of and mortality from these conditions. A survey instrument assessing ageing-related conditions of the elderly will be also developed. Fourteen cohort studies participate in CHANCES with 683,228 elderly (and 150,210 deaths), from 23 European and three non-European countries. So far, 287 variables on health conditions and a variety of exposures, including biomarkers and genetic data have been harmonized. Different research hypotheses are investigated with meta-analyses. The results which will be produced can help international organizations, governments and policy-makers to better understand the broader implications and consequences of ageing and thus make informed decisions.

  10. Development of the adult and child complementary medicine questionnaires fielded on the National Health Interview Survey.

    PubMed

    Stussman, Barbara J; Bethell, Christina D; Gray, Caroline; Nahin, Richard L

    2013-11-23

    The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications.

  11. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  12. Large-scale neuromorphic computing systems.

    PubMed

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers. PMID:27529195

  13. Large-Scale Visual Data Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  14. Large scale processes in the solar nebula.

    NASA Astrophysics Data System (ADS)

    Boss, A. P.

    Most proposed chondrule formation mechanisms involve processes occurring inside the solar nebula, so the large scale (roughly 1 to 10 AU) structure of the nebula is of general interest for any chrondrule-forming mechanism. Chondrules and Ca, Al-rich inclusions (CAIs) might also have been formed as a direct result of the large scale structure of the nebula, such as passage of material through high temperature regions. While recent nebula models do predict the existence of relatively hot regions, the maximum temperatures in the inner planet region may not be high enough to account for chondrule or CAI thermal processing, unless the disk mass is considerably greater than the minimum mass necessary to restore the planets to solar composition. Furthermore, it does not seem to be possible to achieve both rapid heating and rapid cooling of grain assemblages in such a large scale furnace. However, if the accretion flow onto the nebula surface is clumpy, as suggested by observations of variability in young stars, then clump-disk impacts might be energetic enough to launch shock waves which could propagate through the nebula to the midplane, thermally processing any grain aggregates they encounter, and leaving behind a trail of chondrules.

  15. Planned NLM/AHCPR large-scale vocabulary test: using UMLS technology to determine the extent to which controlled vocabularies cover terminology needed for health care and public health.

    PubMed Central

    Humphreys, B L; Hole, W T; McCray, A T; Fitzmaurice, J M

    1996-01-01

    The National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR) are sponsoring a test to determine the extent to which a combination of existing health-related terminologies covers vocabulary needed in health information systems. The test vocabularies are the 30 that are fully or partially represented in the 1996 edition of the Unified Medical Language System (UMLS) Metathesaurus, plus three planned additions: the portions of SNOMED International not in the 1996 Metathesaurus Read Clinical Classification, and the Logical Observations Identifiers, Names, and Codes (LOINC) system. These vocabularies are available to testers through a special interface to the Internet-based UMLS Knowledge Source Server. The test will determine the ability of the test vocabularies to serve as a source of controlled vocabulary for health data systems and applications. It should provide the basis for realistic resource estimates for developing and maintaining a comprehensive "standard" health vocabulary that is based on existing terminologies. PMID:8816351

  16. The 2013 Canadian Forces Mental Health Survey

    PubMed Central

    Bennett, Rachel E.; Boulos, David; Garber, Bryan G.; Jetly, Rakesh; Sareen, Jitender

    2016-01-01

    Objective: The 2013 Canadian Forces Mental Health Survey (CFMHS) collected detailed information on mental health problems, their impacts, occupational and nonoccupational determinants of mental health, and the use of mental health services from a random sample of 8200 serving personnel. The objective of this article is to provide a firm scientific foundation for understanding and interpreting the CFMHS findings. Methods: This narrative review first provides a snapshot of the Canadian Armed Forces (CAF), focusing on 2 key determinants of mental health: the deployment of more than 40,000 personnel in support of the mission in Afghanistan and the extensive renewal of the CAF mental health system. The findings of recent population-based CAF mental health research are reviewed, with a focus on findings from the very similar mental health survey done in 2002. Finally, key aspects of the methods of the 2013 CFMHS are presented. Results: The findings of 20 peer-reviewed publications using the 2002 mental health survey data are reviewed, along with those of 25 publications from other major CAF mental health research projects executed over the past decade. Conclusions: More than a decade of population-based mental health research in the CAF has provided a detailed picture of its mental health and use of mental health services. This knowledge base and the homology of the 2013 survey with the 2002 CAF survey and general population surveys in 2002 and 2012 will provide an unusual opportunity to use the CFMHS to situate mental health in the CAF in a historical and societal perspective. PMID:27270738

  17. Post-disaster mental health need assessment surveys - the challenge of improved future research.

    PubMed

    Kessler, Ronald C; Wittchen, Hans-Ulrich

    2008-12-01

    Disasters are very common occurrences, becoming increasingly prevalent throughout the world. The number of natural disasters either affecting more than 100 people or resulting in a call for international assistance, increased from roughly 100 per year worldwide in the late 1960s, to over 500 per year in the past decade. Population growth, environmental degradation, and global warming all play parts in accounting for these increases. There is also the possibility of a pandemic. This paper and associated journal issue focuses on the topic of growing worldwide importance: mental health needs assessment in the wake of large-scale disasters. Although natural and human-made disasters are known to have substantial effects on the mental health of the people who experience them, research shows that the prevalence of post-disaster psychopathology varies enormously from one disaster to another in ways that are difficult to predict merely by knowing the objective circumstances of the disaster. Mental health needs assessment surveys are consequently carried out after many large-scale natural and human-made disasters to provide information for service planners on the nature and magnitude of need for services. These surveys vary greatly, though, in the rigor with which they assess disaster-related stressors and post-disaster mental illness. Synthesis of findings across surveys is hampered by these inconsistencies. The typically limited focus of these surveys with regard to the inclusion of risk factors, follow-up assessments, and evaluations of treatment, also limit insights from these surveys concerning post-disaster mental illness and treatment response. The papers in this issue discuss methodological issues in the design and implementation of post-disaster mental health needs assessment surveys aimed at improving on the quality of previous such surveys. The many recommendations in these papers will hopefully help to foster improvements in the next generation of post

  18. Post-disaster mental health need assessment surveys - the challenge of improved future research.

    PubMed

    Kessler, Ronald C; Wittchen, Hans-Ulrich

    2008-12-01

    Disasters are very common occurrences, becoming increasingly prevalent throughout the world. The number of natural disasters either affecting more than 100 people or resulting in a call for international assistance, increased from roughly 100 per year worldwide in the late 1960s, to over 500 per year in the past decade. Population growth, environmental degradation, and global warming all play parts in accounting for these increases. There is also the possibility of a pandemic. This paper and associated journal issue focuses on the topic of growing worldwide importance: mental health needs assessment in the wake of large-scale disasters. Although natural and human-made disasters are known to have substantial effects on the mental health of the people who experience them, research shows that the prevalence of post-disaster psychopathology varies enormously from one disaster to another in ways that are difficult to predict merely by knowing the objective circumstances of the disaster. Mental health needs assessment surveys are consequently carried out after many large-scale natural and human-made disasters to provide information for service planners on the nature and magnitude of need for services. These surveys vary greatly, though, in the rigor with which they assess disaster-related stressors and post-disaster mental illness. Synthesis of findings across surveys is hampered by these inconsistencies. The typically limited focus of these surveys with regard to the inclusion of risk factors, follow-up assessments, and evaluations of treatment, also limit insights from these surveys concerning post-disaster mental illness and treatment response. The papers in this issue discuss methodological issues in the design and implementation of post-disaster mental health needs assessment surveys aimed at improving on the quality of previous such surveys. The many recommendations in these papers will hopefully help to foster improvements in the next generation of post

  19. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  20. The Cardiff health survey: teaching survey methodology by participation.

    PubMed

    Lewis, P A; Charny, M

    1987-01-01

    Medical students were taught survey methodology by participating in all phases of a large community survey. The survey examined health beliefs, knowledge and behaviour in a sample of 5150 people drawn from the electoral register of the City of Cardiff. The study achieved several educational objectives for the medical students: they met well people in their own homes and had an opportunity to get to know a community; by taking part in a study from the initial phases to the conclusion they could appreciate the context of the theoretical teaching they were being given concurrently in their undergraduate course; they learnt to analyse raw data and produce reports; and they gained insights into the health knowledge, behaviour, attitudes and beliefs of a population. In addition, the survey produced a substantial quantity of valuable data which staff and students are analysing and intend to publish. PMID:3423507

  1. The Cardiff health survey: teaching survey methodology by participation.

    PubMed

    Lewis, P A; Charny, M

    1987-01-01

    Medical students were taught survey methodology by participating in all phases of a large community survey. The survey examined health beliefs, knowledge and behaviour in a sample of 5150 people drawn from the electoral register of the City of Cardiff. The study achieved several educational objectives for the medical students: they met well people in their own homes and had an opportunity to get to know a community; by taking part in a study from the initial phases to the conclusion they could appreciate the context of the theoretical teaching they were being given concurrently in their undergraduate course; they learnt to analyse raw data and produce reports; and they gained insights into the health knowledge, behaviour, attitudes and beliefs of a population. In addition, the survey produced a substantial quantity of valuable data which staff and students are analysing and intend to publish.

  2. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  3. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  4. Large-scale brightenings associated with flares

    NASA Technical Reports Server (NTRS)

    Mandrini, Cristina H.; Machado, Marcos E.

    1992-01-01

    It is shown that large-scale brightenings (LSBs) associated with solar flares, similar to the 'giant arches' discovered by Svestka et al. (1982) in images obtained by the SSM HXIS hours after the onset of two-ribbon flares, can also occur in association with confined flares in complex active regions. For these events, a clear link between the LSB and the underlying flare is clearly evident from the active-region magnetic field topology. The implications of these findings are discussed within the framework of the interacting loops of flares and the giant arch phenomenology.

  5. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  6. Large-scale dynamics and global warming

    SciTech Connect

    Held, I.M. )

    1993-02-01

    Predictions of future climate change raise a variety of issues in large-scale atmospheric and oceanic dynamics. Several of these are reviewed in this essay, including the sensitivity of the circulation of the Atlantic Ocean to increasing freshwater input at high latitudes; the possibility of greenhouse cooling in the southern oceans; the sensitivity of monsoonal circulations to differential warming of the two hemispheres; the response of midlatitude storms to changing temperature gradients and increasing water vapor in the atmosphere; and the possible importance of positive feedback between the mean winds and eddy-induced heating in the polar stratosphere.

  7. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  8. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  9. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  10. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    NASA Technical Reports Server (NTRS)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  11. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  12. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    NASA Astrophysics Data System (ADS)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  13. Health Occupations Education. Survey of Critical Issues.

    ERIC Educational Resources Information Center

    American Vocational Association, Washington, DC. Health Occupations Education Div.

    A survey of the members of the American Vocational Association-Health Occupations Education (AVA-HOE) was conducted to identify critical issues concerning health occupations, establish the order of priority of these issues, and determine a position regarding each issue that was reflective of the opinion of the AVA-HOE members. Each member of the…

  14. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  15. Large scale study of tooth enamel

    SciTech Connect

    Bodart, F.; Deconninck, G.; Martin, M.Th.

    1981-04-01

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. One hundred eighty samples of teeth were first analysed using PIXE, backscattering and nuclear reaction techniques. The results were analysed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population.

  16. Batteries for Large Scale Energy Storage

    SciTech Connect

    Soloveichik, Grigorii L.

    2011-07-15

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β”-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  17. Large-scale databases of proper names.

    PubMed

    Conley, P; Burgess, C; Hage, D

    1999-05-01

    Few tools for research in proper names have been available--specifically, there is no large-scale corpus of proper names. Two corpora of proper names were constructed, one based on U.S. phone book listings, the other derived from a database of Usenet text. Name frequencies from both corpora were compared with human subjects' reaction times (RTs) to the proper names in a naming task. Regression analysis showed that the Usenet frequencies contributed to predictions of human RT, whereas phone book frequencies did not. In addition, semantic neighborhood density measures derived from the HAL corpus were compared with the subjects' RTs and found to be a better predictor of RT than was frequency in either corpus. These new corpora are freely available on line for download. Potentials for these corpora range from using the names as stimuli in experiments to using the corpus data in software applications. PMID:10495803

  18. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  19. Large-scale simulations of reionization

    SciTech Connect

    Kohler, Katharina; Gnedin, Nickolay Y.; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  20. Large scale water lens for solar concentration.

    PubMed

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation. PMID:26072893

  1. Large scale structures in transitional pipe flow

    NASA Astrophysics Data System (ADS)

    Hellström, Leo; Ganapathisubramani, Bharathram; Smits, Alexander

    2015-11-01

    We present a dual-plane snapshot POD analysis of transitional pipe flow at a Reynolds number of 3440, based on the pipe diameter. The time-resolved high-speed PIV data were simultaneously acquired in two planes, a cross-stream plane (2D-3C) and a streamwise plane (2D-2C) on the pipe centerline. The two light sheets were orthogonally polarized, allowing particles situated in each plane to be viewed independently. In the snapshot POD analysis, the modal energy is based on the cross-stream plane, while the POD modes are calculated using the dual-plane data. We present results on the emergence and decay of the energetic large scale motions during transition to turbulence, and compare these motions to those observed in fully developed turbulent flow. Supported under ONR Grant N00014-13-1-0174 and ERC Grant No. 277472.

  2. Challenges in large scale distributed computing: bioinformatics.

    SciTech Connect

    Disz, T.; Kubal, M.; Olson, R.; Overbeek, R.; Stevens, R.; Mathematics and Computer Science; Univ. of Chicago; The Fellowship for the Interpretation of Genomes

    2005-01-01

    The amount of genomic data available for study is increasing at a rate similar to that of Moore's law. This deluge of data is challenging bioinformaticians to develop newer, faster and better algorithms for analysis and examination of this data. The growing availability of large scale computing grids coupled with high-performance networking is challenging computer scientists to develop better, faster methods of exploiting parallelism in these biological computations and deploying them across computing grids. In this paper, we describe two computations that are required to be run frequently and which require large amounts of computing resource to complete in a reasonable time. The data for these computations are very large and the sequential computational time can exceed thousands of hours. We show the importance and relevance of these computations, the nature of the data and parallelism and we show how we are meeting the challenge of efficiently distributing and managing these computations in the SEED project.

  3. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  4. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  5. Large scale water lens for solar concentration.

    PubMed

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation.

  6. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  7. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  8. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  9. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  10. Afghan Health Education Project: a community survey.

    PubMed

    Lipson, J G; Omidian, P A; Paul, S M

    1995-06-01

    This study assessed the health concerns and needs for health education in the Afghan refugee and immigrant community of the San Francisco Bay Area. The study used a telephone survey, seven community meetings and a survey administered to 196 Afghan families through face-to-face interviews. Data were analyzed qualitatively and statistically. Health problems of most concern are mental health problems and stress related to past refugee trauma and loss, current occupational and economic problems, and culture conflict. Physical health problems include heart disease, diabetes and dental problems. Needed health education topics include dealing with stress, heart health, nutrition, raising children in the United States (particularly adolescents), aging in the United States, and diabetes. Using coalition building and involving Afghans in their community assessment, we found that the Afghan community is eager for culture- and language-appropriate health education programs through videos, television, lectures, and written materials. Brief health education talks in community meetings and a health fair revealed enthusiasm and willingness to consider health promotion and disease-prevention practices. PMID:7596962

  11. [National Strategic Promotion for Large-Scale Clinical Cancer Research].

    PubMed

    Toyama, Senya

    2016-04-01

    The number of clinical research by clinical cancer study groups has been decreasing this year in Japan. They say the reason is the abolition of donations to the groups from the pharmaceutical companies after the Diovan scandal. But I suppose fundamental problem is that government-supported large-scale clinical cancer study system for evidence based medicine (EBM) has not been fully established. An urgent establishment of the system based on the national strategy is needed for the cancer patients and the public health promotion.

  12. Identification of Extremely Large Scale Structures in SDSS-III

    NASA Astrophysics Data System (ADS)

    Sankhyayan, Shishir; Bagchi, J.; Sarkar, P.; Sahni, V.; Jacob, J.

    2016-10-01

    We have initiated the search and detailed study of large scale structures present in the universe using galaxy redshift surveys. In this process, we take the volume-limited sample of galaxies from Sloan Digital Sky Survey III and find very large structures even beyond the redshift of 0.2. One of the structures is even greater than 600 Mpc which raises a question on the homogeneity scale of the universe. The shapes of voids-structures (adjacent to each other) seem to be correlated, which supports the physical existence of the observed structures. The other observational supports include galaxy clusters' and QSO distribution's correlation with the density peaks of the volume limited sample of galaxies.

  13. Large-Scale Statistics for Cu Electromigration

    NASA Astrophysics Data System (ADS)

    Hauschildt, M.; Gall, M.; Hernandez, R.

    2009-06-01

    Even after the successful introduction of Cu-based metallization, the electromigration failure risk has remained one of the important reliability concerns for advanced process technologies. The observation of strong bimodality for the electron up-flow direction in dual-inlaid Cu interconnects has added complexity, but is now widely accepted. The failure voids can occur both within the via ("early" mode) or within the trench ("late" mode). More recently, bimodality has been reported also in down-flow electromigration, leading to very short lifetimes due to small, slit-shaped voids under vias. For a more thorough investigation of these early failure phenomena, specific test structures were designed based on the Wheatstone Bridge technique. The use of these structures enabled an increase of the tested sample size close to 675000, allowing a direct analysis of electromigration failure mechanisms at the single-digit ppm regime. Results indicate that down-flow electromigration exhibits bimodality at very small percentage levels, not readily identifiable with standard testing methods. The activation energy for the down-flow early failure mechanism was determined to be 0.83±0.02 eV. Within the small error bounds of this large-scale statistical experiment, this value is deemed to be significantly lower than the usually reported activation energy of 0.90 eV for electromigration-induced diffusion along Cu/SiCN interfaces. Due to the advantages of the Wheatstone Bridge technique, we were also able to expand the experimental temperature range down to 150° C, coming quite close to typical operating conditions up to 125° C. As a result of the lowered activation energy, we conclude that the down-flow early failure mode may control the chip lifetime at operating conditions. The slit-like character of the early failure void morphology also raises concerns about the validity of the Blech-effect for this mechanism. A very small amount of Cu depletion may cause failure even before a

  14. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  15. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  16. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  17. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  18. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  19. Simulations of Large Scale Structures in Cosmology

    NASA Astrophysics Data System (ADS)

    Liao, Shihong

    Large-scale structures are powerful probes for cosmology. Due to the long range and non-linear nature of gravity, the formation of cosmological structures is a very complicated problem. The only known viable solution is cosmological N-body simulations. In this thesis, we use cosmological N-body simulations to study structure formation, particularly dark matter haloes' angular momenta and dark matter velocity field. The origin and evolution of angular momenta is an important ingredient for the formation and evolution of haloes and galaxies. We study the time evolution of the empirical angular momentum - mass relation for haloes to offer a more complete picture about its origin, dependences on cosmological models and nonlinear evolutions. We also show that haloes follow a simple universal specific angular momentum profile, which is useful in modelling haloes' angular momenta. The dark matter velocity field will become a powerful cosmological probe in the coming decades. However, theoretical predictions of the velocity field rely on N-body simulations and thus may be affected by numerical artefacts (e.g. finite box size, softening length and initial conditions). We study how such numerical effects affect the predicted pairwise velocities, and we propose a theoretical framework to understand and correct them. Our results will be useful for accurately comparing N-body simulations to observational data of pairwise velocities.

  20. Backscatter in Large-Scale Flows

    NASA Astrophysics Data System (ADS)

    Nadiga, Balu

    2009-11-01

    Downgradient mixing of potential-voriticity and its variants are commonly employed to model the effects of unresolved geostrophic turbulence on resolved scales. This is motivated by the (inviscid and unforced) particle-wise conservation of potential-vorticity and the mean forward or down-scale cascade of potential enstrophy in geostrophic turubulence. By examining the statistical distribution of the transfer of potential enstrophy from mean or filtered motions to eddy or sub-filter motions, we find that the mean forward cascade results from the forward-scatter being only slightly greater than the backscatter. Downgradient mixing ideas, do not recognize such equitable mean-eddy or large scale-small scale interactions and consequently model only the mean effect of forward cascade; the importance of capturing the effects of backscatter---the forcing of resolved scales by unresolved scales---are only beginning to be recognized. While recent attempts to model the effects of backscatter on resolved scales have taken a stochastic approach, our analysis suggests that these effects are amenable to being modeled deterministically.

  1. Large scale molecular simulations of nanotoxicity.

    PubMed

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells.

  2. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  3. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  4. Large-scale wind turbine structures

    NASA Astrophysics Data System (ADS)

    Spera, David A.

    1988-05-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  5. Health Physics Enrollents and Degrees Survey, 2006 Data

    SciTech Connect

    Oak Ridge Institute for Science and Education

    2007-03-31

    This annual survey collects 2006 data on the number of health physics degrees awarded as well as the number of students enrolled in health physics academic programs. Thirty universities offer health physics degrees; all responded to the survey.

  6. Simulating the large-scale structure of HI intensity maps

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 20483 particles (particle mass 1.6 × 1011 Msolar / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (108 Msolar / h < Mhalo < 1013 Msolar / h), we assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 lesssim z lesssim 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  7. The effective field theory of cosmological large scale structures

    SciTech Connect

    Carrasco, John Joseph M.; Hertzberg, Mark P.; Senatore, Leonardo

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  8. Halo detection via large-scale Bayesian inference

    NASA Astrophysics Data System (ADS)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  9. Locally Biased Galaxy Formation and Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Narayanan, Vijay K.; Berlind, Andreas A.; Weinberg, David H.

    2000-01-01

    We examine the influence of the morphology-density relation and a wide range of simple models for biased galaxy formation on statistical measures of large-scale structure. We contrast the behavior of local biasing models, in which the efficiency of galaxy formation is determined by the density, geometry, or velocity dispersion of the local mass distribution, with that of nonlocal biasing models, in which galaxy formation is modulated coherently over scales larger than the galaxy correlation length. If morphological segregation of galaxies is governed by a local morphology-density relation, then the correlation function of E/S0 galaxies should be steeper and stronger than that of spiral galaxies on small scales, as observed, while on large scales the E/S0 and spiral galaxies should have correlation functions with the same shape but different amplitudes. Similarly, all of our local bias models produce scale-independent amplification of the correlation function and power spectrum in the linear and mildly nonlinear regimes; only a nonlocal biasing mechanism can alter the shape of the power spectrum on large scales. Moments of the biased galaxy distribution retain the hierarchical pattern of the mass moments, but biasing alters the values and scale dependence of the hierarchical amplitudes S3 and S4. Pair-weighted moments of the galaxy velocity distribution are sensitive to the details of the bias prescription even if galaxies have the same local velocity distribution as the underlying dark matter. The nonlinearity of the relation between galaxy density and mass density depends on the biasing prescription and the smoothing scale, and the scatter in this relation is a useful diagnostic of the physical parameters that determine the bias. While the assumption that galaxy formation is governed by local physics leads to some important simplifications on large scales, even local biasing is a multifaceted phenomenon whose impact cannot be described by a single parameter or

  10. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  11. Korea Community Health Survey Data Profiles.

    PubMed

    Kang, Yang Wha; Ko, Yun Sil; Kim, Yoo Jin; Sung, Kyoung Mi; Kim, Hyo Jin; Choi, Hyung Yun; Sung, Changhyun; Jeong, Eunkyeong

    2015-06-01

    In 2008, Korea Centers for Disease Control and Prevention initiated the first nationwide survey, Korea Community Health Survey (KCHS), to provide data that could be used to plan, implement, monitor, and evaluate community health promotion and disease prevention programs. This community-based cross-sectional survey has been conducted by 253 community health centers, 35 community universities, and 1500 interviewers. The KCHS standardized questionnaire was developed jointly by the Korea Centers for Disease Control and Prevention staff, a working group of health indicators standardization subcommittee, and 16 metropolitan cities and provinces with 253 regional sites. The questionnaire covers a variety of topics related to health behaviors and prevention, which is used to assess the prevalence of personal health practices and behaviors related to the leading causes of disease, including smoking, alcohol use, drinking and driving, high blood pressure control, physical activity, weight control, quality of life (European Quality of Life-5 Dimensions, European Quality of Life-Visual Analogue Scale, Korean Instrumental Activities of Daily Living ), medical service, accident, injury, etc. The KCHS was administered by trained interviewers, and the quality control of the KCHS was improved by the introduction of a computer-assisted personal interview in 2010. The KCHS data allow a direct comparison of the differences of health issues among provinces. Furthermore, the provinces can use these data for their own cost-effective health interventions to improve health promotion and disease prevention. For users and researchers throughout the world, microdata (in the form of SAS files) and analytic guidelines can be downloaded from the KCHS website (http://KCHS.cdc.go.kr/) in Korean. PMID:26430619

  12. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  13. International space station. Large scale integration approach

    NASA Astrophysics Data System (ADS)

    Cohen, Brad

    The International Space Station is the most complex large scale integration program in development today. The approach developed for specification, subsystem development, and verification lay a firm basis on which future programs of this nature can be based. International Space Station is composed of many critical items, hardware and software, built by numerous International Partners, NASA Institutions, and U.S. Contractors and is launched over a period of five years. Each launch creates a unique configuration that must be safe, survivable, operable, and support ongoing assembly (assemblable) to arrive at the assembly complete configuration in 2003. The approaches to integrating each of the modules into a viable spacecraft and continue the assembly is a challenge in itself. Added to this challenge are the severe schedule constraints and lack of an "Iron Bird", which prevents assembly and checkout of each on-orbit configuration prior to launch. This paper will focus on the following areas: 1) Specification development process explaining how the requirements and specifications were derived using a modular concept driven by launch vehicle capability. Each module is composed of components of subsystems versus completed subsystems. 2) Approach to stage (each stage consists of the launched module added to the current on-orbit spacecraft) specifications. Specifically, how each launched module and stage ensures support of the current and future elements of the assembly. 3) Verification approach, due to the schedule constraints, is primarily analysis supported by testing. Specifically, how are the interfaces ensured to mate and function on-orbit when they cannot be mated before launch. 4) Lessons learned. Where can we improve this complex system design and integration task?

  14. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  15. Disaster triage systems for large-scale catastrophic events.

    PubMed

    Bostick, Nathan A; Subbarao, Italo; Burkle, Frederick M; Hsu, Edbert B; Armstrong, John H; James, James J

    2008-09-01

    Large-scale catastrophic events typically result in a scarcity of essential medical resources and accordingly necessitate the implementation of triage management policies to minimize preventable morbidity and mortality. Accomplishing this goal requires a reconceptualization of triage as a population-based systemic process that integrates care at all points of interaction between patients and the health care system. This system identifies at minimum 4 orders of contact: first order, the community; second order, prehospital; third order, facility; and fourth order, regional level. Adopting this approach will ensure that disaster response activities will occur in a comprehensive fashion that minimizes the patient care burden at each subsequent order of intervention and reduces the overall need to ration care. The seamless integration of all orders of intervention within this systems-based model of disaster-specific triage, coordinated through health emergency operations centers, can ensure that disaster response measures are undertaken in a manner that is effective, just, and equitable. PMID:18769264

  16. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  17. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  18. Nonzero Density-Velocity Consistency Relations for Large Scale Structures.

    PubMed

    Rizzo, Luca Alberto; Mota, David F; Valageas, Patrick

    2016-08-19

    We present exact kinematic consistency relations for cosmological structures that do not vanish at equal times and can thus be measured in surveys. These rely on cross correlations between the density and velocity, or momentum, fields. Indeed, the uniform transport of small-scale structures by long-wavelength modes, which cannot be detected at equal times by looking at density correlations only, gives rise to a shift in the amplitude of the velocity field that could be measured. These consistency relations only rely on the weak equivalence principle and Gaussian initial conditions. They remain valid in the nonlinear regime and for biased galaxy fields. They can be used to constrain nonstandard cosmological scenarios or the large-scale galaxy bias. PMID:27588842

  19. Nonzero Density-Velocity Consistency Relations for Large Scale Structures

    NASA Astrophysics Data System (ADS)

    Rizzo, Luca Alberto; Mota, David F.; Valageas, Patrick

    2016-08-01

    We present exact kinematic consistency relations for cosmological structures that do not vanish at equal times and can thus be measured in surveys. These rely on cross correlations between the density and velocity, or momentum, fields. Indeed, the uniform transport of small-scale structures by long-wavelength modes, which cannot be detected at equal times by looking at density correlations only, gives rise to a shift in the amplitude of the velocity field that could be measured. These consistency relations only rely on the weak equivalence principle and Gaussian initial conditions. They remain valid in the nonlinear regime and for biased galaxy fields. They can be used to constrain nonstandard cosmological scenarios or the large-scale galaxy bias.

  20. The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation

    NASA Astrophysics Data System (ADS)

    Noh, Yookyung

    The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.

  1. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. PMID:26595397

  2. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model.

  3. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  4. Linking Large-Scale Reading Assessments: Comment

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  5. Indonesian survey looks at adolescent reproductive health.

    PubMed

    Achmad, S I; Westley, S B

    1999-10-01

    The Baseline Survey of Young Adult Reproductive Welfare in Indonesia, conducted from September to December 1998, provides information about young Indonesians on topics concerning work, education, marriage, family life, sexuality, fertility, and HIV/AIDS and other sexually transmitted diseases. The survey interviewed 4106 men and 3978 women aged 15-24 years in three provinces of Java. Survey findings showed that 42% of the women and 8% of the men are currently or have been married. There was a strong inverse relationship between marriage and schooling, which suggests that greater educational attainment and a higher average age at marriage are likely to go together. Although most young couples prefer to delay and space births, only half of currently married young women are using any type of contraception. These results indicate that there is a need for better reproductive health care as well as improved reproductive health education. Moreover, the current economic crisis has lead to a decline in the use of the private sector for health care. Instead, young people are using the less-expensive government services, and young women are turning to pharmacies and midwives rather than to private doctors to obtain contraceptives. These findings have several policy implications including the need for reproductive health programs that provide services needed by young people. PMID:12295693

  6. Large-Scale Survey for Tickborne Bacteria, Khammouan Province, Laos.

    PubMed

    Taylor, Andrew J; Vongphayloth, Khamsing; Vongsouvath, Malavanh; Grandadam, Marc; Brey, Paul T; Newton, Paul N; Sutherland, Ian W; Dittrich, Sabine

    2016-09-01

    We screened 768 tick pools containing 6,962 ticks from Khammouan Province, Laos, by using quantitative real-time PCR and identified Rickettsia spp., Ehrlichia spp., and Borrelia spp. Sequencing of Rickettsia spp.-positive and Borrelia spp.-positive pools provided evidence for distinct genotypes. Our results identified bacteria with human disease potential in ticks in Laos. PMID:27532491

  7. Large-Scale Survey for Tickborne Bacteria, Khammouan Province, Laos

    PubMed Central

    Vongphayloth, Khamsing; Vongsouvath, Malavanh; Grandadam, Marc; Brey, Paul T.; Newton, Paul N.; Sutherland, Ian W.; Dittrich, Sabine

    2016-01-01

    We screened 768 tick pools containing 6,962 ticks from Khammouan Province, Laos, by using quantitative real-time PCR and identified Rickettsia spp., Ehrlichia spp., and Borrelia spp. Sequencing of Rickettsia spp.–positive and Borrelia spp.–positive pools provided evidence for distinct genotypes. Our results identified bacteria with human disease potential in ticks in Laos. PMID:27532491

  8. Survey of mental health of foreign students.

    PubMed

    Sam, D L; Eide, R

    1991-01-01

    The multifaceted nature of problems foreign students face have led some researchers to conclude that these students tend to suffer from poor health during their overseas sojourn. This assertion is examined among foreign students at the University of Bergen by means of a questionnaire survey. Loneliness, tiredness, sadness and worrying were reported as a frequent source of problem by nearly one in four of over 300 respondents. Students reported a decline in their general state of health as well as a rise in the occurrence of syndrome-like tendencies resembling paranoia, anxiety, depression and somatic complaints. These tendencies were attributed to certain psychosocial factors such as information received regarding study opportunities, social contacts with other tenants in the hall of residence and future job opportunities. Scandinavian students on the whole tended to have better mental health than students from the other countries. The implications of impaired health among foreign students is discussed. PMID:2047794

  9. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  10. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  11. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  12. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  13. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  14. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  15. Modeling temporal relationships in large scale clinical associations

    PubMed Central

    Hanauer, David A; Ramakrishnan, Naren

    2013-01-01

    Objective We describe an approach for modeling temporal relationships in a large scale association analysis of electronic health record data. The addition of temporal information can inform hypothesis generation and help to explain the relationships. We applied this approach on a dataset containing 41.2 million time-stamped International Classification of Diseases, Ninth Revision (ICD-9) codes from 1.6 million patients. Methods We performed two independent analyses including a pairwise association analysis using a χ2 test and a temporal analysis using a binomial test. Data were visualized using network diagrams and reviewed for clinical significance. Results We found nearly 400 000 highly associated pairs of ICD-9 codes with varying numbers of strong temporal associations ranging from ≥1 day to ≥10 years apart. Most of the findings were not considered clinically novel, although some, such as an association between Helicobacter pylori infection and diabetes, have recently been reported in the literature. The temporal analysis in our large cohort, however, revealed that diabetes usually preceded the diagnoses of H pylori, raising questions about possible cause and effect. Discussion Such analyses have significant limitations, some of which are due to known problems with ICD-9 codes and others to potentially incomplete data even at a health system level. Nevertheless, large scale association analyses with temporal modeling can help provide a mechanism for novel discovery in support of hypothesis generation. Conclusions Temporal relationships can provide an additional layer of meaning in identifying and interpreting clinical associations. PMID:23019240

  16. Modelling large-scale halo bias using the bispectrum

    NASA Astrophysics Data System (ADS)

    Pollack, Jennifer E.; Smith, Robert E.; Porciani, Cristiano

    2012-03-01

    We study the relation between the density distribution of tracers for large-scale structure and the underlying matter distribution - commonly termed bias - in the Λ cold dark matter framework. In particular, we examine the validity of the local model of biasing at quadratic order in the matter density. This model is characterized by parameters b1 and b2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales. We find that, whilst the fits are reasonably good, the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no smoothing scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo and halo-mass power spectra and from these construct estimates of the effective large-scale bias as a guide for b1. We measure the configuration dependence of the halo bispectra Bhhh and reduced bispectra Qhhh for very large-scale k-space triangles. From these data, we constrain b1 and b2, taking into account the full bispectrum covariance matrix. Using the lowest order perturbation theory, we find that for Bhhh the best-fitting parameters are in reasonable agreement with one another as the triangle scale is varied, although the fits become poor as smaller scales are included. The same is true for Qhhh. The best-fitting values were found to depend on the discreteness correction. This led us to consider halo-mass cross-bispectra. The results from these statistics supported our earlier findings. We then developed a test to explore whether the inconsistency in the recovered bias parameters could be attributed to missing higher order corrections in the models. We prove that low-order expansions are not sufficiently accurate to model the data, even on scales k1˜ 0.04 h Mpc-1. If robust inferences concerning bias are to be drawn

  17. Precision Measurement of Large Scale Structure

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    2001-01-01

    The purpose of this grant was to develop and to start to apply new precision methods for measuring the power spectrum and redshift distortions from the anticipated new generation of large redshift surveys. A highlight of work completed during the award period was the application of the new methods developed by the PI to measure the real space power spectrum and redshift distortions of the IRAS PSCz survey, published in January 2000. New features of the measurement include: (1) measurement of power over an unprecedentedly broad range of scales, 4.5 decades in wavenumber, from 0.01 to 300 h/Mpc; (2) at linear scales, not one but three power spectra are measured, the galaxy-galaxy, galaxy-velocity, and velocity-velocity power spectra; (3) at linear scales each of the three power spectra is decorrelated within itself, and disentangled from the other two power spectra (the situation is analogous to disentangling scalar and tensor modes in the Cosmic Microwave Background); and (4) at nonlinear scales the measurement extracts not only the real space power spectrum, but also the full line-of-sight pairwise velocity distribution in redshift space.

  18. Parallel block schemes for large scale least squares computations

    SciTech Connect

    Golub, G.H.; Plemmons, R.J.; Sameh, A.

    1986-04-01

    Large scale least squares computations arise in a variety of scientific and engineering problems, including geodetic adjustments and surveys, medical image analysis, molecular structures, partial differential equations and substructuring methods in structural engineering. In each of these problems, matrices often arise which possess a block structure which reflects the local connection nature of the underlying physical problem. For example, such super-large nonlinear least squares computations arise in geodesy. Here the coordinates of positions are calculated by iteratively solving overdetermined systems of nonlinear equations by the Gauss-Newton method. The US National Geodetic Survey will complete this year (1986) the readjustment of the North American Datum, a problem which involves over 540 thousand unknowns and over 6.5 million observations (equations). The observation matrix for these least squares computations has a block angular form with 161 diagnonal blocks, each containing 3 to 4 thousand unknowns. In this paper parallel schemes are suggested for the orthogonal factorization of matrices in block angular form and for the associated backsubstitution phase of the least squares computations. In addition, a parallel scheme for the calculation of certain elements of the covariance matrix for such problems is described. It is shown that these algorithms are ideally suited for multiprocessors with three levels of parallelism such as the Cedar system at the University of Illinois. 20 refs., 7 figs.

  19. Probes of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Suto, Yasushi; Gorski, Krzysztof; Juszkiewicz, Roman; Silk, Joseph

    1988-01-01

    A general formalism is developed which shows that the gravitational instability theory for the origin of the large-scale structure of the universe is now capable of critically confronting observational results on cosmic background radiation angular anisotropies, large-scale bulk motions, and large-scale clumpiness in the galaxy counts. The results indicate that presently advocated cosmological models will have considerable difficulty in simultaneously explaining the observational results.

  20. Illinois department of public health H1N1/A pandemic communications evaluation survey.

    SciTech Connect

    Walsh, D.; Decision and Information Sciences

    2010-09-16

    Because of heightened media coverage, a 24-hour news cycle and the potential miscommunication of health messages across all levels of government during the onset of the H1N1 influenza outbreak in spring 2009, the Illinois Department of Public Health (IDPH) decided to evaluate its H1N1 influenza A communications system. IDPH wanted to confirm its disease information and instructions were helping stakeholders prepare for and respond to a novel influenza outbreak. In addition, the time commitment involved in preparing, issuing, monitoring, updating, and responding to H1N1 federal guidelines/updates and media stories became a heavy burden for IDPH staff. The process and results of the H1N1 messaging survey represent a best practice that other health departments and emergency management agencies can replicate to improve coordination efforts with stakeholder groups during both emergency preparedness and response phases. Importantly, the H1N1 survey confirmed IDPH's messages were influencing stakeholders decisions to activate their pandemic plans and initiate response operations. While there was some dissatisfaction with IDPH's delivery of information and communication tools, such as the fax system, this report should demonstrate to IDPH that its core partners believe it has the ability and expertise to issue timely and accurate instructions that can help them respond to a large-scale disease outbreak in Illinois. The conclusion will focus on three main areas: (1) the survey development process, (2) survey results: best practices and areas for improvement and (3) recommendations: next steps.

  1. Small Business and Health Care. Results of a Survey.

    ERIC Educational Resources Information Center

    Hall, Charles P., Jr.; Kuder, John M.

    A 1989 mail survey collected data regarding health insurance from 18,614 small business owners who were employer members of the National Federation of Independent Business. In all, 5,368 usable surveys were returned for a 29 percent response rate. Data were obtained on opinions about health care, health care markets, and general health policy;…

  2. Large-scale mouse knockouts and phenotypes.

    PubMed

    Ramírez-Solis, Ramiro; Ryder, Edward; Houghton, Richard; White, Jacqueline K; Bottomley, Joanna

    2012-01-01

    Standardized phenotypic analysis of mutant forms of every gene in the mouse genome will provide fundamental insights into mammalian gene function and advance human and animal health. The availability of the human and mouse genome sequences, the development of embryonic stem cell mutagenesis technology, the standardization of phenotypic analysis pipelines, and the paradigm-shifting industrialization of these processes have made this a realistic and achievable goal. The size of this enterprise will require global coordination to ensure economies of scale in both the generation and primary phenotypic analysis of the mutant strains, and to minimize unnecessary duplication of effort. To provide more depth to the functional annotation of the genome, effective mechanisms will also need to be developed to disseminate the information and resources produced to the wider community. Better models of disease, potential new drug targets with novel mechanisms of action, and completely unsuspected genotype-phenotype relationships covering broad aspects of biology will become apparent. To reach these goals, solutions to challenges in mouse production and distribution, as well as development of novel, ever more powerful phenotypic analysis modalities will be necessary. It is a challenging and exciting time to work in mouse genetics.

  3. Health sciences library building projects: 1994 survey.

    PubMed Central

    Ludwig, L

    1995-01-01

    Designing and building new or renovated space is time consuming and requires politically sensitive discussions concerning a number of both long-term and immediate planning issues. The Medical Library Association's fourth annual survey of library building projects identified ten health sciences libraries that are planning, expanding, or constructing new facilities. Two projects are in predesign stages, four represent new construction, and four involve renovations to existing libraries. The Texas Medical Association Library, the King Faisal Specialist Hospital and Research Centre Library, and the Northwestern University Galter Health Sciences Library illustrate how these libraries are being designed for the future and take into account areas of change produced by new information technologies, curricular trends, and new ways to deliver library services. Images PMID:7599586

  4. On the Estimation of Hierarchical Latent Regression Models for Large-Scale Assessments

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas; Jiang, Yanlin

    2009-01-01

    To find population proficiency distributions, a two-level hierarchical linear model may be applied to large-scale survey assessments such as the National Assessment of Educational Progress (NAEP). The model and parameter estimation are developed and a simulation was carried out to evaluate parameter recovery. Subsequently, both a hierarchical and…

  5. Understanding Participation in E-Learning in Organizations: A Large-Scale Empirical Study of Employees

    ERIC Educational Resources Information Center

    Garavan, Thomas N.; Carbery, Ronan; O'Malley, Grace; O'Donnell, David

    2010-01-01

    Much remains unknown in the increasingly important field of e-learning in organizations. Drawing on a large-scale survey of employees (N = 557) who had opportunities to participate in voluntary e-learning activities, the factors influencing participation in e-learning are explored in this empirical paper. It is hypothesized that key variables…

  6. Safeguards instruments for Large-Scale Reprocessing Plants

    SciTech Connect

    Hakkila, E.A.; Case, R.S.; Sonnier, C.

    1993-06-01

    Between 1987 and 1992 a multi-national forum known as LASCAR (Large Scale Reprocessing Plant Safeguards) met to assist the IAEA in development of effective and efficient safeguards for large-scale reprocessing plants. The US provided considerable input for safeguards approaches and instrumentation. This paper reviews and updates instrumentation of importance in measuring plutonium and uranium in these facilities.

  7. The Challenge of Large-Scale Literacy Improvement

    ERIC Educational Resources Information Center

    Levin, Ben

    2010-01-01

    This paper discusses the challenge of making large-scale improvements in literacy in schools across an entire education system. Despite growing interest and rhetoric, there are very few examples of sustained, large-scale change efforts around school-age literacy. The paper reviews 2 instances of such efforts, in England and Ontario. After…

  8. A large-scale study of epilepsy in Ecuador: methodological aspects.

    PubMed

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  9. Approximate registration of point clouds with large scale differences

    NASA Astrophysics Data System (ADS)

    Novak, D.; Schindler, K.

    2013-10-01

    3D reconstruction of objects is a basic task in many fields, including surveying, engineering, entertainment and cultural heritage. The task is nowadays often accomplished with a laser scanner, which produces dense point clouds, but lacks accurate colour information, and lacks per-point accuracy measures. An obvious solution is to combine laser scanning with photogrammetric recording. In that context, the problem arises to register the two datasets, which feature large scale, translation and rotation differences. The absence of approximate registration parameters (3D translation, 3D rotation and scale) precludes the use of fine-registration methods such as ICP. Here, we present a method to register realistic photogrammetric and laser point clouds in a fully automated fashion. The proposed method decomposes the registration into a sequence of simpler steps: first, two rotation angles are determined by finding dominant surface normal directions, then the remaining parameters are found with RANSAC followed by ICP and scale refinement. These two steps are carried out at low resolution, before computing a precise final registration at higher resolution.

  10. Implicit solvers for large-scale nonlinear problems

    SciTech Connect

    Keyes, D E; Reynolds, D; Woodward, C S

    2006-07-13

    Computational scientists are grappling with increasingly complex, multi-rate applications that couple such physical phenomena as fluid dynamics, electromagnetics, radiation transport, chemical and nuclear reactions, and wave and material propagation in inhomogeneous media. Parallel computers with large storage capacities are paving the way for high-resolution simulations of coupled problems; however, hardware improvements alone will not prove enough to enable simulations based on brute-force algorithmic approaches. To accurately capture nonlinear couplings between dynamically relevant phenomena, often while stepping over rapid adjustments to quasi-equilibria, simulation scientists are increasingly turning to implicit formulations that require a discrete nonlinear system to be solved for each time step or steady state solution. Recent advances in iterative methods have made fully implicit formulations a viable option for solution of these large-scale problems. In this paper, we overview one of the most effective iterative methods, Newton-Krylov, for nonlinear systems and point to software packages with its implementation. We illustrate the method with an example from magnetically confined plasma fusion and briefly survey other areas in which implicit methods have bestowed important advantages, such as allowing high-order temporal integration and providing a pathway to sensitivity analyses and optimization. Lastly, we overview algorithm extensions under development motivated by current SciDAC applications.

  11. Weak lensing of large scale structure in the presence of screening

    SciTech Connect

    Tessore, Nicolas; Metcalf, R. Benton; Giocoli, Carlo E-mail: hans.winther@astro.ox.ac.uk E-mail: pedro.ferreira@physics.ox.ac.uk

    2015-10-01

    A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.

  12. Reconstructing Information in Large-Scale Structure via Logarithmic Mapping

    NASA Astrophysics Data System (ADS)

    Szapudi, Istvan

    We propose to develop a new method to extract information from large-scale structure data combining two-point statistics and non-linear transformations; before, this information was available only with substantially more complex higher-order statistical methods. Initially, most of the cosmological information in large-scale structure lies in two-point statistics. With non- linear evolution, some of that useful information leaks into higher-order statistics. The PI and group has shown in a series of theoretical investigations how that leakage occurs, and explained the Fisher information plateau at smaller scales. This plateau means that even as more modes are added to the measurement of the power spectrum, the total cumulative information (loosely speaking the inverse errorbar) is not increasing. Recently we have shown in Neyrinck et al. (2009, 2010) that a logarithmic (and a related Gaussianization or Box-Cox) transformation on the non-linear Dark Matter or galaxy field reconstructs a surprisingly large fraction of this missing Fisher information of the initial conditions. This was predicted by the earlier wave mechanical formulation of gravitational dynamics by Szapudi & Kaiser (2003). The present proposal is focused on working out the theoretical underpinning of the method to a point that it can be used in practice to analyze data. In particular, one needs to deal with the usual real-life issues of galaxy surveys, such as complex geometry, discrete sam- pling (Poisson or sub-Poisson noise), bias (linear, or non-linear, deterministic, or stochastic), redshift distortions, pro jection effects for 2D samples, and the effects of photometric redshift errors. We will develop methods for weak lensing and Sunyaev-Zeldovich power spectra as well, the latter specifically targetting Planck. In addition, we plan to investigate the question of residual higher- order information after the non-linear mapping, and possible applications for cosmology. Our aim will be to work out

  13. Testing gravity using large-scale redshift-space distortions

    NASA Astrophysics Data System (ADS)

    Raccanelli, Alvise; Bertacca, Daniele; Pietrobon, Davide; Schmidt, Fabian; Samushia, Lado; Bartolo, Nicola; Doré, Olivier; Matarrese, Sabino; Percival, Will J.

    2013-11-01

    We use luminous red galaxies from the Sloan Digital Sky Survey (SDSS) II to test the cosmological structure growth in two alternatives to the standard Λ cold dark matter (ΛCDM)+general relativity (GR) cosmological model. We compare observed three-dimensional clustering in SDSS Data Release 7 (DR7) with theoretical predictions for the standard vanilla ΛCDM+GR model, unified dark matter (UDM) cosmologies and the normal branch Dvali-Gabadadze-Porrati (nDGP). In computing the expected correlations in UDM cosmologies, we derive a parametrized formula for the growth factor in these models. For our analysis we apply the methodology tested in Raccanelli et al. and use the measurements of Samushia et al. that account for survey geometry, non-linear and wide-angle effects and the distribution of pair orientation. We show that the estimate of the growth rate is potentially degenerate with wide-angle effects, meaning that extremely accurate measurements of the growth rate on large scales will need to take such effects into account. We use measurements of the zeroth and second-order moments of the correlation function from SDSS DR7 data and the Large Suite of Dark Matter Simulations (LasDamas), and perform a likelihood analysis to constrain the parameters of the models. Using information on the clustering up to rmax = 120 h-1 Mpc, and after marginalizing over the bias, we find, for UDM models, a speed of sound c∞ ≤ 6.1e-4, and, for the nDGP model, a cross-over scale rc ≥ 340 Mpc, at 95 per cent confidence level.

  14. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  15. Non-Gaussianity and large-scale structure in a two-field inflationary model

    SciTech Connect

    Tseliakhovich, Dmitriy; Hirata, Christopher

    2010-08-15

    Single-field inflationary models predict nearly Gaussian initial conditions, and hence a detection of non-Gaussianity would be a signature of the more complex inflationary scenarios. In this paper we study the effect on the cosmic microwave background and on large-scale structure from primordial non-Gaussianity in a two-field inflationary model in which both the inflaton and curvaton contribute to the density perturbations. We show that in addition to the previously described enhancement of the galaxy bias on large scales, this setup results in large-scale stochasticity. We provide joint constraints on the local non-Gaussianity parameter f-tilde{sub NL} and the ratio {xi} of the amplitude of primordial perturbations due to the inflaton and curvaton using WMAP and Sloan Digital Sky Survey data.

  16. A Review of International Large-Scale Assessments in Education: Assessing Component Skills and Collecting Contextual Data. PISA for Development

    ERIC Educational Resources Information Center

    Cresswell, John; Schwantner, Ursula; Waters, Charlotte

    2015-01-01

    This report reviews the major international and regional large-scale educational assessments, including international surveys, school-based surveys and household-based surveys. The report compares and contrasts the cognitive and contextual data collection instruments and implementation methods used by the different assessments in order to identify…

  17. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  18. Quasars as a Tracer of Large-scale Structures in the Distant Universe

    NASA Astrophysics Data System (ADS)

    Song, Hyunmi; Park, Changbom; Lietzen, Heidi; Einasto, Maret

    2016-08-01

    We study the dependence of the number density and properties of quasars on the background galaxy density using the currently largest spectroscopic data sets of quasars and galaxies. We construct a galaxy number density field smoothed over the variable smoothing scale of between approximately 10 and 20 h -1 Mpc over the redshift range 0.46 < z < 0.59 using the Sloan Digital Sky Survey (SDSS) Data Release 12 (DR12) Constant MASS galaxies. The quasar sample is prepared from the SDSS-I/II DR7. We examine the correlation of incidence of quasars with the large-scale background density and the dependence of quasar properties such as bolometric luminosity, black hole mass, and Eddington ratio on the large-scale density. We find a monotonic correlation between the quasar number density and large-scale galaxy number density, which is fitted well with a power-law relation, {n}Q\\propto {ρ }G0.618. We detect weak dependences of quasar properties on the large-scale density such as a positive correlation between black hole mass and density, and a negative correlation between luminosity and density. We discuss the possibility of using quasars as a tracer of large-scale structures at high redshifts, which may be useful for studies of the growth of structures in the high-redshift universe.

  19. Ultra-large-scale Cosmology in Next-generation Experiments with Single Tracers

    NASA Astrophysics Data System (ADS)

    Alonso, David; Bull, Philip; Ferreira, Pedro G.; Maartens, Roy; Santos, Mário G.

    2015-12-01

    Future surveys of large-scale structure will be able to measure perturbations on the scale of the cosmological horizon, and so could potentially probe a number of novel relativistic effects that are negligibly small on sub-horizon scales. These effects leave distinctive signatures in the power spectra of clustering observables and, if measurable, would open a new window on relativistic cosmology. We quantify the size and detectability of the effects for the most relevant future large-scale structure experiments: spectroscopic and photometric galaxy redshift surveys, intensity mapping surveys of neutral hydrogen, and radio continuum surveys. Our forecasts show that next-generation experiments, reaching out to redshifts z≃ 4, will not be able to detect previously undetected general-relativistic effects by using individual tracers of the density field, although the contribution of weak lensing magnification on large scales should be clearly detectable. We also perform a rigorous joint forecast for the detection of primordial non-Gaussianity through the excess power it produces in the clustering of biased tracers on large scales, finding that uncertainties of σ ({f}{{NL}})∼ 1-2 should be achievable. We study the level of degeneracy of these large-scale effects with several tracer-dependent nuisance parameters, quantifying the minimal priors on the latter that are needed for an optimal measurement of the former. Finally, we discuss the systematic effects that must be mitigated to achieve this level of sensitivity, and some alternative approaches that should help to improve the constraints. The computational tools developed to carry out this study, which requires the full-sky computation of the theoretical angular power spectra for {O}(100) redshift bins, as well as realistic models of the luminosity function, are publicly available at http://intensitymapping.physics.ox.ac.uk/codes.html.

  20. Bacteriological survey of sixty health foods.

    PubMed Central

    Andrews, W H; Wilson, C R; Poelma, P L; Romero, A; Mislivec, P B

    1979-01-01

    A bacteriological survey was performed on 1,960 food samples encompassing 60 types of health foods available in the Baltimore-Washington, D.C., metropolitan area. No consistent bacteriological distinction (aerobic plate counts, total coliform and fecal coliform most probable numbers) was observed between foods labeled as organic (raised on soil with compost or nonchemical fertilizer and without application of pesticides, fungicides, and herbicides) and their counterpart food types bearing no such label. Types and numbers of samples containing Salmonella were: sunflower seeds, 4; soy flour, 3; soy protein powder, 2; soy milk powder, 1; dried active yeast, 1; brewers' years, 1; rye flour, 1; brown rice, 1; and alfalfa seeds,1. The occurrence of this pathogen in three types of soybean products should warrant further investigation of soybean derivatives as potentially significant sources of Salmonella. PMID:572198

  1. Virginia agricultural health and safety survey.

    PubMed

    Mariger, S C; Grisso, R D; Perumpral, J V; Sorenson, A W; Christensen, N K; Miller, R L

    2009-01-01

    This comprehensive study was conducted primarily to identify the common causes of agricultural injuries on active Virginia farms and to identify hazardous agricultural operations, exposure duration, and injuries associated with each hazardous operation. In addition, the influences of factors such as general health status of farmers, age, weight, and alcohol and tobacco use on injury were examined. This information will be used for the development of educational programs that will improve the safety of agricultural operations. The sample selected for the study included farms of 28 ha or more, operating on a full- or part-time basis. This stipulation was to ensure that all farms in the sample are active and that participants generated a major portion of their income from the farm. Of the 26,000 farms meeting this requirement, 1,650 were selected to participate in the study. A survey instrument was mailed to the farmers selected to collect the information needed for meeting the established objectives of the study. Approximately 19% of the surveys were returned. In terms of percentage injuries, livestock handling was the primary cause. This was followed by working in elevated locations, operating and repairing agricultural machinery, and heavy lifting. The activities carried out most frequently by the participants were: operating farm tractors, operating trucks/automobiles, using hand and power tools, and working with agricultural chemicals. The overall injury rate was 5.6 injuries per 100,000 h. The exposure to agricultural hazards appeared to have minimal or no effect on the health status of Virginia farmers. Farm workers in the 45 to 64 age group sustained the most injuries. Older, more experienced farmers reported fewer injuries because of limited exposure to hazards and work experience. The average age of Virginia farmers surveyed was 60. This is expected to rise because most respondents reported no plans to retire during the next five years. Based on the results

  2. Polymer Physics of the Large-Scale Structure of Chromatin.

    PubMed

    Bianco, Simona; Chiariello, Andrea Maria; Annunziatella, Carlo; Esposito, Andrea; Nicodemi, Mario

    2016-01-01

    We summarize the picture emerging from recently proposed models of polymer physics describing the general features of chromatin large scale spatial architecture, as revealed by microscopy and Hi-C experiments. PMID:27659986

  3. Large-scale anisotropy of the cosmic microwave background radiation

    NASA Technical Reports Server (NTRS)

    Silk, J.; Wilson, M. L.

    1981-01-01

    Inhomogeneities in the large-scale distribution of matter inevitably lead to the generation of large-scale anisotropy in the cosmic background radiation. The dipole, quadrupole, and higher order fluctuations expected in an Einstein-de Sitter cosmological model have been computed. The dipole and quadrupole anisotropies are comparable to the measured values, and impose important constraints on the allowable spectrum of large-scale matter density fluctuations. A significant dipole anisotropy is generated by the matter distribution on scales greater than approximately 100 Mpc. The large-scale anisotropy is insensitive to the ionization history of the universe since decoupling, and cannot easily be reconciled with a galaxy formation theory that is based on primordial adiabatic density fluctuations.

  4. Polymer Physics of the Large-Scale Structure of Chromatin.

    PubMed

    Bianco, Simona; Chiariello, Andrea Maria; Annunziatella, Carlo; Esposito, Andrea; Nicodemi, Mario

    2016-01-01

    We summarize the picture emerging from recently proposed models of polymer physics describing the general features of chromatin large scale spatial architecture, as revealed by microscopy and Hi-C experiments.

  5. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  6. Large scale anomalies in the microwave background: causation and correlation.

    PubMed

    Aslanyan, Grigor; Easther, Richard

    2013-12-27

    Most treatments of large scale anomalies in the microwave sky are a posteriori, with unquantified look-elsewhere effects. We contrast these with physical models of specific inhomogeneities in the early Universe which can generate these apparent anomalies. Physical models predict correlations between candidate anomalies and the corresponding signals in polarization and large scale structure, reducing the impact of cosmic variance. We compute the apparent spatial curvature associated with large-scale inhomogeneities and show that it is typically small, allowing for a self-consistent analysis. As an illustrative example we show that a single large plane wave inhomogeneity can contribute to low-l mode alignment and odd-even asymmetry in the power spectra and the best-fit model accounts for a significant part of the claimed odd-even asymmetry. We argue that this approach can be generalized to provide a more quantitative assessment of potential large scale anomalies in the Universe.

  7. A review of national health surveys in India.

    PubMed

    Dandona, Rakhi; Pandey, Anamika; Dandona, Lalit

    2016-04-01

    Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India's disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health. PMID:27034522

  8. A review of national health surveys in India.

    PubMed

    Dandona, Rakhi; Pandey, Anamika; Dandona, Lalit

    2016-04-01

    Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India's disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health.

  9. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    EPA Science Inventory

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  10. Large-scale studies of marked birds in North America

    USGS Publications Warehouse

    Tautin, J.; Metras, L.; Smith, G.

    1999-01-01

    The first large-scale, co-operative, studies of marked birds in North America were attempted in the 1950s. Operation Recovery, which linked numerous ringing stations along the east coast in a study of autumn migration of passerines, and the Preseason Duck Ringing Programme in prairie states and provinces, conclusively demonstrated the feasibility of large-scale projects. The subsequent development of powerful analytical models and computing capabilities expanded the quantitative potential for further large-scale projects. Monitoring Avian Productivity and Survivorship, and Adaptive Harvest Management are current examples of truly large-scale programmes. Their exemplary success and the availability of versatile analytical tools are driving changes in the North American bird ringing programme. Both the US and Canadian ringing offices are modifying operations to collect more and better data to facilitate large-scale studies and promote a more project-oriented ringing programme. New large-scale programmes such as the Cornell Nest Box Network are on the horizon.

  11. A study of MLFMA for large-scale scattering problems

    NASA Astrophysics Data System (ADS)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  12. Large-scale motions in a plane wall jet

    NASA Astrophysics Data System (ADS)

    Gnanamanickam, Ebenezer; Jonathan, Latim; Shibani, Bhatt

    2015-11-01

    The dynamic significance of large-scale motions in turbulent boundary layers have been the focus of several recent studies, primarily focussing on canonical flows - zero pressure gradient boundary layers, flows within pipes and channels. This work presents an investigation into the large-scale motions in a boundary layer that is used as the prototypical flow field for flows with large-scale mixing and reactions, the plane wall jet. An experimental investigation is carried out in a plane wall jet facility designed to operate at friction Reynolds numbers Reτ > 1000 , which allows for the development of a significant logarithmic region. The streamwise turbulent intensity across the boundary layer is decomposed into small-scale (less than one integral length-scale δ) and large-scale components. The small-scale energy has a peak in the near-wall region associated with the near-wall turbulent cycle as in canonical boundary layers. However, eddies of large-scales are the dominating eddies having significantly higher energy, than the small-scales across almost the entire boundary layer even at the low to moderate Reynolds numbers under consideration. The large-scales also appear to amplitude and frequency modulate the smaller scales across the entire boundary layer.

  13. Intracluster light in the Virgo cluster: large scale distribution

    NASA Astrophysics Data System (ADS)

    Castro-Rodriguéz, N.; Arnaboldi, M.; Aguerri, J. A. L.; Gerhard, O.; Okamura, S.; Yasuda, N.; Freeman, K. C.

    2009-11-01

    Aims: The intracluster light (ICL) is a faint diffuse stellar component of clusters made of stars that are not bound to individual galaxies. We have carried out a large scale study of this component in the nearby Virgo cluster. Methods: The diffuse light is traced using planetary nebulae (PNe). The surveyed areas were observed with a narrow-band filter centered on the [OIII]λ 5007 Å emission line redshifted to the Virgo cluster distance (the on-band image), and a broad-band filter (the off-band image). For some fields, additional narrow band imaging data corresponding to the Hα emission were also obtained. The PNe are detected in the on-band image due to their strong emission in the [OIII]λ 5007 Å line, but disappear in the off-band image. The contribution of Ly-α emitters at z=3.14 are corrected statistically using blank field surveys, when the Hα image at the field position is not available. Results: We have surveyed a total area of 3.3 square degrees in the Virgo cluster with eleven fields located at different radial distances. Those fields located at smaller radii than 80 arcmin from the cluster center contain most of the detected diffuse light. In this central region of the cluster, the ICL has a surface brightness in the range μB = 28.8-30 mag arsec-2, it is not uniformly distributed, and represents about 7% of the total galaxy light in this area. At distances larger than 80 arcmin the ICL is confined to single fields and individual sub-structures, e.g. in the sub-clump B, the M 60/M 59 group. For several fields at 2 and 3 degrees from the Virgo cluster center we set only upper limits. Conclusions: These results indicate that the ICL is not homogeneously distributed in the Virgo core, and it is concentrated in the high density regions of the Virgo cluster, e.g. the cluster core and other sub-structures. Outside these regions, the ICL is confined within areas of ~100 kpc in size, where tidal effects may be at work. These observational results link the

  14. Inflationary tensor fossils in large-scale structure

    SciTech Connect

    Dimastrogiovanni, Emanuela; Fasiello, Matteo; Jeong, Donghui; Kamionkowski, Marc E-mail: mrf65@case.edu E-mail: kamion@jhu.edu

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  15. A health survey of radiologic technologists.

    PubMed

    Boice, J D; Mandel, J S; Doody, M M; Yoder, R C; McGowan, R

    1992-01-15

    A health survey of more than 143,000 radiologic technologists is described. The population was identified from the 1982 computerized files of the American Registry of Radiologic Technologists, which was established in 1926. Inactive members were traced to obtain current addresses or death notifications. More than 6000 technologists were reported to have died. For all registrants who were alive when located, a detailed 16-page questionnaire was sent, covering occupational histories, medical conditions, and other personal and lifestyle characteristics. Nonrespondents were contacted by telephone to complete an abbreviated questionnaire. More than 104,000 responses were obtained. The overall response rate was 79%. Most technologists were female (76%), white (93%), and employed for an average of 12 years; 37% attended college, and approximately 50% never smoked cigarettes. Radiation exposure information was sought from employer records and commercial dosimetry companies. Technologists employed for the longest times had the highest estimated cumulative exposures, with approximately 9% with exposures greater than 5 cGy. There was a high correlation between cumulative occupational exposure and personal exposure to medical radiographs, related, in part, to the association of both factors with attained age. It is interesting that 10% of all technologists allowed others to practice taking radiographs on them during their training. Nearly 4% of the respondents reported having some type of cancer, mainly of the skin (1517), breast (665), and cervix (726). Prospective surveys will monitor cancer mortality rates through use of the National Death Index and cancer incidence through periodic mailings of questionnaires. This is the only occupational study of radiation employees who are primarily women and should provide new information on the possible risks associated with relatively low levels of exposure. PMID:1728391

  16. Occupational health survey of farm workers by camp health aides.

    PubMed

    Cameron, L; Lalich, N; Bauer, S; Booker, V; Bogue, H O; Samuels, S; Steege, A L

    2006-05-01

    Little is known about the magnitude of occupational health problems among migrant farm workers. A community-based cross-sectional survey was conducted in two migrant farm worker communities: Homestead, Florida, and Kankakee, Illinois. Camp Health Aides (CHAs) interviewed 425 workers about job tasks, personal protective equipment (PPE), field sanitation, work exposures, and selected health conditions. Limited provision of personal protective equipment was reported among those reporting early re-entry tasks: 35% in Kankakee and 42% in Homestead were provided gloves, and 22% in Homestead and 0% in Kankakee were provided protective clothing. About two-thirds were provided toilet facilities and water for hand-washing. Workers reported high prevalences of health conditions consistent with exposure to ergonomic hazards and pesticides. The prevalence of back pain in the past 12 months was 39% in Homestead and 24% in Kankakee. Among Homestead participants, 35% experienced eye symptoms, while 31% reported skin symptoms. These symptoms were less prevalent among Kankakee participants (16% for both eye and skin symptoms). Specific areas of concern included back pain associated with heavy lifting and ladder work; eye and skin irritation associated with fertilizer application tasks and with working in fields during or after spraying of chemicals, especially early re-entry of sprayed fields; and skin irritation associated with a lack of access to hand-washing facilities. In both Kankakee and Homestead, better adherence to safety standards is needed, as well as greater efforts to implement solutions that are available to help prevent work-related musculoskeletal problems. PMID:16724790

  17. Health sciences library building projects: 1995 survey.

    PubMed Central

    Ludwig, L

    1996-01-01

    The Medical Library Association's fifth annual survey of recent health sciences library building projects identified twenty-five libraries planning, expanding, or constructing new library facilities. None of the fifteen new library projects are free standing structures; however, several occupy a major portion of the project space. Ten projects involve renovation of or addition to existing space. Information regarding size, cost of project, type of construction, completion date, and other factual data was provided for twelve projects. The remaining identified projects are in pre-design or early-design stages, or are awaiting funding approval. Library building projects for three hospital libraries, three academic medical libraries, and an association library are described. Each illustrates how considerations of economics and technology are changing the traditional library model from a centrally stored information depository housing a wide range of information under one roof where users come to the information, into an electronic model gradually shifting from investment in the physical presence of resources to investment in creating work space for creditible information specialists who help in-house and distanced users to obtain information electronically from any place and at any time. This new model includes a highly skilled library team to manage, filter, and package the information to users trained by these resident experts. Images PMID:8883981

  18. Health sciences library building projects: 1995 survey.

    PubMed

    Ludwig, L

    1996-07-01

    The Medical Library Association's fifth annual survey of recent health sciences library building projects identified twenty-five libraries planning, expanding, or constructing new library facilities. None of the fifteen new library projects are free standing structures; however, several occupy a major portion of the project space. Ten projects involve renovation of or addition to existing space. Information regarding size, cost of project, type of construction, completion date, and other factual data was provided for twelve projects. The remaining identified projects are in pre-design or early-design stages, or are awaiting funding approval. Library building projects for three hospital libraries, three academic medical libraries, and an association library are described. Each illustrates how considerations of economics and technology are changing the traditional library model from a centrally stored information depository housing a wide range of information under one roof where users come to the information, into an electronic model gradually shifting from investment in the physical presence of resources to investment in creating work space for creditible information specialists who help in-house and distanced users to obtain information electronically from any place and at any time. This new model includes a highly skilled library team to manage, filter, and package the information to users trained by these resident experts. PMID:8883981

  19. Worksite Health Promotion Activities. 1992 National Survey. Summary Report.

    ERIC Educational Resources Information Center

    Public Health Service (DHHS), Rockville, MD. Office of Disease Prevention and Health Promotion.

    The survey reported in this document examined worksite health promotion and disease prevention activities in 1,507 private worksites in the United States. Specificlly, the survey assessed policies, practices, services, facilities, information, and activities sponsored by employers to improve the health of their employees, and assessed health…

  20. A survey on the current status of health care marketing.

    PubMed

    Gardner, S F; Paison, A R

    1985-01-01

    This article presents the results of a survey, conducted by Market-PULSE Measurement Systems, reflecting the growth of health care marketing and the marketing perspectives of health care professionals. The survey results echo the opinions of two groups of professionals: chief executive officers of hospitals over 100 beds; and administrators as well as directors of marketing, planning, and public relations who attended a recent health services marketing conference. The survey, a telephone interview, was conducted to determine: The degree to which hospitals are market oriented. The degree to which hospitals use survey research. The following is an analysis of what the surveyors found.

  1. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  2. Large-Scale Candidate Gene Analysis of HDL Particle Features

    PubMed Central

    Kaess, Bernhard M.; Tomaszewski, Maciej; Braund, Peter S.; Stark, Klaus; Rafelt, Suzanne; Fischer, Marcus; Hardwick, Robert; Nelson, Christopher P.; Debiec, Radoslaw; Huber, Fritz; Kremer, Werner; Kalbitzer, Hans Robert; Rose, Lynda M.; Chasman, Daniel I.; Hopewell, Jemma; Clarke, Robert; Burton, Paul R.; Tobin, Martin D.

    2011-01-01

    Background HDL cholesterol (HDL-C) is an established marker of cardiovascular risk with significant genetic determination. However, HDL particles are not homogenous, and refined HDL phenotyping may improve insight into regulation of HDL metabolism. We therefore assessed HDL particles by NMR spectroscopy and conducted a large-scale candidate gene association analysis. Methodology/Principal Findings We measured plasma HDL-C and determined mean HDL particle size and particle number by NMR spectroscopy in 2024 individuals from 512 British Caucasian families. Genotypes were 49,094 SNPs in >2,100 cardiometabolic candidate genes/loci as represented on the HumanCVD BeadChip version 2. False discovery rates (FDR) were calculated to account for multiple testing. Analyses on classical HDL-C revealed significant associations (FDR<0.05) only for CETP (cholesteryl ester transfer protein; lead SNP rs3764261: p = 5.6*10−15) and SGCD (sarcoglycan delta; rs6877118: p = 8.6*10−6). In contrast, analysis with HDL mean particle size yielded additional associations in LIPC (hepatic lipase; rs261332: p = 6.1*10−9), PLTP (phospholipid transfer protein, rs4810479: p = 1.7*10−8) and FBLN5 (fibulin-5; rs2246416: p = 6.2*10−6). The associations of SGCD and Fibulin-5 with HDL particle size could not be replicated in PROCARDIS (n = 3,078) and/or the Women's Genome Health Study (n = 23,170). Conclusions We show that refined HDL phenotyping by NMR spectroscopy can detect known genes of HDL metabolism better than analyses on HDL-C. PMID:21283740

  3. [Colombia. Prevalence, Demography and Health Survey 1990].

    PubMed

    1991-06-01

    Colombia's 1990 Survey of Prevalence, Demography, and Health (EPDS) was intended to provide data on the total population and on the status of women's and children's health for use in planning and in formulating health and family planning policy. 7412 household interviews and 8647 individual interviews with women aged 15-49 years were completed. This document provides a brief description of the questionnaire, sample design, data processing, and survey results. More detailed works on each topic are expected to follow. After weighing, 74.8% of respondents were urban and 25.2% rural. 3.2% were illiterate, 36.6% had some primary education, 50.2% had secondary educations, and 9.9% had high higher educations. Among all respondents and respondents currently in union respectively, 98.2% and 997% knew some contraceptive method, 94.1% and 97.9% knew some source of family planning, 57.6% and 86.0% had ever used a method, and 39.9% and 66.1% were currently using a method. Among all respondents and respondents currently in union respectively, 52.2% and 78.9% had ever used a modern method and 33.0% and 54.6% were currently using a modern method. Among women in union, 14.1% currently used pills, 12.4% IUDs, 2.2% injectables, 1.7% vaginal methods, 2.9% condoms, 20.9% female sterilization, .5% vasectomy, 11.5% some tradition method, 6.1% periodic abstinence, 4.8% withdrawal, and .5% others. Equal proportions of rural and urban women were sterilized. The prevalence of female sterilization declined with education and increased with family size. Modern methods were used by 57.5% of urban and 47.7% of rural women, 44.0% of illiterate women, 51.8% of women with primary and 57.8% with secondary educations. Among women in union, 10.9% wanted a child soon, 19.7% wanted 1 eventually, 3.6% were undecided, 42.6% did not want 1, 21.4% were sterilized, and 1.2% were infertile. Among women giving birth in the past 5 years, the proportion having antitetanus vaccinations increased from 39% in 1986

  4. Efficiency of workplace surveys conducted by Finnish occupational health services.

    PubMed

    Savinainen, Minna; Oksa, Panu

    2011-07-01

    In Finland, workplace surveys are used to identify and assess health risks and problems caused by work and make suggestions for continuous improvement of the work environment. With the aid of the workplace survey, occupational health services can be tailored to a company. The aims of this study were to determine how occupational health professionals gather data via the workplace survey and the effect survey results have on companies. A total of 259 occupational health nurses and 108 occupational health physicians responded to the questionnaire: 84.2% were women and 15.8% were men. The mean age of the respondents was 48.8 years (range, 26 to 65 years). Usually occupational health nurses and foremen and sometimes occupational health physicians and occupational safety and health representatives initiate the workplace survey. More than 90% of the surveys were followed by action proposals, and about 50% of these were implemented. The proposals implemented most often concerned personal protective equipment and less often leadership. Survey respondents should have both the opportunity and the authority to affect resources, the work environment, work arrangements, and tools. Teamwork among occupational health and safety professionals, management, and employees is vital for cost-effectively solving today's complex problems at workplaces around the globe. PMID:21710956

  5. Efficiency of workplace surveys conducted by Finnish occupational health services.

    PubMed

    Savinainen, Minna; Oksa, Panu

    2011-07-01

    In Finland, workplace surveys are used to identify and assess health risks and problems caused by work and make suggestions for continuous improvement of the work environment. With the aid of the workplace survey, occupational health services can be tailored to a company. The aims of this study were to determine how occupational health professionals gather data via the workplace survey and the effect survey results have on companies. A total of 259 occupational health nurses and 108 occupational health physicians responded to the questionnaire: 84.2% were women and 15.8% were men. The mean age of the respondents was 48.8 years (range, 26 to 65 years). Usually occupational health nurses and foremen and sometimes occupational health physicians and occupational safety and health representatives initiate the workplace survey. More than 90% of the surveys were followed by action proposals, and about 50% of these were implemented. The proposals implemented most often concerned personal protective equipment and less often leadership. Survey respondents should have both the opportunity and the authority to affect resources, the work environment, work arrangements, and tools. Teamwork among occupational health and safety professionals, management, and employees is vital for cost-effectively solving today's complex problems at workplaces around the globe.

  6. Brief 73 Health Physics Enrollments and Degrees Survey, 2013 Data

    SciTech Connect

    None, None

    2014-02-15

    The survey includes degrees granted between September 1, 2012 and August 31, 2013. Enrollment information refers to the fall term 2013. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.taoi_na

  7. Brief 75 Health Physics Enrollments and Degrees Survey, 2014 Data

    SciTech Connect

    None, None

    2015-03-05

    The 2014 survey includes degrees granted between September 1, 2013 and August 31, 2014. Enrollment information refers to the fall term 2014. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.

  8. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  9. Do Large-Scale Topological Features Correlate with Flare Properties?

    NASA Astrophysics Data System (ADS)

    DeRosa, Marc L.; Barnes, Graham

    2016-05-01

    In this study, we aim to identify whether the presence or absence of particular topological features in the large-scale coronal magnetic field are correlated with whether a flare is confined or eruptive. To this end, we first determine the locations of null points, spine lines, and separatrix surfaces within the potential fields associated with the locations of several strong flares from the current and previous sunspot cycles. We then validate the topological skeletons against large-scale features in observations, such as the locations of streamers and pseudostreamers in coronagraph images. Finally, we characterize the topological environment in the vicinity of the flaring active regions and identify the trends involving their large-scale topologies and the properties of the associated flares.

  10. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  11. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  12. Coupling between convection and large-scale circulation

    NASA Astrophysics Data System (ADS)

    Becker, T.; Stevens, B. B.; Hohenegger, C.

    2014-12-01

    The ultimate drivers of convection - radiation, tropospheric humidity and surface fluxes - are altered both by the large-scale circulation and by convection itself. A quantity to which all drivers of convection contribute is moist static energy, or gross moist stability, respectively. Therefore, a variance analysis of the moist static energy budget in radiative-convective equilibrium helps understanding the interaction of precipitating convection and the large-scale environment. In addition, this method provides insights concerning the impact of convective aggregation on this coupling. As a starting point, the interaction is analyzed with a general circulation model, but a model intercomparison study using a hierarchy of models is planned. Effective coupling parameters will be derived from cloud resolving models and these will in turn be related to assumptions used to parameterize convection in large-scale models.

  13. Human pescadillo induces large-scale chromatin unfolding.

    PubMed

    Zhang, Hao; Fang, Yan; Huang, Cuifen; Yang, Xiao; Ye, Qinong

    2005-06-01

    The human pescadillo gene encodes a protein with a BRCT domain. Pescadillo plays an important role in DNA synthesis, cell proliferation and transformation. Since BRCT domains have been shown to induce chromatin large-scale unfolding, we tested the role of Pescadillo in regulation of large-scale chromatin unfolding. To this end, we isolated the coding region of Pescadillo from human mammary MCF10A cells. Compared with the reported sequence, the isolated Pescadillo contains in-frame deletion from amino acid 580 to 582. Targeting the Pescadillo to an amplified, lac operator-containing chromosome region in the mammalian genome results in large-scale chromatin decondensation. This unfolding activity maps to the BRCT domain of Pescadillo. These data provide a new clue to understanding the vital role of Pescadillo.

  14. Numerical methods for large-scale, time-dependent partial differential equations

    NASA Technical Reports Server (NTRS)

    Turkel, E.

    1979-01-01

    A survey of numerical methods for time dependent partial differential equations is presented. The emphasis is on practical applications to large scale problems. A discussion of new developments in high order methods and moving grids is given. The importance of boundary conditions is stressed for both internal and external flows. A description of implicit methods is presented including generalizations to multidimensions. Shocks, aerodynamics, meteorology, plasma physics and combustion applications are also briefly described.

  15. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  16. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  17. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  18. Large-scale superfluid vortex rings at nonzero temperatures

    NASA Astrophysics Data System (ADS)

    Wacks, D. H.; Baggaley, A. W.; Barenghi, C. F.

    2014-12-01

    We numerically model experiments in which large-scale vortex rings—bundles of quantized vortex loops—are created in superfluid helium by a piston-cylinder arrangement. We show that the presence of a normal-fluid vortex ring together with the quantized vortices is essential to explain the coherence of these large-scale vortex structures at nonzero temperatures, as observed experimentally. Finally we argue that the interaction of superfluid and normal-fluid vortex bundles is relevant to recent investigations of superfluid turbulence.

  19. Childfeeding survey at Kimalewa Health Centre.

    PubMed

    Lavrijsen, G; Jansen, A A

    1983-07-01

    In July and August 1980 a child feeding survey was conducted at Kimalewa Health Center, Bokoli Location, Western Province, Kenya to become acquainted with traditional child feeding patterns. Interviews were held at the Center with the help of a male health worker. 150 women were interviewed. The majority of the mothers breastfed their children on demand. 1 of 5 (21.4%) of all children taken off the breast was weaned during the 1st year of life. The main reasons for stopping breastfeeding was either the feeling that the child is old enough or because the mother was pregnant again. Other reasons given were not enough milk, illness of the mother, refusal by the child, bottle is better, and abscess of the breast. Most of the children were gradually weaned off the breast by giving them (more of) other foods. Abrupt ways to stop breastfeeding were painting the nipples with pili pili, sending the children to relatives, and mother sleeping dressed. As to the feeding practices, the most important weaning food was porridge. Uji and cow's milk appeared to be the 1st weaning foods. From 5-6 months onwards cow's milk was replaced by other foods. Fruit juice was given to a few babies only. Fresh fruit was more popular (oranges, lemons, and sweet banana) from 3 months onwards. No solid foods were introduced during the 1st 3 months with the exception of beans. Only after 6 months were children given solid foods like ugali and vegetables. Milk intake increased with age. Fresh cow's milk mixed with some water or cow's milk added to uji were commonly used. 63% of the mothers thought that for 3-6 month old infants breastfeeding was best. 23% of the mothers thought that bottle feeding was better than breastfeeding. 21 women did not know which of the 2 methods was better. There was confusion regarding the time solid foods should be introduced. Most mothers seemed to favor early introduction of solid foods, 14% of the mothers thought that breastfeeding should be stopped before the child

  20. Lessons that newborn screening in the USA can teach us about biobanking and large-scale genetic studies

    PubMed Central

    Tarini, Beth A; Lantos, John D

    2013-01-01

    The intent in establishing newborn screening programs was not to create and sustain a large-scale genetic biobanks. Instead, newborn screening programs were designed as a public health program. As such, they have successfully screened millions of asymptomatic newborns for disease that, undiagnosed and untreated, would cause disability or death. However, historical decisions on retention of residual samples and technological innovation have forced these programs and their proponents to confront the prospect of biobanking and the conduct of large-scale genetic studies. We suggest that the challenges facing newborn screening can provide important lessons for other biobanking and large-scale genetic testing endeavors. PMID:23599719

  1. Large-scale volcanism associated with coronae on Venus

    NASA Technical Reports Server (NTRS)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated

  2. A review of national health surveys in India

    PubMed Central

    Pandey, Anamika; Dandona, Lalit

    2016-01-01

    Abstract Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India’s disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health. PMID:27034522

  3. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  4. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  5. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  6. Potential and issues in large scale flood inundation modelling

    NASA Astrophysics Data System (ADS)

    Di Baldassarre, Giuliano; Brandimarte, Luigia; Dottori, Francesco; Mazzoleni, Maurizio; Yan, Kun

    2015-04-01

    The last years have seen a growing research interest on large scale flood inundation modelling. Nowadays, modelling tools and datasets allow for analyzing flooding processes at regional, continental and even global scale with an increasing level of detail. As a result, several research works have already addressed this topic using different methodologies of varying complexity. The potential of these studies is certainly enormous. Large scale flood inundation modelling can provide valuable information in areas where few information and studies were previously available. They can provide a consistent framework for a comprehensive assessment of flooding processes in the river basins of world's large rivers, as well as impacts of future climate scenarios. To make the most of such a potential, we believe it is necessary, on the one hand, to understand strengths and limitations of the existing methodologies, and on the other hand, to discuss possibilities and implications of using large scale flood models for operational flood risk assessment and management. Where should researchers put their effort, in order to develop useful and reliable methodologies and outcomes? How the information coming from large scale flood inundation studies can be used by stakeholders? How should we use this information where previous higher resolution studies exist, or where official studies are available?

  7. Global smoothing and continuation for large-scale molecular optimization

    SciTech Connect

    More, J.J.; Wu, Zhijun

    1995-10-01

    We discuss the formulation of optimization problems that arise in the study of distance geometry, ionic systems, and molecular clusters. We show that continuation techniques based on global smoothing are applicable to these molecular optimization problems, and we outline the issues that must be resolved in the solution of large-scale molecular optimization problems.

  8. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  9. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  10. Large-scale drift and Rossby wave turbulence

    NASA Astrophysics Data System (ADS)

    Harper, K. L.; Nazarenko, S. V.

    2016-08-01

    We study drift/Rossby wave turbulence described by the large-scale limit of the Charney–Hasegawa–Mima equation. We define the zonal and meridional regions as Z:= \\{{k} :| {k}y| \\gt \\sqrt{3}{k}x\\} and M:= \\{{k} :| {k}y| \\lt \\sqrt{3}{k}x\\} respectively, where {k}=({k}x,{k}y) is in a plane perpendicular to the magnetic field such that k x is along the isopycnals and k y is along the plasma density gradient. We prove that the only types of resonant triads allowed are M≤ftrightarrow M+Z and Z≤ftrightarrow Z+Z. Therefore, if the spectrum of weak large-scale drift/Rossby turbulence is initially in Z it will remain in Z indefinitely. We present a generalised Fjørtoft’s argument to find transfer directions for the quadratic invariants in the two-dimensional {k}-space. Using direct numerical simulations, we test and confirm our theoretical predictions for weak large-scale drift/Rossby turbulence, and establish qualitative differences with cases when turbulence is strong. We demonstrate that the qualitative features of the large-scale limit survive when the typical turbulent scale is only moderately greater than the Larmor/Rossby radius.

  11. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  12. Large-scale drift and Rossby wave turbulence

    NASA Astrophysics Data System (ADS)

    Harper, K. L.; Nazarenko, S. V.

    2016-08-01

    We study drift/Rossby wave turbulence described by the large-scale limit of the Charney-Hasegawa-Mima equation. We define the zonal and meridional regions as Z:= \\{{k} :| {k}y| \\gt \\sqrt{3}{k}x\\} and M:= \\{{k} :| {k}y| \\lt \\sqrt{3}{k}x\\} respectively, where {k}=({k}x,{k}y) is in a plane perpendicular to the magnetic field such that k x is along the isopycnals and k y is along the plasma density gradient. We prove that the only types of resonant triads allowed are M≤ftrightarrow M+Z and Z≤ftrightarrow Z+Z. Therefore, if the spectrum of weak large-scale drift/Rossby turbulence is initially in Z it will remain in Z indefinitely. We present a generalised Fjørtoft’s argument to find transfer directions for the quadratic invariants in the two-dimensional {k}-space. Using direct numerical simulations, we test and confirm our theoretical predictions for weak large-scale drift/Rossby turbulence, and establish qualitative differences with cases when turbulence is strong. We demonstrate that the qualitative features of the large-scale limit survive when the typical turbulent scale is only moderately greater than the Larmor/Rossby radius.

  13. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  14. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  15. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  16. Large-scale data analysis using the Wigner function

    NASA Astrophysics Data System (ADS)

    Earnshaw, R. A.; Lei, C.; Li, J.; Mugassabi, S.; Vourdas, A.

    2012-04-01

    Large-scale data are analysed using the Wigner function. It is shown that the 'frequency variable' provides important information, which is lost with other techniques. The method is applied to 'sentiment analysis' in data from social networks and also to financial data.

  17. Ecosystem resilience despite large-scale altered hydro climatic conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  18. Large-scale societal changes and intentionality - an uneasy marriage.

    PubMed

    Bodor, Péter; Fokas, Nikos

    2014-08-01

    Our commentary focuses on juxtaposing the proposed science of intentional change with facts and concepts pertaining to the level of large populations or changes on a worldwide scale. Although we find a unified evolutionary theory promising, we think that long-term and large-scale, scientifically guided - that is, intentional - social change is not only impossible, but also undesirable. PMID:25162863

  19. Implicit solution of large-scale radiation diffusion problems

    SciTech Connect

    Brown, P N; Graziani, F; Otero, I; Woodward, C S

    2001-01-04

    In this paper, we present an efficient solution approach for fully implicit, large-scale, nonlinear radiation diffusion problems. The fully implicit approach is compared to a semi-implicit solution method. Accuracy and efficiency are shown to be better for the fully implicit method on both one- and three-dimensional problems with tabular opacities taken from the LEOS opacity library.

  20. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  1. Simulation and Analysis of Large-Scale Compton Imaging Detectors

    SciTech Connect

    Manini, H A; Lange, D J; Wright, D M

    2006-12-27

    We perform simulations of two types of large-scale Compton imaging detectors. The first type uses silicon and germanium detector crystals, and the second type uses silicon and CdZnTe (CZT) detector crystals. The simulations use realistic detector geometry and parameters. We analyze the performance of each type of detector, and we present results using receiver operating characteristics (ROC) curves.

  2. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  3. CACHE Guidelines for Large-Scale Computer Programs.

    ERIC Educational Resources Information Center

    National Academy of Engineering, Washington, DC. Commission on Education.

    The Computer Aids for Chemical Engineering Education (CACHE) guidelines identify desirable features of large-scale computer programs including running cost and running-time limit. Also discussed are programming standards, documentation, program installation, system requirements, program testing, and program distribution. Lists of types of…

  4. Over-driven control for large-scale MR dampers

    NASA Astrophysics Data System (ADS)

    Friedman, A. J.; Dyke, S. J.; Phillips, B. M.

    2013-04-01

    As semi-active electro-mechanical control devices increase in scale for use in real-world civil engineering applications, their dynamics become increasingly complicated. Control designs that are able to take these characteristics into account will be more effective in achieving good performance. Large-scale magnetorheological (MR) dampers exhibit a significant time lag in their force-response to voltage inputs, reducing the efficacy of typical controllers designed for smaller scale devices where the lag is negligible. A new control algorithm is presented for large-scale MR devices that uses over-driving and back-driving of the commands to overcome the challenges associated with the dynamics of these large-scale MR dampers. An illustrative numerical example is considered to demonstrate the controller performance. Via simulations of the structure using several seismic ground motions, the merits of the proposed control strategy to achieve reductions in various response parameters are examined and compared against several accepted control algorithms. Experimental evidence is provided to validate the improved capabilities of the proposed controller in achieving the desired control force levels. Through real-time hybrid simulation (RTHS), the proposed controllers are also examined and experimentally evaluated in terms of their efficacy and robust performance. The results demonstrate that the proposed control strategy has superior performance over typical control algorithms when paired with a large-scale MR damper, and is robust for structural control applications.

  5. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  6. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  7. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  8. Cosmic strings and the large-scale structure

    NASA Technical Reports Server (NTRS)

    Stebbins, Albert

    1988-01-01

    A possible problem for cosmic string models of galaxy formation is presented. If very large voids are common and if loop fragmentation is not much more efficient than presently believed, then it may be impossible for string scenarios to produce the observed large-scale structure with Omega sub 0 = 1 and without strong environmental biasing.

  9. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  10. Improving the Utility of Large-Scale Assessments in Canada

    ERIC Educational Resources Information Center

    Rogers, W. Todd

    2014-01-01

    Principals and teachers do not use large-scale assessment results because the lack of distinct and reliable subtests prevents identifying strengths and weaknesses of students and instruction, the results arrive too late to be used, and principals and teachers need assistance to use the results to improve instruction so as to improve student…

  11. The National Adolescent Student Health Survey. A Report on the Health of America's Youth.

    ERIC Educational Resources Information Center

    American School Health Association, Kent, OH.

    The National Adolescent Student Health Survey (NASHS) was designed to assess students' health-related knowledge, attitudes, and behaviors in eight areas of critical importance to the health of youth. Two grade levels, eighth and tenth, were chosen to be the focus of the study. The survey provides a national profile of students at these two grade…

  12. Effects of large-scale environment on the assembly history of central galaxies

    SciTech Connect

    Jung, Intae; Lee, Jaehyun; Yi, Sukyoung K.

    2014-10-10

    We examine whether large-scale environment affects the mass assembly history of central galaxies. To facilitate this, we constructed dark matter halo merger trees from a cosmological N-body simulation and calculated the formation and evolution of galaxies using a semi-analytic method. We confirm earlier results that smaller halos show a notable difference in formation time with a mild dependence on large-scale environment. However, using a semi-analytic model, we found that on average the growth rate of the stellar mass of central galaxies is largely insensitive to large-scale environment. Although our results show that the star formation rate (SFR) and the stellar mass of central galaxies in smaller halos are slightly affected by the assembly bias of halos, those galaxies are faint and the difference in the SFR is minute, therefore it is challenging to detect it in real galaxies given the current observational accuracy. Future galaxy surveys, such as the BigBOSS experiment and the Large Synoptic Survey Telescope, which are expected to provide observational data for fainter objects, will provide a chance to test our model predictions.

  13. Gamma-ray bursts as a probe of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Lamb, D. Q.; Quashnock, Jean M.

    1993-01-01

    If gamma-ray bursts are cosmological in origin, the sources of the bursts are expected to trace the large-scale structure of luminous matter in the universe. We show that, if this is so and if the Burst and Transient Source Experiment yields the locations of approximately greater than 3000 gamma-ray bursts, it may be possible to use them to probe the structure of luminous matter on the largest scales known, consistent with recent determinations from pencil beam surveys and studies of superclusters. A positive result would provide compelling evidence that most gamma-ray bursts are cosmological in origin and would allow comparison between the distributions of luminous matter and dark matter on large scales. Conversely, a negative result might cast doubt on the cosmological origin of the bursts, provide evidence that the clustering of burst sources on large scales is less than that expected from pencil beam surveys and studies of superclusters, or indicate that gamma-ray bursts have some more exotic origin.

  14. ADHD and Health Services Utilization in the National Health Interview Survey

    ERIC Educational Resources Information Center

    Cuffe, Steven P.; Moore, Charity G.; McKeown, Robert

    2009-01-01

    Objective: Describe the general health, comorbidities and health service use among U.S. children with ADHD. Method: The 2001 National Health Interview Survey (NHIS) contained the Strengths and Difficulties Questionnaire (SDQ; used to determine probable ADHD), data on medical problems, overall health, and health care utilization. Results: Asthma…

  15. [Population surveys as management tools and health care models].

    PubMed

    Andrade, Flávia Reis de; Narvai, Paulo Capel

    2013-12-01

    The article briefly systematizes health care models, emphasizes the role of population surveys as a management tool and analyzes the specific case of the Brazilian Oral Health Survey (SBBrasil 2010) and its contribution to the consolidation process of health care models consistent with the principles of the Sistema Único de Saúde (SUS, Public Health Care System). While in legal terms SUS corresponds to a health care model, in actual practice the public policy planning and health action, the system gives rise to a care model which is not the result of legal texts or theoretical formulations, but rather the praxis of the personnel involved. Bearing in mind that the management of day-to-day health affairs is a privileged space for the production and consolidation of health care models, it is necessary to stimulate and support the development of technical and operational skills which are different from those required for the management of care related to individual demands.

  16. The Use of Weighted Graphs for Large-Scale Genome Analysis

    PubMed Central

    Zhou, Fang; Toivonen, Hannu; King, Ross D.

    2014-01-01

    There is an acute need for better tools to extract knowledge from the growing flood of sequence data. For example, thousands of complete genomes have been sequenced, and their metabolic networks inferred. Such data should enable a better understanding of evolution. However, most existing network analysis methods are based on pair-wise comparisons, and these do not scale to thousands of genomes. Here we propose the use of weighted graphs as a data structure to enable large-scale phylogenetic analysis of networks. We have developed three types of weighted graph for enzymes: taxonomic (these summarize phylogenetic importance), isoenzymatic (these summarize enzymatic variety/redundancy), and sequence-similarity (these summarize sequence conservation); and we applied these types of weighted graph to survey prokaryotic metabolism. To demonstrate the utility of this approach we have compared and contrasted the large-scale evolution of metabolism in Archaea and Eubacteria. Our results provide evidence for limits to the contingency of evolution. PMID:24619061

  17. Resurrecting hot dark matter - Large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1988-01-01

    These are the results of a numerical simulation of the formation of large-scale structure from cosmic-string loops in a universe dominated by massive neutrinos (hot dark matter). This model has several desirable features. The final matter distribution contains isolated density peaks embedded in a smooth background, producing a natural bias in the distribution of luminous matter. Because baryons can accrete onto the cosmic strings before the neutrinos, the galaxies will have baryon cores and dark neutrino halos. Galaxy formation in this model begins much earlier than in random-phase models. On large scales the distribution of clustered matter visually resembles the CfA survey, with large voids and filaments.

  18. Discussion on Investigation Methods for Large-scale Landslide Disasters on Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, L. K.; Liu, C. H.; Lin, S. C.; Wu, T. Y.

    2012-04-01

    The catastrophic landslide disaster in Siao-Lin village induced by typhoon Morakot was given attention on disaster management and prevention for large-scale landslide events. This event reminds the authors to review the current works on large scale landslide susceptibility, especially the investigation methods. Firstly, the authors explored the investigation methods on landslide susceptibility in Taiwan, Hong Kong, and Japan. Areas with high susceptibility were clarified based on the current investigation data and were classified according to different scales respectively, including the basin scale, sub-catchment scale, and the hillslope scale. The results show the classification principles on major landslide susceptibility areas according to the overlaid information from landslide frequency and vulnerable residents with different scales. According to the process and compare with the historical landslide disaster, there are 10 major basins were determined in this study. The methods can be a good material for the slopeland authority when the quick survey on landslide susceptibility area is necessary.

  19. SPIN ALIGNMENTS OF SPIRAL GALAXIES WITHIN THE LARGE-SCALE STRUCTURE FROM SDSS DR7

    SciTech Connect

    Zhang, Youcai; Yang, Xiaohu; Luo, Wentao; Wang, Huiyuan; Wang, Lei; Mo, H. J.; Van den Bosch, Frank C. E-mail: xyang@sjtu.edu.cn

    2015-01-01

    Using a sample of spiral galaxies selected from the Sloan Digital Sky Survey Data Release 7 and Galaxy Zoo 2, we investigate the alignment of spin axes of spiral galaxies with their surrounding large-scale structure, which is characterized by the large-scale tidal field reconstructed from the data using galaxy groups above a certain mass threshold. We find that the spin axes only have weak tendencies to be aligned with (or perpendicular to) the intermediate (or minor) axis of the local tidal tensor. The signal is the strongest in a cluster environment where all three eigenvalues of the local tidal tensor are positive. Compared to the alignments between halo spins and the local tidal field obtained in N-body simulations, the above observational results are in best agreement with those for the spins of inner regions of halos, suggesting that the disk material traces the angular momentum of dark matter halos in the inner regions.

  20. Challenges and Innovations in Surveying the Governmental Public Health Workforce

    PubMed Central

    Shah, Gulzar; Rider, Nikki; Beck, Angela; Castrucci, Brian C.; Harris, Jenine K.; Sellers, Katie; Varda, Danielle; Ye, Jiali; Erwin, Paul C.; Brownson, Ross C.

    2016-01-01

    Surveying governmental public health practitioners is a critical means of collecting data about public health organizations, their staff, and their partners. A greater focus on evidence-based practices, practice-based systems research, and evaluation has resulted in practitioners consistently receiving requests to participate in myriad surveys. This can result in a substantial survey burden for practitioners and declining response rates for researchers. This is potentially damaging to practitioners and researchers as well as the field of public health more broadly. We have examined recent developments in survey research, especially issues highly relevant for public health practice. We have also proposed a process by which researchers can engage with practitioners and practitioner groups on research questions of mutual interest. PMID:27715307

  1. Survey explores nurses' of e-health tools.

    PubMed

    Wallis, Alison

    2012-03-01

    E-health is concerned with promoting the health and wellbeing of individuals, families and communities, and improving professional practice through the use of information management and information and communication technology. In autumn 2010 the RCN, supported by an information technology consultancy, carried out a survey of members' views on e-health to assess their involvement in, and readiness for, e-health developments and their knowledge of its benefits. A total of 1,313 nurses, midwives, healthcare support workers and pre-registration students from across the UK responded. This article describes ways in which nurse managers can influence the successful implementation of the survey recommendations.

  2. "The Health Educator" Readership Survey, 2011: Reporting the Results

    ERIC Educational Resources Information Center

    Bliss, Kadi; Ogletree, Roberta J.; Liefer, Maureen

    2011-01-01

    Readership surveys can help editors assess satisfaction with a journal as well as identify potential modifications to be made. The editorial staff of "The Health Educator" conducted an online readership survey in the summer of 20 11. After a five-week data solicitation and collection period, a total of 504 Eta Sigma Gamma (ESG) members responded.…

  3. French Frigate Shoals reef health survey

    USGS Publications Warehouse

    Work, Thierry M.; Coles, Steve L.; Rameyer, Robert

    2002-01-01

    French Frigate Shoals consists of a large (31 nm) fringing reef partially enclosing a lagoon. A basalt pinnacle (La Perouse Pinnacle) arises approximately halfway between the two ends of the arcs of the fringing reefs. Tern Island is situated at the northern end of the lagoon and is surrounded by a dredged ship channel. The lagoon becomes progressively shallower from west to east and harbors a variety of marine life including corals, fish, marine mammals, and sea turtles (Amerson 1971). In 2000, an interagency survey of the northwestern Hawaiian Islands was done to document the fauna and flora in FFS (Maragos and Gulko, 2002). During that survey, 38 stations were examined, and 41 species of stony corals were documented, the most of any of the NW Hawaiian islands (Maragos and Gulko 2002). In some of these stations, corals with abnormalities were observed. The present study aimed to expand on the 2000 survey to evaluate the lesions in areas where they were documented.

  4. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  5. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    SciTech Connect

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  6. The Large Scale Synthesis of Aligned Plate Nanostructures

    PubMed Central

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ′ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  7. Electron drift in a large scale solid xenon

    DOE PAGESBeta

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor twomore » faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.« less

  8. Electron drift in a large scale solid xenon

    SciTech Connect

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  9. Large scale meteorological influence during the Geysers 1979 field experiment

    SciTech Connect

    Barr, S.

    1980-01-01

    A series of meteorological field measurements conducted during July 1979 near Cobb Mountain in Northern California reveals evidence of several scales of atmospheric circulation consistent with the climatic pattern of the area. The scales of influence are reflected in the structure of wind and temperature in vertically stratified layers at a given observation site. Large scale synoptic gradient flow dominates the wind field above about twice the height of the topographic ridge. Below that there is a mixture of effects with evidence of a diurnal sea breeze influence and a sublayer of katabatic winds. The July observations demonstrate that weak migratory circulations in the large scale synoptic meteorological pattern have a significant influence on the day-to-day gradient winds and must be accounted for in planning meteorological programs including tracer experiments.

  10. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  11. Lagrangian space consistency relation for large scale structure

    SciTech Connect

    Horn, Bart; Hui, Lam; Xiao, Xiao E-mail: lh399@columbia.edu

    2015-09-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.

  12. The workshop on iterative methods for large scale nonlinear problems

    SciTech Connect

    Walker, H.F.; Pernice, M.

    1995-12-01

    The aim of the workshop was to bring together researchers working on large scale applications with numerical specialists of various kinds. Applications that were addressed included reactive flows (combustion and other chemically reacting flows, tokamak modeling), porous media flows, cardiac modeling, chemical vapor deposition, image restoration, macromolecular modeling, and population dynamics. Numerical areas included Newton iterative (truncated Newton) methods, Krylov subspace methods, domain decomposition and other preconditioning methods, large scale optimization and optimal control, and parallel implementations and software. This report offers a brief summary of workshop activities and information about the participants. Interested readers are encouraged to look into an online proceedings available at http://www.usi.utah.edu/logan.proceedings. In this, the material offered here is augmented with hypertext abstracts that include links to locations such as speakers` home pages, PostScript copies of talks and papers, cross-references to related talks, and other information about topics addresses at the workshop.

  13. The Large Scale Synthesis of Aligned Plate Nanostructures

    NASA Astrophysics Data System (ADS)

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-07-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ‧ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential.

  14. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    The overall objective of the work that was conducted was to understand the present-day large-scale deformations of the crust throughout the western United States and in so doing to improve our ability to assess the potential for seismic hazards in this region. To address this problem, we used a large collection of Global Positioning System (GPS) networks which spans the region to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our results can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  15. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  16. Startup of large-scale projects casts spotlight on IGCC

    SciTech Connect

    Swanekamp, R.

    1996-06-01

    With several large-scale plants cranking up this year, integrated coal gasification/combined cycle (IGCC) appears poised for growth. The technology may eventually help coal reclaim its former prominence in new plant construction, but developers worldwide are eyeing other feedstocks--such as petroleum coke or residual oil. Of the so-called advanced clean-coal technologies, integrated (IGCC) appears to be having a defining year. Of three large-scale demonstration plants in the US, one is well into startup, a second is expected to begin operating in the fall, and a third should startup by the end of the year; worldwide, over a dozen more projects are in the works. In Italy, for example, several large projects using petroleum coke or refinery residues as feedstocks are proceeding, apparently on a project-finance basis.

  17. Considerations of large scale impact and the early Earth

    NASA Technical Reports Server (NTRS)

    Grieve, R. A. F.; Parmentier, E. M.

    1985-01-01

    Bodies which have preserved portions of their earliest crust indicate that large scale impact cratering was an important process in early surface and upper crustal evolution. Large impact basins form the basic topographic, tectonic, and stratigraphic framework of the Moon and impact was responsible for the characteristics of the second order gravity field and upper crustal seismic properties. The Earth's crustal evolution during the first 800 my of its history is conjectural. The lack of a very early crust may indicate that thermal and mechanical instabilities resulting from intense mantle convection and/or bombardment inhibited crustal preservation. Whatever the case, the potential effects of large scale impact have to be considered in models of early Earth evolution. Preliminary models of the evolution of a large terrestrial impact basin was derived and discussed in detail.

  18. New Mexico Adolescent Health Risks Survey.

    ERIC Educational Resources Information Center

    Antle, David

    To inform students of health risks (posed by behavior, environment, and genetics) and provide schools with collective risk appraisal information as a basis for planning/evaluating health and wellness initiatives, New Mexico administered the Teen Wellness Check in 1985 to 1,573 ninth-grade students from 7 New Mexico public schools. Subjects were…

  19. Report of Mental Health Survey Team.

    ERIC Educational Resources Information Center

    Atcheson, J. D.; And Others

    Three psychiatrists and a consulting psychologist investigated mental health problems in the Yukon and Northwest Territories. Specific purposes of the investigation were (1) to comment on the adequacy of existing mental health services and facilities, (2) to make recommendations for improvement of consulting services and facilities, (3) to consult…

  20. The large-scale anisotropy with the PAMELA calorimeter

    NASA Astrophysics Data System (ADS)

    Karelin, A.; Adriani, O.; Barbarino, G.; Bazilevskaya, G.; Bellotti, R.; Boezio, M.; Bogomolov, E.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bruno, A.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; De Donato, C.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A.; Koldashov, S.; Koldobskiy, S.; Krut'kov, S.; Kvashnin, A.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A.; Menn, W.; Mergé, M.; Mikhailov, V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S.; Sarkar, R.; Simon, M.; Scotti, V.; Sparvoli, R.; Spillantini, P.; Stozhkov, Y.; Vacchi, A.; Vannuccini, E.; Vasilyev, G.; Voronov, S.; Yurkin, Y.; Zampa, G.; Zampa, N.

    2015-10-01

    The large-scale anisotropy (or the so-called star-diurnal wave) has been studied using the calorimeter of the space-born experiment PAMELA. The cosmic ray anisotropy has been obtained for the Southern and Northern hemispheres simultaneously in the equatorial coordinate system for the time period 2006-2014. The dipole amplitude and phase have been measured for energies 1-20 TeV n-1.

  1. Report on large scale molten core/magnesia interaction test

    SciTech Connect

    Chu, T.Y.; Bentz, J.H.; Arellano, F.E.; Brockmann, J.E.; Field, M.E.; Fish, J.D.

    1984-08-01

    A molten core/material interaction experiment was performed at the Large-Scale Melt Facility at Sandia National Laboratories. The experiment involved the release of 230 kg of core melt, heated to 2923/sup 0/K, into a magnesia brick crucible. Descriptions of the facility, the melting technology, as well as results of the experiment, are presented. Preliminary evaluations of the results indicate that magnesia brick can be a suitable material for core ladle construction.

  2. Analysis plan for 1985 large-scale tests. Technical report

    SciTech Connect

    McMullan, F.W.

    1983-01-01

    The purpose of this effort is to assist DNA in planning for large-scale (upwards of 5000 tons) detonations of conventional explosives in the 1985 and beyond time frame. Primary research objectives were to investigate potential means to increase blast duration and peak pressures. This report identifies and analyzes several candidate explosives. It examines several charge designs and identifies advantages and disadvantages of each. Other factors including terrain and multiburst techniques are addressed as are test site considerations.

  3. Simulating Weak Lensing by Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Vale, Chris; White, Martin

    2003-08-01

    We model weak gravitational lensing of light by large-scale structure using ray tracing through N-body simulations. The method is described with particular attention paid to numerical convergence. We investigate some of the key approximations in the multiplane ray-tracing algorithm. Our simulated shear and convergence maps are used to explore how well standard assumptions about weak lensing hold, especially near large peaks in the lensing signal.

  4. The Phoenix series large scale LNG pool fire experiments.

    SciTech Connect

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  5. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  6. The Large-scale Structure of Scientific Method

    NASA Astrophysics Data System (ADS)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of scientific method can reveal the global interconnectedness of scientific knowledge that is an essential part of what makes science scientific.

  7. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  8. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  9. Multivariate Clustering of Large-Scale Scientific Simulation Data

    SciTech Connect

    Eliassi-Rad, T; Critchlow, T

    2003-06-13

    Simulations of complex scientific phenomena involve the execution of massively parallel computer programs. These simulation programs generate large-scale data sets over the spatio-temporal space. Modeling such massive data sets is an essential step in helping scientists discover new information from their computer simulations. In this paper, we present a simple but effective multivariate clustering algorithm for large-scale scientific simulation data sets. Our algorithm utilizes the cosine similarity measure to cluster the field variables in a data set. Field variables include all variables except the spatial (x, y, z) and temporal (time) variables. The exclusion of the spatial dimensions is important since ''similar'' characteristics could be located (spatially) far from each other. To scale our multivariate clustering algorithm for large-scale data sets, we take advantage of the geometrical properties of the cosine similarity measure. This allows us to reduce the modeling time from O(n{sup 2}) to O(n x g(f(u))), where n is the number of data points, f(u) is a function of the user-defined clustering threshold, and g(f(u)) is the number of data points satisfying f(u). We show that on average g(f(u)) is much less than n. Finally, even though spatial variables do not play a role in building clusters, it is desirable to associate each cluster with its correct spatial region. To achieve this, we present a linking algorithm for connecting each cluster to the appropriate nodes of the data set's topology tree (where the spatial information of the data set is stored). Our experimental evaluations on two large-scale simulation data sets illustrate the value of our multivariate clustering and linking algorithms.

  10. Multivariate Clustering of Large-Scale Simulation Data

    SciTech Connect

    Eliassi-Rad, T; Critchlow, T

    2003-03-04

    Simulations of complex scientific phenomena involve the execution of massively parallel computer programs. These simulation programs generate large-scale data sets over the spatiotemporal space. Modeling such massive data sets is an essential step in helping scientists discover new information from their computer simulations. In this paper, we present a simple but effective multivariate clustering algorithm for large-scale scientific simulation data sets. Our algorithm utilizes the cosine similarity measure to cluster the field variables in a data set. Field variables include all variables except the spatial (x, y, z) and temporal (time) variables. The exclusion of the spatial space is important since 'similar' characteristics could be located (spatially) far from each other. To scale our multivariate clustering algorithm for large-scale data sets, we take advantage of the geometrical properties of the cosine similarity measure. This allows us to reduce the modeling time from O(n{sup 2}) to O(n x g(f(u))), where n is the number of data points, f(u) is a function of the user-defined clustering threshold, and g(f(u)) is the number of data points satisfying the threshold f(u). We show that on average g(f(u)) is much less than n. Finally, even though spatial variables do not play a role in building a cluster, it is desirable to associate each cluster with its correct spatial space. To achieve this, we present a linking algorithm for connecting each cluster to the appropriate nodes of the data set's topology tree (where the spatial information of the data set is stored). Our experimental evaluations on two large-scale simulation data sets illustrate the value of our multivariate clustering and linking algorithms.

  11. Large-scale Alfvén vortices

    SciTech Connect

    Onishchenko, O. G.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  12. Relic vector field and CMB large scale anomalies

    SciTech Connect

    Chen, Xingang; Wang, Yi E-mail: yw366@cam.ac.uk

    2014-10-01

    We study the most general effects of relic vector fields on the inflationary background and density perturbations. Such effects are observable if the number of inflationary e-folds is close to the minimum requirement to solve the horizon problem. We show that this can potentially explain two CMB large scale anomalies: the quadrupole-octopole alignment and the quadrupole power suppression. We discuss its effect on the parity anomaly. We also provide analytical template for more detailed data comparison.

  13. Large-scale Alfvén vortices

    NASA Astrophysics Data System (ADS)

    Onishchenko, O. G.; Pokhotelov, O. A.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-01

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  14. Turbulent large-scale structure effects on wake meandering

    NASA Astrophysics Data System (ADS)

    Muller, Y.-A.; Masson, C.; Aubrun, S.

    2015-06-01

    This work studies effects of large-scale turbulent structures on wake meandering using Large Eddy Simulations (LES) over an actuator disk. Other potential source of wake meandering such as the instablility mechanisms associated with tip vortices are not treated in this study. A crucial element of the efficient, pragmatic and successful simulations of large-scale turbulent structures in Atmospheric Boundary Layer (ABL) is the generation of the stochastic turbulent atmospheric flow. This is an essential capability since one source of wake meandering is these large - larger than the turbine diameter - turbulent structures. The unsteady wind turbine wake in ABL is simulated using a combination of LES and actuator disk approaches. In order to dedicate the large majority of the available computing power in the wake, the ABL ground region of the flow is not part of the computational domain. Instead, mixed Dirichlet/Neumann boundary conditions are applied at all the computational surfaces except at the outlet. Prescribed values for Dirichlet contribution of these boundary conditions are provided by a stochastic turbulent wind generator. This allows to simulate large-scale turbulent structures - larger than the computational domain - leading to an efficient simulation technique of wake meandering. Since the stochastic wind generator includes shear, the turbulence production is included in the analysis without the necessity of resolving the flow near the ground. The classical Smagorinsky sub-grid model is used. The resulting numerical methodology has been implemented in OpenFOAM. Comparisons with experimental measurements in porous-disk wakes have been undertaken, and the agreements are good. While temporal resolution in experimental measurements is high, the spatial resolution is often too low. LES numerical results provide a more complete spatial description of the flow. They tend to demonstrate that inflow low frequency content - or large- scale turbulent structures - is

  15. A Cloud Computing Platform for Large-Scale Forensic Computing

    NASA Astrophysics Data System (ADS)

    Roussev, Vassil; Wang, Liqiang; Richard, Golden; Marziale, Lodovico

    The timely processing of massive digital forensic collections demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections. This paper describes MPI MapReduce (MMR), an open implementation of the MapReduce processing model that outperforms traditional forensic computing techniques. MMR provides linear scaling for CPU-intensive processing and super-linear scaling for indexing-related workloads.

  16. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  17. Geospatial Optimization of Siting Large-Scale Solar Projects

    SciTech Connect

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  18. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  19. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances.

    PubMed

    Parker, V Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host.

  20. Large scale anisotropy of UHECRs for the Telescope Array

    SciTech Connect

    Kido, E.

    2011-09-22

    The origin of Ultra High Energy Cosmic Rays (UHECRs) is one of the most interesting questions in astroparticle physics. Despite of the efforts by other previous measurements, there is no consensus of both of the origin and the mechanism of UHECRs generation and propagation yet. In this context, Telescope Array (TA) experiment is expected to play an important role as the largest detector in the northern hemisphere which consists of an array of surface particle detectors (SDs) and fluorescence detectors (FDs) and other important calibration devices. We searched for large scale anisotropy using SD data of TA. UHECRs are expected to be restricted in GZK horizon when the composition of UHECRs is proton, so the observed arrival directions are expected to exhibit local large scale anisotropy if UHECR sources are some astrophysical objects. We used the SD data set from 11 May 2008 to 7 September 2010 to search for large-scale anisotropy. The discrimination power between LSS and isotropy is not enough yet, but the statistics in TA is expected to discriminate between those in about 95% confidence level on average in near future.

  1. How Large Scales Flows May Influence Solar Activity

    NASA Technical Reports Server (NTRS)

    Hathaway, D. H.

    2004-01-01

    Large scale flows within the solar convection zone are the primary drivers of the Sun's magnetic activity cycle and play important roles in shaping the Sun's magnetic field. Differential rotation amplifies the magnetic field through its shearing action and converts poloidal field into toroidal field. Poleward meridional flow near the surface carries magnetic flux that reverses the magnetic poles at about the time of solar maximum. The deeper, equatorward meridional flow can carry magnetic flux back toward the lower latitudes where it erupts through the surface to form tilted active regions that convert toroidal fields into oppositely directed poloidal fields. These axisymmetric flows are themselves driven by large scale convective motions. The effects of the Sun's rotation on convection produce velocity correlations that can maintain both the differential rotation and the meridional circulation. These convective motions can also influence solar activity directly by shaping the magnetic field pattern. While considerable theoretical advances have been made toward understanding these large scale flows, outstanding problems in matching theory to observations still remain.

  2. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. PMID:25731989

  3. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, C.P.; Olden, J.D.; Lytle, D.A.; Melis, T.S.; Schmidt, J.C.; Bray, E.N.; Freeman, Mary C.; Gido, K.B.; Hemphill, N.P.; Kennard, M.J.; McMullen, L.E.; Mims, M.C.; Pyron, M.; Robinson, C.T.; Williams, J.G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems. ?? 2011 by American Institute of Biological Sciences. All rights reserved.

  4. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies.

  5. A visualization framework for large-scale virtual astronomy

    NASA Astrophysics Data System (ADS)

    Fu, Chi-Wing

    Motivated by advances in modern positional astronomy, this research attempts to digitally model the entire Universe through computer graphics technology. Our first challenge is space itself. The gigantic size of the Universe makes it impossible to put everything into a typical graphics system at its own scale. The graphics rendering process can easily fail because of limited computational precision, The second challenge is that the enormous amount of data could slow down the graphics; we need clever techniques to speed up the rendering. Third, since the Universe is dominated by empty space, objects are widely separated; this makes navigation difficult. We attempt to tackle these problems through various techniques designed to extend and optimize the conventional graphics framework, including the following: power homogeneous coordinates for large-scale spatial representations, generalized large-scale spatial transformations, and rendering acceleration via environment caching and object disappearance criteria. Moreover, we implemented an assortment of techniques for modeling and rendering a variety of astronomical bodies, ranging from the Earth up to faraway galaxies, and attempted to visualize cosmological time; a method we call the Lightcone representation was introduced to visualize the whole space-time of the Universe at a single glance. In addition, several navigation models were developed to handle the large-scale navigation problem. Our final results include a collection of visualization tools, two educational animations appropriate for planetarium audiences, and state-of-the-art-advancing rendering techniques that can be transferred to practice in digital planetarium systems.

  6. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  7. Impact of Large-scale Geological Architectures On Recharge

    NASA Astrophysics Data System (ADS)

    Troldborg, L.; Refsgaard, J. C.; Engesgaard, P.; Jensen, K. H.

    Geological and hydrogeological data constitutes the basis for assessment of ground- water flow pattern and recharge zones. The accessibility and applicability of hard ge- ological data is often a major obstacle in deriving plausible conceptual models. Nev- ertheless focus is often on parameter uncertainty caused by the effect of geological heterogeneity due to lack of hard geological data, thus neglecting the possibility of alternative conceptualizations of the large-scale geological architecture. For a catchment in the eastern part of Denmark we have constructed different geologi- cal models based on different conceptualization of the major geological trends and fa- cies architecture. The geological models are equally plausible in a conceptually sense and they are all calibrated to well head and river flow measurements. Comparison of differences in recharge zones and subsequently well protection zones emphasize the importance of assessing large-scale geological architecture in hydrological modeling on regional scale in a non-deterministic way. Geostatistical modeling carried out in a transitional probability framework shows the possibility of assessing multiple re- alizations of large-scale geological architecture from a combination of soft and hard geological information.

  8. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  9. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  10. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances.

    PubMed

    Parker, V Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  11. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  12. Seasonal components of avian population change: Joint analysis of two large-scale monitoring programs

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    2007-01-01

    We present a combined analysis of data from two large-scale surveys of bird populations. The North American Breeding Bird Survey is conducted each summer; the Christmas Bird Count is conducted in early winter. The temporal staggering of these surveys allows investigation of seasonal components of population change, which we illustrate with an examination of the effects of severe winters on the Carolina Wren (Thryothorus ludovicianus). Our analysis uses a hierarchical log-linear model with controls for survey-specific sampling covariates. Temporal change in population size is modeled seasonally, with covariates for winter severity. Overall, the winter?spring seasons are associated with 82% of the total population variation for Carolina Wrens, and an additional day of snow cover during winter?spring is associated with an incremental decline of 1.1% of the population.

  13. Seasonal components of avian population change: joint analysis of two large-scale monitoring programs.

    PubMed

    Link, William A; Sauer, John R

    2007-01-01

    We present a combined analysis of data from two large-scale surveys of bird populations. The North American Breeding Bird Survey is conducted each summer; the Christmas Bird Count is conducted in early winter. The temporal staggering of these surveys allows investigation of seasonal components of population change, which we illustrate with an examination of the effects of severe winters on the Carolina Wren (Thryothorus ludovicianus). Our analysis uses a hierarchical log-linear model with controls for survey-specific sampling covariates. Temporal change in population size is modeled seasonally, with covariates for winter severity. Overall, the winter-spring seasons are associated with 82% of the total population variation for Carolina Wrens, and an additional day of snow cover during winter-spring is associated with an incremental decline of 1.1% of the population.

  14. Cosmology from Cosmic Microwave Background and large- scale structure

    NASA Astrophysics Data System (ADS)

    Xu, Yongzhong

    2003-10-01

    This dissertation consists of a series of studies, constituting four published papers, involving the Cosmic Microwave Background and the large scale structure, which help constrain Cosmological parameters and potential systematic errors. First, we present a method for comparing and combining maps with different resolutions and beam shapes, and apply it to the Saskatoon, QMAP and COBE/DMR data sets. Although the Saskatoon and QMAP maps detect signal at the 21σ and 40σ, levels, respectively, their difference is consistent with pure noise, placing strong limits on possible systematic errors. In particular, we obtain quantitative upper limits on relative calibration and pointing errors. Splitting the combined data by frequency shows similar consistency between the Ka- and Q-bands, placing limits on foreground contamination. The visual agreement between the maps is equally striking. Our combined QMAP+Saskatoon map, nicknamed QMASK, is publicly available at www.hep.upenn.edu/˜xuyz/qmask.html together with its 6495 x 6495 noise covariance matrix. This thoroughly tested data set covers a large enough area (648 square degrees—at the time, the largest degree-scale map available) to allow a statistical comparison with LOBE/DMR, showing good agreement. By band-pass-filtering the QMAP and Saskatoon maps, we are also able to spatially compare them scale-by-scale to check for beam- and pointing-related systematic errors. Using the QMASK map, we then measure the cosmic microwave background (CMB) power spectrum on angular scales ℓ ˜ 30 200 (1° 6°), and we test it for non-Gaussianity using morphological statistics known as Minkowski functionals. We conclude that the QMASK map is neither a very typical nor a very exceptional realization of a Gaussian random field. At least about 20% of the 1000 Gaussian Monte Carlo maps differ more than the QMASK map from the mean morphological parameters of the Gaussian fields. Finally, we compute the real-space power spectrum and the

  15. The California Health Interview Survey 2001: translation of a major survey for California's multiethnic population.

    PubMed Central

    Ponce, Ninez A.; Lavarreda, Shana Alex; Yen, Wei; Brown, E. Richard; DiSogra, Charles; Satter, Delight E.

    2004-01-01

    The cultural and linguistic diversity of the U.S. population presents challenges to the design and implementation of population-based surveys that serve to inform public policies. Information derived from such surveys may be less than representative if groups with limited or no English language skills are not included. The California Health Interview Survey (CHIS), first administered in 2001, is a population-based health survey of more than 55,000 California households. This article describes the process that the designers of CHIS 2001 underwent in culturally adapting the survey and translating it into an unprecedented number of languages: Spanish, Chinese, Vietnamese, Korean, and Khmer. The multiethnic and multilingual CHIS 2001 illustrates the importance of cultural and linguistic adaptation in raising the quality of population-based surveys, especially when the populations they intend to represent are as diverse as California's. PMID:15219795

  16. National Natality Survey/National Maternal and Infant Health Survey (NMIHS)

    Cancer.gov

    The survey provides data on socioeconomic and demographic characteristics of mothers, prenatal care, pregnancy history, occupational background, health status of mother and infant, and types and sources of medical care received.

  17. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  18. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  19. Robust large-scale parallel nonlinear solvers for simulations.

    SciTech Connect

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write

  20. Health sciences library building projects, 1998 survey.

    PubMed Central

    Bowden, V M

    1999-01-01

    Twenty-eight health sciences library building projects are briefly described, including twelve new buildings and sixteen additions, remodelings, and renovations. The libraries range in size from 2,144 square feet to 190,000 gross square feet. Twelve libraries are described in detail. These include three hospital libraries, one information center sponsored by ten institutions, and eight academic health sciences libraries. Images PMID:10550027

  1. Taking the Pulse of Undergraduate Health Psychology: A Nationwide Survey

    ERIC Educational Resources Information Center

    Brack, Amy Badura; Kesitilwe, Kutlo; Ware, Mark E.

    2010-01-01

    We conducted a random national survey of 100 doctoral, 100 comprehensive, and 100 baccalaureate institutions to determine the current state of the undergraduate health psychology course. We found clear evidence of a maturing course with much greater commonality in name (health psychology), theoretical foundation (the biopsychosocial model), and…

  2. Health Research Facilities: A survey of Doctorate-Granting Institutions.

    ERIC Educational Resources Information Center

    Atelsek, Frank J.; Gomberg, Irene L.

    The survey data cover three broad categories: (1) the status of existing health research facilities at doctorate-granting institutions (including their current value, adequacy, and condition); (2) the volume of new construction in progress; and (3) the additions to health research facilities anticipated during the next 5 years…

  3. Licensed Practical Nurses in Occupational Health. An Initial Survey.

    ERIC Educational Resources Information Center

    Lee, Jane A.; And Others

    The study, conducted in 1971, assessed characteristics of licensed practical nurses (LPN's) who worked in occupational health nursing. The survey instrument, a questionnaire, was returned by 591 LPN's in occupational health and provided data related to: personal characteristics, work and setting, administrative and professional functioning,…

  4. The Illinois 9th Grade Adolescent Health Survey. Full Report.

    ERIC Educational Resources Information Center

    Illinois State Board of Education, Springfield.

    A survey was conducted in Illinois to identify the risk of certain health problems among adolescents; to determine the health status of Illinois youth in relation to the Surgeon General's "Healthy People 2000 Objectives" and monitor progress toward national and state goals; and to help those working at national, state, and local levels develop…

  5. Summary Health Statistics for U.S. Children: National Health Interview Survey, 1999.

    ERIC Educational Resources Information Center

    Blackwell, Debra L.; Tonthat, Luong

    This report presents statistics from the 1999 National Health Interview Survey (NHIS) on selected health measures for children under 18 years of age, classified by sex, age, race/ethnicity, family structure, parent education, family income, poverty status, health insurance coverage, place of residence, region, and current health status. The NHIS…

  6. Development and Implementation of Culturally Tailored Offline Mobile Health Surveys

    PubMed Central

    2016-01-01

    Background In low and middle income countries (LMICs), and other areas with low resources and unreliable access to the Internet, understanding the emerging best practices for the implementation of new mobile health (mHealth) technologies is needed for efficient and secure data management and for informing public health researchers. Innovations in mHealth technology can improve on previous methods, and dissemination of project development details and lessons learned during implementation are needed to provide lessons learned to stakeholders in both the United States and LMIC settings. Objective The aims of this paper are to share implementation strategies and lessons learned from the development and implementation stages of two survey research projects using offline mobile technology, and to inform and prepare public health researchers and practitioners to implement new mobile technologies in survey research projects in LMICs. Methods In 2015, two survey research projects were developed and piloted in Puerto Rico and pre-tested in Costa Rica to collect face-to-face data, get formative evaluation feedback, and to test the feasibility of an offline mobile data collection process. Fieldwork in each setting involved survey development, back translation with cultural tailoring, ethical review and approvals, data collector training, and piloting survey implementation on mobile tablets. Results Critical processes and workflows for survey research projects in low resource settings were identified and implemented. This included developing a secure mobile data platform tailored to each survey, establishing user accessibility, and training and eliciting feedback from data collectors and on-site LMIC project partners. Conclusions Formative and process evaluation strategies are necessary and useful for the development and implementation of survey research projects using emerging mHealth technologies in LMICs and other low resource settings. Lessons learned include: (1) plan

  7. CUMULATIVE TRAUMAS AND RISK THRESHOLDS: 12-MONTH PTSD IN THE WORLD MENTAL HEALTH (WMH) SURVEYS

    PubMed Central

    Karam, Elie G.; Friedman, Matthew J.; Hill, Eric D.; Kessler, Ronald C.; McLaughlin, Katie A.; Petukhova, Maria; Sampson, Laura; Shahly, Victoria; Angermeyer, Matthias C.; Bromet, Evelyn J.; de Girolamo, Giovanni; de Graaf, Ron; Demyttenaere, Koen; Ferry, Finola; Florescu, Silvia E.; Haro, Josep Maria; He, Yanling; Karam, Aimee N.; Kawakami, Norito; Kovess-Masfety, Viviane; Medina-Mora, María Elena; Browne, Mark A. Oakley; Posada-Villa, José A.; Shalev, Arieh Y.; Stein, Dan J.; Viana, Maria Carmen; Zarkov, Zahari; Koenen, Karestan C.

    2014-01-01

    Background Clinical research suggests that posttraumatic stress disorder (PTSD) patients exposed to multiple traumatic events (TEs) rather than a single TE have increased morbidity and dysfunction. Although epidemiological surveys in the United States and Europe also document high rates of multiple TE exposure, no population-based cross-national data have examined this issue. Methods Data were analyzed from 20 population surveys in the World Health Organization World Mental Health Survey Initiative (n 51,295 aged 18+). The Composite International Diagnostic Interview (3.0) assessed 12-month PTSD and other common DSM-IV disorders. Respondents with 12-month PTSD were assessed for single versus multiple TEs implicated in their symptoms. Associations were examined with age of onset (AOO), functional impairment, comorbidity, and PTSD symptom counts. Results 19.8% of respondents with 12-month PTSD reported that their symptoms were associated with multiple TEs. Cases who associated their PTSD with four or more TEs had greater functional impairment, an earlier AOO, longer duration, higher comorbidity with mood and anxiety disorders, elevated hyper-arousal symptoms, higher proportional exposures to partner physical abuse and other types of physical assault, and lower proportional exposure to unexpected death of a loved one than cases with fewer associated TEs. Conclusions A risk threshold was observed in this large-scale cross-national database wherein cases who associated their PTSD with four or more TEs presented a more “complex” clinical picture with substantially greater functional impairment and greater morbidity than other cases of PTSD. PTSD cases associated with four or more TEs may merit specific and targeted intervention strategies. Depression and Anxiety 31:130–142, 2014. PMID:23983056

  8. Constraining dark energy evolution with gravitational lensing by large scale structures

    SciTech Connect

    Benabed, Karim; Waerbeke, Ludovic van

    2004-12-15

    We study the sensitivity of weak lensing by large scale structures as a probe of the evolution of dark energy. We explore a two-parameters model of dark energy evolution, inspired by tracking quintessence models. To this end, we compute the likelihood of a few fiducial models with varying and nonvarying equation of states. For the different models, we investigate the dark energy parameter degeneracies with the mass power spectrum shape {gamma}, normalization {sigma}{sub 8}, and with the matter mean density {omega}{sub M}. We find that degeneracies are such that weak lensing turns out to be a good probe of dark energy evolution, even with limited knowledge on {gamma}, {sigma}{sub 8}, and {omega}{sub M}. This result is a strong motivation for performing large scale structure simulations beyond the simple constant dark energy models, in order to calibrate the nonlinear regime accurately. Such calibration could then be used for any large scale structure tests of dark energy evolution. Prospective for the Canada France Hawaii Telescope Legacy Survey and Super-Novae Acceleration Probe are given. These results complement nicely the cosmic microwave background and supernovae constraints.

  9. TOPOLOGY OF A LARGE-SCALE STRUCTURE AS A TEST OF MODIFIED GRAVITY

    SciTech Connect

    Wang Xin; Chen Xuelei; Park, Changbom

    2012-03-01

    The genus of the isodensity contours is a robust measure of the topology of a large-scale structure, and it is relatively insensitive to nonlinear gravitational evolution, galaxy bias, and redshift-space distortion. We show that the growth of density fluctuations is scale dependent even in the linear regime in some modified gravity theories, which opens a new possibility of testing the theories observationally. We propose to use the genus of the isodensity contours, an intrinsic measure of the topology of the large-scale structure, as a statistic to be used in such tests. In Einstein's general theory of relativity, density fluctuations grow at the same rate on all scales in the linear regime, and the genus per comoving volume is almost conserved as structures grow homologously, so we expect that the genus-smoothing-scale relation is basically time independent. However, in some modified gravity models where structures grow with different rates on different scales, the genus-smoothing-scale relation should change over time. This can be used to test the gravity models with large-scale structure observations. We study the cases of the f(R) theory, DGP braneworld theory as well as the parameterized post-Friedmann models. We also forecast how the modified gravity models can be constrained with optical/IR or redshifted 21 cm radio surveys in the near future.

  10. Large-scale imprint of relativistic effects in the cosmic magnification

    NASA Astrophysics Data System (ADS)

    Duniya, Didam G. A.

    2016-05-01

    Apart from the known weak gravitational lensing effect, the cosmic magnification acquires relativistic corrections owing to Doppler, integrated Sachs-Wolfe, time-delay and other (local) gravitational potential effects, respectively. These corrections grow on very large scales and high redshifts z , which will be the reach of forthcoming surveys. In this work, these relativistic corrections are investigated in the magnification angular power spectrum, using both (standard) noninteracting dark energy (DE), and interacting DE (IDE). It is found that for noninteracting DE, the relativistic corrections can boost the magnification large-scale power by ˜40 % at z =3 , and increases at lower z . It is also found that the IDE effect is sensitive to the relativistic corrections in the magnification power spectrum, particularly at low z —which will be crucial for constraints on IDE. Moreover, the results show that if relativistic corrections are not taken into account, this may lead to an incorrect estimate of the large-scale imprint of IDE in the cosmic magnification; including the relativistic corrections can enhance the true potential of the cosmic magnification as a cosmological probe.

  11. Classification of large-scale stellar spectra based on the non-linearly assembling learning machine

    NASA Astrophysics Data System (ADS)

    Liu, Zhongbao; Song, Lipeng; Zhao, Wenjuan

    2016-02-01

    An important problem to be solved of traditional classification methods is they cannot deal with large-scale classification because of very high time complexity. In order to solve above problem, inspired by the thinking of collaborative management, the non-linearly assembling learning machine (NALM) is proposed and used in the large-scale stellar spectral classification. In NALM, the large-scale dataset is firstly divided into several subsets, and then the traditional classifiers such as support vector machine (SVM) runs on the subset, finally, the classification results on each subset are assembled and the overall classification decision is obtained. In comparative experiments, we investigate the performance of NALM in the stellar spectral subclasses classification compared with SVM. We apply SVM and NALM respectively to classify the four subclasses of K-type spectra, three subclasses of F-type spectra and three subclasses of G-type spectra from Sloan Digital Sky Survey (SDSS). The comparative experiment results show that the performance of NALM is much better than SVM in view of the classification accuracy and the computation time.

  12. Results from the 2010 National Survey on Drug Use and Health: Mental Health Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2012

    2012-01-01

    This report presents results pertaining to mental health from the 2010 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. This report presents national estimates of the prevalence of past year mental disorders and past year mental health…

  13. Dual pricing of health sciences periodicals: a survey.

    PubMed Central

    Miller, D R; Jensen, J E

    1980-01-01

    A survey of dual pricing practices among publishers of health-related journals identified 281 periodicals with an average price differential of over 100% between individual and institutional subscription rates. Both the practice itself and the amount of the differential are increasing, indicating that journal subscriptions of health sciences libraries increasingly provide the financial support necessary for the publication of health sciences journals. Dual pricing is also correlated with copyright royalties. The problems that dual pricing creates for health sciences libraries' budgets are due in part to uncritical purchasing by libraries. Increased consumerism on the part of health science librarians is recommended. PMID:7437588

  14. Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability

    ERIC Educational Resources Information Center

    Stoynoff, Stephen

    2009-01-01

    This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…

  15. EPIDEMIOLOGY and Health Care Reform The National Health Survey of 1935-1936

    PubMed Central

    2011-01-01

    The National Health Survey undertaken in 1935 and 1936 was the largest morbidity survey until that time. It was also the first national survey to focus on chronic disease and disability. The decision to conduct a survey of this magnitude was part of the larger strategy to reform health care in the United States. The focus on morbidity allowed reformers to argue that the health status of Americans was poor, despite falling mortality rates that suggested the opposite. The focus on chronic disease morbidity proved to be an especially effective way of demonstrating the poor health of the population and the strong links between poverty and illness. The survey, undertaken by a small group of reform-minded epidemiologists led by Edgar Sydenstricker, was made possible by the close interaction during the Depression of agencies and actors in the public health and social welfare sectors, a collaboration which produced new ways of thinking about disease burdens. PMID:21233434

  16. Behavioral health in the gulf coast region following the Deepwater Horizon oil spill: findings from two federal surveys.

    PubMed

    Gould, Deborah W; Teich, Judith L; Pemberton, Michael R; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents-including compensation for lost income from BP and increases in available mental health services-may have resulted in a reduction in potential mental health problems. PMID:25339594

  17. Behavioral Health in the Gulf Coast Region Following the Deepwater Horizon Oil Spill: Findings from Two Federal Surveys

    PubMed Central

    Gould, Deborah W.; Pemberton, Michael R.; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents—including compensation for lost income from BP and increases in available mental health services—may have resulted in a reduction in potential mental health problems. PMID:25339594

  18. Behavioral health in the gulf coast region following the Deepwater Horizon oil spill: findings from two federal surveys.

    PubMed

    Gould, Deborah W; Teich, Judith L; Pemberton, Michael R; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents-including compensation for lost income from BP and increases in available mental health services-may have resulted in a reduction in potential mental health problems.

  19. Survey of Health Sciences CAI Materials.

    ERIC Educational Resources Information Center

    Kamp, Martin

    A project to develop an automated index of information about existing computerized instruction in the health sciences is reported and described. Methods of obtaining and indexing materials for the catalog are detailed. Entry and recovery techniques and selection of descriptors are described. Results to date show that the data base contains…

  20. Isocurvature modes and Baryon Acoustic Oscillations II: gains from combining CMB and Large Scale Structure

    SciTech Connect

    Carbone, Carmelita; Mangilli, Anna; Verde, Licia E-mail: anna.mangilli@icc.ub.edu

    2011-09-01

    We consider cosmological parameters estimation in the presence of a non-zero isocurvature contribution in the primordial perturbations. A previous analysis showed that even a tiny amount of isocurvature perturbation, if not accounted for, could affect standard rulers calibration from Cosmic Microwave Background observations such as those provided by the Planck mission, affect Baryon Acoustic Oscillations interpretation, and introduce biases in the recovered dark energy properties that are larger than forecasted statistical errors from future surveys. Extending on this work, here we adopt a general fiducial cosmology which includes a varying dark energy equation of state parameter and curvature. Beside Baryon Acoustic Oscillations measurements, we include the information from the shape of the galaxy power spectrum and consider a joint analysis of a Planck-like Cosmic Microwave Background probe and a future, space-based, Large Scale Structure probe not too dissimilar from recently proposed surveys. We find that this allows one to break the degeneracies that affect the Cosmic Microwave Background and Baryon Acoustic Oscillations combination. As a result, most of the cosmological parameter systematic biases arising from an incorrect assumption on the isocurvature fraction parameter f{sub iso}, become negligible with respect to the statistical errors. We find that the Cosmic Microwave Background and Large Scale Structure combination gives a statistical error σ(f{sub iso}) ∼ 0.008, even when curvature and a varying dark energy equation of state are included, which is smaller that the error obtained from Cosmic Microwave Background alone when flatness and cosmological constant are assumed. These results confirm the synergy and complementarity between Cosmic Microwave Background and Large Scale Structure, and the great potential of future and planned galaxy surveys.

  1. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    SciTech Connect

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  2. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  3. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  4. Statistical Modeling of Large-Scale Scientific Simulation Data

    SciTech Connect

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  5. Infectious diseases in large-scale cat hoarding investigations.

    PubMed

    Polak, K C; Levy, J K; Crawford, P C; Leutenegger, C M; Moriello, K A

    2014-08-01

    Animal hoarders accumulate animals in over-crowded conditions without adequate nutrition, sanitation, and veterinary care. As a result, animals rescued from hoarding frequently have a variety of medical conditions including respiratory infections, gastrointestinal disease, parasitism, malnutrition, and other evidence of neglect. The purpose of this study was to characterize the infectious diseases carried by clinically affected cats and to determine the prevalence of retroviral infections among cats in large-scale cat hoarding investigations. Records were reviewed retrospectively from four large-scale seizures of cats from failed sanctuaries from November 2009 through March 2012. The number of cats seized in each case ranged from 387 to 697. Cats were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) in all four cases and for dermatophytosis in one case. A subset of cats exhibiting signs of upper respiratory disease or diarrhea had been tested for infections by PCR and fecal flotation for treatment planning. Mycoplasma felis (78%), calicivirus (78%), and Streptococcus equi subspecies zooepidemicus (55%) were the most common respiratory infections. Feline enteric coronavirus (88%), Giardia (56%), Clostridium perfringens (49%), and Tritrichomonas foetus (39%) were most common in cats with diarrhea. The seroprevalence of FeLV and FIV were 8% and 8%, respectively. In the one case in which cats with lesions suspicious for dermatophytosis were cultured for Microsporum canis, 69/76 lesional cats were culture-positive; of these, half were believed to be truly infected and half were believed to be fomite carriers. Cats from large-scale hoarding cases had high risk for enteric and respiratory infections, retroviruses, and dermatophytosis. Case responders should be prepared for mass treatment of infectious diseases and should implement protocols to prevent transmission of feline or zoonotic infections during the emergency response and when

  6. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency

  7. Large-scale molten core/material interaction experiments

    SciTech Connect

    Chu, T.Y.

    1984-01-01

    The paper described the facility and melting technology for large-scale molten core/material interaction experiments being carried out at Sandia National Laboratories. The facility is largest of its kind anywhere. It is capable of producing core melts up to 500 kg at a temperature of 3000/sup 0/K. Results of a recent experiment involving the release of 230 kg of core melt into a magnesia brick crucible is discussed in detail. Data on thermal and mechanical responses of magnesia brick, heat flux partitioning, melt penetration, gas and aerosol generation are presented.

  8. Laser Welding of Large Scale Stainless Steel Aircraft Structures

    NASA Astrophysics Data System (ADS)

    Reitemeyer, D.; Schultz, V.; Syassen, F.; Seefeld, T.; Vollertsen, F.

    In this paper a welding process for large scale stainless steel structures is presented. The process was developed according to the requirements of an aircraft application. Therefore, stringers are welded on a skin sheet in a t-joint configuration. The 0.6 mm thickness parts are welded with a thin disc laser, seam length up to 1920 mm are demonstrated. The welding process causes angular distortions of the skin sheet which are compensated by a subsequent laser straightening process. Based on a model straightening process parameters matching the induced welding distortion are predicted. The process combination is successfully applied to stringer stiffened specimens.

  9. Large-scale genotoxicity assessments in the marine environment.

    PubMed

    Hose, J E

    1994-12-01

    There are a number of techniques for detecting genotoxicity in the marine environment, and many are applicable to large-scale field assessments. Certain tests can be used to evaluate responses in target organisms in situ while others utilize surrogate organisms exposed to field samples in short-term laboratory bioassays. Genotoxicity endpoints appear distinct from traditional toxicity endpoints, but some have chemical or ecotoxicologic correlates. One versatile end point, the frequency of anaphase aberrations, has been used in several large marine assessments to evaluate genotoxicity in the New York Bight, in sediment from San Francisco Bay, and following the Exxon Valdez oil spill.

  10. Why large-scale seasonal streamflow forecasts are feasible

    NASA Astrophysics Data System (ADS)

    Bierkens, M. F.; Candogan Yossef, N.; Van Beek, L. P.

    2011-12-01

    Seasonal forecasts of precipitation and temperature, using either statistical or dynamic prediction, have been around for almost 2 decades. The skill of these forecasts differ both in space and time, with highest skill in areas heavily influenced by SST anomalies such as El Nino or areas where land surface properties have a major impact on e.g. Monsoon strength, such as the vegetation cover of the Sahel region or the snow cover of the Tibetan plateau. However, the skill of seasonal forecasts is limited in most regions, with anomaly correlation coefficients varying between 0.2 and 0.5 for 1-3 month precipitation totals. This raises the question whether seasonal hydrological forecasting is feasible. Here, we make the case that it is. Using the example of statistical forecasts of NAO-strength and related precipitation anomalies over Europe, we show that the skill of large-scale streamflow forecasts is generally much higher than the precipitation forecasts itself, provided that the initial state of the system is accurately estimated. In the latter case, even the precipitation climatology can produce skillful results. This is due to the inertia of the hydrological system rooted in the storage of soil moisture, groundwater and snow pack, as corroborated by a recent study using snow observations for seasonal streamflow forecasting in the Western US. These examples seem to suggest that for accurate seasonal hydrological forecasting, correct state estimation is more important than accurate seasonal meteorological forecasts. However, large-scale estimation of hydrological states is difficult and validation of large-scale hydrological models often reveals large biases in e.g. streamflow estimates. Fortunately, as shown with a validation study of the global model PCR-GLOBWB, these biases are of less importance when seasonal forecasts are evaluated in terms of their ability to reproduce anomalous flows and extreme events, i.e. by anomaly correlations or categorical quantile

  11. Towards large scale production and separation of carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Alvarez, Noe T.

    Since their discovery, carbon nanotubes (CNTs) have boosted the research and applications of nanotechnology; however, many applications of CNTs are inaccessible because they depend upon large-scale CNT production and separations. Type, chirality and diameter control of CNTs determine many of their physical properties, and such control is still not accesible. This thesis studies the fundamentals for scalable selective reactions of HiPCo CNTs as well as the early phase of routes to an inexpensive approach for large-scale CNT production. In the growth part, this thesis covers a complete wet-chemistry process of catalyst and catalyst support deposition for growth of vertically aligned (VA) CNTs. A wet-chemistry preparation process has significant importance for CNT synthesis through chemical vapor deposition (CVD). CVD is by far, the most suitable and inexpensive process for large-scale CNT production when compared to other common processes such as laser ablation and arc discharge. However, its potential has been limited by low-yielding and difficult preparation processes of catalyst and its support, therefore its competitiveness has been reduced. The wet-chemistry process takes advantage of current nanoparticle technology to deposit the catalyst and the catalyst support as a thin film of nanoparticles, making the protocol simple compared to electron beam evaporation and sputtering processes. In the CNT selective reactions part, this thesis studies UV irradiation of individually dispersed HiPCo CNTs that generates auto-selective reactions in the liquid phase with good control over their diameter and chirality. This technique is ideal for large-scale and continuous-process of separations of CNTs by diameter and type. Additionally, an innovative simple catalyst deposition through abrasion is demonstrated. Simple friction between the catalyst and the substrates deposit a high enough density of metal catalyst particles for successful CNT growth. This simple approach has

  12. Novel algorithm of large-scale simultaneous linear equations.

    PubMed

    Fujiwara, T; Hoshi, T; Yamamoto, S; Sogabe, T; Zhang, S-L

    2010-02-24

    We review our recently developed methods of solving large-scale simultaneous linear equations and applications to electronic structure calculations both in one-electron theory and many-electron theory. This is the shifted COCG (conjugate orthogonal conjugate gradient) method based on the Krylov subspace, and the most important issue for applications is the shift equation and the seed switching method, which greatly reduce the computational cost. The applications to nano-scale Si crystals and the double orbital extended Hubbard model are presented.

  13. Quantum computation for large-scale image classification

    NASA Astrophysics Data System (ADS)

    Ruan, Yue; Chen, Hanwu; Tan, Jianing; Li, Xi

    2016-10-01

    Due to the lack of an effective quantum feature extraction method, there is currently no effective way to perform quantum image classification or recognition. In this paper, for the first time, a global quantum feature extraction method based on Schmidt decomposition is proposed. A revised quantum learning algorithm is also proposed that will classify images by computing the Hamming distance of these features. From the experimental results derived from the benchmark database Caltech 101, and an analysis of the algorithm, an effective approach to large-scale image classification is derived and proposed against the background of big data.

  14. Generation of Large-Scale Winds in Horizontally Anisotropic Convection.

    PubMed

    von Hardenberg, J; Goluskin, D; Provenzale, A; Spiegel, E A

    2015-09-25

    We simulate three-dimensional, horizontally periodic Rayleigh-Bénard convection, confined between free-slip horizontal plates and rotating about a distant horizontal axis. When both the temperature difference between the plates and the rotation rate are sufficiently large, a strong horizontal wind is generated that is perpendicular to both the rotation vector and the gravity vector. The wind is turbulent, large-scale, and vertically sheared. Horizontal anisotropy, engendered here by rotation, appears necessary for such wind generation. Most of the kinetic energy of the flow resides in the wind, and the vertical turbulent heat flux is much lower on average than when there is no wind. PMID:26451558

  15. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  16. Search for Large Scale Anisotropies with the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Bonino, R.; Pierre Auger Collaboration

    The Pierre Auger Observatory studies the nature and the origin of Ultra High Energy Cosmic Rays (>3\\cdot1018 eV). Completed at the end of 2008, it has been continuously operating for more than six years. Using data collected from 1 January 2004 until 31 March 2009, we search for large scale anisotropies with two complementary analyses in different energy windows. No significant anisotropies are observed, resulting in bounds on the first harmonic amplitude at the 1% level at EeV energies.

  17. Large-Scale periodic solar velocities: An observational study

    NASA Technical Reports Server (NTRS)

    Dittmer, P. H.

    1977-01-01

    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating.

  18. Evaluation of uncertainty in large-scale fusion metrology

    NASA Astrophysics Data System (ADS)

    Zhang, Fumin; Qu, Xinghua; Wu, Hongyan; Ye, Shenghua

    2008-12-01

    The expression system of uncertainty in conventional scale has been perfect, however, due to varies of error sources, it is still hard to obtain the uncertainty of large-scale instruments by common methods. In this paper, the uncertainty is evaluated by Monte Carlo simulation. The point-clouds created by this method are shown through computer visualization and point by point analysis is made. Thus, in fusion measurement, apart from the uncertainty of every instrument being expressed directly, the contribution every error source making for the whole uncertainty becomes easy to calculate. Finally, the application of this method in measuring tunnel component is given.

  19. Large-scale sodium spray fire code validation (SOFICOV) test

    SciTech Connect

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m/sup 3/ Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes.

  20. Large scale obscuration and related climate effects open literature bibliography

    SciTech Connect

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.