Science.gov

Sample records for large-scale health survey

  1. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  2. Improving Recent Large-Scale Pulsar Surveys

    NASA Astrophysics Data System (ADS)

    Cardoso, Rogerio Fernando; Ransom, S.

    2011-01-01

    Pulsars are unique in that they act as celestial laboratories for precise tests of gravity and other extreme physics (Kramer 2004). There are approximately 2000 known pulsars today, which is less than ten percent of pulsars in the Milky Way according to theoretical models (Lorimer 2004). Out of these 2000 known pulsars, approximately ten percent are known millisecond pulsars, objects used for their period stability for detailed physics tests and searches for gravitational radiation (Lorimer 2008). As the field and instrumentation progress, pulsar astronomers attempt to overcome observational biases and detect new pulsars, consequently discovering new millisecond pulsars. We attempt to improve large scale pulsar surveys by examining three recent pulsar surveys. The first, the Green Bank Telescope 350MHz Drift Scan, a low frequency isotropic survey of the northern sky, has yielded a large number of candidates that were visually inspected and identified, resulting in over 34.000 thousands candidates viewed, dozens of detections of known pulsars, and the discovery of a new low-flux pulsar, PSRJ1911+22. The second, the PALFA survey, is a high frequency survey of the galactic plane with the Arecibo telescope. We created a processing pipeline for the PALFA survey at the National Radio Astronomy Observatory in Charlottesville- VA, in addition to making needed modifications upon advice from the PALFA consortium. The third survey examined is a new GBT 820MHz survey devoted to find new millisecond pulsars by observing the target-rich environment of unidentified sources in the FERMI LAT catalogue. By approaching these three pulsar surveys at different stages, we seek to improve the success rates of large scale surveys, and hence the possibility for ground-breaking work in both basic physics and astrophysics.

  3. Associations between adult attachment style and mental health care utilization: Findings from a large-scale national survey.

    PubMed

    Meng, Xiangfei; D'Arcy, Carl; Adams, G Camelia

    2015-09-30

    This study investigated the association between attachment style and the use of a range of mental health services controlling socio-demographic, physical and psychological risk factors. Using a large nationally representative sample from the US National Comorbidity Survey Replication (NCS-R), a total of 5645 participants (18+) were included. The majority of participants reported their attachment as secure (63.5%), followed by avoidant (22.2%), unclassified (8.8%), and anxious (5.5%). The percentages using different health services studied varied widely (1.1-31.1%). People with insecure (anxious and avoidant) attachment were more likely to report accessing a hotline, having had a session of psychological counselling or therapy, getting a prescription or medicine for mental and behavioural problems. Individuals with anxious attachment only were also more likely to report the use of internet support groups or chat rooms. This is a first analysis to explore relationships between self-reported adult attachment style and a wide range of health care services. Insecurely attached individuals were more likely to use a wide range of health care services even after controlling for socio-demographic factors, psychiatric disorders and chronic health conditions. These findings suggest that adult attachment plays an important role in the use of mental health care services.

  4. Opportunities and challenges for the use of large-scale surveys in public health research: A comparison of the assessment of cancer screening behaviors

    PubMed Central

    Hamilton, Jada G.; Breen, Nancy; Klabunde, Carrie N.; Moser, Richard P.; Leyva, Bryan; Breslau, Erica S.; Kobrin, Sarah C.

    2014-01-01

    Large-scale surveys that assess cancer prevention and control behaviors are a readily-available, rich resource for public health researchers. Although these data are used by a subset of researchers who are familiar with them, their potential is not fully realized by the research community for reasons including lack of awareness of the data, and limited understanding of their content, methodology, and utility. Until now, no comprehensive resource existed to describe and facilitate use of these data. To address this gap and maximize use of these data, we catalogued the characteristics and content of four surveys that assessed cancer screening behaviors in 2005, the most recent year with concurrent periods of data collection: the National Health Interview Survey, Health Information National Trends Survey, Behavioral Risk Factor Surveillance System, and California Health Interview Survey. We documented each survey's characteristics, measures of cancer screening, and relevant correlates; examined how published studies (n=78) have used the surveys’ cancer screening data; and reviewed new cancer screening constructs measured in recent years. This information can guide researchers in deciding how to capitalize on the opportunities presented by these data resources. PMID:25300474

  5. Health Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2015-12-01

    13. SUPPLEMENTARY  NOTES 14.  ABSTRACT The promise of the benefits of fully integrated electronic health care systems can only be realized if the... Language Processing techniques were carried out to process 325791 clinical notes to extract new terms including diseases, symptoms, and mental and...military electronic health record systems by allowing system level integration of the human´s visual capabilities into the overall health data based

  6. Large scale survey of enteric viruses in river and waste water underlines the health status of the local population.

    PubMed

    Prevost, B; Lucas, F S; Goncalves, A; Richard, F; Moulin, L; Wurtzer, S

    2015-06-01

    Although enteric viruses constitute a major cause of acute waterborne diseases worldwide, environmental data about occurrence and viral load of enteric viruses in water are not often available. In this study, enteric viruses (i.e., adenovirus, aichivirus, astrovirus, cosavirus, enterovirus, hepatitis A and E viruses, norovirus of genogroups I and II, rotavirus A and salivirus) were monitored in the Seine River and the origin of contamination was untangled. A total of 275 water samples were collected, twice a month for one year, from the river Seine, its tributaries and the major WWTP effluents in the Paris agglomeration. All water samples were negative for hepatitis A and E viruses. AdV, NVGI, NVGII and RV-A were the most prevalent and abundant populations in all water samples. The viral load and the detection frequency increased significantly between the samples collected the most upstream and the most downstream of the Paris urban area. The calculated viral fluxes demonstrated clearly the measurable impact of WWTP effluents on the viral contamination of the Seine River. The viral load was seasonal for almost all enteric viruses, in accordance with the gastroenteritis recordings provided by the French medical authorities. These results implied the existence of a close relationship between the health status of inhabitants and the viral contamination of WWTP effluents and consequently surface water contamination. Subsequently, the regular analysis of wastewater could serve as a proxy for the monitoring of the human viruses circulating in both a population and surface water.

  7. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2015-04-01

    Award Number: W81XWH-13-1-0020 TITLE: Health-Terrain: Visualizing Large Scale Health Data PRINCIPAL INVESTIGATOR: Ph.D. Fang, Shiaofen...ADDRESS. 1. REPORT DATE April 2015 2. REPORT TYPE Annual 3. DATES COVERED 7 MAR 2014 – 6 MAR 2015 4. TITLE AND SUBTITLE Health-Terrain: Visualizing ...1) creating a concept space data model, which represents a schema tailored to support diverse visualizations and provides a uniform ontology that

  8. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  9. Theoretical expectations for bulk flows in large-scale surveys

    NASA Technical Reports Server (NTRS)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  10. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  11. Performance Health Monitoring of Large-Scale Systems

    SciTech Connect

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  12. Large-scale Health Information Database and Privacy Protection*1

    PubMed Central

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA*2 projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients’ medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  13. Large-scale Health Information Database and Privacy Protection.

    PubMed

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  14. Large Scale Structure From Motion for Autonomous Underwater Vehicle Surveys

    DTIC Science & Technology

    2004-09-01

    from Figure 6-9 that most of the outliers correspond to the broad carpet waves . 6.3 Results: Bermuda survey 6.3.1 Context In August 2002 the SeaBED...moves, power and/or size limitations lead to lighting patterns that are far from uniform (Figure 1-1). Also with the advent of autonomous underwater...with strobed light sources rather than continuous lighting, and acquire low overlap imagery in order to preserve power and cover greater distances

  15. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  16. Synthesizing Planetary Nebulae for Large Scale Surveys: Predictions for LSST

    NASA Astrophysics Data System (ADS)

    Vejar, George; Montez, Rodolfo; Morris, Margaret; Stassun, Keivan G.

    2017-01-01

    The short-lived planetary nebula (PN) phase of stellar evolution is characterized by a hot central star and a bright, ionized, nebula. The PN phase forms after a low- to intermediate-mass star stops burning hydrogen in its core, ascends the asymptotic giant branch, and expels its outer layers of material into space. The exposed hot core produces ionizing UV photons and a fast stellar wind that sweeps up the surrounding material into a dense shell of ionized gas known as the PN. This fleeting stage of stellar evolution provides insight into rare atomic processes and the nucleosynthesis of elements in stars. The inherent brightness of the PNe allow them to be used to obtain distances to nearby stellar systems via the PN luminosity function and as kinematic tracers in other galaxies. However, the prevalence of non-spherical morphologies of PNe challenge the current paradigm of PN formation. The role of binarity in the shaping of the PN has recently gained traction ultimately suggesting single stars might not form PN. Searches for binary central stars have increased the binary fraction but the current PN sample is incomplete. Future wide-field, multi-epoch surveys like the Large Synoptic Survey Telescope (LSST) can impact studies of PNe and improve our understanding of their origin and formation. Using a suite of Cloudy radiative transfer calculations, we study the detectability of PNe in the proposed LSST multiband observations. We compare our synthetic PNe to common sources (stars, galaxies, quasars) and establish discrimination techniques. Finally, we discuss follow-up strategies to verify new LSST-discovered PNe and use limiting distances to estimate the potential sample of PNe enabled by LSST.

  17. Large-scale structure in the Southern Sky Redshift Survey

    NASA Technical Reports Server (NTRS)

    Park, Changbom; Gott, J. R., III; Da Costa, L. N.

    1992-01-01

    The power spectrum from the Southern Sky Redshift Survey and the CfA samples are measured in order to explore the amplitude of fluctuation in the galaxy density. At lambda of less than or equal to 30/h Mpc the observed power spectrum is quite consistent with the standard CDM model. At larger scales the data indicate an excess of power over the standard CDM model. The observed power spectrum from these optical galaxy samples is in good agreement with that drawn from the sparsely sampled IRAS galaxies. The shape of the power spectrum is also studied by examining the relation between the genus per unit volume and the smoothing length. It is found that, over Gaussian smoothing scales from 6 to 14/h Mpc, the power spectrum has a slope of about -1. The topology of the galaxy density field is studied by measuring the shift of the genus curve from the Gaussian case. Over all smoothing scales studied, the observed genus curves are consistent with a random phase distribution of the galaxy density field, as predicted by the inflationary scenarios.

  18. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we

  19. Language Learning Motivation in China: Results of a Large-Scale Stratified Survey

    ERIC Educational Resources Information Center

    You, Chenjing; Dörnyei, Zoltán

    2016-01-01

    This article reports on the findings of a large-scale cross-sectional survey of the motivational disposition of English language learners in secondary schools and universities in China. The total sample involved over 10,000 students and was stratified according to geographical region and teaching contexts, selecting participants both from urban…

  20. The Use of Online Social Networks by Polish Former Erasmus Students: A Large-Scale Survey

    ERIC Educational Resources Information Center

    Bryla, Pawel

    2014-01-01

    There is an increasing role of online social networks in the life of young Poles. We conducted a large-scale survey among Polish former Erasmus students. We have received 2450 completed questionnaires from alumni of 115 higher education institutions all over Poland. 85.4% of our respondents reported they kept in touch with their former Erasmus…

  1. PERSPECTIVES ON LARGE-SCALE NATURAL RESOURCES SURVEYS WHEN CAUSE-EFFECT IS A POTENTIAL ISSUE

    EPA Science Inventory

    Our objective is to present a perspective on large-scale natural resource monitoring when cause-effect is a potential issue. We believe that the approach of designing a survey to meet traditional commodity production and resource state descriptive objectives is too restrictive an...

  2. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  3. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey

  4. Use of large-scale, multi-species surveys to monitor gyrfalcon and ptarmigan populations

    USGS Publications Warehouse

    Bart, Jonathan; Fuller, Mark; Smith, Paul; Dunn, Leah; Watson, Richard T.; Cade, Tom J.; Fuller, Mark; Hunt, Grainger; Potapov, Eugene

    2011-01-01

    We evaluated the ability of three large-scale, multi-species surveys in the Arctic to provide information on abundance and habitat relationships of Gyrfalcons (Falco rusticolus) and ptarmigan. The Program for Regional and International Shorebird Monitoring (PRISM) has surveyed birds widely across the arctic regions of Canada and Alaska since 2001. The Arctic Coastal Plain survey has collected abundance information on the North Slope of Alaska using fixed-wing aircraft since 1992. The Northwest Territories-Nunavut Bird Checklist has collected presenceabsence information from little-known locations in northern Canada since 1995. All three surveys provide extensive information on Willow Ptarmigan (Lagopus lagopus) and Rock Ptarmigan (L. muta). For example, they show that ptarmigan are most abundant in western Alaska, next most abundant in northern Alaska and northwest Canada, and least abundant in the Canadian Archipelago. PRISM surveys were less successful in detecting Gyrfalcons, and the Arctic Coastal Plain Survey is largely outside the Gyrfalcon?s breeding range. The Checklist Survey, however, reflects the expansive Gyrfalcon range in Canada. We suggest that collaboration by Gyrfalcon and ptarmigan biologists with the organizers of large scale surveys like the ones we investigated provides an opportunity for obtaining useful information on these species and their environment across large areas.

  5. Bayesian inference of the initial conditions from large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Leclercq, Florent

    2016-10-01

    Analysis of three-dimensional cosmological surveys has the potential to answer outstanding questions on the initial conditions from which structure appeared, and therefore on the very high energy physics at play in the early Universe. We report on recently proposed statistical data analysis methods designed to study the primordial large-scale structure via physical inference of the initial conditions in a fully Bayesian framework, and applications to the Sloan Digital Sky Survey data release 7. We illustrate how this approach led to a detailed characterization of the dynamic cosmic web underlying the observed galaxy distribution, based on the tidal environment.

  6. Characterising large-scale structure with the REFLEX II cluster survey

    NASA Astrophysics Data System (ADS)

    Chon, Gayoung

    2016-10-01

    We study the large-scale structure with superclusters from the REFLEX X-ray cluster survey together with cosmological N-body simulations. It is important to construct superclusters with criteria such that they are homogeneous in their properties. We lay out our theoretical concept considering future evolution of superclusters in their definition, and show that the X-ray luminosity and halo mass functions of clusters in superclusters are found to be top-heavy, different from those of clusters in the field. We also show a promising aspect of using superclusters to study the local cluster bias and mass scaling relation with simulations.

  7. The Observatorio Astrofisico de Javalambre. A planned facility for large scale surveys

    NASA Astrophysics Data System (ADS)

    Moles, M.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Gruel, N.; Marín Franch, A.; Valdivielso, L.; Viironen, K.

    2011-11-01

    All-sky surveys play a fundamental role for the development of Astrophysics. The need for large-scale surveys comes from two basic motivations: one is to make an inventory of sources as complete as possible and allow for their classification in families. The other is to attack some problems demanding the sampling of large volumes to give a detectable signal. New challenges, in particular in the domain of Cosmology are giving impulse to a new kind of large-scale surveys, combining area coverage, depth and accurate enough spectral information to recover the redshift and spectral energy distribution (SED) of the detected objects. New instruments are needed to satisfy the requirements of those large-scale surveys, in particular large Etendue telescopes. The Observatorio Astrofisico de Javalambre, OAJ, project includes a telescope of 2.5 m aperture, with a wide field of view, 3 degrees in diameter, and excellent image quality in the whole field. Taking into account that it is going to be fully devoted to carry out surveys, it will be the highest effective Etendue telescope up to date. The project is completed with a smaller, wide field auxiliary telescope. The Observatory is being built at Pico del Buitre, Sierra de Javalambre, Teruel, a site with excellent seeing and low sky surface brightness. The institution in charge of the Observatory is the Centro de Estudios de Fisica del Cosmos de Aragon, CEFCA, a new center created in Teruel for the operation and scientific exploitation of the Javalambre Observatory. CEFCA will be also in charge of the data management and archiving. The data will be made accessible to the community.The first planned scientific project is a multi-narrow-band photometric survey covering 8,000 square degrees, designed to produce precise SEDs, and photometric redshifts accurate at the 0.3 % level. A total of 42, 100-120 Å band pass filters covering most of the optical spectral range will be used. In this sense it is the development, at a much

  8. The future of primordial features with large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Chen, Xingang; Dvorkin, Cora; Huang, Zhiqi; Namjoo, Mohammad Hossein; Verde, Licia

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  9. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  10. Large-scale galaxy distribution in the Las Campanas Redshift Survey

    NASA Astrophysics Data System (ADS)

    Doroshkevich, A. G.; Tucker, D. L.; Fong, R.; Turchaninov, V.; Lin, H.

    2001-04-01

    We make use of three-dimensional clustering analysis, inertia tensor methods, and the minimal spanning tree technique to estimate some physical and statistical characteristics of the large-scale galaxy distribution and, in particular, of the sample of overdense regions seen in the Las Campanas Redshift Survey (LCRS). Our investigation provides additional evidence for a network of structures found in our core sampling analysis of the LCRS: a system of rich sheet-like structures, which in turn surround large underdense regions criss-crossed by a variety of filamentary structures. We find that the overdense regions contain ~40-50 per cent of LCRS galaxies and have proper sizes similar to those of nearby superclusters. The formation of such structures can be roughly described as a non-linear compression of protowalls of typical cross-sectional size ~20-25h-1Mpc this scale is ~5 times the conventional value for the onset of non-linear clustering - to wit, r0, the autocorrelation length for galaxies. The comparison with available simulations and theoretical estimates shows that the formation of structure elements with parameters similar to those observed is presently possible only in low-density cosmological models, Ωmh~0.2-0.3, with a suitable large-scale bias between galaxies and dark matter.

  11. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    PubMed

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments.

  12. Large-Scale Surveys of Snow Depth on Arctic Sea Ice from Operation IceBridge

    NASA Technical Reports Server (NTRS)

    Kurtz, Nathan T.; Farrell, Sinead L.

    2011-01-01

    We show the first results of a large ]scale survey of snow depth on Arctic sea ice from NASA fs Operation IceBridge snow radar system for the 2009 season and compare the data to climatological snow depth values established over the 1954.1991 time period. For multiyear ice, the mean radar derived snow depth is 33.1 cm and the corresponding mean climatological snow depth is 33.4 cm. The small mean difference suggests consistency between contemporary estimates of snow depth with the historical climatology for the multiyear ice region of the Arctic. A 16.5 cm mean difference (climatology minus radar) is observed for first year ice areas suggesting that the increasingly seasonal sea ice cover of the Arctic Ocean has led to an overall loss of snow as the region has transitioned away from a dominantly multiyear ice cover.

  13. Large-scale internal structure in volcanogenic breakout flood deposits: Extensive GPR survey on volcaniclastic deposits

    NASA Astrophysics Data System (ADS)

    Kataoka, K.; Gomez, C. A.

    2012-12-01

    Large-scale outburst floods from volcanic lakes such as caldera lakes or volcanically dammed river-valleys tend to be voluminous with total discharge of > 1-10s km3 and peak discharge of >10000s to 100000s m3 s-1. Such a large flood can travel long distance and leave sediments and bedforms/landforms extensively with large-scale internal structures, which are difficult to assess from single local sites. Moreover, the sediments and bedforms/landforms are sometimes untraceable, and outcrop information obtained by classical geological and geomorphological field surveys is limited to the dissected/terraced parts of fan body, road cuts and/or large quarries. Therefore, GPR (Ground Penetrating Radar), using the properties of electromagnetic waves' propagation through media, seems best adapted for the appraisal of large-scale subsurface structures. Recently, studies on GPR applications to volcanic deposits have successfully captured images of lava flows and volcaniclastic deposits and proved the usefulness of this method even onto the volcanic areas which often encompass complicated stratigraphy and structures with variable material, grainsize, and ferromagnetic content. Using GPR, the present study aims to understand the large-scale internal structures of volcanogenic flood deposits. The survey was carried out over two volcanogenic flood fan (or apron) sediments in northeast Japan, at Numazawa and Towada volcanoes. The 5 ka Numazawa flood deposits in the Tadami river catchment that has been emplaced by a breakout flood from ignimbrite-dammed valley leaving pumiceous gravelly sediments with meter-sized boulders in the flow path. At Towada volcano, a comparable flood event originating from a breach in the caldera rim emplaced the 13-15 ka Sanbongi fan deposits in the Oirase river valley, which is characterized by a bouldery fan deposits. The GPR data was collected following 200 to 500 m long lateral and longitudinal transects, which were captured using a GPR Pulse

  14. Measuring Large-Scale Structure at z ~ 1 with the VIPERS galaxy survey

    NASA Astrophysics Data System (ADS)

    Guzzo, Luigi

    2016-10-01

    The VIMOS Public Extragalactic Redshift Survey (VIPERS) is the largest redshift survey ever conducted with the ESO telescopes. It has used the Very Large Telescope to collect nearly 100,000 redshifts from the general galaxy population at 0.5 < z < 1.2. With a combination of volume and high sampling density that is unique for these redshifts, it allows statistical measurements of galaxy clustering and related cosmological quantities to be obtained on an equal footing with classic results from local redshift surveys. At the same time, the simple magnitude-limited selection and the wealth of ancillary photometric data provide a general view of the galaxy population, its physical properties and the relation of the latter to large-scale structure. This paper presents an overview of the galaxy clustering results obtained so far, together with their cosmological implications. Most of these are based on the ~ 55,000 galaxies forming the first public data release (PDR-1). As of January 2015, observations and data reduction are complete and the final data set of more than 90,000 redshifts is being validated and made ready for the final investigations.

  15. Ten key considerations for the successful implementation and adoption of large-scale health information technology.

    PubMed

    Cresswell, Kathrin M; Bates, David W; Sheikh, Aziz

    2013-06-01

    The implementation of health information technology interventions is at the forefront of most policy agendas internationally. However, such undertakings are often far from straightforward as they require complex strategic planning accompanying the systemic organizational changes associated with such programs. Building on our experiences of designing and evaluating the implementation of large-scale health information technology interventions in the USA and the UK, we highlight key lessons learned in the hope of informing the on-going international efforts of policymakers, health directorates, healthcare management, and senior clinicians.

  16. Searching transients in large-scale surveys. A method based on the Abbe value

    NASA Astrophysics Data System (ADS)

    Mowlavi, N.

    2014-08-01

    Aims: A new method is presented to identify transient candidates in large-scale surveys based on the variability pattern in their light curves. Methods: The method is based on the Abbe value, Ab, that estimates the smoothness of a light curve, and on a newly introduced value called the excess Abbe and denoted excessAb, that estimates the regularity of the light curve variability pattern over the duration of the observations. Results: Based on simulated light curves, transients are shown to occupy a specific region in the {diagram} diagram, distinct from sources presenting pulsating-like features in their light curves or having featureless light curves. The method is tested on real light curves taken from EROS-2 and OGLE-II surveys in a 0.50° × 0.17° field of the sky in the Large Magellanic Cloud centered at RA(J2000) = 5h25m56.5s and Dec(J2000) = -69d29m43.3s. The method identifies 43 EROS-2 transient candidates out of a total of 1300 variable stars, and 19 more OGLE-II candidates, 10 of which do not have any EROS-2 variable star matches and which would need further confirmation to assess their reliability. The efficiency of the method is further tested by comparing the list of transient candidates with known Be stars in the literature. It is shown that all Be stars known in the studied field of view with detectable bursts or outbursts are successfully extracted by the method. In addition, four new transient candidates displaying bursts and/or outbursts are found in the field, of which at least two are good new Be candidates. Conclusions: The new method proves to be a potentially powerful tool to extract transient candidates from large-scale multi-epoch surveys. The better the photometric measurement uncertainties are, the cleaner the list of detected transient candidates is. In addition, the diagram diagram is shown to be a good diagnostic tool to check the data quality of multi-epoch photometric surveys. A trend of instrumental and/or data reduction origin

  17. Survey and analysis of selected jointly owned large-scale electric utility storage projects

    SciTech Connect

    Not Available

    1982-05-01

    The objective of this study was to examine and document the issues surrounding the curtailment in commercialization of large-scale electric storage projects. It was sensed that if these issues could be uncovered, then efforts might be directed toward clearing away these barriers and allowing these technologies to penetrate the market to their maximum potential. Joint-ownership of these projects was seen as a possible solution to overcoming the major barriers, particularly economic barriers, of commercializaton. Therefore, discussions with partners involved in four pumped storage projects took place to identify the difficulties and advantages of joint-ownership agreements. The four plants surveyed included Yards Creek (Public Service Electric and Gas and Jersey Central Power and Light); Seneca (Pennsylvania Electric and Cleveland Electric Illuminating Company); Ludington (Consumers Power and Detroit Edison, and Bath County (Virginia Electric Power Company and Allegheny Power System, Inc.). Also investigated were several pumped storage projects which were never completed. These included Blue Ridge (American Electric Power); Cornwall (Consolidated Edison); Davis (Allegheny Power System, Inc.) and Kttatiny Mountain (General Public Utilities). Institutional, regulatory, technical, environmental, economic, and special issues at each project were investgated, and the conclusions relative to each issue are presented. The major barriers preventing the growth of energy storage are the high cost of these systems in times of extremely high cost of capital, diminishing load growth and regulatory influences which will not allow the building of large-scale storage systems due to environmental objections or other reasons. However, the future for energy storage looks viable despite difficult economic times for the utility industry. Joint-ownership can ease some of the economic hardships for utilites which demonstrate a need for energy storage.

  18. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  19. Large-scale Analysis of Counseling Conversations: An Application of Natural Language Processing to Mental Health

    PubMed Central

    Althoff, Tim; Clark, Kevin; Leskovec, Jure

    2016-01-01

    Mental illness is one of the most pressing public health issues of our time. While counseling and psychotherapy can be effective treatments, our knowledge about how to conduct successful counseling conversations has been limited due to lack of large-scale data with labeled outcomes of the conversations. In this paper, we present a large-scale, quantitative study on the discourse of text-message-based counseling conversations. We develop a set of novel computational discourse analysis methods to measure how various linguistic aspects of conversations are correlated with conversation outcomes. Applying techniques such as sequence-based conversation models, language model comparisons, message clustering, and psycholinguistics-inspired word frequency analyses, we discover actionable conversation strategies that are associated with better conversation outcomes.

  20. Photometric Redshifts for the Dark Energy Survey and VISTA and Implications for Large Scale Structure

    SciTech Connect

    Banerji, Manda; Abdalla, Filipe B.; Lahav, Ofer; Lin, Huan; /Fermilab

    2007-11-01

    We conduct a detailed analysis of the photometric redshift requirements for the proposed Dark Energy Survey (DES) using two sets of mock galaxy simulations and an artificial neural network code-ANNz. In particular, we examine how optical photometry in the DES grizY bands can be complemented with near infra-red photometry from the planned VISTA Hemisphere Survey (VHS) in the JHK{sub s} bands in order to improve the photometric redshift estimate by a factor of two at z > 1. We draw attention to the effects of galaxy formation scenarios such as reddening on the photo-z estimate and using our neural network code, calculate A{sub v} for these reddened galaxies. We also look at the impact of using different training sets when calculating photometric redshifts. In particular, we find that using the ongoing DEEP2 and VVDS-Deep spectroscopic surveys to calibrate photometric redshifts for DES, will prove effective. However we need to be aware of uncertainties in the photometric redshift bias that arise when using different training sets as these will translate into errors in the dark energy equation of state parameter, w. Furthermore, we show that the neural network error estimate on the photometric redshift may be used to remove outliers from our samples before any kind of cosmological analysis, in particular for large-scale structure experiments. By removing all galaxies with a 1{sigma} photo-z scatter greater than 0.1 from our DES+VHS sample, we can constrain the galaxy power spectrum out to a redshift of 2 and reduce the fractional error on this power spectrum by {approx}15-20% compared to using the entire catalogue.

  1. Conducting Large-Scale Surveys in Secondary Schools: The Case of the Youth On Religion (YOR) Project

    ERIC Educational Resources Information Center

    Madge, Nicola; Hemming, Peter J.; Goodman, Anthony; Goodman, Sue; Kingston, Sarah; Stenson, Kevin; Webster, Colin

    2012-01-01

    There are few published articles on conducting large-scale surveys in secondary schools, and this paper seeks to fill this gap. Drawing on the experiences of the Youth On Religion project, it discusses the politics of gaining access to these schools and the considerations leading to the adoption and administration of an online survey. It is…

  2. A Large-scale Survey of CRF55_01B from Men-Who-Have-Sex-with-Men in China: implying the Evolutionary History and Public Health Impact

    PubMed Central

    Han, Xiaoxu; Takebe, Yutaka; Zhang, Weiqing; An, Minghui; Zhao, Bin; Hu, Qinghai; Xu, Junjie; Wu, Hao; Wu, Jianjun; Lu, Lin; Chen, Xi; Liang, Shu; Wang, Zhe; Yan, Hongjing; Fu, Jihua; Cai, Weiping; Zhuang, Minghua; Liao, Christina; Shang, Hong

    2015-01-01

    The HIV-1 epidemic among men-who-have-sex-with-men (MSM) continues to expand in China, involving the co-circulation of several different lineages of HIV-1 strains, including subtype B and CRF01_AE. This expansion has created conditions that facilitate the generation of new recombinant strains. A molecular epidemiologic survey among MSM in 11 provinces/cities around China was conducted from 2008 to 2013. Based on pol nucleotide sequences, a total of 19 strains (1.95%) belonged to the CRF55_01B were identified from 975 MSM in 7 provinces, with the prevalence range from 1.5% to 12.5%. Near full length genome (NFLG) sequences from six epidemiologically-unlinked MSM were amplified for analyzing evolutionary history, an identical genome structure composed of CRF01_AE and subtype B with four unique recombination breakpoints in the pol region were identified. Bayesian molecular clock analyses for both CRF01_AE and B segments indicated that the estimated time of the most recent common ancestors of CRF55_01B was around the year 2000. Our study found CRF55_01B has spread throughout the most provinces with high HIV-1 prevalence and highlights the importance of continual surveillance of dynamic changes in HIV-1 strains, the emergence of new recombinants, and the need for implementing effective prevention measures specifically targeting the MSM population in China. PMID:26667846

  3. A Large-scale Survey of CRF55_01B from Men-Who-Have-Sex-with-Men in China: implying the Evolutionary History and Public Health Impact.

    PubMed

    Han, Xiaoxu; Takebe, Yutaka; Zhang, Weiqing; An, Minghui; Zhao, Bin; Hu, Qinghai; Xu, Junjie; Wu, Hao; Wu, Jianjun; Lu, Lin; Chen, Xi; Liang, Shu; Wang, Zhe; Yan, Hongjing; Fu, Jihua; Cai, Weiping; Zhuang, Minghua; Liao, Christina; Shang, Hong

    2015-12-15

    The HIV-1 epidemic among men-who-have-sex-with-men (MSM) continues to expand in China, involving the co-circulation of several different lineages of HIV-1 strains, including subtype B and CRF01_AE. This expansion has created conditions that facilitate the generation of new recombinant strains. A molecular epidemiologic survey among MSM in 11 provinces/cities around China was conducted from 2008 to 2013. Based on pol nucleotide sequences, a total of 19 strains (1.95%) belonged to the CRF55_01B were identified from 975 MSM in 7 provinces, with the prevalence range from 1.5% to 12.5%. Near full length genome (NFLG) sequences from six epidemiologically-unlinked MSM were amplified for analyzing evolutionary history, an identical genome structure composed of CRF01_AE and subtype B with four unique recombination breakpoints in the pol region were identified. Bayesian molecular clock analyses for both CRF01_AE and B segments indicated that the estimated time of the most recent common ancestors of CRF55_01B was around the year 2000. Our study found CRF55_01B has spread throughout the most provinces with high HIV-1 prevalence and highlights the importance of continual surveillance of dynamic changes in HIV-1 strains, the emergence of new recombinants, and the need for implementing effective prevention measures specifically targeting the MSM population in China.

  4. A process for creating multimetric indices for large-scale aquatic surveys

    EPA Science Inventory

    Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...

  5. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    NASA Astrophysics Data System (ADS)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  6. Linking Errors in Trend Estimation in Large-Scale Surveys: A Case Study. Research Report. ETS RR-10-10

    ERIC Educational Resources Information Center

    Xu, Xueli; von Davier, Matthias

    2010-01-01

    One of the major objectives of large-scale educational surveys is reporting trends in academic achievement. For this purpose, a substantial number of items are carried from one assessment cycle to the next. The linking process that places academic abilities measured in different assessments on a common scale is usually based on a concurrent…

  7. Evaluating large-scale health programmes at a district level in resource-limited countries

    PubMed Central

    Mate, Kedar S

    2011-01-01

    Abstract Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts. PMID:22084529

  8. Health impacts of large-scale floods: governmental decision-making and resilience of the citizens.

    PubMed

    Fundter, Dick Q P; Jonkman, Bas; Beerman, Steve; Goemans, Corsmas L P M; Briggs, Rosanna; Coumans, Frits; Lahaye, Jan Willem; Bierens, Joost

    2008-01-01

    During the 15th World Congress on Disaster and Emergency Medicine in Amsterdam, May 2007 (15WCDEM), a targeted agenda program (TAP) about the public health aspects of large-scale floods was organized. The main goal of the TAP was the establishment of an overview of issues that would help governmental decision-makers to develop policies to increase the resilience of the citizens during floods. During the meetings, it became clear that citizens have a natural resistance to evacuations. This results in death due to drowning and injuries. Recently, communication and education programs have been developed that may increase awareness that timely evacuation is important and can be life-saving. After a flood, health problems persist over prolonged periods, including increased death rates during the first year after a flood and a higher incidence of chronic illnesses that last for decades after the flood recedes. Population-based resilience (bottom-up) and governmental responsibility (top-down) must be combined to prepare regions for the health impact of evacuations and floods. More research data are needed to become better informed about the health impact and consequences of translocation of health infrastructures after evacuations. A better understanding of the consequences of floods will support governmental decision-making to mitigate the health impact. A top-10 priority action list was formulated.

  9. High prevalence of caprine arthritis encephalitis virus (CAEV) in Taiwan revealed by large-scale serological survey

    PubMed Central

    YANG, Wei-Cheng; CHEN, Hui-Yu; WANG, Chi-Young; PAN, Hung-Yu; WU, Cheng-Wei; HSU, Yun-Hsiu; SU, Jui-Chuan; CHAN, Kun-Wei

    2016-01-01

    In this study, a large-scale serological survey of caprine arthritis encephalitis virus (CAEV) infection was conducted between March 2011 and October 2012. 3,437 goat blood or milk samples were collected from 65 goat farms throughout Taiwan. A commercial ELISA kit was used to detect antibodies against CAEV. The overall seropositive rate was 61.7% (2,120/3,437) in goats and in 98.5% (64/65) of goat farms. These results provide the first large-scale serological evidence for the presence of CAEV infection, indicating that the disease is widespread in Taiwan. PMID:27916786

  10. Human-Machine Cooperation in Large-Scale Multimedia Retrieval: A Survey

    ERIC Educational Resources Information Center

    Shirahama, Kimiaki; Grzegorzek, Marcin; Indurkhya, Bipin

    2015-01-01

    "Large-Scale Multimedia Retrieval" (LSMR) is the task to fast analyze a large amount of multimedia data like images or videos and accurately find the ones relevant to a certain semantic meaning. Although LSMR has been investigated for more than two decades in the fields of multimedia processing and computer vision, a more…

  11. Health risks from large-scale water pollution: trends in Central Asia.

    PubMed

    Törnqvist, Rebecka; Jarsjö, Jerker; Karimov, Bakhtiyor

    2011-02-01

    Limited data on the pollution status of spatially extensive water systems constrain health-risk assessments at basin-scales. Using a recipient measurement approach in a terminal water body, we show that agricultural and industrial pollutants in groundwater-surface water systems of the Aral Sea Drainage Basin (covering the main part of Central Asia) yield cumulative health hazards above guideline values in downstream surface waters, due to high concentrations of copper, arsenic, nitrite, and to certain extent dichlorodiphenyltrichloroethane (DDT). Considering these high-impact contaminants, we furthermore perform trend analyses of their upstream spatial-temporal distribution, investigating dominant large-scale spreading mechanisms. The ratio between parent DDT and its degradation products showed that discharges into or depositions onto surface waters are likely to be recent or ongoing. In river water, copper concentrations peak during the spring season, after thawing and snow melt. High spatial variability of arsenic concentrations in river water could reflect its local presence in the top soil of nearby agricultural fields. Overall, groundwaters were associated with much higher health risks than surface waters. Health risks can therefore increase considerably, if the downstream population must switch to groundwater-based drinking water supplies during surface water shortage. Arid regions are generally vulnerable to this problem due to ongoing irrigation expansion and climate changes.

  12. A Survey on Routing Protocols for Large-Scale Wireless Sensor Networks

    PubMed Central

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. “Large-scale” means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner

  13. Public knowledge and preventive behavior during a large-scale Salmonella outbreak: results from an online survey in the Netherlands

    PubMed Central

    2014-01-01

    Background Food-borne Salmonella infections are a worldwide concern. During a large-scale outbreak, it is important that the public follows preventive advice. To increase compliance, insight in how the public gathers its knowledge and which factors determine whether or not an individual complies with preventive advice is crucial. Methods In 2012, contaminated salmon caused a large Salmonella Thompson outbreak in the Netherlands. During the outbreak, we conducted an online survey (n = 1,057) to assess the general public’s perceptions, knowledge, preventive behavior and sources of information. Results Respondents perceived Salmonella infections and the 2012 outbreak as severe (m = 4.21; five-point scale with 5 as severe). Their knowledge regarding common food sources, the incubation period and regular treatment of Salmonella (gastro-enteritis) was relatively low (e.g., only 28.7% knew that Salmonella is not normally treated with antibiotics). Preventive behavior differed widely, and the majority (64.7%) did not check for contaminated salmon at home. Most information about the outbreak was gathered through traditional media and news and newspaper websites. This was mostly determined by time spent on the medium. Social media played a marginal role. Wikipedia seemed a potentially important source of information. Conclusions To persuade the public to take preventive actions, public health organizations should deliver their message primarily through mass media. Wikipedia seems a promising instrument for educating the public about food-borne Salmonella. PMID:24479614

  14. Detected Galaxies and Large Scale Structure in the Arecibo L-band Feed Array Zone of Avoidance Survey (ALFAZOA)

    NASA Astrophysics Data System (ADS)

    Henning, Patricia A.; Sanchez-Barrantes, Monica; McIntyre, Travis; Minchin, Robert F.; Momjian, Emmanuel; Butcher, Zhon; Rosenberg, Jessica L.; Schneider, Stephen E.; Staveley-Smith, Lister; van Driel, Wim; Ramatsoku, Mpati; Koribalski, Baerbel; Spears, Brady

    2017-01-01

    While large, systematic redshift surveys of galaxies have been conducted for decades, lack of information behind the Milky Way (the Zone of Avoidance) contributes uncertainty to our picture of dynamics in the local universe. Controversy persists for the dipole calculated from galaxy and redshift surveys compared to the CMB. Depth in redshift space is an issue, as is incomplete sky mapping, even of supposed all sky redshifts surveys. For instance, the wide-angle 2MASS Redshift Survey retains a gap of 5-8 deg around the Galactic plane. Fortunately, there is no ZOA at 21cm, except for velocities occupied by the Galaxy. This long-wavelength spectral line passes unimpeded through dust, and is unaffected by stellar confusion. With immediate redshift determination, a 21-cm survey produces a 3-dimensional map of the distribution of obscured galaxies which contain HI. It traces large-scale structure right across the Galactic Plane, and identifies obscured mass overdensities relevant to flow-field studies.ALFAZOA is a blind HI survey for galaxies behind the Milky Way covering more than 1000 square degrees of the Arecibo sky. It proceeds in two phases: shallow (completed) and deep (ongoing). The shallow survey (rms ~5-7 mJy) mapped the region within Galactic longitude l = 30 - 75 deg, and latitude b = -10 to +10 deg, detecting several hundred galaxies to about 12,000 km/s, tracing large-scale structure across the plane. The deep survey (rms ~1 mJy), in both the inner (Galactic longitude 30 - 75 deg and latitude plus/minus 2 deg) and outer (longitude 175 - 207 deg and latitude = +1 to -2 deg) Galaxy is ongoing, with detections reaching to 18,000 km/s. Analysis of detections to date, and large-scale structure mapped, will be presented.

  15. Child Maltreatment Experience among Primary School Children: A Large Scale Survey in Selangor State, Malaysia

    PubMed Central

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10–12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country. PMID:25786214

  16. Child maltreatment experience among primary school children: a large scale survey in Selangor state, Malaysia.

    PubMed

    Ahmed, Ayesha; Wan-Yuen, Choo; Marret, Mary Joseph; Guat-Sim, Cheah; Othman, Sajaratulnisah; Chinna, Karuthan

    2015-01-01

    Official reports of child maltreatment in Malaysia have persistently increased throughout the last decade. However there is a lack of population surveys evaluating the actual burden of child maltreatment, its correlates and its consequences in the country. This cross sectional study employed 2 stage stratified cluster random sampling of public primary schools, to survey 3509 ten to twelve year old school children in Selangor state. It aimed to estimate the prevalence of parental physical and emotional maltreatment, parental neglect and teacher- inflicted physical maltreatment. It further aimed to examine the associations between child maltreatment and important socio-demographic factors; family functioning and symptoms of depression among children. Logistic regression on weighted samples was used to extend results to a population level. Three quarters of 10-12 year olds reported at least one form of maltreatment, with parental physical maltreatment being most common. Males had higher odds of maltreatment in general except for emotional maltreatment. Ethnicity and parental conflict were key factors associated with maltreatment. The study contributes important evidence towards improving public health interventions for child maltreatment prevention in the country.

  17. Workplace Bullying and Sleep Disturbances: Findings from a Large Scale Cross-Sectional Survey in the French Working Population

    PubMed Central

    Niedhammer, Isabelle; David, Simone; Degioanni, Stéphanie; Drummond, Anne; Philip, Pierre

    2009-01-01

    Study Objectives: The purpose of this study was to explore the associations between workplace bullying, the characteristics of workplace bullying, and sleep disturbances in a large sample of employees of the French working population. Design: Workplace bullying, evaluated using the validated instrument developed by Leymann, and sleep disturbances, as well as covariates, were measured using a self-administered questionnaire. Covariates included age, marital status, presence of children, education, occupation, working hours, night work, physical and chemical exposures at work, self-reported health, and depressive symptoms. Statistical analysis was performed using logistic regression analysis and was carried out separately for men and women. Setting: General working population. Participants: The study population consisted of a random sample of 3132 men and 4562 women of the working population in the southeast of France. Results: Workplace bullying was strongly associated with sleep disturbances. Past exposure to bullying also increased the risk for this outcome. The more frequent the exposure to bullying, the higher the risk of experiencing sleep disturbances. Observing someone else being bullied in the workplace was also associated with the outcome. Adjustment for covariates did not modify the results. Additional adjustment for self-reported health and depressive symptoms diminished the magnitude of the associations that remained significant. Conclusions: The prevalence of workplace bullying (around 10%) was found to be high in this study as well was the impact of this major job-related stressor on sleep disturbances. Although no conclusion about causality could be drawn from this cross-sectional study, the findings suggest that the contribution of workplace bullying to the burden of sleep disturbances may be substantial. Citation: Niedhammer I; David S; Degioanni S; Drummond A; Philip P. Workplace bullying and sleep disturbances: findings from a large scale cross

  18. The Muenster Red Sky Survey: Large-scale structures in the universe

    NASA Astrophysics Data System (ADS)

    Ungruhe, R.; Seitter, W. C.; Duerbeck, H. W.

    2003-01-01

    We present a large-scale galaxy catalogue for the red spectral region which covers an area of 5 000 square degrees. It contains positions, red magnitudes, radii, ellipticities and position angles of about 5.5 million galaxies. Together with the APM catalogue (4,300 square degrees) in the blue spectral region, this catalogue forms at present the largest coherent data base for cosmological investigations in the southern hemisphere. 217 ESO Southern Sky Atlas R Schmidt plates with galactic latitudes -45 degrees were digitized with the two PDS microdensitometers of the Astronomisches Institut Münster, with a step width of 15 microns, corresponding to 1.01 arcseconds per pixel. All data were stored on different storage media and are available for further investigations. Suitable search parameters must be chosen in such a way that all objects are found on the plates, and that the percentage of artificial objects remains as low as possible. Based on two reference areas on different plates, a search threshold of 140 PDS density units and a minimum number of four pixels per object were chosen. The detected objects were stored, according to size, in frames of different size length. Each object was investigated in its frame, and 18 object parameters were determined. The classification of objects into stars, galaxies and perturbed objects was done with an automatic procedure which makes use of combinations of computed object parameters. In the first step, the perturbed objects are removed from the catalogue. Double objects and noise objects can be excluded on the basis of symmetry properties, while for satellite trails, a new classification criterium based on apparent magnitude, effective radius and apparent ellipticity, was developed. For the remaining objects, a star/galaxy separation was carried out. For bright objects, the relation between apparent magnitude and effective radius serves as the discriminating property, for fainter objects, the relation between effective

  19. Multi-stage sampling for large scale natural resources surveys: A case study of rice and waterfowl

    USGS Publications Warehouse

    Stafford, J.D.; Reinecke, K.J.; Kaminski, R.M.; Gerard, P.D.

    2005-01-01

    Large-scale sample surveys to estimate abundance and distribution of organisms and their habitats are increasingly important in ecological studies. Multi-stage sampling (MSS) is especially suited to large-scale surveys because of the natural clustering of resources. To illustrate an application, we: (1) designed a stratified MSS to estimate late autumn abundance (kg/ha) of rice seeds in harvested fields as food for waterfowl wintering in the Mississippi Alluvial Valley (MAV); (2) investigated options for improving the MSS design; and (3) compared statistical and cost efficiency of MSS to simulated simple random sampling (SRS). During 2000?2002, we sampled 25?35 landowners per year, 1 or 2 fields per landowner per year, and measured seed mass in 10 soil cores collected within each field. Analysis of variance components and costs for each stage of the survey design indicated that collecting 10 soil cores per field was near the optimum of 11?15, whereas sampling >1 field per landowner provided few benefits because data from fields within landowners were highly correlated. Coefficients of variation (CV) of annual estimates of rice abundance ranged from 0.23 to 0.31 and were limited by variation among landowners and the number of landowners sampled. Design effects representing the statistical efficiency of MSS relative to SRS ranged from 3.2 to 9.0, and simulations indicated SRS would cost, on average, 1.4 times more than MSS because clustering of sample units in MSS decreased travel costs. We recommend MSS as a potential sampling strategy for large-scale natural resource surveys and specifically for future surveys of the availability of rice as food for waterfowl in the MAV and similar areas.

  20. A Survey of Residents' Perceptions of the Effect of Large-Scale Economic Developments on Perceived Safety, Violence, and Economic Benefits

    PubMed Central

    Fabio, Anthony; Geller, Ruth; Bazaco, Michael; Bear, Todd M.; Foulds, Abigail L.; Duell, Jessica; Sharma, Ravi

    2015-01-01

    Background. Emerging research highlights the promise of community- and policy-level strategies in preventing youth violence. Large-scale economic developments, such as sports and entertainment arenas and casinos, may improve the living conditions, economics, public health, and overall wellbeing of area residents and may influence rates of violence within communities. Objective. To assess the effect of community economic development efforts on neighborhood residents' perceptions on violence, safety, and economic benefits. Methods. Telephone survey in 2011 using a listed sample of randomly selected numbers in six Pittsburgh neighborhoods. Descriptive analyses examined measures of perceived violence and safety and economic benefit. Responses were compared across neighborhoods using chi-square tests for multiple comparisons. Survey results were compared to census and police data. Results. Residents in neighborhoods with the large-scale economic developments reported more casino-specific and arena-specific economic benefits. However, 42% of participants in the neighborhood with the entertainment arena felt there was an increase in crime, and 29% of respondents from the neighborhood with the casino felt there was an increase. In contrast, crime decreased in both neighborhoods. Conclusions. Large-scale economic developments have a direct influence on the perception of violence, despite actual violence rates. PMID:26273310

  1. A large scale geophysical survey in the archaeological site of Europos (northern Greece)

    NASA Astrophysics Data System (ADS)

    Tsokas, G. N.; Giannopoulos, A.; Tsourlos, P.; Vargemezis, G.; Tealby, J. M.; Sarris, A.; Papazachos, C. B.; Savopoulou, T.

    1994-04-01

    The results of a large scale exploration of an archaeological site by geophysical means are presented and discussed. The operation took place in the site where the ruins of the ancient city of Europos are buried. This site is in northern Greece. Resistivity prospecting was employed to detect the remnants of wall foundations in the place where the main urban complex of the ancient city once stood. The data were transformed in an image form depicting, thus, the spatial variation of resistivity in a manner that resembles the plane view of the ruins that could have been drawn if an excavation had taken place. This image revealed the urban plan of the latest times of the life of the city. Trial excavations verified the geophysical result. Magnetic prospecting in the same area complemented the resistivity data. The exact location of the fire hearths, kilns and remnants of collapsed roofs were spotted. Magnetic gradient measurements were taken in an area out of the main complex of the ancient city and revealed the location of several kilns. One of these locations was excavated and a pottery kiln was discovered. The resistivity prospecting in one of the graveyards of the ancient city showed anomalies which were expected and corresponded to monumental tombs. The locations of a few of them were excavated and large burial structures were revealed. Ground probing radar profiles were measured over the tombs which showed pronounced resistivity anomalies, so far unearthed. The relatively high resolving ability of the method assisted the interpretation in the sense that a few attributes were added. In the presented case, it was concluded that a particular tomb consists of two rooms and that it is roofless.

  2. A Strong-Lens Survey in AEGIS: the Influence of Large Scale Structure

    SciTech Connect

    Moustakas, Leonidas A.; Marshall, Phil J.; Newman, Jeffrey A.; Coil, Alison L.; Cooper, Michael C.; Davis, Marc; Fassnacht, Christopher D.; Guhathakurta, Puragra; Hopkins, Andrew; Koekemoer, Anton; Konidaris, Nicholas P.; Lotz, Jennifer M.; Willmer, Christopher N.A.; /Arizona U., Astron. Dept. - Steward Observ.

    2006-07-14

    We report on the results of a visual search for galaxy-scale strong gravitational lenses over 650 arcmin2 of HST/ACS imaging in the Extended Groth Strip (EGS). These deep F606W- and F814W-band observations are in the DEEP2-EGS field. In addition to a previously-known Einstein Cross also found by our search (the ''Cross'', HSTJ141735+52264, with z{sub lens} = 0.8106 and a published z{sub source} = 3.40), we identify two new strong galaxy-galaxy lenses with multiple extended arcs. The first, HSTJ141820+52361 (the ''Dewdrop''; z{sub lens} = 0.5798), lenses two distinct extended sources into two pairs of arcs (z{sub source} = 0.9818 by nebular [O{sub II}] emission), while the second, HSTJ141833+52435 (the ''Anchor''; z{sub lens} = 0.4625), produces a single pair of arcs (source redshift not yet known). Four less convincing arc/counter-arc and two-image lens candidates are also found and presented for completeness. All three definite lenses are fit reasonably well by simple singular isothermal ellipsoid models including external shear, giving {chi}{sub {nu}}{sup 2}values close to unity. Using the three-dimensional line-of-sight (LOS) information on galaxies from the DEEP2 data, we calculate the convergence and shear contributions {kappa}{sub los} and {gamma}{sub los} to each lens, assuming singular isothermal sphere halos truncated at 200 h{sup -1} kpc. These are compared against a robust measure of local environment, {delta}{sub 3}, a normalized density that uses the distance to the third nearest neighbor. We find that even strong lenses in demonstrably underdense local environments may be considerably affected by LOS contributions, which in turn, under the adopted assumptions, may be underestimates of the effect of large scale structure.

  3. GLOBAL CLIMATE AND LARGE-SCALE INFLUENCES ON AQUATIC ANIMAL HEALTH

    EPA Science Inventory

    The last 3 decades have witnessed numerous large-scale mortality events of aquatic organisms in North America. Affected species range from ecologically-important sea urchins to commercially-valuable American lobsters and protected marine mammals. Short-term forensic investigation...

  4. Testing LSST Dither Strategies for Survey Uniformity and Large-scale Structure Systematics

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric; Kurczynski, Peter; Jones, R. Lynne; Zhan, Hu; Padilla, Nelson D.; Muñoz Arancibia, Alejandra M.; Orsi, Alvaro; Cora, Sofía A.; Yoachim, Peter

    2016-09-01

    The Large Synoptic Survey Telescope (LSST) will survey the southern sky from 2022-2032 with unprecedented detail. Since the observing strategy can lead to artifacts in the data, we investigate the effects of telescope-pointing offsets (called dithers) on the r-band coadded 5σ depth yielded after the 10-year survey. We analyze this survey depth for several geometric patterns of dithers (e.g., random, hexagonal lattice, spiral) with amplitudes as large as the radius of the LSST field of view, implemented on different timescales (per season, per night, per visit). Our results illustrate that per night and per visit dither assignments are more effective than per season assignments. Also, we find that some dither geometries (e.g., hexagonal lattice) are particularly sensitive to the timescale on which the dithers are implemented, while others like random dithers perform well on all timescales. We then model the propagation of depth variations to artificial fluctuations in galaxy counts, which are a systematic for LSS studies. We calculate the bias in galaxy counts caused by the observing strategy accounting for photometric calibration uncertainties, dust extinction, and magnitude cuts; uncertainties in this bias limit our ability to account for structure induced by the observing strategy. We find that after 10 years of the LSST survey, the best dither strategies lead to uncertainties in this bias that are smaller than the minimum statistical floor for a galaxy catalog as deep as r < 27.5. A few of these strategies bring the uncertainties close to the statistical floor for r < 25.7 after the first year of survey.

  5. Children's Attitudes about Nuclear War: Results of Large-Scale Surveys of Adolescents.

    ERIC Educational Resources Information Center

    Doctor, Ronald M.; And Others

    A three-section survey instrument was developed to provide descriptive and expressive information about teenagers' attitudes and fear reactions related to the nuclear threat. The first section consisted of one open-ended statement, "Write down your three greatest worries." The second section consisted of 20 areas of potential worry or…

  6. Nonparametric Bayesian Multiple Imputation for Incomplete Categorical Variables in Large-Scale Assessment Surveys

    ERIC Educational Resources Information Center

    Si, Yajuan; Reiter, Jerome P.

    2013-01-01

    In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…

  7. Auxilliary Method Or Sophisticated Field Method? - Thoughts On The Use of Large Scale Magnetometer Surveying In Archaeology

    NASA Astrophysics Data System (ADS)

    Posselt, M.

    The contribution presents large scale magnetometer surveys at early neolithic sites (Linearbandkeramik, 5500-4900 B.C.) in Germany. They serve to supply a prospect for future use of magnetometer survey in archaeological research. It is claimed that it ought to be a major goal to make geophysical survey and magnetometer survey espe- cially an independent field method in archaeology becoming used with the same ob- viousness as excavation or fieldwalking etc. Geophysical surveying, especially mag- netometer survey, is used in archaeology for planning excavations of sites in detail with the aim to save time and costs. Furthermore some sites are investigated by geo- physical methods for research reasons when it is impossible or not necessary to exca- vate them. Though geophysics did precious aid for archaeology in many cases it still lacks acknowledgement, which is necessary to develop its complete potentiality as an archaeological field method. One of the basic reasons for this situation is the evalua- tion geophysics` results suffer from traditional archaeology. Occasionaly geophysics and excavations produce seemingly inconsistent results. Such contradictions then are assigned to the lack of reliability of geophsical survey techniques. Wrongly, as this paper is to show. The seeming contradiction of the results can be explained by a mis- understanding of the possibilities and restrictions of geophysics. Both fieldmethods - excavation and survey - bear their very own fundamentals and allow a restricted set of statements. Therefore a comparison of both methods needs to consider the differ- ences of both methods in detail. The contribution is pictured with the survey maps of neolithic longhouses, which have been detected in several recent projects in the western center of Germany. The maps of these houses produced by magnetometer survey show many of the fine structures the archaeologist is used to know from the excavation of respective sites. For the first time postholes

  8. The Asthma Mobile Health Study, a large-scale clinical observational study using ResearchKit.

    PubMed

    Chan, Yu-Feng Yvonne; Wang, Pei; Rogers, Linda; Tignor, Nicole; Zweig, Micol; Hershman, Steven G; Genes, Nicholas; Scott, Erick R; Krock, Eric; Badgeley, Marcus; Edgar, Ron; Violante, Samantha; Wright, Rosalind; Powell, Charles A; Dudley, Joel T; Schadt, Eric E

    2017-04-01

    The feasibility of using mobile health applications to conduct observational clinical studies requires rigorous validation. Here, we report initial findings from the Asthma Mobile Health Study, a research study, including recruitment, consent, and enrollment, conducted entirely remotely by smartphone. We achieved secure bidirectional data flow between investigators and 7,593 participants from across the United States, including many with severe asthma. Our platform enabled prospective collection of longitudinal, multidimensional data (e.g., surveys, devices, geolocation, and air quality) in a subset of users over the 6-month study period. Consistent trending and correlation of interrelated variables support the quality of data obtained via this method. We detected increased reporting of asthma symptoms in regions affected by heat, pollen, and wildfires. Potential challenges with this technology include selection bias, low retention rates, reporting bias, and data security. These issues require attention to realize the full potential of mobile platforms in research and patient care.

  9. The DOHA algorithm: a new recipe for cotrending large-scale transiting exoplanet survey light curves

    NASA Astrophysics Data System (ADS)

    Mislis, D.; Pyrzas, S.; Alsubai, K. A.; Tsvetanov, Z. I.; Vilchez, N. P. E.

    2017-03-01

    We present DOHA, a new algorithm for cotrending photometric light curves obtained by transiting exoplanet surveys. The algorithm employs a novel approach to the traditional 'differential photometry' technique, by selecting the most suitable comparison star for each target light curve, using a two-step correlation search. Extensive tests on real data reveal that DOHA corrects both intra-night variations and long-term systematics affecting the data. Statistical studies conducted on a sample of ∼9500 light curves from the Qatar Exoplanet Survey reveal that DOHA-corrected light curves show an rms improvement of a factor of ∼2, compared to the raw light curves. In addition, we show that the transit detection probability in our sample can increase considerably, even up to a factor of 7, after applying DOHA.

  10. A composite large-scale CO survey at high galactic latitudes in the second quadrant

    NASA Technical Reports Server (NTRS)

    Heithausen, A.; Stacy, J. G.; De Vries, H. W.; Mebold, U.; Thaddeus, P.

    1993-01-01

    Surveys undertaken in the 2nd quadrant of the Galaxy with the CfA 1.2 m telescope have been combined to produce a map covering about 620 sq deg in the 2.6 mm CO(J = 1 - 0) line at high galactic latitudes. There is CO emission from molecular 'cirrus' clouds in about 13 percent of the region surveyed. The CO clouds are grouped together into three major cloud complexes with 29 individual members. All clouds are associated with infrared emission at 100 micron, although there is no one-to-one correlation between the corresponding intensities. CO emission is detected in all bright and dark Lynds' nebulae cataloged in that region; however not all CO clouds are visible on optical photographs as reflection or absorption features. The clouds are probably local. At an adopted distance of 240 pc cloud sizes range from O.1 to 30 pc and cloud masses from 1 to 1600 solar masses. The molecular cirrus clouds contribute between 0.4 and 0.8 M solar mass/sq pc to the surface density of molecular gas in the galactic plane. Only 26 percent of the 'infrared-excess clouds' in the area surveyed actually show CO and about 2/3 of the clouds detected in CO do not show an infrared excess.

  11. A large-scale survey on sharp injuries among hospital-based healthcare workers in China

    PubMed Central

    Gao, Xiaodong; Hu, Bijie; Suo, Yao; Lu, Qun; Chen, Baiyi; Hou, Tieying; Qin, Jin’ai; Huang, Wenzhi; Zong, Zhiyong

    2017-01-01

    A multi-center survey on sharp injuries (SIs) among hospital-based healthcare workers (HCWs) in seven provinces of China between August and December 2011 was performed. In each province, HCWs from at least 30 hospitals were surveyed by completing a SI report form adapted from the EPINet. The HCWs who declared SIs during the period were interviewed by local infection control practitioners. The survey included 361 hospitals and 206,711 HCWs, most of whom were nurses (47.5%) or doctors (28.4%). In the previous month, 17,506 SI incidents were declared by 13,110 (6.3%) HCWs, corresponding to 1,032 incidents per 1,000 HCWs per year and 121.3 per 100 occupied beds per year. The majority of the SIs was caused by a hollow-bore needle (63.0%). The source patient was identified in 73.4% of all SIs but only 4.4% of all exposures involved a source patient who tested positive for HBV (3.3%), HCV (0.4%) or HIV (0.1%). Only 4.6% of SIs were reported to the infection control team in the hospitals. In conclusion, the rate of SI among HCWs is high in China and SI represents a severe but largely neglected problem. Awareness and safety climate should be promoted to protect the safety of HCWs in China. PMID:28205607

  12. A large-scale survey on sharp injuries among hospital-based healthcare workers in China.

    PubMed

    Gao, Xiaodong; Hu, Bijie; Suo, Yao; Lu, Qun; Chen, Baiyi; Hou, Tieying; Qin, Jin'ai; Huang, Wenzhi; Zong, Zhiyong

    2017-02-16

    A multi-center survey on sharp injuries (SIs) among hospital-based healthcare workers (HCWs) in seven provinces of China between August and December 2011 was performed. In each province, HCWs from at least 30 hospitals were surveyed by completing a SI report form adapted from the EPINet. The HCWs who declared SIs during the period were interviewed by local infection control practitioners. The survey included 361 hospitals and 206,711 HCWs, most of whom were nurses (47.5%) or doctors (28.4%). In the previous month, 17,506 SI incidents were declared by 13,110 (6.3%) HCWs, corresponding to 1,032 incidents per 1,000 HCWs per year and 121.3 per 100 occupied beds per year. The majority of the SIs was caused by a hollow-bore needle (63.0%). The source patient was identified in 73.4% of all SIs but only 4.4% of all exposures involved a source patient who tested positive for HBV (3.3%), HCV (0.4%) or HIV (0.1%). Only 4.6% of SIs were reported to the infection control team in the hospitals. In conclusion, the rate of SI among HCWs is high in China and SI represents a severe but largely neglected problem. Awareness and safety climate should be promoted to protect the safety of HCWs in China.

  13. Large Scale Variability Surveys from Venezuela: Orion OB1 and beyond

    NASA Astrophysics Data System (ADS)

    Briceño, C.; Calvet, N.; Vivas, A. K.; Hartmann, L.

    We present our scheme and initial results for variability surveys spanning hundreds of square degrees near the celestial equator, carried out with an 8k x 8k CCD Mosaic Camera optimized for drift-scanning, installed on the 1m Schmidt telescope at the Venezuela National Astronomical Observatory. In the Orion OB1 association, one of the nearest and most active regions of star formation, we are conducting a 120 sqr.deg. VRIHalpha survey to map the low mass young stellar population in this region. The absence of dust and gas around the young stars in the ˜ 10 Myr Ori OB 1a sub-association suggests that star formation is a rapid process, and that molecular clouds do not last more than a few million years after the first stars are born. The lack of accretion indicators or near IR emission from inner dusty disks among stars in Ori OB 1a suggests that significant disk dissipation has occurred in a few Myr, possibly due to the coagulation/agglomeration of dust particles into larger bodies like planetesimals or planets. The results of our variability surveys will be made available through a massive database equipped with web-based data mining tools, as part of the effort leading to the International Virtual Observatory.

  14. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  15. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys

    PubMed Central

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov–Malyshev–Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  16. Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities

    NASA Astrophysics Data System (ADS)

    Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.

    2012-07-01

    Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.

  17. Large Scale Structures in the Las Campanas Redshift Survey and in Simulations

    NASA Astrophysics Data System (ADS)

    Müller, V.; Doroshkevich, A. G.; Retzlaff, J.; Turchaninov, V.

    1999-06-01

    The large supercluster structures obvious in recent galaxy redshift surveys are quantified using an one-dimensional cluster analysis (core sampling) and a three-dimensional cluster analysis based on the minimal spanning tree. The comparison with the LCRS reveals promising stable results. At a mean overdensity of about ten, the supercluster systems form huge wall-like structures comprising about 40% of all galaxies. The overdense clusters have a low mean transverse velocity dispersion of about 400 km/s, i.e. they look quite narrow in redshift space. We performed N-body simulations with large box sizes for six cosmological scenarios. The quantitative analysis shows that the observed structures can be understood best in low density models with Ω_m <= 0.5 with or without a cosmological constant.

  18. Large-scale survey to describe acne management in Brazilian clinical practice

    PubMed Central

    Seité, Sophie; Caixeta, Clarice; Towersey, Loan

    2015-01-01

    Background Acne is a chronic disease of the pilosebaceous unit that mainly affects adolescents. It is the most common dermatological problem, affecting approximately 80% of teenagers between 12 and 18 years of age. Diagnosis is clinical and is based on the patient’s age at the time the lesions first appear, and on its polymorphism, type of lesions, and their anatomical location. The right treatment for the right patient is key to treating acne safely. The aim of this investigational survey was to evaluate how Brazilian dermatologists in private practice currently manage acne. Materials and methods Dermatologists practicing in 12 states of Brazil were asked how they manage patients with grades I, II, III, and IV acne. Each dermatologist completed a written questionnaire about patient characteristics, acne severity, and the therapy they usually prescribe for each situation. Results In total, 596 dermatologists were interviewed. Adolescents presented as the most common acneic population received by dermatologists, and the most common acne grade was grade II. The doctors could choose more than one type of treatment for each patient, and treatment choices varied according to acne severity. A great majority of dermatologists considered treatment with drugs as the first alternative for all acne grades, choosing either topical or oral presentation depending on the pathology severity. Dermocosmetics were chosen mostly as adjunctive therapy, and their inclusion in the treatment regimen decreased as acne grades increased. Conclusion This survey illustrates that Brazilian dermatologists employ complex treatment regimens to manage acne, choosing systemic drugs, particularly isotretinoin, even in some cases of grade I acne, and heavily prescribe antibiotics. Because complex regimens are harder for patients to comply with, this result notably raises the question of adherence, which is a key factor in successful treatment. PMID:26609243

  19. Studying Displacement After a Disaster Using Large Scale Survey Methods: Sumatra After the 2004 Tsunami

    PubMed Central

    Gray, Clark; Frankenberg, Elizabeth; Gillespie, Thomas; Sumantri, Cecep; Thomas, Duncan

    2014-01-01

    Understanding of human vulnerability to environmental change has advanced in recent years, but measuring vulnerability and interpreting mobility across many sites differentially affected by change remains a significant challenge. Drawing on longitudinal data collected on the same respondents who were living in coastal areas of Indonesia before the 2004 Indian Ocean tsunami and were re-interviewed after the tsunami, this paper illustrates how the combination of population-based survey methods, satellite imagery and multivariate statistical analyses has the potential to provide new insights into vulnerability, mobility and impacts of major disasters on population well-being. The data are used to map and analyze vulnerability to post-tsunami displacement across the provinces of Aceh and North Sumatra and to compare patterns of migration after the tsunami between damaged areas and areas not directly affected by the tsunami. The comparison reveals that migration after a disaster is less selective overall than migration in other contexts. Gender and age, for example, are strong predictors of moving from undamaged areas but are not related to displacement in areas experiencing damage. In our analyses traditional predictors of vulnerability do not always operate in expected directions. Low levels of socioeconomic status and education were not predictive of moving after the tsunami, although for those who did move, they were predictive of displacement to a camp rather than a private home. This survey-based approach, though not without difficulties, is broadly applicable to many topics in human-environment research, and potentially opens the door to rigorous testing of new hypotheses in this literature. PMID:24839300

  20. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    SciTech Connect

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; Percival, Will J.; Tinker, Jeremy; Tojeiro, Rito; White, Marin; Daniel J. Einstein; Maraston, Claudia; Ross, Ashley J.; Sanchez, Ariel G.; Schlegel, David; Sheldon, Erin; Strauss, Michael A.; Thomas, Daniel; Wake, David; Beutler, Florian; Bizyaev, Dmitry; Bolton, Adam S.; Brownstein, Joel R.; Chuang, Chia -Hsun; Dawson, Kyle; Harding, Paul; Kitaura, Francisco -Shu; Leauthaud, Alexie; Masters, Karen; McBride, Cameron K.; More, Surhud; Olmstead, Matthew D.; Oravetz, Daniel; Nuza, Sebastian E.; Pan, Kaike; Parejko, John; Pforr, Janine; Prada, Francisco; Rodriguez-Torres, Sergio; Salazar-Albornoz, Salvador; Samushia, Lado; Schneider, Donald P.; Scoccola, Claudia G.; Simmons, Audrey; Vargas-Magana, Mariana

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.

  1. SDSS-III Baryon Oscillation Spectroscopic Survey data release 12: Galaxy target selection and large-scale structure catalogues

    DOE PAGES

    Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...

    2015-11-17

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less

  2. New ultracool subdwarfs identified in large-scale surveys using Virtual Observatory tools

    NASA Astrophysics Data System (ADS)

    Lodieu, N.; Espinoza Contreras, M.; Zapatero Osorio, M. R.; Solano, E.; Aberasturi, M.; Martín, E. L.; Rodrigo, C.

    2017-02-01

    Aims: We aim to develop an efficient method to search for late-type subdwarfs (metal-depleted dwarfs with spectral types ≥M5) to improve the current statistics. Our objectives are to improve our knowledge of metal-poor low-mass dwarfs, bridge the gap between the late-M and L types, determine their surface density, and understand the impact of metallicity on the stellar and substellar mass function. Methods: We carried out a search cross-matching the Sloan Digital Sky Survey (SDSS) Data Release 7 (DR7) and the Two Micron All Sky Survey (2MASS), and different releases of SDSS and the United Kingdom InfraRed Telescope (UKIRT) Infrared Deep Sky Survey (UKIDSS) using STILTS, Aladin, and Topcat developed as part of the Virtual Observatory tools. We considered different photometric and proper motion criteria for our selection. We identified 29 and 71 late-type subdwarf candidates in each cross-correlation over 8826 and 3679 sq. deg, respectively (2312 sq. deg overlap). We obtained our own low-resolution optical spectra for 71 of our candidates: 26 were observed with the Gran Telescopio de Canarias (GTC; R 350, λλ5000-10 000 Å), six with the Nordic Optical Telescope (NOT; R 450, λλ5000-10 700 Å), and 39 with the Very Large Telescope (VLT; R 350, λλ6000-11 000 Å). We also retrieved spectra for 30 of our candidates from the SDSS spectroscopic database (R 2000 and λλ 3800-9400 Å), nine of these 30 candidates with an independent spectrum in our follow-up. We classified 92 candidates based on 101 optical spectra using two methods: spectral indices and comparison with templates of known subdwarfs. Results: We developed an efficient photometric and proper motion search methodology to identify metal-poor M dwarfs. We confirmed 86% and 94% of the candidates as late-type subdwarfs from the SDSS vs. 2MASS and SDSS vs. UKIDSS cross-matches, respectively. These subdwarfs have spectral types ranging between M5 and L0.5 and SDSS magnitudes in the r = 19.4-23.3 mag range

  3. Effects on aquatic and human health due to large scale bioenergy crop expansion.

    PubMed

    Love, Bradley J; Einheuser, Matthew D; Nejadhashemi, A Pouyan

    2011-08-01

    In this study, the environmental impacts of large scale bioenergy crops were evaluated using the Soil and Water Assessment Tool (SWAT). Daily pesticide concentration data for a study area consisting of four large watersheds located in Michigan (totaling 53,358 km²) was estimated over a six year period (2000-2005). Model outputs for atrazine, bromoxynil, glyphosate, metolachlor, pendimethalin, sethoxydim, triflualin, and 2,4-D model output were used to predict the possible long-term implications that large-scale bioenergy crop expansion may have on the bluegill (Lepomis macrochirus) and humans. Threshold toxicity levels were obtained for the bluegill and for human consumption for all pesticides being evaluated through an extensive literature review. Model output was compared to each toxicity level for the suggested exposure time (96-hour for bluegill and 24-hour for humans). The results suggest that traditional intensive row crops such as canola, corn and sorghum may negatively impact aquatic life, and in most cases affect the safe drinking water availability. The continuous corn rotation, the most representative rotation for current agricultural practices for a starch-based ethanol economy, delivers the highest concentrations of glyphosate to the stream. In addition, continuous canola contributed to a concentration of 1.11 ppm of trifluralin, a highly toxic herbicide, which is 8.7 times the 96-hour ecotoxicity of bluegills and 21 times the safe drinking water level. Also during the period of study, continuous corn resulted in the impairment of 541,152 km of stream. However, there is promise with second-generation lignocellulosic bioenergy crops such as switchgrass, which resulted in a 171,667 km reduction in total stream length that exceeds the human threshold criteria, as compared to the base scenario. Results of this study may be useful in determining the suitability of bioenergy crop rotations and aid in decision making regarding the adaptation of large-scale

  4. [No relationship between blood type and personality: evidence from large-scale surveys in Japan and the US].

    PubMed

    Nawata, Kengo

    2014-06-01

    Despite the widespread popular belief in Japan about a relationship between personality and ABO blood type, this association has not been empirically substantiated. This study provides more robust evidence that there is no relationship between blood type and personality, through a secondary analysis of large-scale survey data. Recent data (after 2000) were collected using large-scale random sampling from over 10,000 people in total from both Japan and the US. Effect sizes were calculated. Japanese datasets from 2004 (N = 2,878-2,938), and 2,005 (N = 3,618-3,692) as well as one dataset from the US in 2004 (N = 3,037-3,092) were used. In all the datasets, 65 of 68 items yielded non-significant differences between blood groups. Effect sizes (eta2) were less than .003. This means that blood type explained less than 0.3% of the total variance in personality. These results show the non-relevance of blood type for personality.

  5. Large-scale distribution of surface ozone mixing ratio in southern Mongolia: A survey

    NASA Astrophysics Data System (ADS)

    Meixner, F. X.; Behrendt, T.; Ermel, M.; Hempelmann, N.; Andreae, M. O.; Jöckel, P.

    2012-04-01

    For the first time, measurements of surface ozone mixing ratio have been performed from semi-arid steppe to arid/hyper-arid southern Mongolian Gobi desert. During 12-29 August 2009, ozone mixing ratio was continuously measured from a mobile platform (4x4 Furgon SUV). The survey (3060 km / 229171km2) started at the Mongolian capital Ulaan-Baatar (47.9582° N, 107.0190° E ), heading to south-west (Echin Gol, 43.2586° N, 99.0255° E), eastward to Dalanzadgad (43.6061° N, 104.4445° E), and finally back to Ulaan-Baatar. Ambient air was sampled (approx. 1 l/min) through a 4 m long PTFE-intake line along a forward facing boom mounted on the roof of a 4x4 Furgon SUV. Ozone mixing ratio has been measured by UV-spectroscopy using a mobile dual-cell ozone analyzer (model 205, 2BTechnologies, Boulder, U.S.A.). While ozone signals were measured every 5 seconds, 1 minute averages and standard deviations have been calculated on-line and stored into the data logger. The latter are used to identify and to discriminate against unrealistic low or high ozone mixing ratios which have been due to occasionally passing plumes of vehicle exhaust and/or biomass burning gases, as well as gasoline (at gas filling stations). Even under desert conditions, the temporal behaviour of ozone mixing ratio was characterized by considerable and regular diel variations. Minimum mixing ratios (15-25 ppb) occurred early in the morning (approx. 06:00 local), when surface depletion of ozone (by dry deposition) can not be compensated by supply from the free troposphere due to thermodynamic stability of the nocturnal boundary layer. Late in the afternoon (approx. 17:00 local), under conditions of a turbulently well mixed convective boundary layer, maximum ozone mixing ratios (45-55 ppb) were reached. Daily amplitudes of the diel cycle of ozone mixing ratio were in the order of 30 ppb (steppe), 20 ppb (arid desert), to approx. 5 ppb (hyper-arid Gobi desert (Shargyn Gobi)). Ozone surface measurements were

  6. Examining Agencies' Satisfaction with Electronic Record Management Systems in e-Government: A Large-Scale Survey Study

    NASA Astrophysics Data System (ADS)

    Hsu, Fang-Ming; Hu, Paul Jen-Hwa; Chen, Hsinchun; Hu, Han-Fen

    While e-government is propelling and maturing steadily, advanced technological capabilities alone cannot guarantee agencies’ realizing the full benefits of the enabling computer-based systems. This study analyzes information systems in e-government settings by examining agencies’ satisfaction with an electronic record management system (ERMS). Specifically, we investigate key satisfaction determinants that include regulatory compliance, job relevance, and satisfaction with support services for using the ERMS. We test our model and the hypotheses in it, using a large-scale survey that involves a total of 1,652 government agencies in Taiwan. Our results show significant effects of regulatory compliance on job relevance and satisfaction with support services, which in turn determine government agencies’ satisfaction with an ERMS. Our data exhibit a reasonably good fit to our model, which can explain a significant portion of the variance in agencies’ satisfaction with an ERMS. Our findings have several important implications to research and practice, which are also discussed.

  7. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  8. The SRG/eROSITA All-Sky Survey: A new era of large-scale structure studies with AGN

    NASA Astrophysics Data System (ADS)

    Kolodzig, Alexander; Gilfanov, Marat; Hütsi, Gert; Sunyaev, Rashid

    2015-08-01

    The four-year X-ray All-Sky Survey (eRASS) of the eROSITA telescope aboard the Spektrum-Roentgen-Gamma (SRG) satellite will detect about 3 million active galactic nuclei (AGN) with a median redshift of z~1 and typical luminosity of L0.5-2.0keV ~ 1044 erg/s. We demonstrate that this unprecedented AGN sample, complemented with redshift information, will supply us with outstanding opportunities for large-scale structure (LSS) studies.We show that with this sample of X-ray selected AGN, it will become possible for the first time to perform detailed redshift- and luminosity-resolved studies of the AGN clustering. This enable us to put strong constraints on different AGN triggering/fueling models as a function of AGN environment, which will dramatically improve our understanding of super-massive black hole growth and its correlation with the co-evolving LSS.Further, the eRASS AGN sample will become a powerful cosmological probe. We demonstrate for the first time that, given the breadth and depth of eRASS, it will become possible to convincingly detect baryonic acoustic oscillations (BAOs) with ~8σ confidence in the 0.8 < z < 2.0 range, currently uncovered by any existing BAO survey.Finally, we discuss the requirements for follow-up missions and demonstrate that in order to fully exploit the potential of the eRASS AGN sample, photometric and spectroscopic surveys of large areas and a sufficient depth will be needed.

  9. Awareness and Concern about Large-Scale Livestock and Poultry: Results from a Statewide Survey of Ohioans

    ERIC Educational Resources Information Center

    Sharp, Jeff; Tucker, Mark

    2005-01-01

    The development of large-scale livestock facilities has become a controversial issue in many regions of the U.S. in recent years. In this research, rural-urban differences in familiarity and concern about large-scale livestock facilities among Ohioans is examined as well as the relationship of social distance from agriculture and trust in risk…

  10. Pre- and Postnatal Influences on Preschool Mental Health: A Large-Scale Cohort Study

    ERIC Educational Resources Information Center

    Robinson, Monique; Oddy, Wendy H.; Li, Jianghong; Kendall, Garth E.; de Klerk, Nicholas H.; Silburn, Sven R.; Zubrick, Stephen R.; Newnham, John P.; Stanley, Fiona J.; Mattes, Eugen

    2008-01-01

    Background: Methodological challenges such as confounding have made the study of the early determinants of mental health morbidity problematic. This study aims to address these challenges in investigating antenatal, perinatal and postnatal risk factors for the development of mental health problems in pre-school children in a cohort of Western…

  11. Cosmology from large-scale galaxy clustering and galaxy–galaxy lensing with Dark Energy Survey Science Verification data

    SciTech Connect

    Kwan, J.; Sánchez, C.; Clampitt, J.; Blazek, J.; Crocce, M.; Jain, B.; Zuntz, J.; Amara, A.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; DeRose, J.; Dodelson, S.; Eifler, T. F.; Gaztanaga, E.; Giannantonio, T.; Gruen, D.; Hartley, W. G.; Kacprzak, T.; Kirk, D.; Krause, E.; MacCrann, N.; Miquel, R.; Park, Y.; Ross, A. J.; Rozo, E.; Rykoff, E. S.; Sheldon, E.; Troxel, M. A.; Wechsler, R. H.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Rosell, A. Carnero; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Jarvis, M.; Kuehn, K.; Lahav, O.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.

    2016-10-05

    We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe as $\\Omega_m = 0.31 \\pm 0.09$ and the clustering amplitude of the matter power spectrum as $\\sigma_8 = 0.74 +\\pm 0.13$ after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into $S_8$ = $\\sigma_8(\\Omega_m/0.3)^{0.16} = 0.74 \\pm 0.12$ for our fiducial lens redshift bin at 0.35 < z < 0.5, while $S_8 = 0.78 \\pm 0.09$ using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.

  12. Cosmology from large scale galaxy clustering and galaxy-galaxy lensing with Dark Energy Survey Science Verification data

    DOE PAGES

    Kwan, J.

    2016-10-05

    Here, we present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe as Ωm = 0.31 ± 0.09 and the clustering amplitude of the matter power spectrum as σ8 = 0.74 ± 0.13 after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into S8 Ξ σ8more » (Ωm/0.3)0.16 = 0.74 ± 0.12 for our fiducial lens redshift bin at 0.35 < z < 0.5, while S8 = 0.78 ± 0.09 using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.« less

  13. Cosmology from large-scale galaxy clustering and galaxy-galaxy lensing with Dark Energy Survey Science Verification data

    NASA Astrophysics Data System (ADS)

    Kwan, J.; Sánchez, C.; Clampitt, J.; Blazek, J.; Crocce, M.; Jain, B.; Zuntz, J.; Amara, A.; Becker, M. R.; Bernstein, G. M.; Bonnett, C.; DeRose, J.; Dodelson, S.; Eifler, T. F.; Gaztanaga, E.; Giannantonio, T.; Gruen, D.; Hartley, W. G.; Kacprzak, T.; Kirk, D.; Krause, E.; MacCrann, N.; Miquel, R.; Park, Y.; Ross, A. J.; Rozo, E.; Rykoff, E. S.; Sheldon, E.; Troxel, M. A.; Wechsler, R. H.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Rosell, A. Carnero; Carrasco Kind, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Gerdes, D. W.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Jarvis, M.; Kuehn, K.; Lahav, O.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Martini, P.; Melchior, P.; Mohr, J. J.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Romer, A. K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; DES Collaboration

    2017-02-01

    We present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 deg2 contiguous patch of DES data from the Science Verification (SV) period of observations. Using large-scale measurements, we constrain the matter density of the Universe as Ωm = 0.31 ± 0.09 and the clustering amplitude of the matter power spectrum as σ8 = 0.74 ± 0.13 after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into S8 ≡ σ8(Ωm/0.3)0.16 = 0.74 ± 0.12 for our fiducial lens redshift bin at 0.35 < z < 0.5, while S8 = 0.78 ± 0.09 using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck Cosmic Microwave Background data, baryon accoustic oscillations and Supernova Type Ia measurements.

  14. Health Benefits from Large-Scale Ozone Reduction in the United States

    PubMed Central

    Berman, Jesse D.; Fann, Neal; Hollingsworth, John W.; Pinkerton, Kent E.; Rom, William N.; Szema, Anthony M.; Breysse, Patrick N.; White, Ronald H.

    2012-01-01

    Background: Exposure to ozone has been associated with adverse health effects, including premature mortality and cardiopulmonary and respiratory morbidity. In 2008, the U.S. Environmental Protection Agency (EPA) lowered the primary (health-based) National Ambient Air Quality Standard (NAAQS) for ozone to 75 ppb, expressed as the fourth-highest daily maximum 8-hr average over a 24-hr period. Based on recent monitoring data, U.S. ozone levels still exceed this standard in numerous locations, resulting in avoidable adverse health consequences. Objectives: We sought to quantify the potential human health benefits from achieving the current primary NAAQS standard of 75 ppb and two alternative standard levels, 70 and 60 ppb, which represent the range recommended by the U.S. EPA Clean Air Scientific Advisory Committee (CASAC). Methods: We applied health impact assessment methodology to estimate numbers of deaths and other adverse health outcomes that would have been avoided during 2005, 2006, and 2007 if the current (or lower) NAAQS ozone standards had been met. Estimated reductions in ozone concentrations were interpolated according to geographic area and year, and concentration–response functions were obtained or derived from the epidemiological literature. Results: We estimated that annual numbers of avoided ozone-related premature deaths would have ranged from 1,410 to 2,480 at 75 ppb to 2,450 to 4,130 at 70 ppb, and 5,210 to 7,990 at 60 ppb. Acute respiratory symptoms would have been reduced by 3 million cases and school-loss days by 1 million cases annually if the current 75-ppb standard had been attained. Substantially greater health benefits would have resulted if the CASAC-recommended range of standards (70–60 ppb) had been met. Conclusions: Attaining a more stringent primary ozone standard would significantly reduce ozone-related premature mortality and morbidity. PMID:22809899

  15. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  16. Monitoring and Evaluating the Transition of Large-Scale Programs in Global Health

    PubMed Central

    Bao, James; Rodriguez, Daniela C; Paina, Ligia; Ozawa, Sachiko; Bennett, Sara

    2015-01-01

    Purpose: Donors are increasingly interested in the transition and sustainability of global health programs as priorities shift and external funding declines. Systematic and high-quality monitoring and evaluation (M&E) of such processes is rare. We propose a framework and related guiding questions to systematize the M&E of global health program transitions. Methods: We conducted stakeholder interviews, searched the peer-reviewed and gray literature, gathered feedback from key informants, and reflected on author experiences to build a framework on M&E of transition and to develop guiding questions. Findings: The conceptual framework models transition as a process spanning pre-transition and transition itself and extending into sustained services and outcomes. Key transition domains include leadership, financing, programming, and service delivery, and relevant activities that drive the transition in these domains forward include sustaining a supportive policy environment, creating financial sustainability, developing local stakeholder capacity, communicating to all stakeholders, and aligning programs. Ideally transition monitoring would begin prior to transition processes being implemented and continue for some time after transition has been completed. As no set of indicators will be applicable across all types of health program transitions, we instead propose guiding questions and illustrative quantitative and qualitative indicators to be considered and adapted based on the transition domains identified as most important to the particular health program transition. The M&E of transition faces new and unique challenges, requiring measuring constructs to which evaluators may not be accustomed. Many domains hinge on measuring “intangibles” such as the management of relationships. Monitoring these constructs may require a compromise between rigorous data collection and the involvement of key stakeholders. Conclusion: Monitoring and evaluating transitions in global

  17. A literature review for large-scale health information system project planning, implementation and evaluation.

    PubMed

    Sligo, Judith; Gauld, Robin; Roberts, Vaughan; Villa, Luis

    2017-01-01

    Information technology is perceived as a potential panacea for healthcare organisations to manage pressure to improve services in the face of increased demand. However, the implementation and evaluation of health information systems (HIS) is plagued with problems and implementation shortcomings and failures are rife. HIS implementation is complex and relies on organisational, structural, technological, and human factors to be successful. It also requires reflective, nuanced, multidimensional evaluation to provide ongoing feedback to ensure success. This article provides a comprehensive review of the literature about evaluating and implementing HIS, detailing the challenges and recommendations for both evaluators and healthcare organisations. The factors that inhibit or promote successful HIS implementation are identified and effective evaluation strategies are described with the goal of informing teams evaluating complex HIS.

  18. LARGE-SCALE STAR-FORMATION-DRIVEN OUTFLOWS AT 1 < z < 2 IN THE 3D-HST SURVEY

    SciTech Connect

    Lundgren, Britt F.; Van Dokkum, Pieter; Bezanson, Rachel; Momcheva, Ivelina; Nelson, Erica; Skelton, Rosalind E.; Wake, David; Whitaker, Katherine; Brammer, Gabriel; Franx, Marijn; Fumagalli, Mattia; Labbe, Ivo; Patel, Shannon; Da Cunha, Elizabete; Rix, Hans Walter; Schmidt, Kasper; Erb, Dawn K.; Fan Xiaohui; Kriek, Mariska; Marchesini, Danilo; and others

    2012-11-20

    We present evidence of large-scale outflows from three low-mass (log(M {sub *}/M {sub Sun }) {approx} 9.75) star-forming (SFR > 4 M {sub Sun} yr{sup -1}) galaxies observed at z = 1.24, z = 1.35, and z = 1.75 in the 3D-HST Survey. Each of these galaxies is located within a projected physical distance of 60 kpc around the sight line to the quasar SDSS J123622.93+621526.6, which exhibits well-separated strong (W {sup {lambda}2796} {sub r} {approx}> 0.8 A) Mg II absorption systems matching precisely to the redshifts of the three galaxies. We derive the star formation surface densities from the H{alpha} emission in the WFC3 G141 grism observations for the galaxies and find that in each case the star formation surface density well exceeds 0.1 M {sub Sun} yr{sup -1} kpc{sup -2}, the typical threshold for starburst galaxies in the local universe. From a small but complete parallel census of the 0.65 < z < 2.6 galaxies with H {sub 140} {approx}< 24 proximate to the quasar sight line, we detect Mg II absorption associated with galaxies extending to physical distances of 130 kpc. We determine that the W{sub r} > 0.8 A Mg II covering fraction of star-forming galaxies at 1 < z < 2 may be as large as unity on scales extending to at least 60 kpc, providing early constraints on the typical extent of starburst-driven winds around galaxies at this redshift. Our observations additionally suggest that the azimuthal distribution of W{sub r} > 0.4 A Mg II absorbing gas around star-forming galaxies may evolve from z {approx} 2 to the present, consistent with recent observations of an increasing collimation of star-formation-driven outflows with time from z {approx} 3.

  19. Cosmology from large scale galaxy clustering and galaxy-galaxy lensing with Dark Energy Survey Science Verification data

    SciTech Connect

    Kwan, J.

    2016-10-05

    Here, we present cosmological constraints from the Dark Energy Survey (DES) using a combined analysis of angular clustering of red galaxies and their cross-correlation with weak gravitational lensing of background galaxies. We use a 139 square degree contiguous patch of DES data from the Science Verification (SV) period of observations. Using large scale measurements, we constrain the matter density of the Universe as Ωm = 0.31 ± 0.09 and the clustering amplitude of the matter power spectrum as σ8 = 0.74 ± 0.13 after marginalizing over seven nuisance parameters and three additional cosmological parameters. This translates into S8 Ξ σ8m/0.3)0.16 = 0.74 ± 0.12 for our fiducial lens redshift bin at 0.35 < z < 0.5, while S8 = 0.78 ± 0.09 using two bins over the range 0.2 < z < 0.5. We study the robustness of the results under changes in the data vectors, modelling and systematics treatment, including photometric redshift and shear calibration uncertainties, and find consistency in the derived cosmological parameters. We show that our results are consistent with previous cosmological analyses from DES and other data sets and conclude with a joint analysis of DES angular clustering and galaxy-galaxy lensing with Planck CMB data, Baryon Accoustic Oscillations and Supernova type Ia measurements.

  20. Perspectives on clinical informatics: integrating large-scale clinical, genomic, and health information for clinical care.

    PubMed

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K; Chung, Yeun-Jun

    2013-12-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population.

  1. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    PubMed Central

    Choi, In Young; Kim, Tae-Min; Kim, Myung Shin; Mun, Seong K.

    2013-01-01

    The advances in electronic medical records (EMRs) and bioinformatics (BI) represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO) aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population. PMID:24465229

  2. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  3. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    PubMed

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation.

  4. Large-scale latitude distortions of the inner Milky Way disk from the Herschel/Hi-GAL Survey

    NASA Astrophysics Data System (ADS)

    Molinari, S.; Noriega-Crespo, A.; Bally, J.; Moore, T. J. T.; Elia, D.; Schisano, E.; Plume, R.; Swinyard, B.; Di Giorgio, A. M.; Pezzuto, S.; Benedettini, M.; Testi, L.

    2016-04-01

    -infrared catalogues are filtered according to criteria that primarily select Young Stellar Objects (YSOs). Conclusions: The distortions of the Galactic inner disk revealed by Herschel confirm previous findings from CO surveys and HII/OB source counts but with much greater statistical significance and are interpreted as large-scale bending modes of the plane. The lack of similar distortions in tracers of more evolved YSOs or stars rules out gravitational instabilities or satellite-induced perturbations, because they should act on both the diffuse and stellar disk components. We propose that the observed bends are caused by incoming flows of extra-planar gas from the Galactic fountain or the Galactic halo interacting with the gaseous disk. With a much lower cross-section, stars decouple from the gaseous ISM and relax into the stellar disk potential. The timescale required for the disappearance of the distortions from the diffuse ISM to the relatively evolved YSO stages are compatible with star formation timescales.

  5. Evaluating a Large-Scale Community-Based Intervention to Improve Pregnancy and Newborn Health Among the Rural Poor in India

    PubMed Central

    Lalwani, Tanya; Dutta, Rahul; Rajaratnam, Julie Knoll; Ruducha, Jenny; Varkey, Leila Caleb; Wunnava, Sita; Menezes, Lysander; Taylor, Catharine; Bernson, Jeff

    2015-01-01

    Objectives. We evaluated the effectiveness of the Sure Start project, which was implemented in 7 districts of Uttar Pradesh, India, to improve maternal and newborn health. Methods. Interventions were implemented at 2 randomly assigned levels of intensity. Forty percent of the areas received a more intense intervention, including community-level meetings with expectant mothers. A baseline survey consisted of 12 000 women who completed pregnancy in 2007; a follow-up survey was conducted for women in 2010 in the same villages. Our quantitative analyses provide an account of the project’s impact. Results. We observed significant health improvements in both intervention areas over time; in the more intensive intervention areas, we found greater improvements in care-seeking and healthy behaviors. The more intensive intervention areas did not experience a significantly greater decline in neonatal mortality. Conclusions. This study demonstrates that community-based efforts, especially mothers’ group meetings designed to increase care-seeking and healthy behaviors, are effective and can be implemented at large scale. PMID:25393175

  6. Large scale profiles of galaxies at z=0-2 studied by stacking the HSC SSP survey data

    NASA Astrophysics Data System (ADS)

    Kubo, Mariko; Ouchi, Masami; Shibuya, Takatoshi

    2017-03-01

    We are carrying out the study of the evolution of radial surface brightness profiles of galaxies from z = 0 to 2 by stacking analysis using data corrected by the Hyper Suprime-Cam (HSC) Subaru Strategic Program (SSP). This will allow us to constrain the large scale average profiles of various galaxy populations at high redshift. From the stacking analysis of galaxies selected based on their photometric redshifts, we successfully detected the outer components of galaxies at z > 1 extending to at least ~80 kpc, which imply an early formation for the galaxy outskirts.

  7. Prevalence of disability in Manikganj district of Bangladesh: results from a large-scale cross-sectional survey

    PubMed Central

    Zaman, M Mostafa; Mashreky, Saidur Rahman

    2016-01-01

    Objective To conduct a comprehensive survey on disability to determine the prevalence and distribution of cause-specific disability among residents of the Manikganj district in Bangladesh. Methods The survey was conducted in Manikganj, a typical district in Bangladesh, in 2009. Data were collected from 37 030 individuals of all ages. Samples were drawn from 8905 households from urban and rural areas proportionate to population size. Three sets of interviewer-administered questionnaires were used separately for age groups 0–1 years, 2–10 years and 11 years and above to collect data. For the age groups 0–1 years and 2–10 years, the parents or the head of the household were interviewed to obtain the responses. Impairments, activity limitations and restriction of participation were considered in defining disability consistent with the International Classification of Functioning, Disability and Health framework. Results Overall, age-standardised prevalence of disability per 1000 was 46.5 (95% CI 44.4 to 48.6). Prevalence was significantly higher among respondents living in rural areas (50.2; 95% CI 47.7 to 52.7) than in urban areas (31.0; 95% CI 27.0 to 35.0). Overall, female respondents had more disability (50.0; 95% CI 46.9 to 53.1) than male respondents (43.4; 95% CI 40.5 to 46.3). Educational deprivation was closely linked to higher prevalence of disability. Commonly reported prevalences (per 1000) for underlying causes of disability were 20.2 for illness, followed by 9.4 for congenital causes and 6.8 for injury, and these were consistent in males and females. Conclusions Disability is a common problem in this typical district of Bangladesh, which is largely generalisable. Interventions at community level with special attention to the socioeconomically deprived are warranted. PMID:27431897

  8. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  9. Large-Scale Survey of Chinese Precollege Students' Epistemological Beliefs about Physics: A Progression or a Regression?

    ERIC Educational Resources Information Center

    Zhang, Ping; Ding, Lin

    2013-01-01

    This paper reports a cross-grade comparative study of Chinese precollege students' epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade…

  10. Adult Siblings of Individuals with Down Syndrome versus with Autism: Findings from a Large-Scale US Survey

    ERIC Educational Resources Information Center

    Hodapp, R. M.; Urbano, R. C.

    2007-01-01

    Background: As adults with Down syndrome live increasingly longer lives, their adult siblings will most likely assume caregiving responsibilities. Yet little is known about either the sibling relationship or the general functioning of these adult siblings. Using a national, web-based survey, this study compared adult siblings of individuals with…

  11. Exploring Students' Concepts of Feedback as Articulated in Large-Scale Surveys: A Useful Proxy and Some Encouraging Nuances

    ERIC Educational Resources Information Center

    Carver, Mark

    2016-01-01

    Surveys asking Higher Education students about feedback tend to find similar results: feedback should be prompt, specific, understandable and regular. Efforts to improve the feedback experience therefore emphasises that feedback be more frequent, detailed and turnaround times reduced. However, indications that students misunderstand key phrases in…

  12. A Large-scale Spectroscopic Survey of Methanol and OH Line Emission from the Galactic Center: Observations and Data

    NASA Astrophysics Data System (ADS)

    Cotton, W. D.; Yusef-Zadeh, F.

    2016-11-01

    Class I methanol masers are collisionally pumped and are generally correlated with outflows in star-forming sites in the Galaxy. Using the Very Large Array in its A-array configuration, we present a spectral line survey to identify methanol J={4}-1\\to {3}0E emission at 36.169 GHz. Over 900 pointings were used to cover a region 66‧ × 13‧ along the inner Galactic plane. A shallow survey of OH at 1612, 1665, 1667, and 1720 MHz was also carried out over the area covered by our methanol survey. We provide a catalog of 2240 methanol masers with narrow line-widths of ˜1 km s-1, spatial resolutions of ˜ 0\\buildrel{\\prime\\prime}\\over{.} 14× 0\\buildrel{\\prime\\prime}\\over{.} 05, and rms noises ˜20 mJy beam-1 per channel. Lower limits on the brightness temperature range from 27,000 to 10,000,000 K, showing that the emission is of non-thermal origin. We also provide a list of 23 OH (1612), 14 OH (1665), 5 OH (1667), and 5 OH (1720 MHz) masers. The origin of such a large number of methanol masers is not clear. Many methanol masers appear to be associated with infrared dark clouds, though it appears unlikely that the entire population of these masers traces the early phase of star formation in the Galactic center.

  13. Abuse of Medications Employed for the Treatment of ADHD: Results From a Large-Scale Community Survey

    PubMed Central

    Bright, George M.

    2008-01-01

    Objective The objective is to assess abuse of prescription and illicit stimulants among individuals being treated for attention-deficit/hyperactivity disorder (ADHD). Methods A survey was distributed to patients enrolled in an ADHD treatment center. It included questions designed to gain information about demographics; ADHD treatment history; illicit drug use; and misuse of prescribed stimulant medications, including type of stimulant medication most frequently misused or abused, and how the stimulant was prepared and administered. Results A total of 545 subjects (89.2% with ADHD) were included in the survey. Results indicated that 14.3% of respondents abused prescription stimulants. Of these, 79.8% abused short-acting agents; 17.2% abused long-acting stimulants; 2.0% abused both short- and long-acting agents; and 1.0% abused other agents. The specific medications abused most often were mixed amphetamine salts (Adderall; 40.0%), mixed amphetamine salts extended release (Adderall XR; 14.2%), and methylphenidate (Ritalin; 15.0%), and the most common manner of stimulant abuse was crushing pills and snorting (75.0%). Survey results also showed that 39.1% of respondents used nonprescription stimulants, most often cocaine (62.2%), methamphetamine (4.8%), and both cocaine and amphetamine (31.1%). Choice of illicit drug was based on rapidity of high onset (43.5%), ease of acquisition (40.7%), ease of use (10.2%), and cost (5.5%). Conclusions The risks for abuse of prescription and illicit stimulants are elevated among individuals being treated in an ADHD clinic. Prescription agents used most often are those with pharmacologic and pharmacokinetic characteristics that provide a rapid high. This suggests that long-acting stimulant preparations that have been developed for the treatment of ADHD may have lower abuse potential than short-acting formulations. PMID:18596945

  14. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  15. Large-scale survey of Chinese precollege students' epistemological beliefs about physics: A progression or a regression?

    NASA Astrophysics Data System (ADS)

    Zhang, Ping; Ding, Lin

    2013-06-01

    This paper reports a cross-grade comparative study of Chinese precollege students’ epistemological beliefs about physics by using the Colorado Learning Attitudes Survey about Sciences (CLASS). Our students of interest are middle and high schoolers taking traditional lecture-based physics as a mandatory science course each year from the 8th grade to the 12th grade in China. The original CLASS was translated into Mandarin through a rigorous transadaption process, and then it was administered as a pencil-and-paper in-class survey to a total of 1318 students across all the five grade levels (8-12). Our results showed that although in general student epistemological beliefs became less expertlike after receiving more years of traditional instruction (a trend consistent with what was reported in the previous literature), the cross-grade change was not a monotonous decrease. Instead, students at grades 9 and 12 showed a slight positive shift in their beliefs measured by CLASS. Particularly, when compared to the 8th graders, students at the 9th grade demonstrated a significant increase in their views about the conceptual nature of physics and problem-solving sophistication. We hypothesize that both pedagogical and nonpedagogical factors may have contributed to these positive changes. Our results cast light on the complex nature of the relationship between formal instruction and student epistemological beliefs.

  16. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  17. Large-scale serological survey of caprine arthritis-encephalitis virus (CAEV) in Korean black goats (Capra hircus aegagrus).

    PubMed

    Oem, Jae-Ku; Chung, Joon-Yee; Byun, Jae-Won; Kim, Ha-Young; Kwak, Dongmi; Jung, Byeong Yeal

    2012-12-01

    A national serological survey of caprine arthritis-encephalitis virus (CAEV) infection was conducted using an enzyme-linked immunosorbent assay (ELISA) and an agar gel immunodiffusion (AGID) test. A total of 658 black goats of various breeds were sampled from 59 farms in three regions of Korea. The CAEV-positive goats were predominantly detected in the Southern region (n=17) as compared with the Northern (n=1) and Central regions (n=0) (χ(2)=6.26, P=0.044). Among 658 goats tested, 18 were positive in both ELISA and AGID, indicating a CAEV prevalence of 2.73% (95% confidence interval: 1.74-4.28). These results indicate that CAEV is present in Korean black goats.

  18. A Large-Scale, Low-Frequency Murchison Widefield Array Survey of Galactic H ii Regions between 260 < l < 340

    NASA Astrophysics Data System (ADS)

    Hindson, L.; Johnston-Hollitt, M.; Hurley-Walker, N.; Callingham, J. R.; Su, H.; Morgan, J.; Bell, M.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Deshpande, A. A.; Dwarakanath, K. S.; For, B.-Q.; Gaensler, B. M.; Greenhill, L. J.; Hancock, P.; Hazelton, B. J.; Kapińska, A. D.; Kaplan, D. L.; Lenc, E.; Lonsdale, C. J.; Mckinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Oberoi, D.; Offringa, A.; Ord, S. M.; Procopio, P.; Prabu, T.; Shankar, N. Udaya; Srivani, K. S.; Staveley-Smith, L.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.; Wu, C.; Zheng, Q.

    2016-05-01

    We have compiled a catalogue of H ii regions detected with the Murchison Widefield Array between 72 and 231 MHz. The multiple frequency bands provided by the Murchison Widefield Array allow us identify the characteristic spectrum generated by the thermal Bremsstrahlung process in H ii regions. We detect 306 H ii regions between 260° < l < 340° and report on the positions, sizes, peak, integrated flux density, and spectral indices of these H ii regions. By identifying the point at which H ii regions transition from the optically thin to thick regime, we derive the physical properties including the electron density, ionised gas mass, and ionising photon flux, towards 61 H ii regions. This catalogue of H ii regions represents the most extensive and uniform low frequency survey of H ii regions in the Galaxy to date.

  19. Characterizing Companions to Low-Mass Stars: A Large-Scale, Volume-Limited Survey of Local M-dwarfs

    NASA Astrophysics Data System (ADS)

    Ward-Duong, Kimberly; Patience, J.; De Rosa, R.; Rajan, A.

    2013-01-01

    M-dwarfs constitute the major fraction of stars within both the solar neighborhood and nearby star-forming regions. However, key M-dwarf companion characteristics - including multiplicity fraction, mass ratios, and separation distributions - are less certain for field stars, due to limited sample sizes and non-uniform selection criteria. Studies of star-forming regions often compare results to solar-type field stars due to the extensive population statistics available for G-dwarfs, but field M-dwarfs represent a more analogous population for comparison due to their prevalence. We present results on a stellar and substellar companion study covering separations from ~1 - 10,000 AU, based on a volume-limited survey of ~300 M-dwarfs within 15 pc. Our study constrains the frequency of binary companions and the shape of the companion separation and mass ratio distributions. Diffraction-limited, mid-to-near infrared archival data were obtained from the Very Large Telescope, Hubble Space Telescope, and Canada-France-Hawaii Telescope, to detect nearby companions to M-dwarfs from ~1 to 100 AU. To supplement the high-resolution data, wide-field archival plates were searched for companions with separations of 100 to 10,000 AU. The all-sky survey data include multiple epochs, and follow up observations at higher resolution will allow us to confirm or reject the new companion candidates detected during our analysis. These multi-epoch observations provide confirmation of common proper motions, thereby minimizing background contamination and providing comprehensive statistics for M-star binaries. Preliminary analysis of an initial subset of the sample suggests a lower limit to the multiplicity of 23 ± 7% within the restricted separation range. Characterizations of the binary frequency for M-dwarfs provide crucial insights into the low-mass star formation environment, and hold additional implications for the frequency and evolutionary histories of their associated disks and

  20. Galaxy clustering on large scales.

    PubMed Central

    Efstathiou, G

    1993-01-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  1. Evaluation of airborne geophysical surveys for large-scale mapping of contaminated mine pools: draft final report

    SciTech Connect

    Hammack, R. W.

    2006-12-28

    Decades of underground coal mining has left about 5,000 square miles of abandoned mine workings that are rapidly filling with water. The water quality of mine pools is often poor; environmental regulatory agencies are concerned because water from mine pools could contaminate diminishing surface and groundwater supplies. Mine pools are also a threat to the safety of current mining operations. Conversely, mine pools are a large, untapped water resource that, with treatment, could be used for a variety of industrial purposes. Others have proposed using mine pools in conjunction with heat pumps as a source of heating and cooling for large industrial facilities. The management or use of mine pool water requires accurate maps of mine pools. West Virginia University has predicted the likely location and volume of mine pools in the Pittsburgh Coalbed using existing mine maps, structure contour maps, and measured mine pool elevations. Unfortunately, mine maps only reflect conditions at the time of mining, are not available for all mines, and do not always denote the maximum extent of mining. Since 1999, the National Energy Technology Laboratory (NETL) has been evaluating helicopter-borne, electromagnetic sensing technologies for the detection and mapping of mine pools. Frequency domain electromagnetic sensors are able to detect shallow mine pools (depth < 50 m) if there is sufficient contrast between the conductance of the mine pool and the conductance of the overburden. The mine pools (conductors) most confidently detected by this technology are overlain by thick, resistive sandstone layers. In 2003, a helicopter time domain electromagnetic sensor was applied to mined areas in southwestern Virginia in an attempt to increase the depth of mine pool detection. This study failed because the mine pool targets were thin and not very conductive. Also, large areas of the surveys were degraded or made unusable by excessive amounts of cultural electromagnetic noise that obscured the

  2. Enterobius vermicularis infection in schoolchildren: a large-scale survey 6 years after a population-based control.

    PubMed

    Wang, L-C; Hwang, K-P; Chen, E-R

    2010-01-01

    Pinworm infection remains prevalent in children in many parts of the world. This study was designed to determine the prevalence of this infection in schoolchildren in Taiwan after the termination of the 15-year population-based control project in 2001. Our results showed that 2.4% of 118 190 children in 385 primary schools were found to have enterobiasis by two-consecutive-day adhesive cellophane perianal swabs. The prevalences were significantly different in the 25 counties/cities surveyed (0.6-6.6%). A significantly higher prevalence was found in boys (2.6%) than in girls (2.2%) and the prevalence decreased by grade from 3.8% in grade 1 to 1.0% in grade 6. In the primary schools, 9.1% had positive rates 10%. In addition, pinworm infection was found to be significantly associated with the socioeconomic status, personal hygiene and sanitary conditions of the children. The results indicate that the overall prevalence of enterobiasis remains at a low level after the control programme was transferred to the local governments.

  3. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Astrophysics Data System (ADS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  4. Galaxy evolution and large-scale structure in the far-infrared. II. The IRAS faint source survey

    SciTech Connect

    Lonsdale, C.J.; Hacking, P.B.; Conrow, T.P.; Rowan-Robinson, M. Queen Mary College, London )

    1990-07-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling. 105 refs.

  5. Galaxy evolution and large-scale structure in the far-infrared. II - The IRAS faint source survey

    NASA Technical Reports Server (NTRS)

    Lonsdale, Carol J.; Hacking, Perry B.; Conrow, T. P.; Rowan-Robinson, M.

    1990-01-01

    The new IRAS Faint Source Survey data base is used to confirm the conclusion of Hacking et al. (1987) that the 60 micron source counts fainter than about 0.5 Jy lie in excess of predictions based on nonevolving model populations. The existence of an anisotropy between the northern and southern Galactic caps discovered by Rowan-Robinson et al. (1986) and Needham and Rowan-Robinson (1988) is confirmed, and it is found to extend below their sensitivity limit to about 0.3 Jy in 60 micron flux density. The count anisotropy at f(60) greater than 0.3 can be interpreted reasonably as due to the Local Supercluster; however, no one structure accounting for the fainter anisotropy can be easily identified in either optical or far-IR two-dimensional sky distributions. The far-IR galaxy sky distributions are considerably smoother than distributions from the published optical galaxy catalogs. It is likely that structure of the large size discussed here have been discriminated against in earlier studies due to insufficient volume sampling.

  6. Macro- and microstructural diversity of sea urchin teeth revealed by large-scale mircro-computed tomography survey

    NASA Astrophysics Data System (ADS)

    Ziegler, Alexander; Stock, Stuart R.; Menze, Björn H.; Smith, Andrew B.

    2012-10-01

    Sea urchins (Echinodermata: Echinoidea) generally possess an intricate jaw apparatus that incorporates five teeth. Although echinoid teeth consist of calcite, their complex internal design results in biomechanical properties far superior to those of inorganic forms of the constituent material. While the individual elements (or microstructure) of echinoid teeth provide general insight into processes of biomineralization, the cross-sectional shape (or macrostructure) of echinoid teeth is useful for phylogenetic and biomechanical inferences. However, studies of sea urchin tooth macro- and microstructure have traditionally been limited to a few readily available species, effectively disregarding a potentially high degree of structural diversity that could be informative in a number of ways. Having scanned numerous sea urchin species using micro-computed tomography µCT) and synchrotron µCT, we report a large variation in macro- and microstructure of sea urchin teeth. In addition, we describe aberrant tooth shapes and apply 3D visualization protocols that permit accelerated visual access to the complex microstructure of sea urchin teeth. Our broad survey identifies key taxa for further in-depth study and integrates previously assembled data on fossil species into a more comprehensive systematic analysis of sea urchin teeth. In order to circumvent the imprecise, word-based description of tooth shape, we introduce shape analysis algorithms that will permit the numerical and therefore more objective description of tooth macrostructure. Finally, we discuss how synchrotron µCT datasets permit virtual models of tooth microstructure to be generated as well as the simulation of tooth mechanics based on finite element modeling.

  7. National Health Care Survey

    Cancer.gov

    This survey encompasses a family of health care provider surveys, including information about the facilities that supply health care, the services rendered, and the characteristics of the patients served.

  8. The VIMOS-VLT Deep Survey First Epoch observations: evolution of galaxies, large scale structures and AGNs over 90% of the current age of the Universe

    NASA Astrophysics Data System (ADS)

    Le Fèvre, O.; Vettolani, G.; Bottini, D.; Garilli, B.; Le Brun, V.; Maccagni, D.; Picat, J.-P.; Scaramella, R.; Scodeggio, M.; Tresse, L.; Zanichelli, A.; Adami, C.; Arnaboldi, M.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Cappi, A.; Charlot, S.; Ciliegi, P.; Contini, T.; Franzetti, P.; Foucaud, S.; Gavignaud, I.; Guzzo, L.; Ilbert, O.; Iovino, A.; McCracken, H.-J.; Marano, B.; Marinoni, C.; Mazure, A.; Meneux, B.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Pozzetti, L.; Radovich, M.; Zamorani, G.; Zucca, E.; Bondi, M.; Bongiorno, A.; Busarello, G.; Lamareille, F.; Mathez, G.; Mellier, Y.; Merluzzi, P.; Ripepi, V.; Rizzo, D.

    2005-03-01

    THE VIMOS VLT DEEP SURVEY (VVDS) IS A MAJOR REDSHIFT SURVEY OF THE DISTANT UNIVERSE, AIMED AT STUDYING THE EVOLUTION OF GALAXIES, LARGE SCALE STRUCTURES AND AGNS OVER MORE THAN 90% OF THE AGE OF THE UNIVERSE. A TOTAL OF 41000 SPECTRA HAVE BEEN OBSERVED SO FAR. FROM THE FIRST EPOCH OBSERVATIONS CONDUCTED WITH VIMOS, WE HAVE ASSEMBLED ~11000 REDSHIFTS FOR GALAXIES WITH 0 £ Z £ 5 SELECTED WITH MAGNITUDE IAB £ 24 IN AN AREA 3.1 TIMES THE AREA OF THE FULL MOON. WE PRESENT EVIDENCE FOR A STRONG EVOLUTION OF THE LUMINOSITY OF GALAXIES AND SHOW THAT GALAXIES ARE ALREADY DISTRIBUTED IN DENSE STRUCTURES AT Z ~ 1.5. THE HIGH REDSHIFT POPULATION OF ~1000 GALAXIES WITH 1.4 £ Z £ 5 APPEARS TO BE MORE NUMEROUS THAN PREVIOUSLY BELIEVED. AS THE SURVEY CONTINUES, WE ARE ASSEMBLING MULTI-WAVELENGTH DATA IN COLLABORATION WITH OTHER TEAMS (GALEX, SPITZER-SWIRE, XMM-LSS, VLA), AS WELL AS EXPANDING TO LARGER SCALES (~100 MPC) TO PROBE THE UNIVERSE IN AN UNPRECEDENTED WAY.

  9. [The benefit of large-scale cohort studies for health research: the example of the German National Cohort].

    PubMed

    Ahrens, Wolfgang; Jöckel, K-H

    2015-08-01

    The prospective nature of large-scale epidemiological multi-purpose cohort studies with long observation periods facilitates the search for complex causes of diseases, the analysis of the natural history of diseases and the identification of novel pre-clinical markers of disease. The German National Cohort (GNC) is a population-based, highly standardised and in-depth phenotyped cohort. It shall create the basis for new strategies for risk assessment and identification, early diagnosis and prevention of multifactorial diseases. The GNC is the largest population-based cohort study in Germany to date. In the year 2014 the examination of 200,000 women and men aged 20-69 years started in 18 study centers. The study facilitates the investigation of the etiology of chronic diseases in relation to lifestyle, genetic, socioeconomic, psychosocial and environmental factors. By this the GNC creates the basis for the development of methods for early diagnosis and prevention of these diseases. Cardiovascular and respiratory diseases, cancer, diabetes, neurodegenerative/-psychiatric diseases, musculoskeletal and infectious diseases are in focus of this study. Due to its mere size, the study could be characterized as a Big Data project. We deduce that this is not the case.

  10. Collective response to public health emergencies and large-scale disasters: putting hospitals at the core of community resilience.

    PubMed

    Paturas, James L; Smith, Deborah; Smith, Stewart; Albanese, Joseph

    2010-07-01

    Healthcare organisations are a critical part of a community's resilience and play a prominent role as the backbone of medical response to natural and manmade disasters. The importance of healthcare organisations, in particular hospitals, to remain operational extends beyond the necessity to sustain uninterrupted medical services for the community, in the aftermath of a large-scale disaster. Hospitals are viewed as safe havens where affected individuals go for shelter, food, water and psychosocial assistance, as well as to obtain information about missing family members or learn of impending dangers related to the incident. The ability of hospitals to respond effectively to high-consequence incidents producing a massive arrival of patients that disrupt daily operations requires surge capacity and capability. The activation of hospital emergency support functions provides an approach by which hospitals manage a short-term shortfall of hospital personnel through the reallocation of hospital employees, thereby obviating the reliance on external qualified volunteers for surge capacity and capability. Recent revisions to the Joint Commission's hospital emergency preparedness standard have impelled healthcare facilities to participate actively in community-wide planning, rather than confining planning exclusively to a single healthcare facility, in order to harmonise disaster management strategies and effectively coordinate the allocation of community resources and expertise across all local response agencies.

  11. The Vimos VLT Deep Survey. Stellar mass segregation and large-scale galaxy environment in the redshift range 0.2 < z < 1.4

    NASA Astrophysics Data System (ADS)

    Scodeggio, M.; Vergani, D.; Cucciati, O.; Iovino, A.; Franzetti, P.; Garilli, B.; Lamareille, F.; Bolzonella, M.; Pozzetti, L.; Abbas, U.; Marinoni, C.; Contini, T.; Bottini, D.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Scaramella, R.; Tresse, L.; Vettolani, G.; Zanichelli, A.; Adami, C.; Arnouts, S.; Bardelli, S.; Cappi, A.; Charlot, S.; Ciliegi, P.; Foucaud, S.; Gavignaud, I.; Guzzo, L.; Ilbert, O.; McCracken, H. J.; Marano, B.; Mazure, A.; Meneux, B.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Radovich, M.; Zamorani, G.; Zucca, E.; Bondi, M.; Bongiorno, A.; Brinchmann, J.; de La Torre, S.; de Ravel, L.; Gregorini, L.; Memeo, P.; Perez-Montero, E.; Mellier, Y.; Temporin, S.; Walcher, C. J.

    2009-07-01

    Context: Hierarchical models of galaxy formation predict that the properties of a dark matter halo depend on the large-scale environment surrounding the halo. As a result of this correlation, we expect massive haloes to be present in larger number in overdense regions than in underdense ones. Given that a correlation exists between a galaxy stellar mass and the hosting dark matter halo mass, the segregation in dark matter halo mass should then result in a segregation in the distribution of stellar mass in the galaxy population. Aims: In this work we study the distribution of galaxy stellar mass and rest-frame optical color as a function of the large-scale galaxy distribution using the VLT VIMOS Deep Survey sample, in order to verify the presence of segregation in the properties of the galaxy population. Methods: We use VVDS redshift measurements and multi-band photometric data to derive estimates of the stellar mass, rest-frame optical color, and of the large-scale galaxy density, on a scale of approximately 8 Mpc, for a sample of 5619 galaxies in the redshift range 0.2 0.7. However, when we consider only galaxies in narrow bins of stellar mass, in order to exclude the effects of stellar mass segregation on galaxy properties, we no longer observe any significant color segregation. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, program 070.A-9007(A), and on data obtained at the Canada-France-Hawaii Telescope

  12. The CIDA-QUEST large-scale survey of Orion OB1: evidence for rapid disk dissipation in a dispersed stellar population.

    PubMed

    Briceño, C; Vivas, A K; Calvet, N; Hartmann, L; Pacheco, R; Herrera, D; Romero, L; Berlind, P; Sánchez, G; Snyder, J A; Andrews, P

    2001-01-05

    We are conducting a large-scale, multiepoch, optical photometric survey [Centro de Investigaciones de Astronomia-Quasar Equatorial Survey Team (CIDA-QUEST)] covering about 120 square degrees to identify the young low-mass stars in the Orion OB1 association. We present results for an area of 34 square degrees. Using photometric variability as our main selection criterion, as well as follow-up spectroscopy, we confirmed 168 previously unidentified pre-main sequence stars that are about 0.6 to 0.9 times the mass of the sun (Mo), with ages of about 1 million to 3 million years (Ori OB1b) and about 3 million to 10 million years (Ori OB1a). The low-mass stars are spatially coincident with the high-mass (at least 3 Mo) members of the associations. Indicators of disk accretion such as Halpha emission and near-infrared emission from dusty disks fall sharply from Ori OB1b to Ori OB1a, indicating that the time scale for disk dissipation and possibly the onset of planet formation is a few million years.

  13. A Deep Survey of Low-Redshift Absorbers and Their Connections with Galaxies: Probing the Roles of Dwarfs, Satellites, and Large-Scale Environment

    NASA Astrophysics Data System (ADS)

    Burchett, Joseph

    2014-10-01

    In the not-too-distant past, the study of galaxy evolution neglected the vast interface between the stars in a galaxy and intergalactic space except for the dynamical effects of dark matter. Thanks to QSO absorption line spectroscopy and the Cosmic Origins Spectrograph {COS}, the circumgalactic medium {CGM} has come into sharp focus as a rich ecosystem playing a vital role in the evolution of the host galaxy. However, attributing the gas detected in absorption with host dwarf galaxies detected in optical surveys around the sightline becomes very difficult very quickly with increasing redshift. In addition, both targeted UV spectroscopy and ground-based galaxy surveys are resource intensive, which complicates compiling large, statistically robust samples of very-low-redshift absorber/galaxy pairs. We propose a CGM study of unprecedented statistical power by exploiting the vast number of sightlines in the HST/COS archive located within the Sloan Digital Sky Survey {SDSS} footprint to compile an estimated sample of 586 absorbers at z<0.015. This very-low-redshift criterion enables spectroscopic completeness down to L<0.01 L* galaxies in publicly available optical imaging and spectroscopy.Our survey is uniquely poised to address the following questions: {1} What is the role of dwarf galaxies that would be undetectable at higher redshift in giving rise to the gas detected in QSO spectroscopy? {2} How does galaxy environment and large-scale structure affect the CGM and what are the implications for environmental quenching of star formation? {3} How efficiently do feedback mechanisms expel metal-enriched gas to great distances into the galaxy halo and into the IGM?

  14. Large scale pulsar surveys, new pulsar discoveries, and the observability of pulsar beams strongly bent by the Sag. A* black hole

    NASA Astrophysics Data System (ADS)

    Stovall, Kevin

    Pulsars are useful tools for a large range of topics including but not limited to the detection of gravitational waves; tests of theories of gravity; population studies of pulsars, neutron stars, and binary systems; and analysis of Galactic structure. In the case of detections of gravitational waves, large numbers of extremely fast pulsars with periods of a few milliseconds distributed across a large number of angular separations are needed. In the case of population and Galactic structure studies, large numbers of pulsars distributed throughout the Galaxy are necessary. In order to find pulsars in the exotic systems useful for tests of theories of gravity, large number of pulsar discoveries are necessary in order to find these rare objects. As all of these efforts require the discovery of large numbers of pulsars, a significant effort has been made over the past few years, and will continue into the foreseeable future, to detect many more new radio pulsars through large scale pulsar surveys. The surveys related to this work include the Pulsar Arecibo L-Band Feed Array, the Green Bank 350MHz Drift Scan Survey, the Arecibo 327MHz Drift Scan Survey (AO327), and the Green Bank North Celestial Cap (GBNCC) survey. Data analysis from each of these surveys has resulted or will result in millions of pulsar candidates to be combed through, in some way, in order to find new radio pulsars. Here we discuss these surveys and the data analysis pipelines for two of them (AO327 and GBNCC). We also introduce a web based software system called ARCC Explorer, which enables researchers of varying levels, including high school and undergraduate students, to assist in the discovery process. In addition, we give discovery or timing solutions for 93 new pulsars directly discovered as a result of this work. One particularly interesting, but not yet detected, pulsar system is the pulsar-black hole system. Attempts have been made (and are still ongoing) to detect pulsars orbiting the black

  15. Evolution of clustering length, large-scale bias, and host halo mass at 2 < z < 5 in the VIMOS Ultra Deep Survey (VUDS)⋆

    NASA Astrophysics Data System (ADS)

    Durkalec, A.; Le Fèvre, O.; Pollo, A.; de la Torre, S.; Cassata, P.; Garilli, B.; Le Brun, V.; Lemaux, B. C.; Maccagni, D.; Pentericci, L.; Tasca, L. A. M.; Thomas, R.; Vanzella, E.; Zamorani, G.; Zucca, E.; Amorín, R.; Bardelli, S.; Cassarà, L. P.; Castellano, M.; Cimatti, A.; Cucciati, O.; Fontana, A.; Giavalisco, M.; Grazian, A.; Hathi, N. P.; Ilbert, O.; Paltani, S.; Ribeiro, B.; Schaerer, D.; Scodeggio, M.; Sommariva, V.; Talia, M.; Tresse, L.; Vergani, D.; Capak, P.; Charlot, S.; Contini, T.; Cuby, J. G.; Dunlop, J.; Fotopoulou, S.; Koekemoer, A.; López-Sanjuan, C.; Mellier, Y.; Pforr, J.; Salvato, M.; Scoville, N.; Taniguchi, Y.; Wang, P. W.

    2015-11-01

    We investigate the evolution of galaxy clustering for galaxies in the redshift range 2.0 Survey (VUDS). We present the projected (real-space) two-point correlation function wp(rp) measured by using 3022 galaxies with robust spectroscopic redshifts in two independent fields (COSMOS and VVDS-02h) covering in total 0.8deg2. We quantify how the scale dependent clustering amplitude r0 changes with redshift making use of mock samples to evaluate and correct the survey selection function. Using a power-law model ξ(r) = (r/r0)- γ we find that the correlation function for the general population is best fit by a model with a clustering length r0 = 3.95+0.48-0.54 h-1 Mpc and slope γ = 1.8+0.02-0.06 at z ~ 2.5, r0 = 4.35 ± 0.60 h-1 Mpc and γ = 1.6+0.12-0.13 at z ~ 3.5. We use these clustering parameters to derive the large-scale linear galaxy bias bLPL, between galaxies and dark matter. We find bLPL = 2.68 ± 0.22 at redshift z ~ 3 (assuming σ8 = 0.8), significantly higher than found at intermediate and low redshifts for the similarly general galaxy populations. We fit a halo occupation distribution (HOD) model to the data and we obtain that the average halo mass at redshift z ~ 3 is Mh = 1011.75 ± 0.23 h-1M⊙. From this fit we confirm that the large-scale linear galaxy bias is relatively high at bLHOD = 2.82 ± 0.27. Comparing these measurements with similar measurements at lower redshifts we infer that the star-forming population of galaxies at z ~ 3 should evolve into the massive and bright (Mr< -21.5)galaxy population, which typically occupy haloes of mass ⟨ Mh ⟩ = 1013.9 h-1M⊙ at redshift z = 0. Based on data obtained with the European Southern Observatory Very Large Telescope, Paranal, Chile, under Large Program 185.A-0791.Appendices are available in electronic form at http://www.aanda.org

  16. Surveying Galaxy Proto-clusters in Emission: A Large-scale Structure at z = 2.44 and the Outlook for HETDEX

    NASA Astrophysics Data System (ADS)

    Chiang, Yi-Kuan; Overzier, Roderik A.; Gebhardt, Karl; Finkelstein, Steven L.; Chiang, Chi-Ting; Hill, Gary J.; Blanc, Guillermo A.; Drory, Niv; Chonis, Taylor S.; Zeimann, Gregory R.; Hagen, Alex; Schneider, Donald P.; Jogee, Shardha; Ciardullo, Robin; Gronwall, Caryl

    2015-07-01

    Galaxy proto-clusters at z≳ 2 provide a direct probe of the rapid mass assembly and galaxy growth of present-day massive clusters. Because of the need for precise galaxy redshifts for density mapping and the prevalence of star formation before quenching, nearly all the proto-clusters known to date were confirmed by spectroscopy of galaxies with strong emission lines. Therefore, large emission-line galaxy surveys provide an efficient way to identify proto-clusters directly. Here we report the discovery of a large-scale structure at z = 2.44 in the Hobby Eberly Telescope Dark Energy Experiment (HETDEX) Pilot Survey. On a scale of a few tens of Mpc comoving, this structure shows a complex overdensity of Lyα emitters (LAE), which coincides with broadband selected galaxies in the COSMOS/UltraVISTA photometric and zCOSMOS spectroscopic catalogs, as well as overdensities of intergalactic gas revealed in the Lyα absorption maps of Lee et al. We construct mock LAE catalogs to predict the cosmic evolution of this structure. We find that such an overdensity should have already broken away from the Hubble flow, and part of the structure will collapse to form a galaxy cluster with {10}14.5+/- 0.4 {M}⊙ by z = 0. The structure contains a higher median stellar mass of broadband selected galaxies, a boost of extended Lyα nebulae, and a marginal excess of active galactic nuclei relative to the field, supporting a scenario of accelerated galaxy evolution in cluster progenitors. Based on the correlation between galaxy overdensity and the z = 0 descendant halo mass calibrated in the simulation, we predict that several hundred 1.9\\lt z\\lt 3.5 proto-clusters with z = 0 mass of \\gt {10}14.5 {M}⊙ will be discovered in the 8.5 Gpc3 of space surveyed by the HETDEX.

  17. The VIMOS Public Extragalactic Redshift Survey (VIPERS). An unprecedented view of galaxies and large-scale structure at 0.5 < z < 1.2

    NASA Astrophysics Data System (ADS)

    Guzzo, L.; Scodeggio, M.; Garilli, B.; Granett, B. R.; Fritz, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bolzonella, M.; Bottini, D.; Branchini, E.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; De Lucia, G.; de la Torre, S.; Franzetti, P.; Fumana, M.; Hudelot, P.; Ilbert, O.; Iovino, A.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Peacock, J. A.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zamorani, G.; Zanichelli, A.; Burden, A.; Di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Moscardini, L.; Nichol, R. C.; Percival, W. J.; Phleps, S.; Wolk, M.

    2014-06-01

    We describe the construction and general features of VIPERS, the VIMOS Public Extragalactic Redshift Survey. This ESO Large Programme is using the Very Large Telescope with the aim of building a spectroscopic sample of ~ 100 000 galaxies with iAB< 22.5 and 0.5 survey covers a total area of ~ 24 deg2 within the CFHTLS-Wide W1 and W4 fields. VIPERS is designed to address a broad range of problems in large-scale structure and galaxy evolution, thanks to a unique combination of volume (~ 5 × 107h-3 Mpc3) and sampling rate (~ 40%), comparable to state-of-the-art surveys of the local Universe, together with extensive multi-band optical and near-infrared photometry. Here we present the survey design, the selection of the source catalogue and the development of the spectroscopic observations. We discuss in detail the overall selection function that results from the combination of the different constituents of the project. This includes the masks arising from the parent photometric sample and the spectroscopic instrumental footprint, together with the weights needed to account for the sampling and the success rates of the observations. Using the catalogue of 53 608 galaxy redshifts composing the forthcoming VIPERS Public Data Release 1 (PDR-1), we provide a first assessment of the quality of the spectroscopic data. The stellar contamination is found to be only 3.2%, endorsing the quality of the star-galaxy separation process and fully confirming the original estimates based on the VVDS data, which also indicate a galaxy incompleteness from this process of only 1.4%. Using a set of 1215 repeated observations, we estimate an rms redshift error σz/ (1 + z) = 4.7 × 10-4 and calibrate the internal spectral quality grading. Benefiting from the combination of size and detailed sampling of this dataset, we conclude by presenting a map showing in unprecedented detail the large-scale distribution of galaxies between 5 and 8 billion years ago. Based on observations

  18. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    PubMed

    Vercelloni, Julie; Caley, M Julian; Kayal, Mohsen; Low-Choy, Samantha; Mengersen, Kerrie

    2014-01-01

    Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  19. Understanding Uncertainties in Non-Linear Population Trajectories: A Bayesian Semi-Parametric Hierarchical Approach to Large-Scale Surveys of Coral Cover

    PubMed Central

    Vercelloni, Julie; Caley, M. Julian; Kayal, Mohsen; Low-Choy, Samantha; Mengersen, Kerrie

    2014-01-01

    Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making. PMID:25364915

  20. Effects of a large-scale unconditional cash transfer program on mental health outcomes of young people in Kenya

    PubMed Central

    Thirumurthy, Harsha; Halpern, Carolyn Tucker; Pettifor, Audrey; Handa, Sudhanshu

    2015-01-01

    Purpose This study investigates the causal effect of Kenya's unconditional cash transfer program on mental health outcomes of young people. Methods Selected Locations in Kenya were randomly assigned to receive unconditional cash transfers in the first phase of Kenya's Cash Transfer Program for Orphans and Vulnerable Children (CT-OVC). In intervention Locations, low-income households and those with OVCs began receiving monthly cash transfers of $20 in 2007. In 2011, four years after program onset, data were collected on the psychosocial status for youth aged 15-24 from households in intervention and control Locations (N=1960). The primary outcome variable was an indicator of depressive symptoms using the 10-question Center for Epidemiologic Studies Depression Scale (CES-D10). Secondary outcomes include Hope and physical health measures. Logistic regression models that adjusted for individual and household characteristics were used to determine the effect of the cash transfer program. Results The cash transfer reduced the odds of depressive symptoms by 24 percent among young persons living in households that received cash transfers. Further analysis by gender and age revealed that the effects were only significant for young men and were larger among men aged 20-24 and orphans. Conclusions This study provides evidence that poverty targeted unconditional cash transfer programs, can improve the mental health of young people in low-income countries. PMID:26576822

  1. A large-scale, rapid public health response to rabies in an organ recipient and the previously undiagnosed organ donor.

    PubMed

    Wallace, R M; Stanek, D; Griese, S; Krulak, D; Vora, N M; Pacha, L; Kan, V; Said, M; Williams, C; Burgess, T H; Clausen, S S; Austin, C; Gabel, J; Lehman, M; Finelli, L N; Selvaggi, G; Joyce, P; Gordin, F; Benator, D; Bettano, A; Cersovsky, S; Blackmore, C; Jones, S V; Buchanan, B D; Fernandez, A I; Dinelli, D; Agnes, K; Clark, A; Gill, J; Irmler, M; Blythe, D; Mitchell, K; Whitman, T J; Zapor, M J; Zorich, S; Witkop, C; Jenkins, P; Mora, P; Droller, D; Turner, S; Dunn, L; Williams, P; Richards, C; Ewing, G; Chapman, K; Corbitt, C; Girimont, T; Franka, R; Recuenco, S; Blanton, J D; Feldman, K A

    2014-12-01

    This article describes and contrasts the public health response to two human rabies cases: one organ recipient diagnosed within days of symptom onset and the transplant donor who was diagnosed 18 months post-symptom onset. In response to an organ-transplant-related rabies case diagnosed in 2013, organ donor and recipient investigations were conducted by multiple public health agencies. Persons with potential exposure to infectious patient materials were assessed for rabies virus exposure. An exposure investigation was conducted to determine the source of the organ donor's infection. Over 100 persons from more than 20 agencies spent over 2700 h conducting contact investigations in healthcare, military and community settings. The 564 persons assessed include 417 healthcare workers [5.8% recommended for post-exposure prophylaxis (PEP)], 96 community contacts (15.6% recommended for PEP), 30 autopsy personnel (50% recommended for PEP), and 21 other persons (4.8% recommended for PEP). Donor contacts represented 188 assessed with 20.2% recommended for PEP, compared with 5.6% of 306 recipient contacts recommended for PEP. Human rabies cases result in substantial use of public health and medical resources, especially when diagnosis is delayed. Although rare, clinicians should consider rabies in cases of encephalitis of unexplained aetiology, particularly for cases that may result in organ donation.

  2. Testing deviations from ΛCDM with growth rate measurements from six large-scale structure surveys at z = 0.06-1

    NASA Astrophysics Data System (ADS)

    Alam, Shadab; Ho, Shirley; Silvestri, Alessandra

    2016-03-01

    We use measurements from the Planck satellite mission and galaxy redshift surveys over the last decade to test three of the basic assumptions of the standard model of cosmology, ΛCDM (Λ cold dark matter): the spatial curvature of the universe, the nature of dark energy and the laws of gravity on large scales. We obtain improved constraints on several scenarios that violate one or more of these assumptions. We measure w0 = -0.94 ± 0.17 (18 per cent measurement) and 1 + wa = 1.16 ± 0.36 (31 per cent measurement) for models with a time-dependent equation of state, which is an improvement over current best constraints. In the context of modified gravity, we consider popular scalar-tensor models as well as a parametrization of the growth factor. In the case of one-parameter f(R) gravity models with a ΛCDM background, we constrain B0 < 1.36 × 10-5 (1σ C.L.), which is an improvement by a factor of 4 on the current best. We provide the very first constraint on the coupling parameters of general scalar-tensor theory and stringent constraint on the only free coupling parameter of Chameleon models. We also derive constraints on extended Chameleon models, improving the constraint on the coupling by a factor of 6 on the current best. The constraints on coupling parameter for Chameleon model rule out the value of β1 = 4/3 required for f(R) gravity. We also measure γ = 0.612 ± 0.072 (11.7 per cent measurement) for growth index parametrization. We improve all the current constraints by combining results from various galaxy redshift surveys in a coherent way, which includes a careful treatment of scale dependence introduced by modified gravity.

  3. Estimates of occupational safety and health impacts resulting from large-scale production of major photovoltaic technologies

    SciTech Connect

    Owens, T.; Ungers, L.; Briggs, T.

    1980-08-01

    The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many of the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.

  4. Health risks from large-scale water pollution: Current trends and implications for improving drinking water quality in the lower Amu Darya drainage basin, Uzbekistan

    NASA Astrophysics Data System (ADS)

    Törnqvist, Rebecka; Jarsjö, Jerker

    2010-05-01

    Safe drinking water is a primary prerequisite to human health, well being and development. Yet, there are roughly one billion people around the world that lack access to safe drinking water supply. Health risk assessments are effective for evaluating the suitability of using various water sources as drinking water supply. Additionally, knowledge of pollutant transport processes on relatively large scales is needed to identify effective management strategies for improving water resources of poor quality. The lower Amu Darya drainage basin close to the Aral Sea in Uzbekistan suffers from physical water scarcity and poor water quality. This is mainly due to the intensive agriculture production in the region, which requires extensive freshwater withdrawals and use of fertilizers and pesticides. In addition, recurrent droughts in the region affect the surface water availability. On average 20% of the population in rural areas in Uzbekistan lack access to improved drinking water sources, and the situation is even more severe in the lower Amu Darya basin. In this study, we consider health risks related to water-borne contaminants by dividing measured substance concentrations with health-risk based guideline values from the World Health Organisation (WHO). In particular, we analyse novel results of water quality measurements performed in 2007 and 2008 in the Mejdurechye Reservoir (located in the downstream part of the Amu Darya river basin). We furthermore identify large-scale trends by comparing the Mejdurechye results to reported water quality results from a considerable stretch of the Amu Darya river basin, including drainage water, river water and groundwater. The results show that concentrations of cadmium and nitrite exceed the WHO health-risk based guideline values in Mejdurechye Reservoir. Furthermore, concentrations of the since long ago banned and highly toxic pesticides dichlorodiphenyltrichloroethane (DDT) and γ-hexachlorocyclohexane (γ-HCH) were detected in

  5. Health Occupations Survey.

    ERIC Educational Resources Information Center

    Willett, Lynn H.

    A survey was conducted to determine the need for health occupations personnel in the Moraine Valley Community College district, specifically to: (1) describe present employment for selected health occupations; (2) project health occupation employment to 1974; (3) identify the supply of applicants for the selected occupations; and (4) identify…

  6. Seismic texture and amplitude analysis of large scale fluid escape pipes using time lapses seismic surveys: examples from the Loyal Field (Scotland, UK)

    NASA Astrophysics Data System (ADS)

    Maestrelli, Daniele; Jihad, Ali; Iacopini, David; Bond, Clare

    2016-04-01

    ) affected by large scale fracture (semblance image) and seem consistent with a suspended mud/sand mixture non-fluidized fluid flow. Near-Middle-Far offsets amplitude analysis confirms that most of the amplitude anomalies within the pipes conduit and terminus are only partly related to gas. An interpretation of the possible texture observed is proposed with a discussion of the noise and artefact induced by resolution and migration problems. Possible hypothetical formation mechanisms for those Pipes are discussed.

  7. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L. |; Rickert, M. |

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  8. Large scale tracking algorithms

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  9. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  10. The Personal Health Survey

    ERIC Educational Resources Information Center

    Thorne, Frederick C.

    1978-01-01

    The Personal Health Survey (PHS) is a 200-item inventory designed to sample symptomatology as subjective experiences from the 12 principal domains of organ system and psychophysiological functioning. This study investigates the factorial validity of the empirically constructed scales. (Author)

  11. Prevalence of Trachoma in Unity State, South Sudan: Results from a Large-Scale Population-Based Survey and Potential Implications for Further Surveys

    PubMed Central

    Edwards, Tansy; Smith, Jennifer; Sturrock, Hugh J. W.; Kur, Lucia W.; Sabasio, Anthony; Finn, Timothy P.; Lado, Mounir; Haddad, Danny; Kolaczinski, Jan H.

    2012-01-01

    Background Large parts of South Sudan are thought to be trachoma-endemic but baseline data are limited. This study aimed to estimate prevalence for planning trachoma interventions in Unity State, to identify risk factors and to investigate the effect of different sampling approaches on study conclusions. Methods and Findings The survey area was defined as one domain of eight counties in Unity State. Across the area, 40 clusters (villages) were randomly selected proportional to the county population size in a population-based prevalence survey. The simplified grading scheme was used to classify clinical signs of trachoma. The unadjusted prevalence of trachoma inflammation-follicular (TF) in children aged 1–9 years was 70.5% (95% CI: 68.6–72.3). After adjusting for age, sex, county and clustering of cases at household and village level the prevalence was 71.0% (95% CI: 69.9–72.1). The prevalence of trachomatous trichiasis (TT) in adults was 15.1% (95% CI: 13.4–17.0) and 13.5% (95% CI: 12.0–15.1) before and after adjustment, respectively. We estimate that 700,000 people (the entire population of Unity State) require antibiotic treatment and approximately 54,178 people require TT surgery. Risk factor analyses confirmed child-level associations with TF and highlighted that older adults living in poverty are at higher risk of TT. Conditional simulations, testing the alternatives of sampling 20 or 60 villages over the same area, indicated that sampling of only 20 villages would have provided an acceptable level of precision for state-level prevalence estimation to inform intervention decisions in this hyperendemic setting. Conclusion Trachoma poses an enormous burden on the population of Unity State. Comprehensive control is urgently required to avoid preventable blindness and should be initiated across the state now. In other parts of South Sudan suspected to be highly trachoma endemic, counties should be combined into larger survey areas to generate the

  12. Large-scale survey of rates of achieving targets for blood glucose, blood pressure, and lipids and prevalence of complications in type 2 diabetes (JDDM 40)

    PubMed Central

    Yokoyama, Hiroki; Oishi, Mariko; Takamura, Hiroshi; Yamasaki, Katsuya; Shirabe, Shin-ichiro; Uchida, Daigaku; Sugimoto, Hidekatsu; Kurihara, Yoshio; Araki, Shin-ichi; Maegawa, Hiroshi

    2016-01-01

    Objective The fact that population with type 2 diabetes mellitus and bodyweight of patients are increasing but diabetes care is improving makes it important to explore the up-to-date rates of achieving treatment targets and prevalence of complications. We investigated the prevalence of microvascular/macrovascular complications and rates of achieving treatment targets through a large-scale multicenter-based cohort. Research design and methods A cross-sectional nationwide survey was performed on 9956 subjects with type 2 diabetes mellitus who consecutively attended primary care clinics. The prevalence of nephropathy, retinopathy, neuropathy, and macrovascular complications and rates of achieving targets of glycated hemoglobin (HbA1c) <7.0%, blood pressure <130/80 mm Hg, and lipids of low-density/high-density lipoprotein cholesterol <3.1/≥1.0 mmol/L and non-high-density lipoprotein cholesterol <3.8 mmol/L were investigated. Results The rates of achieving targets for HbA1c, blood pressure, and lipids were 52.9%, 46.8% and 65.5%, respectively. The prevalence of microvascular complications was ∼28% each, 6.4% of which had all microvascular complications, while that of macrovascular complications was 12.6%. With an increasing duration of diabetes, the rate of achieving target HbA1c decreased and the prevalence of each complication increased despite increased use of diabetes medication. The prevalence of each complication decreased according to the number achieving the 3 treatment targets and was lower in subjects without macrovascular complications than those with. Adjustments for considerable covariates exhibited that each complication was closely inter-related, and the achievement of each target was significantly associated with being free of each complication. Conclusions Almost half of the subjects examined did not meet the recommended targets. The risk of each complication was significantly affected by 1 on-target treatment (inversely) and the

  13. Self-Assessments or Tests? Comparing Cross-National Differences in Patterns and Outcomes of Graduates' Skills Based on International Large-Scale Surveys

    ERIC Educational Resources Information Center

    Humburg, Martin; van der Velden, Rolf

    2015-01-01

    In this paper an analysis is carried out whether objective tests and subjective self-assessments in international large-scale studies yield similar results when looking at cross-national differences in the effects of skills on earnings, and skills patterns across countries, fields of study and gender. The findings indicate that subjective skills…

  14. A Numeric Scorecard Assessing the Mental Health Preparedness for Large-Scale Crises at College and University Campuses: A Delphi Study

    ERIC Educational Resources Information Center

    Burgin, Rick A.

    2012-01-01

    Large-scale crises continue to surprise, overwhelm, and shatter college and university campuses. While the devastation to physical plants and persons is often evident and is addressed with crisis management plans, the number of emotional casualties left in the wake of these large-scale crises may not be apparent and are often not addressed with…

  15. A large-scale CO survey of the Rosette Molecular Cloud: assessing the effects of O stars on surrounding molecular gas

    NASA Astrophysics Data System (ADS)

    Dent, W. R. F.; Hovey, G. J.; Dewdney, P. E.; Burgess, T. A.; Willis, A. G.; Lightfoot, J. F.; Jenness, T.; Leech, J.; Matthews, H. E.; Heyer, M.; Poulton, C. J.

    2009-06-01

    We present a new large-scale survey of the J = 3-2 12CO emission covering 4.8deg2 around the Rosette Nebula. The results reveal the complex dynamics of the molecular gas in this region. We identify about 2000 compact gas clumps having a mass distribution given by dN/dM ~ M-1.8, with no dependence of the power-law index on distance from the central O stars. A detailed study of a number of the clumps in the inner region shows that most exhibit velocity gradients in the range 1-3kms-1pc-1, generally directed away from the exciting nebula. The magnitude of the velocity gradient decreases with distance from the central O stars, and we compare the apparent clump acceleration with a photoionized gas acceleration model. For most clumps outside the central nebula, the model predicts lifetimes of a few 105yr. In one of the most extended of these clumps, however, a near-constant velocity gradient can be measured over 1.7pc, which is difficult to explain with radiatively driven models of clump acceleration. As well as the individual accelerated clumps, an unresolved limb-brightened rim lies at the interface between the central nebular cavity and the Rosette Molecular Cloud. Extending over 4pc along the edge of the nebula, this region is thought to be at an earlier phase of disruption than the accelerating compact globules. Blueshifted gas clumps around the nebula are in all cases associated with dark absorbing optical globules, indicating that this material lies in front of the nebula and has been accelerated towards us. Redshifted gas shows little evidence of associated line-of-sight dark clouds, indicating that the dominant bulk molecular gas motion throughout the region is expansion away from the O stars. In addition, we find evidence that many of the clumps lie in a molecular ring, having an expansion velocity of 30kms-1 and radius 11pc. The dynamical time-scale derived for this structure (~106yr) is similar to the age of the nebula as a whole (2 × 106yr). The J = 3

  16. Virtual reality for health care: a survey.

    PubMed

    Moline, J

    1997-01-01

    This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.

  17. Collecting reliable information about violence against women safely in household interviews: experience from a large-scale national survey in South Asia.

    PubMed

    Andersson, Neil; Cockcroft, Anne; Ansari, Noor; Omer, Khalid; Chaudhry, Ubaid Ullah; Khan, Amir; Pearson, Luwei

    2009-04-01

    This article describes the first national survey of violence against women in Pakistan from 2001 to 2004 covering 23,430 women. The survey took account of methodological and ethical recommendations, ensuring privacy of interviews through one person interviewing the mother-in-law while another interviewed the eligible woman privately. The training module for interviewers focused on empathy with respondents, notably increasing disclosure rates. Only 3% of women declined to participate, and 1% were not permitted to participate. Among women who disclosed physical violence, only one third had previously told anyone. Surveys of violence against women in Pakistan not using methods to minimize underreporting could seriously underestimate prevalence.

  18. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  19. Is cost-related non-collection of prescriptions associated with a reduction in health? Findings from a large-scale longitudinal study of New Zealand adults

    PubMed Central

    Jatrana, Santosh; Richardson, Ken; Norris, Pauline; Crampton, Peter

    2015-01-01

    Objective To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. Settings New Zealand Survey of Family, Income and Employment (SoFIE)-Health. Participants Data from 17 363 participants with at least two observations in three waves (2004–2005, 2006–2007, 2008–2009) of a panel study were analysed using fixed effects regression modelling. Primary outcome measures Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. Results After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15–24 and 25–64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. Conclusion Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health. PMID:26553826

  20. What Sort of Girl Wants to Study Physics after the Age of 16? Findings from a Large-Scale UK Survey

    ERIC Educational Resources Information Center

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-01-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK…

  1. A Short Survey on the State of the Art in Architectures and Platforms for Large Scale Data Analysis and Knowledge Discovery from Data

    SciTech Connect

    Begoli, Edmon

    2012-01-01

    Intended as a survey for practicing architects and researchers seeking an overview of the state-of-the-art architectures for data analysis, this paper provides an overview of the emerg- ing data management and analytic platforms including par- allel databases, Hadoop-based systems, High Performance Computing (HPC) platforms and platforms popularly re- ferred to as NoSQL platforms. Platforms are presented based on their relevance, analysis they support and the data organization model they support.

  2. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2008-09-30

    aerosol species up to six days in advance anywhere on the globe. NAAPS and COAMPS are particularly useful for forecasts of dust storms in areas...impact cloud processes globally. With increasing dust storms due to climate change and land use changes in desert regions, the impact of the...bacteria in large-scale dust storms is expected to significantly impact warm ice cloud formation, human health, and ecosystems globally. In Niemi et al

  3. The Cosmic Large-Scale Structure in X-rays (CLASSIX) Cluster Survey. I. Probing galaxy cluster magnetic fields with line of sight rotation measures

    NASA Astrophysics Data System (ADS)

    Böhringer, Hans; Chon, Gayoung; Kronberg, Philipp P.

    2016-11-01

    To search for a signature of an intracluster magnetic field, we compare measurements of Faraday rotation of polarised extragalactic radio sources in the line of sight of galaxy clusters with those outside. To this end, we correlated a catalogue of 1383 rotation measures of extragalactic polarised radio sources with galaxy clusters from the CLASSIX survey (combining REFLEX II and NORAS II) detected by their X-ray emission in the ROSAT All-Sky Survey. The survey covers 8.25 ster of the sky at | bII | ≥ 20°. We compared the rotation measures in the line of sight of clusters within their projected radii of r500 with those outside and found a significant excess of the dispersion of the rotation measures in the cluster regions. Since the observed rotation measure is the result of Faraday rotation in several presumably uncorrelated magnetised cells of the intracluster medium, the observations correspond to quantities averaged over several magnetic field directions and strengths. Therefore the interesting quantity is the dispersion or standard deviation of the rotation measure for an ensemble of clusters. In the analysis of the observations we found a standard deviation of the rotation measure inside r500 of about 120 (± 21) rad m-2. This compares to about 56 (± 8) rad m-2 outside. Correcting for the effect of the Galaxy with the mean rotation measure in a region of 10 deg radius in the outskirts of the clusters does not change the outcome quoted above. We show that the most X-ray luminous and thus most massive clusters contribute most to the observed excess rotation measure. Modelling the electron density distribution in the intracluster medium with a self-similar model based on the REXCESS Survey, we found that the dispersion of the rotation measure increases with the column density, and we deduce a magnetic field value of about 2-6 (l/ 10 kpc)- 1/2μG assuming a constant magnetic field strength, where l is the size of the coherently magnetised intracluster medium

  4. A 1.85-m mm-submm Telescope for Large-Scale Molecular Gas Surveys in 12CO, 13CO, and C18O (J = 2-1)

    NASA Astrophysics Data System (ADS)

    Onishi, Toshikazu; Nishimura, Atsushi; Ota, Yuya; Hashizume, Akio; Kojima, Yoshiharu; Minami, Akihito; Tokuda, Kazuki; Touga, Shiori; Abe, Yasuhiro; Kaiden, Masahiro; Kimura, Kimihiro; Muraoka, Kazuyuki; Maezawa, Hiroyuki; Ogawa, Hideo; Dobashi, Kazuhito; Shimoikura, Tomomi; Yonekura, Yoshinori; Asayama, Shin'ichiro; Handa, Toshihiro; Nakajima, Taku; Noguchi, Takashi; Kuno, Nario

    2013-08-01

    We have developed a new mm-submm telescope with a diameter of 1.85-m installed at the Nobeyama Radio Observatory. The scientific goal is to precisely reveal the physical properties of molecular clouds in the Milky Way Galaxy by obtaining a large-scale distribution of molecular gas, which can also be compared with large-scale observations at various wavelengths. The target frequency is ˜ 230 GHz; simultaneous observations at the molecular rotational lines of J = 2-1 of three carbon monoxide isotopes (12CO, 13CO, C18 O) are achieved with a beam size (HPBW) of 2.7'. In order to accomplish the simultaneous observations, we have developed waveguide-type sideband-separating SIS mixers to obtain spectra separately in the upper and lower side bands. A Fourier digital spectrometer with a 1 GHz bandwidth having 16384 channels is installed, and the bandwidth of the spectrometer is divided into three parts, corresponding to each of the three spectra; the IF system has been designed so as to inject these three lines into the spectrometer. A flexible observation system was created mainly in Python on Linux PCs, enabling effective OTF (On-The-Fly) scans for large-area mapping. The telescope is enclosed in a radome with a membrane covered to prevent any harmful effects of sunlight, strong wind, and precipitation in order to minimize errors in the telescope pointing, and to stabilize the receiver and the IF devices. From 2011 November, we started science operation, resulting in a large-scale survey of the Orion A/B clouds, Cygnus OB7, Galactic Plane, Taurus, and so on. We also updated the receiver system for dual-polarization observations.

  5. What Sort of Girl Wants to Study Physics After the Age of 16? Findings from a Large-scale UK Survey

    NASA Astrophysics Data System (ADS)

    Mujtaba, Tamjid; Reiss, Michael J.

    2013-11-01

    This paper investigates the characteristics of 15-year-old girls who express an intention to study physics post-16. This paper unpacks issues around within-girl group differences and similarities between boys and girls in survey responses about physics. The analysis is based on the year 10 (age 15 years) responses of 5,034 students from 137 UK schools as learners of physics during the academic year 2008-2009. A comparison between boys and girls indicates the pervasiveness of gender issues, with boys more likely to respond positively towards physics-specific constructs than girls. The analysis also indicates that girls and boys who expressed intentions to participate in physics post-16 gave similar responses towards their physics teachers and physics lessons and had comparable physics extrinsic motivation. Girls (regardless of their intention to participate in physics) were less likely than boys to be encouraged to study physics post-16 by teachers, family and friends. Despite this, there were a subset of girls still intending to study physics post-16. The crucial differences between the girls who intended to study physics post-16 and those who did not is that girls who intend to study physics post-16 had higher physics extrinsic motivation, more positive perceptions of physics teachers and lessons, greater competitiveness and a tendency to be less extrovert. This strongly suggests that higher extrinsic motivation in physics could be the crucial underlying key that encourages a subset of girls (as well as boys) in wanting to pursue physics post-16.

  6. IRAM 30 m Large Scale Survey of 12CO(2-1) and 13CO(2-1) Emission in the Orion Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Berné, O.; Marcelino, N.; Cernicharo, J.

    2014-11-01

    Using the IRAM 30 m telescope, we have surveyed a 1 × 0.°8 part of the Orion molecular cloud in the 12CO and 13CO (2-1) lines with a maximal spatial resolution of ~11'' and spectral resolution of ~0.4 km s-1. The cloud appears filamentary, clumpy, and with a complex kinematical structure. We derive an estimated mass of the cloud of 7700 M ⊙ (half of which is found in regions with visual extinctions AV below ~10) and a dynamical age for the nebula of the order of 0.2 Myr. The energy balance suggests that magnetic fields play an important role in supporting the cloud, at large and small scales. According to our analysis, the turbulent kinetic energy in the molecular gas due to outflows is comparable to turbulent kinetic energy resulting from the interaction of the cloud with the H II region. This latter feedback appears negative, i.e., the triggering of star formation by the H II region is inefficient in Orion. The reduced data as well as additional products such as the column density map are made available online (http://userpages.irap.omp.eu/~oberne/Olivier_Berne/Data).

  7. Modified gravity and large scale flows, a review

    NASA Astrophysics Data System (ADS)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  8. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  9. A large-scale survey of the novel 15q24 microdeletion syndrome in autism spectrum disorders identifies an atypical deletion that narrows the critical region

    PubMed Central

    2010-01-01

    Background The 15q24 microdeletion syndrome has been recently described as a recurrent, submicroscopic genomic imbalance found in individuals with intellectual disability, typical facial appearance, hypotonia, and digital and genital abnormalities. Gene dosage abnormalities, including copy number variations (CNVs), have been identified in a significant fraction of individuals with autism spectrum disorders (ASDs). In this study we surveyed two ASD cohorts for 15q24 abnormalities to assess the frequency of genomic imbalances in this interval. Methods We screened 173 unrelated subjects with ASD from the Central Valley of Costa Rica and 1336 subjects with ASD from 785 independent families registered with the Autism Genetic Resource Exchange (AGRE) for CNVs across 15q24 using oligonucleotide arrays. Rearrangements were confirmed by array comparative genomic hybridization and quantitative PCR. Results Among the patients from Costa Rica, an atypical de novo deletion of 3.06 Mb in 15q23-q24.1 was detected in a boy with autism sharing many features with the other 13 subjects with the 15q24 microdeletion syndrome described to date. He exhibited intellectual disability, constant smiling, characteristic facial features (high anterior hairline, broad medial eyebrows, epicanthal folds, hypertelorism, full lower lip and protuberant, posteriorly rotated ears), single palmar crease, toe syndactyly and congenital nystagmus. The deletion breakpoints are atypical and lie outside previously characterized low copy repeats (69,838-72,897 Mb). Genotyping data revealed that the deletion had occurred in the paternal chromosome. Among the AGRE families, no large 15q24 deletions were observed. Conclusions From the current and previous studies, deletions in the 15q24 region represent rare causes of ASDs with an estimated frequency of 0.1 to 0.2% in individuals ascertained for ASDs, although the proportion might be higher in sporadic cases. These rates compare with a frequency of about 0.3% in

  10. An integrative structural health monitoring system for the local/global responses of a large-scale irregular building under construction.

    PubMed

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-07-15

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction.

  11. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  12. Large-scale screening of nasal swabs for Bacillus anthracis: descriptive summary and discussion of the National Institutes of Health's experience.

    PubMed

    Kiratisin, Pattarachai; Fukuda, Caroline D; Wong, Alexandra; Stock, Frida; Preuss, Jeanne C; Ediger, Laura; Brahmbhatt, Trupti N; Fischer, Steven H; Fedorko, Daniel P; Witebsky, Frank G; Gill, Vee J

    2002-08-01

    In October 2001, a letter containing a large number of anthrax spores was sent through the Brentwood post office in Washington, D.C., to a United States Senate office on Capitol Hill, resulting in contamination in both places. Several thousand people who worked at these sites were screened for spore exposure by collecting nasal swab samples. We describe here a screening protocol which we, as a level A laboratory, used on very short notice to process a large number of specimens (3,936 swabs) in order to report preliminary results as quickly as possible. Six isolates from our screening met preliminary criteria for Bacillus anthracis identification and were referred for definitive testing. Although none of the isolates was later confirmed to be B. anthracis, we studied these isolates further to define their biochemical characteristics and 16S rRNA sequences. Four of the six isolates were identified as Bacillus megaterium, one was identified as Bacillus cereus, and one was an unidentifiable Bacillus sp. Our results suggest that large-scale nasal-swab screening for potential exposure to anthrax spores, particularly if not done immediately postexposure, may not be very effective for detecting B. anthracis but may detect a number of Bacillus spp. that are phenotypically very similar to B. anthracis.

  13. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success

    PubMed Central

    Scott-Dupree, Cynthia D.; Sultan, Maryam; McFarlane, Andrew D.; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5–2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012–2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees. PMID:25374790

  14. A large-scale field study examining effects of exposure to clothianidin seed-treated canola on honey bee colony health, development, and overwintering success.

    PubMed

    Cutler, G Christopher; Scott-Dupree, Cynthia D; Sultan, Maryam; McFarlane, Andrew D; Brewer, Larry

    2014-01-01

    In summer 2012, we initiated a large-scale field experiment in southern Ontario, Canada, to determine whether exposure to clothianidin seed-treated canola (oil seed rape) has any adverse impacts on honey bees. Colonies were placed in clothianidin seed-treated or control canola fields during bloom, and thereafter were moved to an apiary with no surrounding crops grown from seeds treated with neonicotinoids. Colony weight gain, honey production, pest incidence, bee mortality, number of adults, and amount of sealed brood were assessed in each colony throughout summer and autumn. Samples of honey, beeswax, pollen, and nectar were regularly collected, and samples were analyzed for clothianidin residues. Several of these endpoints were also measured in spring 2013. Overall, colonies were vigorous during and after the exposure period, and we found no effects of exposure to clothianidin seed-treated canola on any endpoint measures. Bees foraged heavily on the test fields during peak bloom and residue analysis indicated that honey bees were exposed to low levels (0.5-2 ppb) of clothianidin in pollen. Low levels of clothianidin were detected in a few pollen samples collected toward the end of the bloom from control hives, illustrating the difficulty of conducting a perfectly controlled field study with free-ranging honey bees in agricultural landscapes. Overwintering success did not differ significantly between treatment and control hives, and was similar to overwintering colony loss rates reported for the winter of 2012-2013 for beekeepers in Ontario and Canada. Our results suggest that exposure to canola grown from seed treated with clothianidin poses low risk to honey bees.

  15. The Kaiser Permanente Northern California Adult Member Health Survey

    PubMed Central

    Gordon, Nancy; Lin, Teresa

    2016-01-01

    Introduction The Kaiser Permanente Northern California (KPNC) Member Health Survey (MHS) is used to describe sociodemographic and health-related characteristics of the adult membership of this large, integrated health care delivery system to monitor trends over time, identify health disparities, and conduct research. Objective To provide an overview of the KPNC MHS and share findings that illustrate how survey statistics and data have been and can be used for research and programmatic purposes. Methods The MHS is a large-scale, institutional review board-approved survey of English-speaking KPNC adult members. The confidential survey has been conducted by mail triennially starting in 1993 with independent age-sex and geographically stratified random samples, with an option for online completion starting in 2005. The full survey sample and survey data are linkable at the individual level to Health Plan and geocoded data. Respondents are assigned weighting factors for their survey year and additional weighting factors for analysis of pooled survey data. Results Statistics from the 1999, 2002, 2005, 2008, and 2011 surveys show trends in sociodemographic and health-related characteristics and access to the Internet and e-mail for the adult membership aged 25 to 79 years and for 6 age-sex subgroups. Pooled data from the 2008 and 2011 surveys show many significant differences in these characteristics across the 5 largest race/ethnic groups in KPNC (non-Hispanic whites, blacks, Latinos, Filipinos, and Chinese). Conclusion The KPNC MHS has yielded unique insights and provides an opportunity for researchers and public health organizations outside of KPNC to leverage our survey-generated statistics and collaborate on epidemiologic and health services research studies. PMID:27548806

  16. Large-scale PACS implementation.

    PubMed

    Carrino, J A; Unkel, P J; Miller, I D; Bowser, C L; Freckleton, M W; Johnson, T G

    1998-08-01

    The transition to filmless radiology is a much more formidable task than making the request for proposal to purchase a (Picture Archiving and Communications System) PACS. The Department of Defense and the Veterans Administration have been pioneers in the transformation of medical diagnostic imaging to the electronic environment. Many civilian sites are expected to implement large-scale PACS in the next five to ten years. This presentation will related the empirical insights gleaned at our institution from a large-scale PACS implementation. Our PACS integration was introduced into a fully operational department (not a new hospital) in which work flow had to continue with minimal impact. Impediments to user acceptance will be addressed. The critical components of this enormous task will be discussed. The topics covered during this session will include issues such as phased implementation, DICOM (digital imaging and communications in medicine) standard-based interaction of devices, hospital information system (HIS)/radiology information system (RIS) interface, user approval, networking, workstation deployment and backup procedures. The presentation will make specific suggestions regarding the implementation team, operating instructions, quality control (QC), training and education. The concept of identifying key functional areas is relevant to transitioning the facility to be entirely on line. Special attention must be paid to specific functional areas such as the operating rooms and trauma rooms where the clinical requirements may not match the PACS capabilities. The printing of films may be necessary for certain circumstances. The integration of teleradiology and remote clinics into a PACS is a salient topic with respect to the overall role of the radiologists providing rapid consultation. A Web-based server allows a clinician to review images and reports on a desk-top (personal) computer and thus reduce the number of dedicated PACS review workstations. This session

  17. Large-Scale Sequence Comparison.

    PubMed

    Lal, Devi; Verma, Mansi

    2017-01-01

    There are millions of sequences deposited in genomic databases, and it is an important task to categorize them according to their structural and functional roles. Sequence comparison is a prerequisite for proper categorization of both DNA and protein sequences, and helps in assigning a putative or hypothetical structure and function to a given sequence. There are various methods available for comparing sequences, alignment being first and foremost for sequences with a small number of base pairs as well as for large-scale genome comparison. Various tools are available for performing pairwise large sequence comparison. The best known tools either perform global alignment or generate local alignments between the two sequences. In this chapter we first provide basic information regarding sequence comparison. This is followed by the description of the PAM and BLOSUM matrices that form the basis of sequence comparison. We also give a practical overview of currently available methods such as BLAST and FASTA, followed by a description and overview of tools available for genome comparison including LAGAN, MumMER, BLASTZ, and AVID.

  18. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  19. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  20. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  1. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  2. Voids in the Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    El-Ad, Hagai; Piran, Tsvi

    1997-12-01

    Voids are the most prominent feature of the large-scale structure of the universe. Still, their incorporation into quantitative analysis of it has been relatively recent, owing essentially to the lack of an objective tool to identify the voids and to quantify them. To overcome this, we present here the VOID FINDER algorithm, a novel tool for objectively quantifying voids in the galaxy distribution. The algorithm first classifies galaxies as either wall galaxies or field galaxies. Then, it identifies voids in the wall-galaxy distribution. Voids are defined as continuous volumes that do not contain any wall galaxies. The voids must be thicker than an adjustable limit, which is refined in successive iterations. In this way, we identify the same regions that would be recognized as voids by the eye. Small breaches in the walls are ignored, avoiding artificial connections between neighboring voids. We test the algorithm using Voronoi tesselations. By appropriate scaling of the parameters with the selection function, we apply it to two redshift surveys, the dense SSRS2 and the full-sky IRAS 1.2 Jy. Both surveys show similar properties: ~50% of the volume is filled by voids. The voids have a scale of at least 40 h-1 Mpc and an average -0.9 underdensity. Faint galaxies do not fill the voids, but they do populate them more than bright ones. These results suggest that both optically and IRAS-selected galaxies delineate the same large-scale structure. Comparison with the recovered mass distribution further suggests that the observed voids in the galaxy distribution correspond well to underdense regions in the mass distribution. This confirms the gravitational origin of the voids.

  3. The cost of large-scale school health programmes which deliver anthelmintics to children in Ghana and Tanzania. The Partnership for Child Development.

    PubMed

    1999-07-30

    It has been argued that the delivery of anthelmintics to school-children through existing education infrastructure can be one of the most cost-effective approaches to controlling parasitic worm infection. This paper examines the actual costs of a combination of mass and selective treatment for schistosomiasis using praziquantel and mass treatment for intestinal nematodes using albendazole, as an integral part of school health programmes reaching 80442 pupils in 577 schools in Volta Region, Ghana, and reaching 109099 pupils in 350 schools in Tanga Region, Tanzania. The analysis shows that financial delivery costs per child treated using praziquantel, which involved a dose related to body mass and a prior screening at the school level, were US$ 0.67 in Ghana and US$ 0.21 in Tanzania, while the delivery costs for albendazole, which was given as a fixed dose to all children, were US$ 0.04 in Ghana and US$ 0.03 in Tanzania. The higher unit costs in Ghana reflect the epidemiology of infection; overall, fixed costs were similar in both countries, but fewer children required treatment in Ghana. Analysis of economic costs-which includes the cost of unpaid days of labour--indicates that the financial costs are increased in Ghana by 78% and in Tanzania by 44%. It is these additional costs which are avoided by integration into an existing infrastructure. It is concluded that: the base cost of delivering a universal, standard, school-based health intervention can be as low as US$ 0.03 per child treated; that even a slight increase in the complexity of delivery can have a significant impact on the cost of intervention; and that the use of the education infrastructure does indeed offer significant savings in delivery costs.

  4. A Navajo health consumer survey.

    PubMed

    Stewart, T; May, P; Muneta, A

    1980-12-01

    The findings of a health consumer survey of 309 Navajo families in three areas of the Navajo Reservation are reported. The survey shows that access to facilities and lack of safe water and sanitary supplies are continuing problems for these families. The families show consistent use of Indian Health Service providers, particularly nurses, pharmacists and physicians, as well as traditional Navajo medicine practitioners. Only incidental utilization of private medical services is reported. Extended waiting times and translation from English to Navajo are major concerns in their contacts with providers. A surprisingly high availability of third-party insurance is noted. Comparisons are made between this data base and selected national and regional surveys, and with family surveys from other groups assumed to be disadvantaged in obtaining health care. The comparisons indicate somewhat lower utilization rates and more problems in access to care for this Navajo sample. The discussion suggests that attitudes regarding free health care eventually may be a factor for Navajo people and other groups, that cultural considerations are often ignored or accepted as truisms in delivering care, and that the Navajo Reservation may serve as a unique microcosm of health care in the U.S.

  5. Psychological Resilience after Hurricane Sandy: The Influence of Individual- and Community-Level Factors on Mental Health after a Large-Scale Natural Disaster

    PubMed Central

    Lowe, Sarah R.; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13–16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience. PMID:25962178

  6. Psychological resilience after Hurricane Sandy: the influence of individual- and community-level factors on mental health after a large-scale natural disaster.

    PubMed

    Lowe, Sarah R; Sampson, Laura; Gruebner, Oliver; Galea, Sandro

    2015-01-01

    Several individual-level factors are known to promote psychological resilience in the aftermath of disasters. Far less is known about the role of community-level factors in shaping postdisaster mental health. The purpose of this study was to explore the influence of both individual- and community-level factors on resilience after Hurricane Sandy. A representative sample of household residents (N = 418) from 293 New York City census tracts that were most heavily affected by the storm completed telephone interviews approximately 13-16 months postdisaster. Multilevel multivariable models explored the independent and interactive contributions of individual- and community-level factors to posttraumatic stress and depression symptoms. At the individual-level, having experienced or witnessed any lifetime traumatic event was significantly associated with higher depression and posttraumatic stress, whereas demographic characteristics (e.g., older age, non-Hispanic Black race) and more disaster-related stressors were significantly associated with higher posttraumatic stress only. At the community-level, living in an area with higher social capital was significantly associated with higher posttraumatic stress. Additionally, higher community economic development was associated with lower risk of depression only among participants who did not experience any disaster-related stressors. These results provide evidence that individual- and community-level resources and exposure operate in tandem to shape postdisaster resilience.

  7. Accuracy of Electronic Health Record Data for Identifying Stroke Cases in Large-Scale Epidemiological Studies: A Systematic Review from the UK Biobank Stroke Outcomes Group

    PubMed Central

    Woodfield, Rebecca; Grant, Ian; Sudlow, Cathie L. M.

    2015-01-01

    Objective Long-term follow-up of population-based prospective studies is often achieved through linkages to coded regional or national health care data. Our knowledge of the accuracy of such data is incomplete. To inform methods for identifying stroke cases in UK Biobank (a prospective study of 503,000 UK adults recruited in middle-age), we systematically evaluated the accuracy of these data for stroke and its main pathological types (ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage), determining the optimum codes for case identification. Methods We sought studies published from 1990-November 2013, which compared coded data from death certificates, hospital admissions or primary care with a reference standard for stroke or its pathological types. We extracted information on a range of study characteristics and assessed study quality with the Quality Assessment of Diagnostic Studies tool (QUADAS-2). To assess accuracy, we extracted data on positive predictive values (PPV) and—where available—on sensitivity, specificity, and negative predictive values (NPV). Results 37 of 39 eligible studies assessed accuracy of International Classification of Diseases (ICD)-coded hospital or death certificate data. They varied widely in their settings, methods, reporting, quality, and in the choice and accuracy of codes. Although PPVs for stroke and its pathological types ranged from 6–97%, appropriately selected, stroke-specific codes (rather than broad cerebrovascular codes) consistently produced PPVs >70%, and in several studies >90%. The few studies with data on sensitivity, specificity and NPV showed higher sensitivity of hospital versus death certificate data for stroke, with specificity and NPV consistently >96%. Few studies assessed either primary care data or combinations of data sources. Conclusions Particular stroke-specific codes can yield high PPVs (>90%) for stroke/stroke types. Inclusion of primary care data and combining data sources should

  8. Coverage of Large-Scale Food Fortification of Edible Oil, Wheat Flour, and Maize Flour Varies Greatly by Vehicle and Country but Is Consistently Lower among the Most Vulnerable: Results from Coverage Surveys in 8 Countries.

    PubMed

    Aaron, Grant J; Friesen, Valerie M; Jungjohann, Svenja; Garrett, Greg S; Neufeld, Lynnette M; Myatt, Mark

    2017-04-12

    Background: Large-scale food fortification (LSFF) of commonly consumed food vehicles is widely implemented in low- and middle-income countries. Many programs have monitoring information gaps and most countries fail to assess program coverage.Objective: The aim of this work was to present LSFF coverage survey findings (overall and in vulnerable populations) from 18 programs (7 wheat flour, 4 maize flour, and 7 edible oil programs) conducted in 8 countries between 2013 and 2015.Methods: A Fortification Assessment Coverage Toolkit (FACT) was developed to standardize the assessments. Three indicators were used to assess the relations between coverage and vulnerability: 1) poverty, 2) poor dietary diversity, and 3) rural residence. Three measures of coverage were assessed: 1) consumption of the vehicle, 2) consumption of a fortifiable vehicle, and 3) consumption of a fortified vehicle. Individual program performance was assessed based on the following: 1) achieving overall coverage ≥50%, 2) achieving coverage of ≥75% in ≥1 vulnerable group, and 3) achieving equity in coverage for ≥1 vulnerable group.Results: Coverage varied widely by food vehicle and country. Only 2 of the 18 LSFF programs assessed met all 3 program performance criteria. The 2 main program bottlenecks were a poor choice of vehicle and failure to fortify a fortifiable vehicle (i.e., absence of fortification).Conclusions: The results highlight the importance of sound program design and routine monitoring and evaluation. There is strong evidence of the impact and cost-effectiveness of LSFF; however, impact can only be achieved when the necessary activities and processes during program design and implementation are followed. The FACT approach fills an important gap in the availability of standardized tools. The LSFF programs assessed here need to be re-evaluated to determine whether to further invest in the programs, whether other vehicles are appropriate, and whether other approaches are needed.

  9. Study Protocol for the Fukushima Health Management Survey

    PubMed Central

    Yasumura, Seiji; Hosoya, Mitsuaki; Yamashita, Shunichi; Kamiya, Kenji; Abe, Masafumi; Akashi, Makoto; Kodama, Kazunori; Ozasa, Kotaro

    2012-01-01

    and birth survey. This long-term large-scale epidemiologic study is expected to provide valuable data in the investigation of the health effects of low-dose radiation and disaster-related stress. PMID:22955043

  10. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  11. THE OBSERVATIONS OF REDSHIFT EVOLUTION IN LARGE-SCALE ENVIRONMENTS (ORELSE) SURVEY. I. THE SURVEY DESIGN AND FIRST RESULTS ON CL 0023+0423 AT z = 0.84 AND RX J1821.6+6827 AT z = 0.82

    SciTech Connect

    Lubin, L. M.; Lemaux, B. C.; Kocevski, D. D.; Gal, R. R.; Squires, G. K.

    2009-06-15

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h {sup -1} {sub 70} Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z {approx} 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc x 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s{sup -1}. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the H{delta} line in a composite spectrum of 138 members indicates a substantial contribution from recent

  12. The Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey. I. The Survey Design and First Results on CL 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82

    NASA Astrophysics Data System (ADS)

    Lubin, L. M.; Gal, R. R.; Lemaux, B. C.; Kocevski, D. D.; Squires, G. K.

    2009-06-01

    We present the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 h -1 70 Mpc around 20 well-known clusters at redshifts of 0.6 < z < 1.3. The goal of the survey is to examine a statistical sample of dynamically active clusters and large-scale structures in order to quantify galaxy properties over the full range of local and global environments. We describe the survey design, the cluster sample, and our extensive observational data covering at least 25' around each target cluster. We use adaptively smoothed red galaxy density maps from our wide-field optical imaging to identify candidate groups/clusters and intermediate-density large-scale filaments/walls in each cluster field. Because photometric techniques (such as photometric redshifts, statistical overdensities, and richness estimates) can be highly uncertain, the crucial component of this survey is the unprecedented amount of spectroscopic coverage. We are using the wide-field, multiobject spectroscopic capabilities of the Deep Multiobject Imaging Spectrograph to obtain 100-200+ confirmed cluster members in each field. Our survey has already discovered the Cl 1604 supercluster at z ≈ 0.9, a structure which contains at least eight groups and clusters and spans 13 Mpc × 100 Mpc. Here, we present the results on the large-scale environments of two additional clusters, Cl 0023+0423 at z = 0.84 and RX J1821.6+6827 at z = 0.82, which highlight the diversity of global properties at these redshifts. The optically selected Cl 0023+0423 is a four-way group-group merger with constituent groups having measured velocity dispersions between 206 and 479 km s-1. The galaxy population is dominated by blue, star-forming galaxies, with 80% of the confirmed members showing [O II] emission. The strength of the Hδ line in a composite spectrum of 138 members indicates a substantial contribution from recent starbursts to the overall galaxy

  13. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the 'Living-It-Up' programme.

    PubMed

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2016-12-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade

  14. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the ‘Living-It-Up’ programme

    PubMed Central

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2015-01-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment. Initial findings illustrate that it is clear − and perhaps not surprising − that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a ‘multi-stakeholder’ environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased ‘buy-in’ from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder − or at least substantially slow down − the effective rollout of services at scale. The deployment of Living-It-Up services is ongoing, but our results to date suggest that − in order to be successful − the roll-out of digital health and well-being technologies at scale requires a delicate

  15. National health surveys and health policy: impact of the Jamaica Health and Lifestyle Surveys and the Reproductive Health Surveys.

    PubMed

    Ferguson, T S; Tulloch-Reid, M K; Gordon-Strachan, G; Hamilton, P; Wilks, R J

    2012-07-01

    Over the last six decades, comprehensive national health surveys have become important data-gathering mechanisms to inform countries on their health status and provide information for health policy and programme planning. Developing countries have only recently begun such surveys and Jamaica has been at the forefront of this effort. Jamaica's Reproductive Health Surveys and programme response to their findings have resulted in an almost 50% reduction infertility rates over three decades as well as a 40% reduction in unmet contraceptive needs and a 40% reduction in unplanned pregnancies over the last two decades. The Jamaica Health and Lifestyle Surveys have served to reinforce the major burden that non-communicable diseases place on the society and the extent to which these are driven by unhealthy lifestyles. These surveys have shown that obesity, hypertension, diabetes and dyslipidaemia affect approximately 50%, 25%, 10% and 10% of the adult population, respectively. These surveys have documented low rates of treatment and control for these chronic non-communicable diseases despite two major policy initiatives, the National Programme for the Promotion of Healthy Lifestyles and the creation of the National Health Fund which subsidizes healthcare provision for chronic diseases. In order to maximize the uptake of the findings of future surveys into effective health policy, there will need to be effective collaborations between academia, policy-makers, regional and international health agencies, non-government organizations and civil society. Such collaborations should take into account the social, political and economic issues, thus ensuring a more comprehensive approach to health policy and result in improvement of the nation's health status and by extension national development.

  16. Large-scale investment in green space as an intervention for physical activity, mental and cardiometabolic health: study protocol for a quasi-experimental evaluation of a natural experiment

    PubMed Central

    Astell-Burt, Thomas; Feng, Xiaoqi; Kolt, Gregory S

    2016-01-01

    Introduction ‘Green spaces’ such as public parks are regarded as determinants of health, but evidence from tends to be based on cross-sectional designs. This protocol describes a study that will evaluate a large-scale investment in approximately 5280 hectares of green space stretching 27 km north to south in Western Sydney, Australia. Methods and analysis A Geographic Information System was used to identify 7272 participants in the 45 and Up Study baseline data (2006–2008) living within 5 km of the Western Sydney Parklands and some of the features that have been constructed since 2009, such as public access points, advertising billboards, walking and cycle tracks, BBQ stations, and children's playgrounds. These data were linked to information on a range of health and behavioural outcomes, with the second wave of data collection initiated by the Sax Institute in 2012 and expected to be completed by 2015. Multilevel models will be used to analyse potential change in physical activity, weight status, social contacts, mental and cardiometabolic health within a closed sample of residentially stable participants. Comparisons between persons with contrasting proximities to different areas of the Parklands will provide ‘treatment’ and ‘control’ groups within a ‘quasi-experimental’ study design. In line with expectations, baseline results prior to the enhancement of the Western Sydney Parklands indicated virtually no significant differences in the distribution of any of the outcomes with respect to proximity to green space preintervention. Ethics and dissemination Ethical approval was obtained for the 45 and Up Study from the University of New South Wales Human Research Ethics Committee. Ethics approval for this study was obtained from the University of Western Sydney Ethics Committee. Findings will be disseminated through partner organisations (the Western Sydney Parklands and the National Heart Foundation of Australia), as well as to policymakers in

  17. Large Scale Metal Additive Techniques Review

    SciTech Connect

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W; Love, Lonnie J

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  18. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  19. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  20. The convergent validity of three surveys as alternative sources of health information to the 2011 UK census.

    PubMed

    Taylor, Joanna; Twigg, Liz; Moon, Graham

    2014-09-01

    Censuses have traditionally been a key source of localised information on the state of a nation's health. Many countries are now adopting alternative approaches to the traditional census, placing such information at risk. The purpose of this paper is to inform debate about whether existing social surveys could provide an adequate 'base' for alternative model-based small area estimates of health data in a post traditional census era. Using a case study of 2011 UK Census questions on self-assessed health and limiting long term illness, we examine the extent to which the results from three large-scale surveys - the Health Survey for England, the Crime Survey for England and Wales and the Integrated Household Survey - conform to census output. Particularly in the case of limiting long term illness, the question wording renders comparisons difficult. However, with the exception of the general health question from the Health Survey for England all three surveys meet tests for convergent validity.

  1. The National Adolescent Student Health Survey: Survey Replication Booklet.

    ERIC Educational Resources Information Center

    American School Health Association, Kent, OH.

    The National Adolescent Student Health Survey (NASHS), initiated in 1985, is conducted to examine the health-related knowledge, practices, and attitudes of the nation's youth in the following health areas: AIDS; Nutrition; Consumer Health; Sexually Transmitted Disease; Drug and Alcohol Use; Suicide; Injury Prevention; and Violence. Findings…

  2. Large-scale cortical networks and cognition.

    PubMed

    Bressler, S L

    1995-03-01

    The well-known parcellation of the mammalian cerebral cortex into a large number of functionally distinct cytoarchitectonic areas presents a problem for understanding the complex cortical integrative functions that underlie cognition. How do cortical areas having unique individual functional properties cooperate to accomplish these complex operations? Do neurons distributed throughout the cerebral cortex act together in large-scale functional assemblages? This review examines the substantial body of evidence supporting the view that complex integrative functions are carried out by large-scale networks of cortical areas. Pathway tracing studies in non-human primates have revealed widely distributed networks of interconnected cortical areas, providing an anatomical substrate for large-scale parallel processing of information in the cerebral cortex. Functional coactivation of multiple cortical areas has been demonstrated by neurophysiological studies in non-human primates and several different cognitive functions have been shown to depend on multiple distributed areas by human neuropsychological studies. Electrophysiological studies on interareal synchronization have provided evidence that active neurons in different cortical areas may become not only coactive, but also functionally interdependent. The computational advantages of synchronization between cortical areas in large-scale networks have been elucidated by studies using artificial neural network models. Recent observations of time-varying multi-areal cortical synchronization suggest that the functional topology of a large-scale cortical network is dynamically reorganized during visuomotor behavior.

  3. Large-scale nanophotonic phased array.

    PubMed

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  4. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  5. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  6. National Health Interview Survey (NHIS)

    EPA Pesticide Factsheets

    The NHIS collects data on a broad range of health topics through personal household interviews. The results of NHIS provide data to track health status, health care access, and progress toward achieving national health objectives.

  7. Washington State Survey of Adolescent Health Behaviors.

    ERIC Educational Resources Information Center

    Washington State Dept. of Social and Health Services, Olympia.

    The 1992 Washington State Survey of Adolescent Health Behaviors (WSSAHB) was created to collect information regarding a variety of adolescent health behaviors among students in the state of Washington. It expands on two previous administrations of a student tobacco, alcohol, and other drug survey and includes questions about medical care, safety,…

  8. Estimating health expenditure shares from household surveys

    PubMed Central

    Brooks, Benjamin PC; Hanlon, Michael

    2013-01-01

    Abstract Objective To quantify the effects of household expenditure survey characteristics on the estimated share of a household’s expenditure devoted to health. Methods A search was conducted for all country surveys reporting data on health expenditure and total household expenditure. Data on total expenditure and health expenditure were extracted from the surveys to generate the health expenditure share (i.e. fraction of the household expenditure devoted to health). To do this the authors relied on survey microdata or survey reports to calculate the health expenditure share for the particular instrument involved. Health expenditure share was modelled as a function of the survey’s recall period, the number of health expenditure items, the number of total expenditure items, the data collection method and the placement of the health module within the survey. Data exists across space and time, so fixed effects for territory and year were included as well. The model was estimated by means of ordinary least squares regression with clustered standard errors. Findings A one-unit increase in the number of health expenditure questions was accompanied by a 1% increase in the estimated health expenditure share. A one-unit increase in the number of non-health expenditure questions resulted in a 0.2% decrease in the estimated share. Increasing the recall period by one month was accompanied by a 6% decrease in the health expenditure share. Conclusion The characteristics of a survey instrument examined in the study affect the estimate of the health expenditure share. Those characteristics need to be accounted for when comparing results across surveys within a territory and, ultimately, across territories. PMID:23825879

  9. Fukushima Health Management Survey and Related Issues.

    PubMed

    Yasumura, Seiji; Abe, Masafumi

    2017-03-01

    After the Great East Japan Earthquake on March 11, 2011, the Tokyo Electric Power Company Fukushima Daiichi Nuclear Power Plant accident occurred. The Fukushima prefectural government decided to launch the Fukushima Health Management Survey; Fukushima Medical University was entrusted to design and implement the survey. The survey process and development is described from the standpoint of its background and aim. An overview of the basic survey and 4 detailed surveys is briefly provided. Issues related to the survey are discussed from the perspective of supporting the Fukushima residents.

  10. Large-scale Advanced Propfan (LAP) program

    NASA Technical Reports Server (NTRS)

    Sagerser, D. A.; Ludemann, S. G.

    1985-01-01

    The propfan is an advanced propeller concept which maintains the high efficiencies traditionally associated with conventional propellers at the higher aircraft cruise speeds associated with jet transports. The large-scale advanced propfan (LAP) program extends the research done on 2 ft diameter propfan models to a 9 ft diameter article. The program includes design, fabrication, and testing of both an eight bladed, 9 ft diameter propfan, designated SR-7L, and a 2 ft diameter aeroelastically scaled model, SR-7A. The LAP program is complemented by the propfan test assessment (PTA) program, which takes the large-scale propfan and mates it with a gas generator and gearbox to form a propfan propulsion system and then flight tests this system on the wing of a Gulfstream 2 testbed aircraft.

  11. The predictive power of airborne gamma ray survey data on the locations of domestic radon hazards in Norway: A strong case for utilizing airborne data in large-scale radon potential mapping.

    PubMed

    Smethurst, M A; Watson, R J; Baranwal, V C; Rudjord, A L; Finne, I

    2017-01-01

    corresponding RP map of the Oslo area has no unclassified parts. We used statistics of proportions to add 95% confidence limits to estimates of RP on our predictive maps, offering public health strategists an objective measure of uncertainty in the model. The geological and AGRS RP maps were further compared in terms of their performances in correctly classifying local areas known to be radon affected and less affected. Both maps were accurate in their predictions; however the AGRS map out-performed the geology map in its ability to offer confident predictions of RP for all of the local areas tested. We compared the AGRS RP map with the 2015 distribution of population in the Oslo area to determine the likely impact of radon contamination on the population. 11.4% of the population currently reside in the area classified as radon affected. 34% of ground floor living spaces in this affected area are expected to exceed the maximum limit of 200 Bq/m(3), while 8.4% of similar spaces outside the affected area exceed this same limit, indicating that the map is very efficient at separating areas with quite different radon contamination profiles. The usefulness of the AGRS RP map in guiding new indoor radon surveys in the Oslo area was also examined. It is shown that indoor measuring programmes targeted on elevated RP areas could be as much as 6 times more efficient at identifying ground floor living spaces above the radon action level compared with surveys based on a random sampling strategy. Also, targeted measuring using the AGRS RP map as a guide makes it practical to search for the worst affected homes in the Oslo area: 10% of the incidences of very high radon contamination in ground floor living spaces (≥800 Bq/m(3)) are concentrated in just 1.2% of the populated part of the area.

  12. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  13. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project--design, population and data harmonization of a large-scale, international study.

    PubMed

    Boffetta, Paolo; Bobak, Martin; Borsch-Supan, Axel; Brenner, Hermann; Eriksson, Sture; Grodstein, Fran; Jansen, Eugene; Jenab, Mazda; Juerges, Hendrik; Kampman, Ellen; Kee, Frank; Kuulasmaa, Kari; Park, Yikyung; Tjonneland, Anne; van Duijn, Cornelia; Wilsgaard, Tom; Wolk, Alicja; Trichopoulos, Dimitrios; Bamia, Christina; Trichopoulou, Antonia

    2014-12-01

    There is a public health demand to prevent health conditions which lead to increased morbidity and mortality among the rapidly-increasing elderly population. Data for the incidence of such conditions exist in cohort studies worldwide, which, however, differ in various aspects. The Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES) project aims at harmonizing data from existing major longitudinal studies for the elderly whilst focussing on cardiovascular diseases, diabetes mellitus, cancer, fractures and cognitive impairment in order to estimate their prevalence, incidence and cause-specific mortality, and identify lifestyle, socioeconomic, and genetic determinants and biomarkers for the incidence of and mortality from these conditions. A survey instrument assessing ageing-related conditions of the elderly will be also developed. Fourteen cohort studies participate in CHANCES with 683,228 elderly (and 150,210 deaths), from 23 European and three non-European countries. So far, 287 variables on health conditions and a variety of exposures, including biomarkers and genetic data have been harmonized. Different research hypotheses are investigated with meta-analyses. The results which will be produced can help international organizations, governments and policy-makers to better understand the broader implications and consequences of ageing and thus make informed decisions.

  14. Large-Scale Molecular Gas Survey in 12CO, 13CO and C18O (J=2-1) with the Osaka 1.85m mm-submm Telescope

    NASA Astrophysics Data System (ADS)

    Onishi, Toshikazu; Nishimura, Atsushi; Tokuda, Kazuki; Harada, Ryohei; Dobashi, Kazuhito; Shimoikura, Tomomi; Kimura, Kimihiro; Ogawa, Hideo

    2015-08-01

    Molecular clouds are sites of star formation, and rotational transition lines of carbon monoxide (CO) have been widely used to investigate the distribution, physical properties, and kinematics to understand the star formation process in the Galaxy and external galaxies. Although J=1-0 lines of CO are powerful tools to investigate the mass of molecular content of the interstellar medium, the other transitions with different critical densities for the excitation are needed to investigate the local density and temperature, which are important to know the evolutionary status of molecular clouds. We have thus developed a new mm-submm telescope with a diameter of 1.85m installed at the Nobeyama Radio Observatory (Onishi et al. 2013). The scientific goal is to precisely reveal physical properties of molecular clouds in the Galaxy by obtaining a large-scale distribution of molecular gas, which also can be compared with large-scale observations in various wavelengths. The target frequency is ~230 GHz; simultaneous observations in J=2-1 lines of 12CO, 13CO, C18O are achieved with a beam size (HPBW) of 2.7 arcmin. Currently, about 1500 square degrees are covered including the galactic plane (L = 5° ~ 220° with |B| ≤ 1° and star forming regions (Orion, Taurus, Cygnus OB7/X, Opuichus, Aquila and so on). The observations of Orion A and B were compared with J=1-0 of the 12CO, 13CO, and C18O data at the same angular resolution to derive the spatial distributions of the physical properties of the molecular gas (Nishimura et al. 2015). We then explored the large velocity gradient formalism to determine the gas density and temperature using line combinations of 12CO(2-1), 13CO(2-1), and 13CO(1-0). We found that this line combination effectively can solve the density and temperature of the molecular cloud in this size scale, which can be compared with the star formation activity there. These J=2-1 data of the Galactic molecular clouds will be precious for the comparison with

  15. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  16. Large-Scale Visual Data Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  17. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  18. Planned NLM/AHCPR large-scale vocabulary test: using UMLS technology to determine the extent to which controlled vocabularies cover terminology needed for health care and public health.

    PubMed Central

    Humphreys, B L; Hole, W T; McCray, A T; Fitzmaurice, J M

    1996-01-01

    The National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR) are sponsoring a test to determine the extent to which a combination of existing health-related terminologies covers vocabulary needed in health information systems. The test vocabularies are the 30 that are fully or partially represented in the 1996 edition of the Unified Medical Language System (UMLS) Metathesaurus, plus three planned additions: the portions of SNOMED International not in the 1996 Metathesaurus Read Clinical Classification, and the Logical Observations Identifiers, Names, and Codes (LOINC) system. These vocabularies are available to testers through a special interface to the Internet-based UMLS Knowledge Source Server. The test will determine the ability of the test vocabularies to serve as a source of controlled vocabulary for health data systems and applications. It should provide the basis for realistic resource estimates for developing and maintaining a comprehensive "standard" health vocabulary that is based on existing terminologies. PMID:8816351

  19. Development of the adult and child complementary medicine questionnaires fielded on the National Health Interview Survey

    PubMed Central

    2013-01-01

    The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications. PMID:24267412

  20. Development of the adult and child complementary medicine questionnaires fielded on the National Health Interview Survey.

    PubMed

    Stussman, Barbara J; Bethell, Christina D; Gray, Caroline; Nahin, Richard L

    2013-11-23

    The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications.

  1. The 2013 Canadian Forces Mental Health Survey

    PubMed Central

    Bennett, Rachel E.; Boulos, David; Garber, Bryan G.; Jetly, Rakesh; Sareen, Jitender

    2016-01-01

    Objective: The 2013 Canadian Forces Mental Health Survey (CFMHS) collected detailed information on mental health problems, their impacts, occupational and nonoccupational determinants of mental health, and the use of mental health services from a random sample of 8200 serving personnel. The objective of this article is to provide a firm scientific foundation for understanding and interpreting the CFMHS findings. Methods: This narrative review first provides a snapshot of the Canadian Armed Forces (CAF), focusing on 2 key determinants of mental health: the deployment of more than 40,000 personnel in support of the mission in Afghanistan and the extensive renewal of the CAF mental health system. The findings of recent population-based CAF mental health research are reviewed, with a focus on findings from the very similar mental health survey done in 2002. Finally, key aspects of the methods of the 2013 CFMHS are presented. Results: The findings of 20 peer-reviewed publications using the 2002 mental health survey data are reviewed, along with those of 25 publications from other major CAF mental health research projects executed over the past decade. Conclusions: More than a decade of population-based mental health research in the CAF has provided a detailed picture of its mental health and use of mental health services. This knowledge base and the homology of the 2013 survey with the 2002 CAF survey and general population surveys in 2002 and 2012 will provide an unusual opportunity to use the CFMHS to situate mental health in the CAF in a historical and societal perspective. PMID:27270738

  2. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  3. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  4. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  5. What is a large-scale dynamo?

    NASA Astrophysics Data System (ADS)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  6. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  7. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  8. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  9. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  10. Large-scale Heterogeneous Network Data Analysis

    DTIC Science & Technology

    2012-07-31

    Data for Multi-Player Influence Maximization on Social Networks.” KDD 2012 (Demo).  Po-Tzu Chang , Yen-Chieh Huang, Cheng-Lun Yang, Shou-De Lin, Pu...Jen Cheng. “Learning-Based Time-Sensitive Re-Ranking for Web Search.” SIGIR 2012 (poster)  Hung -Che Lai, Cheng-Te Li, Yi-Chen Lo, and Shou-De Lin...Exploiting and Evaluating MapReduce for Large-Scale Graph Mining.” ASONAM 2012 (Full, 16% acceptance ratio).  Hsun-Ping Hsieh , Cheng-Te Li, and Shou

  11. Pregnancy and Birth Survey of the Fukushima Health Management Survey.

    PubMed

    Ishii, Kayoko; Goto, Aya; Ota, Misao; Yasumura, Seiji; Fujimori, Keiya

    2017-03-01

    The Pregnancy and Birth Survey was started by Fukushima Medical University as part of the Fukushima Health Management Survey in 2011 in order to assess the physical and mental health of mothers and provide parenting support (telephone counseling) for those in need. The present study reviewed the major findings from 4 annual surveys conducted from 2011 to 2014. Overall proportions of preterm deliveries, low birth weight infants, and congenital anomalies in the first year were almost the same as those in national surveillance data. The prevalence of depressive symptoms among the mothers held steady at about 25% over the 4 years. Regarding the content of parenting counseling, the proportion of mothers who voiced concerns about radiation decreased each year. This survey should be continued to provide support to mothers in Fukushima.

  12. The Cardiff health survey: teaching survey methodology by participation.

    PubMed

    Lewis, P A; Charny, M

    1987-01-01

    Medical students were taught survey methodology by participating in all phases of a large community survey. The survey examined health beliefs, knowledge and behaviour in a sample of 5150 people drawn from the electoral register of the City of Cardiff. The study achieved several educational objectives for the medical students: they met well people in their own homes and had an opportunity to get to know a community; by taking part in a study from the initial phases to the conclusion they could appreciate the context of the theoretical teaching they were being given concurrently in their undergraduate course; they learnt to analyse raw data and produce reports; and they gained insights into the health knowledge, behaviour, attitudes and beliefs of a population. In addition, the survey produced a substantial quantity of valuable data which staff and students are analysing and intend to publish.

  13. Retention of memory for large-scale spaces.

    PubMed

    Ishikawa, Toru

    2013-01-01

    This study empirically examined the retention of large-scale spatial memory, taking different types of spatial knowledge and levels of sense of direction into consideration. A total of 38 participants learned a route from a video and conducted spatial tasks immediately after learning the route and after 2 weeks or 3 months had passed. Results showed that spatial memory decayed over time, at a faster rate for the first 2-week period than for the subsequent period of up to 3 months, although it was not completely forgotten even after 3 months. The rate of forgetting differed depending on the type of knowledge, with landmark and route knowledge deteriorating at a much faster rate than survey knowledge. Sense of direction affected both the acquisition and the retention of survey knowledge. Survey knowledge by people with a good sense of direction was more accurate and decayed much less than that by people with a poor sense of direction.

  14. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  15. Determination of the Average Native Background and the Light-Induced EPR Signals and their Variation in the Teeth Enamel Based on Large-Scale Survey of the Population.

    PubMed

    Ivannikov, Alexander I; Khailov, Artem M; Orlenko, Sergey P; Skvortsov, Valeri G; Stepanenko, Valeri F; Zhumadilov, Kassym Sh; Williams, Benjamin B; Flood, Ann B; Swartz, Harold M

    2016-12-01

    The aim of the study is to determine the average intensity and variation of the native background signal amplitude (NSA) and of the solar light-induced signal amplitude (LSA) in electron paramagnetic resonance (EPR) spectra of tooth enamel for different kinds of teeth and different groups of people. These values are necessary for determination of the intensity of the radiation-induced signal amplitude (RSA) by subtraction of the expected NSA and LSA from the total signal amplitude measured in L-band for in vivo EPR dosimetry. Variation of these signals should be taken into account when estimating the uncertainty of the estimated RSA. A new analysis of several hundred EPR spectra that were measured earlier at X-band in a large-scale examination of the population of the Central Russia was performed. Based on this analysis, the average values and the variation (standard deviation, SD) of the amplitude of the NSA for the teeth from different positions, as well as LSA in outer enamel of the front teeth for different population groups, were determined. To convert data acquired at X-band to values corresponding to the conditions of measurement at L-band, the experimental dependencies of the intensities of the RSA, LSA and NSA on the m.w. power, measured at both X and L-band, were analysed. For the two central upper incisors, which are mainly used in in vivo dosimetry, the mean LSA annual rate induced only in the outer side enamel and its variation were obtained as 10 ± 2 (SD = 8) mGy y(-1), the same for X- and L-bands (results are presented as the mean ± error of mean). Mean NSA in enamel and its variation for the upper incisors was calculated at 2.0 ± 0.2 (SD = 0.5) Gy, relative to the calibrated RSA dose-response to gamma radiation measured under non-power saturation conditions at X-band. Assuming the same value for L-band under non-power saturating conditions, then for in vivo measurements at L-band at 25 mW (power saturation conditions), a mean NSA and its

  16. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  17. Large-scale Globally Propagating Coronal Waves.

    PubMed

    Warmuth, Alexander

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  18. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    NASA Astrophysics Data System (ADS)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  19. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  20. Large-scale structure in the universe. Proceedings. Conference, London (UK), 25 - 26 Mar 1998.

    NASA Astrophysics Data System (ADS)

    1999-01-01

    The following topics were dealt with: Universe: large-scale structure, early Universe: quantum fluctuations, microwave background radiation studies, the Sloan Digital Sky Survey, the 2dF Galaxy Redshift Survey, galaxy clustering evolution, the CNOC2 Field Galaxy Redshift Survey, quasar clustering.

  1. Ethical Issues in School Health: A Survey.

    ERIC Educational Resources Information Center

    Richardson, Glenn E.; Jose, Nancy

    1983-01-01

    The need for a code of ethics for health educators is discussed, and results of a survey of school health educators' opinions on curriculum-related ethical issses are reported. Ethical issues of concern include use of scare tactics, efforts to change behavior and attitudes, and appropriate subject matter. (PP)

  2. California Community Colleges Health Services Survey.

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    In 1990, a telephone survey was conducted of health services offered by California's community colleges. Statewide, 42 of the 71 districts in California levied a health service fee, 18 districts offered services without charge, and 11 offered no service. Districts operating programs collected an average of $15.81 in student fees per credit average…

  3. Hispanic Health Care Survey of Southeastern Wisconsin.

    ERIC Educational Resources Information Center

    Kvasnica, Barbara; And Others

    The results of a study on the health care needs and utilization patterns of Hispanic (primarily Mexican American) families in southeastern Wisconsin are presented in this report. The methodology of the study, which included two surveys in a 9 county area, is described. Findings of the two studies, one focusing on health services utilization by…

  4. The Impact of Large Scale Environments on Cluster Entropy Profiles

    NASA Astrophysics Data System (ADS)

    Trierweiler, Isabella; Su, Yuanyuan

    2017-01-01

    We perform a systematic analysis of 21 clusters imaged by the Suzaku satellite to determine the relation between the richness of cluster environments and entropy at large radii. Entropy profiles for clusters are expected to follow a power-law, but Suzaku observations show that the entropy profiles of many clusters are significantly flattened beyond 0.3 Rvir. While the entropy at the outskirts of clusters is thought to be highly dependent on the large scale cluster environment, the exact nature of the environment/entropy relation is unclear. Using the Sloan Digital Sky Survey and 6dF Galaxy Survey, we study the 20 Mpc large scale environment for all clusters in our sample. We find no strong relation between the entropy deviations at the virial radius and the total luminosity of the cluster surroundings, indicating that accretion and mergers have a more complex and indirect influence on the properties of the gas at large radii. We see a possible anti-correlation between virial temperature and richness of the cluster environment and find that density excess appears to play a larger role in the entropy flattening than temperature, suggesting that clumps of gas can lower entropy.

  5. Survey of health planning proposals.

    PubMed

    Krakauer, R S

    1994-02-01

    It is important that physicians participate in the debate and planning process that will ultimately guide how we reform the way health care is financed and delivered in the United States. Herein is offered a perspective on the problem, one which is not necessarily appreciated by health planners. While we deliver the best quality of care in the world to most of our population, our system has been severely criticized because we fail to provide for access to a substantial minority of our population. Additionally, the cost of the product is considerably greater than that in comparable countries. Attempts to control costs without diminishing quality have introduced expensive complexities into our system without any real success in cutting costs. Several proposals have been advanced to address the issues of cost and access. One of these is a single payer system, common in Europe and Canada, whereby a single agent or group of agents finances all health care through universal rules and means. A system operating in Hawaii is a simple employer mandate to provide health insurance. A uniquely American plan is the Jackson Hole Plan or Managed Competition (now called "Managed Cooperation"). This system is currently popular among national health planners, and involves a defined minimum managed health plan offered by various groups of providers to employees and individuals through health plan purchasing cooperatives. This plan is interesting, but has not been implemented in any jurisdiction, and it is not certain it would accomplish its goals in practice since it is difficult to predict behavior of all parties to such a system.

  6. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  7. Large-scale parametric survival analysis.

    PubMed

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  8. Primer design for large scale sequencing.

    PubMed

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-06-15

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects.

  9. Large scale preparation of pure phycobiliproteins.

    PubMed

    Padgett, M P; Krogmann, D W

    1987-01-01

    This paper describes simple procedures for the purification of large amounts of phycocyanin and allophycocyanin from the cyanobacterium Microcystis aeruginosa. A homogeneous natural bloom of this organism provided hundreds of kilograms of cells. Large samples of cells were broken by freezing and thawing. Repeated extraction of the broken cells with distilled water released phycocyanin first, then allophycocyanin, and provides supporting evidence for the current models of phycobilisome structure. The very low ionic strength of the aqueous extracts allowed allophycocyanin release in a particulate form so that this protein could be easily concentrated by centrifugation. Other proteins in the extract were enriched and concentrated by large scale membrane filtration. The biliproteins were purified to homogeneity by chromatography on DEAE cellulose. Purity was established by HPLC and by N-terminal amino acid sequence analysis. The proteins were examined for stability at various pHs and exposures to visible light.

  10. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  11. Efficient, large scale separation of coal macerals

    SciTech Connect

    Dyrkacz, G.R.; Bloomquist, C.A.A.

    1988-01-01

    The authors believe that the separation of macerals by continuous flow centrifugation offers a simple technique for the large scale separation of macerals. With relatively little cost (/approximately/ $10K), it provides an opportunity for obtaining quite pure maceral fractions. Although they have not completely worked out all the nuances of this separation system, they believe that the problems they have indicated can be minimized to pose only minor inconvenience. It cannot be said that this system completely bypasses the disagreeable tedium or time involved in separating macerals, nor will it by itself overcome the mental inertia required to make maceral separation an accepted necessary fact in fundamental coal science. However, they find their particular brand of continuous flow centrifugation is considerably faster than sink/float separation, can provide a good quality product with even one separation cycle, and permits the handling of more material than a conventional sink/float centrifuge separation.

  12. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  13. Large-scale optimization of neuron arbors

    NASA Astrophysics Data System (ADS)

    Cherniak, Christopher; Changizi, Mark; Won Kang, Du

    1999-05-01

    At the global as well as local scales, some of the geometry of types of neuron arbors-both dendrites and axons-appears to be self-organizing: Their morphogenesis behaves like flowing water, that is, fluid dynamically; waterflow in branching networks in turn acts like a tree composed of cords under tension, that is, vector mechanically. Branch diameters and angles and junction sites conform significantly to this model. The result is that such neuron tree samples globally minimize their total volume-rather than, for example, surface area or branch length. In addition, the arbors perform well at generating the cheapest topology interconnecting their terminals: their large-scale layouts are among the best of all such possible connecting patterns, approaching 5% of optimum. This model also applies comparably to arterial and river networks.

  14. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  15. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  16. Primer design for large scale sequencing.

    PubMed Central

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-01-01

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects. PMID:9611248

  17. Large scale study of tooth enamel

    SciTech Connect

    Bodart, F.; Deconninck, G.; Martin, M.Th.

    1981-04-01

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. One hundred eighty samples of teeth were first analysed using PIXE, backscattering and nuclear reaction techniques. The results were analysed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population.

  18. Modeling the Internet's large-scale topology

    PubMed Central

    Yook, Soon-Hyung; Jeong, Hawoong; Barabási, Albert-László

    2002-01-01

    Network generators that capture the Internet's large-scale topology are crucial for the development of efficient routing protocols and modeling Internet traffic. Our ability to design realistic generators is limited by the incomplete understanding of the fundamental driving forces that affect the Internet's evolution. By combining several independent databases capturing the time evolution, topology, and physical layout of the Internet, we identify the universal mechanisms that shape the Internet's router and autonomous system level topology. We find that the physical layout of nodes form a fractal set, determined by population density patterns around the globe. The placement of links is driven by competition between preferential attachment and linear distance dependence, a marked departure from the currently used exponential laws. The universal parameters that we extract significantly restrict the class of potentially correct Internet models and indicate that the networks created by all available topology generators are fundamentally different from the current Internet. PMID:12368484

  19. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  20. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  1. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  2. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  3. Large scale structure of the globular cluster population in Coma

    NASA Astrophysics Data System (ADS)

    Gagliano, Alexander T.; O'Neill, Conor; Madrid, Juan P.

    2016-01-01

    A search for globular cluster candidates in the Coma Cluster was carried out using Hubble Space Telescope data taken with the Advanced Camera for Surveys. We combine different observing programs including the Coma Treasury Survey in order to obtain the large scale distribution of globular clusters in Coma. Globular cluster candidates were selected through careful morphological inspection and a detailed analysis of their magnitude and colors in the two available wavebands, F475W (Sloan g) and F814W (I). Color Magnitude Diagrams, radial density plots and density maps were then created to characterize the globular cluster population in Coma. Preliminary results show the structure of the intergalactic globular cluster system throughout Coma, among the largest globular clusters catalogues to date. The spatial distribution of globular clusters shows clear overdensities, or bridges, between Coma galaxies. It also becomes evident that galaxies of similar luminosity have vastly different numbers of associated globular clusters.

  4. Testing LSST Dither Strategies for Large-scale Structure Systematics

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter

    2017-01-01

    The Large Synoptic Survey Telescope (LSST) will start a ten-year survey of the southern sky in 2022. Since the telescope observing strategy can lead to artifacts in the observed data, we undertake an investigation of implementing large telescope-pointing offsets (called dithers) as a means to minimize the induced artifacts. We implement various types of dithers, varying in both implementation timescale and the dither geometry, and examine their effects on the r-band coadded depth after the 10-year survey. Then we propagate the depth fluctuations to galaxy counts fluctuations, which are a systematic for large-scale structure studies. We show that the observing strategies induce window function uncertainties which set a constraint on the level of information we can extract from an optimized survey to precisely measure Baryonic Acoustic Oscillations at high redshifts. We find that the best dither strategies lead to window function uncertainties well below the minimum statistical uncertainty after the 10 years of survey, hence not requiring any systematics correction methods. While the systematics level is considerably higher after the first year of the survey, dithering can play a critical role in reducing it. We also explore different cadences, and demonstrate that the best dither strategies minimize the window function uncertainties for various cadences.

  5. Determinants of Cancer Screening Disparities Among Asian Americans: A Systematic Review of Public Health Surveys.

    PubMed

    Jun, Jungmi; Nan, Xiaoli

    2017-04-05

    We conducted a systematic analysis of 24 peer-reviewed literary works that examined Asian Americans' breast, cervical, and colon cancer screening, focusing on empirical findings from large-scale public health surveys (i.e., NHIS, CHIS, HINTS, BRFSS). We provide an overview of relevant research in terms of study characteristics, samples, predictor/covariate of cancer screenings, and key findings. Our analysis indicates that Asian Americans' cancer screening rates are lower than for non-Hispanic Whites for all cancer types in four large-scale public health surveys throughout 17 study years. Acculturation and healthcare access were two significant factors in explaining Asian Americans' cancer screening rates. Cancer fatalism and family cancer history emerged as potential factors that may account for more variances. However, the screening disparities between Asian Americans and whites persist even after adjusting all covariates, including SES, acculturation, healthcare access, health status, and health perception/literacy. More individual and cultural factors should be identified to address these disparities.

  6. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  7. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  8. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  9. Large-scale tides in general relativity

    NASA Astrophysics Data System (ADS)

    Ip, Hiu Yan; Schmidt, Fabian

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  10. Large-scale autostereoscopic outdoor display

    NASA Astrophysics Data System (ADS)

    Reitterer, Jörg; Fidler, Franz; Saint Julien-Wallsee, Ferdinand; Schmid, Gerhard; Gartner, Wolfgang; Leeb, Walter; Schmid, Ulrich

    2013-03-01

    State-of-the-art autostereoscopic displays are often limited in size, effective brightness, number of 3D viewing zones, and maximum 3D viewing distances, all of which are mandatory requirements for large-scale outdoor displays. Conventional autostereoscopic indoor concepts like lenticular lenses or parallax barriers cannot simply be adapted for these screens due to the inherent loss of effective resolution and brightness, which would reduce both image quality and sunlight readability. We have developed a modular autostereoscopic multi-view laser display concept with sunlight readable effective brightness, theoretically up to several thousand 3D viewing zones, and maximum 3D viewing distances of up to 60 meters. For proof-of-concept purposes a prototype display with two pixels was realized. Due to various manufacturing tolerances each individual pixel has slightly different optical properties, and hence the 3D image quality of the display has to be calculated stochastically. In this paper we present the corresponding stochastic model, we evaluate the simulation and measurement results of the prototype display, and we calculate the achievable autostereoscopic image quality to be expected for our concept.

  11. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  12. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  13. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  14. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  15. Demographic and health surveys: a profile.

    PubMed

    Corsi, Daniel J; Neuman, Melissa; Finlay, Jocelyn E; Subramanian, S V

    2012-12-01

    Demographic and Health Surveys (DHS) are comparable nationally representative household surveys that have been conducted in more than 85 countries worldwide since 1984. The DHS were initially designed to expand on demographic, fertility and family planning data collected in the World Fertility Surveys and Contraceptive Prevalence Surveys, and continue to provide an important resource for the monitoring of vital statistics and population health indicators in low- and middle-income countries. The DHS collect a wide range of objective and self-reported data with a strong focus on indicators of fertility, reproductive health, maternal and child health, mortality, nutrition and self-reported health behaviours among adults. Key advantages of the DHS include high response rates, national coverage, high quality interviewer training, standardized data collection procedures across countries and consistent content over time, allowing comparability across populations cross-sectionally and over time. Data from DHS facilitate epidemiological research focused on monitoring of prevalence, trends and inequalities. A variety of robust observational data analysis methods have been used, including cross-sectional designs, repeated cross-sectional designs, spatial and multilevel analyses, intra-household designs and cross-comparative analyses. In this profile, we present an overview of the DHS along with an introduction to the potential scope for these data in contributing to the field of micro- and macro-epidemiology. DHS datasets are available for researchers through MEASURE DHS at www.measuredhs.com.

  16. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2014-12-01

    Figure 8. The Theme River visualization for Hepatitis A, B, C and D Ring graph In order to view more detailed patient level data, we developed...a new patient visualization method called Ring Graph. In Ring Graph, each patient is modeled as a point in a radial coordinate system. The radial...space is subdivided into multiple rings , each of which represents one visualization term that was selected from the association map. These terms are

  17. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2014-04-01

    environmental factors and genetic information (family history). Natural Language Processing ( NLP ) provides a means to augment the NCD data analytics with the...information discovered from these clinical reports. NLP techniques were carried out to process 325791 clinical notes that contain patient...notes from XML format to simple text format and sentence splitting. Advanced level NLP was applied in the form of named entity recognition (NER) for

  18. The effective field theory of cosmological large scale structures

    SciTech Connect

    Carrasco, John Joseph M.; Hertzberg, Mark P.; Senatore, Leonardo

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  19. Halo detection via large-scale Bayesian inference

    NASA Astrophysics Data System (ADS)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  20. Programme coverage, condom use and STI treatment among FSWs in a large-scale HIV prevention programme: results from cross-sectional surveys in 22 districts in southern India

    PubMed Central

    Gautam, Abhishek; Goswami, Prabuddhagopal; Kallam, Srinivasan; Adhikary, Rajatashuvra; Mainkar, Mandar K; Ramesh, Banadakoppa M; Morineau, Guy; George, Bitra; Paranjape, Ramesh S

    2010-01-01

    Objective This paper evaluates Avahan programme's coverage of female sex workers (FSWs), focus on high-risk FSWs and intermediate outcomes. Methods First round of cross-sectional survey data, Integrated Behavioral and Biological Assessments (IBBA), conducted in 22 districts, were aggregated into district categories: Solo, where Avahan was the sole service provider covering all FSWs and Major or Minor where Avahan was not the sole provider, but intended coverage was >50% or ≤ 50% of FSWs respectively. Multivariate logistic regression was applied to compare exposure by district categories, vulnerability factors and intermediate outcomes associated with exposure. Results Reported exposure, evaluated on basis of having received any of three core services, was higher in Solo (75%) compared with Minor (66%) districts. Logistic regression showed that FSWs in solo districts were more likely to be exposed (adjusted odds ratio (AOR)=1.5; 95% CI 1.20 to 1.86) compared with FSWs in Minor districts. Multivariate analysis in Solo districts revealed that FSW with ≥15 clients in the past week had a higher chance of being exposed to core services (AOR=1.56; 95% CI 1.03 to 2.35). Exposure to the three services in Solo Avahan districts was significantly associated with correct knowledge on condom use (AOR=1.36; 95% CI 1.05 to 1.78), consistent condom use with occasional clients (AOR=3.17; 95% CI 2.17 to 4.63) and regular clients (AOR=2.47; 95% CI 1.86 to 3.28) and STI treatment-seeking behaviour (AOR=3.00; 95% CI 1.94 to 4.65). Conclusions Higher coverage of FSWs was achieved in districts where Avahan was the only intervention compared with districts having multiple and longstanding non-Avahan programmes. Exposure in Solo districts was associated with intermediate outcomes; this need to be further evaluated in comparison with non Avahan areas and substantiated through data from next IBBA. PMID:20167734

  1. UAV Data Processing for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV) bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI) i.e. <5,000 hectares. For detail spatial planning purposes in Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan) level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D) and digital terrain models (3D) will be integrated in order to provide Digital Elevation Models (DEM) as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD). Finally this result will be used as the benchmark for alternative geospatial

  2. Health Physics Enrollents and Degrees Survey, 2006 Data

    SciTech Connect

    Oak Ridge Institute for Science and Education

    2007-03-31

    This annual survey collects 2006 data on the number of health physics degrees awarded as well as the number of students enrolled in health physics academic programs. Thirty universities offer health physics degrees; all responded to the survey.

  3. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  4. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  5. Synchronization of coupled large-scale Boolean networks

    NASA Astrophysics Data System (ADS)

    Li, Fangfei

    2014-03-01

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  6. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  7. Colorado Health Occupations Manpower Survey, 1972.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Employment, Denver. Research and Analysis Section.

    This study was conducted to supply information for vocational education planners concerning the employment needs of the health services industry in Colorado. It should also provide some indication of the demand for trained workers in the occupations surveyed by coordinating expected company expansion and replacement needs with the number to be…

  8. The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation

    NASA Astrophysics Data System (ADS)

    Noh, Yookyung

    The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.

  9. Large Scale Archaeological Satellite Classification and Data Mining Tools

    NASA Astrophysics Data System (ADS)

    Canham, Kelly

    Archaeological applications routinely use many different forms of remote sensing imagery, the exception being hyperspectral imagery (HSI). HSI tends to be utilized in a similar fashion as multispectral imagery (MSI) or processed to the point that it can be utilized similarly to MSI, thus reducing the benefits of HSI. However, for large scale archaeological surveys, HSI data can be used to differentiate materials more accurately than MSI because of HSI's larger number of spectral bands. HSI also has the ability to identify multiple materials found within a single pixel (sub-pixel material mixing), which is traditionally not possible with MSI. The Zapotec people of Oaxaca, Mexico, lived in an environment that isolates the individual settlements by rugged mountain ranges and dramatically different ecosystems. The rugged mountains of Oaxaca make large scale ground based archaeological surveys expensive in terms of both time and money. The diverse ecosystems of Oaxaca make multispectral satellite imagery inadequate for local material identification. For these reasons hyperspectral imagery was collected over Oaxaca, Mexico. Using HSI, investigations were conducted into how the Zapotec statehood was impacted by the environment, and conversely, how the environment impacted the statehood. Emphasis in this research is placed on identifying the number of pure materials present in the imagery, what these materials are, and identifying archaeological regions of interest using image processing techniques. The HSI processing techniques applied include a new spatially adaptive spectral unmixing approach (LoGlo) to identify pure materials across broad regions of Oaxaca, vegetation indices analysis, and spectral change detection algorithms. Verification of identified archaeological sites is completed using Geospatial Information System (GIS) tools, ground truth data, and high-resolution satellite MSI. GIS tools are also used to analyze spatial trends in lost archaeological sites due

  10. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model.

  11. Probing large-scale structure with radio observations

    NASA Astrophysics Data System (ADS)

    Brown, Shea D.

    This thesis focuses on detecting magnetized relativistic plasma in the intergalactic medium (IGM) of filamentary large-scale structure (LSS) by observing synchrotron emission emitted by structure formation shocks. Little is known about the IGM beyond the largest clusters of galaxies, and synchrotron emission holds enormous promise as a means of probing magnetic fields and relativistic particle populations in these low density regions. I'll first report on observations taken at the Very Large Array and the Westerbork Synthesis Radio Telescope of the diffuse radio source 0809+39. I use these observations to demonstrate that 0809+39 is likely the first "radio relic" discovered that is not associated with a rich |"X-ray emitting cluster of galaxies. I then demonstrate that an unconventional reprocessing of the NVSS polarization survey can reveal structures on scales from 15' to hundreds of degrees, far larger than the nominal shortest-baseline scale. This yields hundreds of new diffuse sources as well as the identification of a new nearby galactic loop . These observations also highlight the major obstacle that diffuse galactic foreground emission poses for any search for large-scale, low surface- brightness extragalactic emission. I therefore explore the cross-correlation of diffuse radio emission with optical tracers of LSS as a means of statistically detecting the presence of magnetic fields in the low-density regions of the cosmic web. This initial study with the Bonn 1.4 GHz radio survey yields an upper limit of 0.2 mG for large-scale filament magnetic fields. Finally, I report on new Green Bank Telescope and Westerbork Synthesis Radio Telescope observations of the famous Coma cluster of galaxies. Major findings include an extension to the Coma cluster radio relic source 1253+275 which makes its total extent ~2 Mpc, as well as a sharp edge, or "front", on the Western side of the radio halo which shows a strong correlation with merger activity associated with an

  12. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  13. Social surveys and health policy implications for national health insurance.

    PubMed Central

    Aday, L A; Andersen, R; Anderson, O W

    1977-01-01

    The authors explore the utility of applying social survey data (a) to evaluate the impact of existing health programs and (b) to rank-order priorities concerning future health care policies. Based on national survey data from 1963, 1970, and 1976, they concluded that although Medicare and Medicaid have enabled more people to see a physician than ever before, a large proportion of the population still registers dissatisfaction with the health care they received--particularly with respect to their out-of-pocket costs for obtaining it. However, national health insurance options favored by the majority of the population--particularly those who can best afford the cost of care--suggest preferences for programs that incorporate some mix of existing modes of financing rather than those that provide for substantial restructuring of the current system. PMID:337340

  14. Linking Large-Scale Reading Assessments: Comment

    ERIC Educational Resources Information Center

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  15. Large-Scale Survey for Tickborne Bacteria, Khammouan Province, Laos

    PubMed Central

    Vongphayloth, Khamsing; Vongsouvath, Malavanh; Grandadam, Marc; Brey, Paul T.; Newton, Paul N.; Sutherland, Ian W.; Dittrich, Sabine

    2016-01-01

    We screened 768 tick pools containing 6,962 ticks from Khammouan Province, Laos, by using quantitative real-time PCR and identified Rickettsia spp., Ehrlichia spp., and Borrelia spp. Sequencing of Rickettsia spp.–positive and Borrelia spp.–positive pools provided evidence for distinct genotypes. Our results identified bacteria with human disease potential in ticks in Laos. PMID:27532491

  16. Sufficient observables for large-scale structure in galaxy surveys

    NASA Astrophysics Data System (ADS)

    Carron, J.; Szapudi, I.

    2014-03-01

    Beyond the linear regime, the power spectrum and higher order moments of the matter field no longer capture all cosmological information encoded in density fluctuations. While non-linear transforms have been proposed to extract this information lost to traditional methods, up to now, the way to generalize these techniques to discrete processes was unclear; ad hoc extensions had some success. We pointed out in Carron and Szapudi's paper that the logarithmic transform approximates extremely well the optimal `sufficient statistics', observables that extract all information from the (continuous) matter field. Building on these results, we generalize optimal transforms to discrete galaxy fields. We focus our calculations on the Poisson sampling of an underlying lognormal density field. We solve and test the one-point case in detail, and sketch out the sufficient observables for the multipoint case. Moreover, we present an accurate approximation to the sufficient observables in terms of the mean and spectrum of a non-linearly transformed field. We find that the corresponding optimal non-linear transformation is directly related to the maximum a posteriori Bayesian reconstruction of the underlying continuous field with a lognormal prior as put forward in the paper of Kitaura et al.. Thus, simple recipes for realizing the sufficient observables can be built on previously proposed algorithms that have been successfully implemented and tested in simulations.

  17. "Cosmological Parameters from Large Scale Structure"

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    2005-01-01

    This grant has provided primary support for graduate student Mark Neyrinck, and some support for the PI and for colleague Nick Gnedin, who helped co-supervise Neyrinck. This award had two major goals. First, to continue to develop and apply methods for measuring galaxy power spectra on large, linear scales, with a view to constraining cosmological parameters. And second, to begin try to understand galaxy clustering at smaller. nonlinear scales well enough to constrain cosmology from those scales also. Under this grant, the PI and collaborators, notably Max Tegmark. continued to improve their technology for measuring power spectra from galaxy surveys at large, linear scales. and to apply the technology to surveys as the data become available. We believe that our methods are best in the world. These measurements become the foundation from which we and other groups measure cosmological parameters.

  18. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  19. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  20. Higher Education Teachers' Descriptions of Their Own Learning: A Large-Scale Study of Finnish Universities of Applied Sciences

    ERIC Educational Resources Information Center

    Töytäri, Aija; Piirainen, Arja; Tynjälä, Päivi; Vanhanen-Nuutinen, Liisa; Mäki, Kimmo; Ilves, Vesa

    2016-01-01

    In this large-scale study, higher education teachers' descriptions of their own learning were examined with qualitative analysis involving application of principles of phenomenographic research. This study is unique: it is unusual to use large-scale data in qualitative studies. The data were collected through an e-mail survey sent to 5960 teachers…

  1. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  2. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  3. A large-scale crop protection bioassay data set

    NASA Astrophysics Data System (ADS)

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-07-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  4. A large-scale crop protection bioassay data set

    PubMed Central

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  5. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  6. Human and animal health surveys among pastoralists.

    PubMed

    Schelling, E; Greter, H; Kessely, H; Abakar, M F; Ngandolo, B N; Crump, L; Bold, B; Kasymbekov, J; Baljinnyam, Z; Fokou, G; Zinsstag, J; Bonfoh, B; Hattendorf, J; Béchir, M

    2016-11-01

    Valid human and livestock health surveys, including longitudinal follow-up, are feasible among mobile pastoralists and provide fundamental information to agencies for interventions that are responsive to realities and effective in addressing the needs of pastoralists. However, pastoralists are often excluded from studies, surveillance systems and health programmes. The occurrence of preventable and treatable diseases such as perinatal tetanus, measles and tuberculosis are indicative of limited access to health providers and information. It is difficult for health services to include effective outreach with their available financial and human resources. One consequence is that maternal mortality rates among pastoralists are unacceptably high. Environmental determinants such as the quality of water and the pasture ecosystems further influence the morbidity of pastoralists. In the Sahel, the nutritional status of pastoralist children is seasonally better than that of settled children; but pastoralist women tend to have higher acute malnutrition rates. Pastoralist women are more vulnerable than men to exclusion from health services for different context-specific reasons. Evidence-based control measures can be assessed in cluster surveys with simultaneous assessments of health among people and livestock, where data on costs of disease and interventions are also collected. These provide important arguments for governmental and non-governmental agencies for intervention development. New, integrated One Health surveillance systems making use of mobile technology and taking into account local concepts and the experiences and priorities of pastoralist communities, combined with sound field data, are essential to develop and provide adapted human and animal health services that are inclusive for mobile pastoralist communities and allow them to maintain their mobile way of life.

  7. Modelling large-scale halo bias using the bispectrum

    NASA Astrophysics Data System (ADS)

    Pollack, Jennifer E.; Smith, Robert E.; Porciani, Cristiano

    2012-03-01

    We study the relation between the density distribution of tracers for large-scale structure and the underlying matter distribution - commonly termed bias - in the Λ cold dark matter framework. In particular, we examine the validity of the local model of biasing at quadratic order in the matter density. This model is characterized by parameters b1 and b2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales. We find that, whilst the fits are reasonably good, the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no smoothing scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo and halo-mass power spectra and from these construct estimates of the effective large-scale bias as a guide for b1. We measure the configuration dependence of the halo bispectra Bhhh and reduced bispectra Qhhh for very large-scale k-space triangles. From these data, we constrain b1 and b2, taking into account the full bispectrum covariance matrix. Using the lowest order perturbation theory, we find that for Bhhh the best-fitting parameters are in reasonable agreement with one another as the triangle scale is varied, although the fits become poor as smaller scales are included. The same is true for Qhhh. The best-fitting values were found to depend on the discreteness correction. This led us to consider halo-mass cross-bispectra. The results from these statistics supported our earlier findings. We then developed a test to explore whether the inconsistency in the recovered bias parameters could be attributed to missing higher order corrections in the models. We prove that low-order expansions are not sufficiently accurate to model the data, even on scales k1˜ 0.04 h Mpc-1. If robust inferences concerning bias are to be drawn

  8. Simulating the large-scale structure of HI intensity maps

    SciTech Connect

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel E-mail: aseem@iucaa.in E-mail: alexandre.refregier@phys.ethz.ch E-mail: joel.akeret@phys.ethz.ch

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 2048{sup 3} particles (particle mass 1.6 × 10{sup 11} M{sub ⊙} / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (10{sup 8} M{sub ⊙} / h < M{sub halo} < 10{sup 13} M{sub ⊙} / h), we assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 ∼< z ∼< 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  9. Generating intrinsic dipole anisotropy in the large scale structures

    NASA Astrophysics Data System (ADS)

    Ghosh, Shamik

    2014-03-01

    There have been recent reports of unexpectedly large velocity dipole in the NRAO VLA Sky Survey (NVSS) data. We investigate whether the excess in the NVSS dipole reported can be of cosmological origin. We assume a long wavelength inhomogeneous scalar perturbation of the form αsin(κz) and study its effects on the matter density contrasts. Assuming an ideal fluid model, we calculate, in the linear regime, the contribution of the inhomogeneous mode to the density contrast. We calculate the expected dipole in the large scale structure (LSS) for two cases, first assuming that the mode is still superhorizon everywhere, and second assuming the mode is subhorizon but has crossed the horizon deep in matter domination and is subhorizon everywhere in the region of the survey (NVSS). In both cases, we find that such an inhomogeneous scalar perturbation is sufficient to generate the reported values of dipole anisotropy in LSS. For the superhorizon modes, we find values which are consistent with both cosmic microwave background and NVSS results. We also predict signatures for the model which can be tested by future observations.

  10. Large-scale structural monitoring systems

    NASA Astrophysics Data System (ADS)

    Solomon, Ian; Cunnane, James; Stevenson, Paul

    2000-06-01

    Extensive structural health instrumentation systems have been installed on three long-span cable-supported bridges in Hong Kong. The quantities measured include environment and applied loads (such as wind, temperature, seismic and traffic loads) and the bridge response to these loadings (accelerations, displacements, and strains). Measurements from over 1000 individual sensors are transmitted to central computing facilities via local data acquisition stations and a fault- tolerant fiber-optic network, and are acquired and processed continuously. The data from the systems is used to provide information on structural load and response characteristics, comparison with design, optimization of inspection, and assurance of continued bridge health. Automated data processing and analysis provides information on important structural and operational parameters. Abnormal events are noted and logged automatically. Information of interest is automatically archived for post-processing. Novel aspects of the instrumentation system include a fluid-based high-accuracy long-span Level Sensing System to measure bridge deck profile and tower settlement. This paper provides an outline of the design and implementation of the instrumentation system. A description of the design and implementation of the data acquisition and processing procedures is also given. Examples of the use of similar systems in monitoring other large structures are discussed.

  11. Neutrino footprint in large scale structure

    NASA Astrophysics Data System (ADS)

    Garay, Carlos Peña; Verde, Licia; Jimenez, Raul

    2017-03-01

    Recent constrains on the sum of neutrino masses inferred by analyzing cosmological data, show that detecting a non-zero neutrino mass is within reach of forthcoming cosmological surveys. Such a measurement will imply a direct determination of the absolute neutrino mass scale. Physically, the measurement relies on constraining the shape of the matter power spectrum below the neutrino free streaming scale: massive neutrinos erase power at these scales. However, detection of a lack of small-scale power from cosmological data could also be due to a host of other effects. It is therefore of paramount importance to validate neutrinos as the source of power suppression at small scales. We show that, independent on hierarchy, neutrinos always show a footprint on large, linear scales; the exact location and properties are fully specified by the measured power suppression (an astrophysical measurement) and atmospheric neutrinos mass splitting (a neutrino oscillation experiment measurement). This feature cannot be easily mimicked by systematic uncertainties in the cosmological data analysis or modifications in the cosmological model. Therefore the measurement of such a feature, up to 1% relative change in the power spectrum for extreme differences in the mass eigenstates mass ratios, is a smoking gun for confirming the determination of the absolute neutrino mass scale from cosmological observations. It also demonstrates the synergy between astrophysics and particle physics experiments.

  12. [Acknowledgement and satisfaction survey on mental health].

    PubMed

    Pedraza, María S; Noriega, Nicolás H

    2015-01-01

    Knowing patients opinion about mental healthcare services is highly important in clinical daily work settings, especially in the interior regions of the country. Since information on the subject is very scarce, gathering further data would be helpful to design active policies in this field. With this aim in mind, a work team composed of a Psychologist and an Immunologist of the city of Villa Marí in Córdoba, Argentina, conducted a survey to enquire about the population's knowledge regarding mental health care services and their degree of satisfaction. This survey was done using a database from a healthcare consultancy, and data collection was carried out by means of electronic mails, which contained mostly closed questions. Two conclusions were drawn as relevant information; the first one was that only 1% of the subjects answered the survey, which would reveal little interest in it. The second conclusion was that, in spite of the investigators assumptions, most of the population surveyed (over 50%), had thorough knowledge of the use of mental healthcare services.

  13. Illinois department of public health H1N1/A pandemic communications evaluation survey.

    SciTech Connect

    Walsh, D.; Decision and Information Sciences

    2010-09-16

    Because of heightened media coverage, a 24-hour news cycle and the potential miscommunication of health messages across all levels of government during the onset of the H1N1 influenza outbreak in spring 2009, the Illinois Department of Public Health (IDPH) decided to evaluate its H1N1 influenza A communications system. IDPH wanted to confirm its disease information and instructions were helping stakeholders prepare for and respond to a novel influenza outbreak. In addition, the time commitment involved in preparing, issuing, monitoring, updating, and responding to H1N1 federal guidelines/updates and media stories became a heavy burden for IDPH staff. The process and results of the H1N1 messaging survey represent a best practice that other health departments and emergency management agencies can replicate to improve coordination efforts with stakeholder groups during both emergency preparedness and response phases. Importantly, the H1N1 survey confirmed IDPH's messages were influencing stakeholders decisions to activate their pandemic plans and initiate response operations. While there was some dissatisfaction with IDPH's delivery of information and communication tools, such as the fax system, this report should demonstrate to IDPH that its core partners believe it has the ability and expertise to issue timely and accurate instructions that can help them respond to a large-scale disease outbreak in Illinois. The conclusion will focus on three main areas: (1) the survey development process, (2) survey results: best practices and areas for improvement and (3) recommendations: next steps.

  14. Use of domestic fuels for large-scale space heating and for district heating

    SciTech Connect

    Seppaelae, R.; Asplund, D.

    1980-01-01

    The aim of the study was to survey the heating systems for large-scale space heating and district heating with domestic fuels or under development in Finland, and to study alternative technico-economic applications in the size class of 0.5 - 5 MW.

  15. Strategic Leadership for Large-Scale Reform: The Case of England's National Literacy and Numeracy Strategy

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Jantzi, Doris; Earl, Lorna; Watson, Nancy; Levin, Benjamin; Fullan, Michael

    2004-01-01

    Both 'strategic' and 'distributed' forms of leadership are considered promising responses to the demands placed on school systems by large-scale reform initiatives. Using observation, interview and survey data collected as part of a larger evaluation of England's National Literacy and Numeracy Strategies, this study inquired about sources of…

  16. Understanding Participation in E-Learning in Organizations: A Large-Scale Empirical Study of Employees

    ERIC Educational Resources Information Center

    Garavan, Thomas N.; Carbery, Ronan; O'Malley, Grace; O'Donnell, David

    2010-01-01

    Much remains unknown in the increasingly important field of e-learning in organizations. Drawing on a large-scale survey of employees (N = 557) who had opportunities to participate in voluntary e-learning activities, the factors influencing participation in e-learning are explored in this empirical paper. It is hypothesized that key variables…

  17. Information Tailoring Enhancements for Large-Scale Social Data

    DTIC Science & Technology

    2016-06-15

    Intelligent Automation Incorporated Information Tailoring Enhancements for Large-Scale...Automation Incorporated Progress Report No. 3 Information Tailoring Enhancements for Large-Scale Social Data Submitted in accordance with...also gathers information about entities from all news articles and displays it on over one million entity pages [5][6], and the information is made

  18. Recovering the full velocity and density fields from large-scale redshift-distance samples

    NASA Technical Reports Server (NTRS)

    Bertschinger, Edmund; Dekel, Avishai

    1989-01-01

    A new method for extracting the large-scale three-dimensional velocity and mass density fields from measurements of the radial peculiar velocities is presented. Galaxies are assumed to trace the velocity field rather than the mass. The key assumption made is that the Lagrangian velocity field has negligible vorticity, as might be expected from perturbations that grew by gravitational instability. By applying the method to cosmological N-body simulations, it is demonstrated that it accurately reconstructs the velocity field. This technique promises a direct determination of the mass density field and the initial conditions for the formation of large-scale structure from galaxy peculiar velocity surveys.

  19. Weak lensing of large scale structure in the presence of screening

    SciTech Connect

    Tessore, Nicolas; Metcalf, R. Benton; Giocoli, Carlo E-mail: hans.winther@astro.ox.ac.uk E-mail: pedro.ferreira@physics.ox.ac.uk

    2015-10-01

    A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.

  20. Reconstructing Information in Large-Scale Structure via Logarithmic Mapping

    NASA Astrophysics Data System (ADS)

    Szapudi, Istvan

    We propose to develop a new method to extract information from large-scale structure data combining two-point statistics and non-linear transformations; before, this information was available only with substantially more complex higher-order statistical methods. Initially, most of the cosmological information in large-scale structure lies in two-point statistics. With non- linear evolution, some of that useful information leaks into higher-order statistics. The PI and group has shown in a series of theoretical investigations how that leakage occurs, and explained the Fisher information plateau at smaller scales. This plateau means that even as more modes are added to the measurement of the power spectrum, the total cumulative information (loosely speaking the inverse errorbar) is not increasing. Recently we have shown in Neyrinck et al. (2009, 2010) that a logarithmic (and a related Gaussianization or Box-Cox) transformation on the non-linear Dark Matter or galaxy field reconstructs a surprisingly large fraction of this missing Fisher information of the initial conditions. This was predicted by the earlier wave mechanical formulation of gravitational dynamics by Szapudi & Kaiser (2003). The present proposal is focused on working out the theoretical underpinning of the method to a point that it can be used in practice to analyze data. In particular, one needs to deal with the usual real-life issues of galaxy surveys, such as complex geometry, discrete sam- pling (Poisson or sub-Poisson noise), bias (linear, or non-linear, deterministic, or stochastic), redshift distortions, pro jection effects for 2D samples, and the effects of photometric redshift errors. We will develop methods for weak lensing and Sunyaev-Zeldovich power spectra as well, the latter specifically targetting Planck. In addition, we plan to investigate the question of residual higher- order information after the non-linear mapping, and possible applications for cosmology. Our aim will be to work out

  1. Testing gravity using large-scale redshift-space distortions

    NASA Astrophysics Data System (ADS)

    Raccanelli, Alvise; Bertacca, Daniele; Pietrobon, Davide; Schmidt, Fabian; Samushia, Lado; Bartolo, Nicola; Doré, Olivier; Matarrese, Sabino; Percival, Will J.

    2013-11-01

    We use luminous red galaxies from the Sloan Digital Sky Survey (SDSS) II to test the cosmological structure growth in two alternatives to the standard Λ cold dark matter (ΛCDM)+general relativity (GR) cosmological model. We compare observed three-dimensional clustering in SDSS Data Release 7 (DR7) with theoretical predictions for the standard vanilla ΛCDM+GR model, unified dark matter (UDM) cosmologies and the normal branch Dvali-Gabadadze-Porrati (nDGP). In computing the expected correlations in UDM cosmologies, we derive a parametrized formula for the growth factor in these models. For our analysis we apply the methodology tested in Raccanelli et al. and use the measurements of Samushia et al. that account for survey geometry, non-linear and wide-angle effects and the distribution of pair orientation. We show that the estimate of the growth rate is potentially degenerate with wide-angle effects, meaning that extremely accurate measurements of the growth rate on large scales will need to take such effects into account. We use measurements of the zeroth and second-order moments of the correlation function from SDSS DR7 data and the Large Suite of Dark Matter Simulations (LasDamas), and perform a likelihood analysis to constrain the parameters of the models. Using information on the clustering up to rmax = 120 h-1 Mpc, and after marginalizing over the bias, we find, for UDM models, a speed of sound c∞ ≤ 6.1e-4, and, for the nDGP model, a cross-over scale rc ≥ 340 Mpc, at 95 per cent confidence level.

  2. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  3. Probing for Dark Energy Perturbations using the CMB and Large Scale Structure?

    NASA Astrophysics Data System (ADS)

    Bean, Rachel; Doré, Olivier

    2004-12-01

    We review the implications of having a non-trivial matter component in the universe and the potential for detecting such a component through the matter power spectrum and ISW effect. We adopt a phenomenological approach and consider the mysterious dark energy to be a cosmic fluid. It is thus fully characterized, up to linear order, by its equation of state and its speed of sound. Whereas the equation of state has been widely studied in the literature, less interest has been devoted to the speed of sound. Its observational consequences come predominantly from very large scale modes of dark matter perturbations (k < 0.01hMpc-1). Since these modes have hardly been probed so far by large scale galaxy surveys, we investigate whether joint constraints that can be placed on those two quantities using the recent CMB fluctuations measurements by WMAP as well as the recently measured CMB large scale structure cross-correlation.

  4. Non-Gaussianity and large-scale structure in a two-field inflationary model

    SciTech Connect

    Tseliakhovich, Dmitriy; Hirata, Christopher

    2010-08-15

    Single-field inflationary models predict nearly Gaussian initial conditions, and hence a detection of non-Gaussianity would be a signature of the more complex inflationary scenarios. In this paper we study the effect on the cosmic microwave background and on large-scale structure from primordial non-Gaussianity in a two-field inflationary model in which both the inflaton and curvaton contribute to the density perturbations. We show that in addition to the previously described enhancement of the galaxy bias on large scales, this setup results in large-scale stochasticity. We provide joint constraints on the local non-Gaussianity parameter f-tilde{sub NL} and the ratio {xi} of the amplitude of primordial perturbations due to the inflaton and curvaton using WMAP and Sloan Digital Sky Survey data.

  5. Non-Gaussianity and Large Scale Structure in a two-field Inflationary model

    SciTech Connect

    Tseliakhovich, D.; Slosar, A.; Hirata, C.

    2010-08-30

    Single-field inflationary models predict nearly Gaussian initial conditions, and hence a detection of non-Gaussianity would be a signature of the more complex inflationary scenarios. In this paper we study the effect on the cosmic microwave background and on large-scale structure from primordial non-Gaussianity in a two-field inflationary model in which both the inflaton and curvaton contribute to the density perturbations. We show that in addition to the previously described enhancement of the galaxy bias on large scales, this setup results in large-scale stochasticity. We provide joint constraints on the local non-Gaussianity parameter f*{sub NL} and the ratio {zeta} of the amplitude of primordial perturbations due to the inflaton and curvaton using WMAP and Sloan Digital Sky Survey data.

  6. A Review of International Large-Scale Assessments in Education: Assessing Component Skills and Collecting Contextual Data. PISA for Development

    ERIC Educational Resources Information Center

    Cresswell, John; Schwantner, Ursula; Waters, Charlotte

    2015-01-01

    This report reviews the major international and regional large-scale educational assessments, including international surveys, school-based surveys and household-based surveys. The report compares and contrasts the cognitive and contextual data collection instruments and implementation methods used by the different assessments in order to identify…

  7. School-Based Health Care State Policy Survey. Executive Summary

    ERIC Educational Resources Information Center

    National Assembly on School-Based Health Care, 2012

    2012-01-01

    The National Assembly on School-Based Health Care (NASBHC) surveys state public health and Medicaid offices every three years to assess state-level public policies and activities that promote the growth and sustainability of school-based health services. The FY2011 survey found 18 states (see map below) reporting investments explicitly dedicated…

  8. Determining Environmental Impacts of Large Scale Irrigation in Turkey

    NASA Astrophysics Data System (ADS)

    Simpson, K.; Douglas, E. M.; Limbrunner, J. F.; Ozertan, G.

    2010-12-01

    In 1989, the Turkish government launched their most comprehensive regional development plan in history entitled the Southeastern Anatolia Project (SAP) which focuses on improving the quality of life and income level within the most underdeveloped region in Turkey. This project aims to integrate sustainable human development through agriculture, industry, transportation, education, health and rural and urban infrastructure building. In May 2008, a new action plan was announced for the region which includes the designation of almost 800,000 hectares of previously unirrigated land to be open for irrigation within the next five years. If not done in a sustainable manner, such a large-scale irrigation project could cause severe environmental impacts. The first objective of our research is to use computer simulations to reproduce the observed environmental impacts of irrigated agriculture in this arid region, primarily by simulating the effects of soil salinization. The second objective of our research is to estimate soil salinization that could result from expanded irrigation and suggest sustainable strategies for the newly irrigated land in Turkey in order to minimize these environmental impacts.

  9. A health survey of toll booth workers

    SciTech Connect

    Strauss, P.; Orris, P.; Buckley, L. )

    1992-01-01

    The prevalence of respiratory and other health problems in a cohort of highway toll booth workers was surveyed by mailed questionnaire. In a low proportion of respondents (43.2%), a high prevalence of central nervous system complaints (headaches, irritability, or anxiety, and unusual tiredness), mucous membrane irritation (eye irritation, nasal congestion, and dry throat), and musculoskeletal problems (joint and back pains) was found. We believe these symptoms are reflective of the acute irritant and central nervous system effects of exposure to motor vehicle exhaust. The musculoskeletal complaints are likely the result of bending, reaching, and leaning out of the toll booth. The need for in-depth evaluation of the ventilation systems and the ergonomic and job stressors of work at toll booths is suggested by these results.

  10. Learning networks for sustainable, large-scale improvement.

    PubMed

    McCannon, C Joseph; Perla, Rocco J

    2009-05-01

    Large-scale improvement efforts known as improvement networks offer structured opportunities for exchange of information and insights into the adaptation of clinical protocols to a variety of settings.

  11. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  12. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    EPA Science Inventory

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  13. Bayesian hierarchical model for large-scale covariance matrix estimation.

    PubMed

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  14. A study of MLFMA for large-scale scattering problems

    NASA Astrophysics Data System (ADS)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  15. A review of national health surveys in India.

    PubMed

    Dandona, Rakhi; Pandey, Anamika; Dandona, Lalit

    2016-04-01

    Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India's disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health.

  16. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey

    PubMed Central

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    ABSTRACT Background: The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. Objectives: We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. Methods: The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Results: Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. Conclusions: We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys. PMID:28145817

  17. Inflationary tensor fossils in large-scale structure

    SciTech Connect

    Dimastrogiovanni, Emanuela; Fasiello, Matteo; Jeong, Donghui; Kamionkowski, Marc E-mail: mrf65@case.edu E-mail: kamion@jhu.edu

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  18. Cosmological parameters from large scale structure - geometric versus shape information

    SciTech Connect

    Hamann, Jan; Hannestad, Steen; Lesgourgues, Julien; Rampf, Cornelius; Wong, Yvonne Y.Y. E-mail: sth@phys.au.dk E-mail: rampf@physik.rwth-aachen.de

    2010-07-01

    The matter power spectrum as derived from large scale structure (LSS) surveys contains two important and distinct pieces of information: an overall smooth shape and the imprint of baryon acoustic oscillations (BAO). We investigate the separate impact of these two types of information on cosmological parameter estimation for current data, and show that for the simplest cosmological models, the broad-band shape information currently contained in the SDSS DR7 halo power spectrum (HPS) is by far superseded by geometric information derived from the baryonic features. An immediate corollary is that contrary to popular beliefs, the upper limit on the neutrino mass m{sub ν} presently derived from LSS combined with cosmic microwave background (CMB) data does not in fact arise from the possible small-scale power suppression due to neutrino free-streaming, if we limit the model framework to minimal ΛCDM+m{sub ν}. However, in more complicated models, such as those extended with extra light degrees of freedom and a dark energy equation of state parameter w differing from -1, shape information becomes crucial for the resolution of parameter degeneracies. This conclusion will remain true even when data from the Planck spacecraft are combined with SDSS DR7 data. In the course of our analysis, we update both the BAO likelihood function by including an exact numerical calculation of the time of decoupling, as well as the HPS likelihood, by introducing a new dewiggling procedure that generalises the previous approach to models with an arbitrary sound horizon at decoupling. These changes allow a consistent application of the BAO and HPS data sets to a much wider class of models, including the ones considered in this work. All the cases considered here are compatible with the conservative 95%-bounds Σm{sub ν} < 1.16eV, N{sub eff} = 4.8±2.0.

  19. Large-scale filaments associated with Milky Way spiral arms

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Testi, Leonardo; Ginsburg, Adam; Walmsley, C. Malcolm; Molinari, Sergio; Schisano, Eugenio

    2015-07-01

    The ubiquity of filamentary structure at various scales throughout the Galaxy has triggered a renewed interest in their formation, evolution, and role in star formation. The largest filaments can reach up to Galactic scale as part of the spiral arm structure. However, such large-scale filaments are hard to identify systematically due to limitations in identifying methodology (i.e. as extinction features). We present a new approach to directly search for the largest, coldest, and densest filaments in the Galaxy, making use of sensitive Herschel Hi-GAL (Herschel Infrared Galactic Plane Survey) data complemented by spectral line cubes. We present a sample of the nine most prominent Herschel filaments, including six identified from a pilot search field plus three from outside the field. These filaments measure 37-99 pc long and 0.6-3.0 pc wide with masses (0.5-8.3) × 104 M⊙, and beam-averaged (28 arcsec, or 0.4-0.7 pc) peak H2 column densities of (1.7-9.3)× 1022 cm- 2. The bulk of the filaments are relatively cold (17-21 K), while some local clumps have a dust temperature up to 25-47 K. All the filaments are located within ≲60 pc from the Galactic mid-plane. Comparing the filaments to a recent spiral arm model incorporating the latest parallax measurements, we find that 7/9 of them reside within arms, but most are close to arm edges. These filaments are comparable in length to the Galactic scaleheight and therefore are not simply part of a grander turbulent cascade.

  20. Recursive architecture for large-scale adaptive system

    NASA Astrophysics Data System (ADS)

    Hanahara, Kazuyuki; Sugiyama, Yoshihiko

    1994-09-01

    'Large scale' is one of major trends in the research and development of recent engineering, especially in the field of aerospace structural system. This term expresses the large scale of an artifact in general, however, it also implies the large number of the components which make up the artifact in usual. Considering a large scale system which is especially used in remote space or deep-sea, such a system should be adaptive as well as robust by itself, because its control as well as maintenance by human operators are not easy due to the remoteness. An approach to realizing this large scale, adaptive and robust system is to build the system as an assemblage of components which are respectively adaptive by themselves. In this case, the robustness of the system can be achieved by using a large number of such components and suitable adaptation as well as maintenance strategies. Such a system gathers many research's interest and their studies such as decentralized motion control, configurating algorithm and characteristics of structural elements are reported. In this article, a recursive architecture concept is developed and discussed towards the realization of large scale system which consists of a number of uniform adaptive components. We propose an adaptation strategy based on the architecture and its implementation by means of hierarchically connected processing units. The robustness and the restoration from degeneration of the processing unit are also discussed. Two- and three-dimensional adaptive truss structures are conceptually designed based on the recursive architecture.

  1. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  2. Process monitoring during manufacturing of large-scale composite parts

    NASA Astrophysics Data System (ADS)

    Heider, Dirk; Eckel, Douglas A., II; Don, Roderic C.; Fink, Bruce K.; Gillespie, John W., Jr.

    1999-01-01

    One of the inherent problems with the processing of composites is the development of internal stresses and the resulting warpage, which results in out-of-tolerance components. This investigation examines possible fiber-optic sensor methods, which can be applied to measure internal strain and thus residual stress during production. Extrinsic Fabry-Perot Interferometers (EFPI) and Bragg gratings are utilizes to monitor the strain behavior during manufacturing of large-scale composite parts. Initially, a 24 in X 18 in X 1 in thick part was manufactured using the vacuum- assisted resin transfer molding (VARTM) technique. In this part, one Bragg grating, multiple thermocouples and a resin flow sensor (SMARTweave) were integrate to measure the flow and cure behavior during production. An AGEMA thermal image camera verified the temperature history on the part surface. In addition, several EFPI's and Bragg gratings were implemented into three temperature history on the part surface. In addition, several EFPI's and Bragg gratings were implemented into three 13 ft X 32 ft X 20.3 in civilian bridge deck test specimens manufactured with the VARTM process. The Bragg gratings showed great promise to capture the changes in strain due to the residual stress during cure. The actual implementation of fiber optics into large composite parts is a challenge and the problems of sensor survivability in these parts are addressed in this study. The fiber optic measurements in combination with SMARTweave's ability to monitor flow could lead to a sensor system, which allows feedback for process control of the VARTM technique. In addition, the optical fibers will be used for health monitoring during the lifetime of the part.

  3. Survey on Continuing Education Needs for Health Professionals: Report.

    ERIC Educational Resources Information Center

    System Development Corp., Santa Monica, CA.

    The report documents the results of a 1967 survey of health professionals in the four-State Western Interstate Commission for Higher Education (WICHE) Mountain States Regional Medical Program (MS/RMP). Addressed to health professionals in each of the four States--Idaho, Montana, Nevada, and Wyoming--the survey focuses primarily on the…

  4. HARRISBURG TRI-COUNTY HEALTH MANPOWER SURVEY REPORT. PRELIMINARY.

    ERIC Educational Resources Information Center

    RATNER, MURIEL

    THE HARRISBURG AREA COMMUNITY COLLEGE COOPERATED WITH TWO HOSPITALS IN A SURVEY OF THE AREA'S NEEDS FOR HEALTH TECHNICIANS. DATA, COLLECTED BY QUESTIONNAIRE SURVEYS OF DOCTORS AND DENTISTS AND BY INTERVIEWS WITH ADMINISTRATORS OF HOSPITALS, NURSING HOMES AND PROFESSIONAL ORGANIZATIONS, INDICATED THAT (1) A 60-PERCENT INCREASE IN HEALTH MNAPOWER…

  5. Worksite Health Promotion Activities. 1992 National Survey. Summary Report.

    ERIC Educational Resources Information Center

    Public Health Service (DHHS), Rockville, MD. Office of Disease Prevention and Health Promotion.

    The survey reported in this document examined worksite health promotion and disease prevention activities in 1,507 private worksites in the United States. Specificlly, the survey assessed policies, practices, services, facilities, information, and activities sponsored by employers to improve the health of their employees, and assessed health…

  6. Numerical methods for large-scale, time-dependent partial differential equations

    NASA Technical Reports Server (NTRS)

    Turkel, E.

    1979-01-01

    A survey of numerical methods for time dependent partial differential equations is presented. The emphasis is on practical applications to large scale problems. A discussion of new developments in high order methods and moving grids is given. The importance of boundary conditions is stressed for both internal and external flows. A description of implicit methods is presented including generalizations to multidimensions. Shocks, aerodynamics, meteorology, plasma physics and combustion applications are also briefly described.

  7. Modulation analysis of large-scale discrete vortices.

    PubMed

    Cisneros, Luis A; Minzoni, Antonmaria A; Panayotaros, Panayotis; Smyth, Noel F

    2008-09-01

    The behavior of large-scale vortices governed by the discrete nonlinear Schrödinger equation is studied. Using a discrete version of modulation theory, it is shown how vortices are trapped and stabilized by the self-consistent Peierls-Nabarro potential that they generate in the lattice. Large-scale circular and polygonal vortices are studied away from the anticontinuum limit, which is the limit considered in previous studies. In addition numerical studies are performed on large-scale, straight structures, and it is found that they are stabilized by a nonconstant mean level produced by standing waves generated at the ends of the structure. Finally, numerical evidence is produced for long-lived, localized, quasiperiodic structures.

  8. Large-scale simulations of complex physical systems

    NASA Astrophysics Data System (ADS)

    Belić, A.

    2007-04-01

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results. In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  9. Large-scale velocity structures in turbulent thermal convection.

    PubMed

    Qiu, X L; Tong, P

    2001-09-01

    A systematic study of large-scale velocity structures in turbulent thermal convection is carried out in three different aspect-ratio cells filled with water. Laser Doppler velocimetry is used to measure the velocity profiles and statistics over varying Rayleigh numbers Ra and at various spatial positions across the whole convection cell. Large velocity fluctuations are found both in the central region and near the cell boundary. Despite the large velocity fluctuations, the flow field still maintains a large-scale quasi-two-dimensional structure, which rotates in a coherent manner. This coherent single-roll structure scales with Ra and can be divided into three regions in the rotation plane: (1) a thin viscous boundary layer, (2) a fully mixed central core region with a constant mean velocity gradient, and (3) an intermediate plume-dominated buffer region. The experiment reveals a unique driving mechanism for the large-scale coherent rotation in turbulent convection.

  10. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  11. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  12. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  13. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  14. Efficiency of workplace surveys conducted by Finnish occupational health services.

    PubMed

    Savinainen, Minna; Oksa, Panu

    2011-07-01

    In Finland, workplace surveys are used to identify and assess health risks and problems caused by work and make suggestions for continuous improvement of the work environment. With the aid of the workplace survey, occupational health services can be tailored to a company. The aims of this study were to determine how occupational health professionals gather data via the workplace survey and the effect survey results have on companies. A total of 259 occupational health nurses and 108 occupational health physicians responded to the questionnaire: 84.2% were women and 15.8% were men. The mean age of the respondents was 48.8 years (range, 26 to 65 years). Usually occupational health nurses and foremen and sometimes occupational health physicians and occupational safety and health representatives initiate the workplace survey. More than 90% of the surveys were followed by action proposals, and about 50% of these were implemented. The proposals implemented most often concerned personal protective equipment and less often leadership. Survey respondents should have both the opportunity and the authority to affect resources, the work environment, work arrangements, and tools. Teamwork among occupational health and safety professionals, management, and employees is vital for cost-effectively solving today's complex problems at workplaces around the globe.

  15. Brief 73 Health Physics Enrollments and Degrees Survey, 2013 Data

    SciTech Connect

    None, None

    2014-02-15

    The survey includes degrees granted between September 1, 2012 and August 31, 2013. Enrollment information refers to the fall term 2013. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.taoi_na

  16. Brief 75 Health Physics Enrollments and Degrees Survey, 2014 Data

    SciTech Connect

    None, None

    2015-03-05

    The 2014 survey includes degrees granted between September 1, 2013 and August 31, 2014. Enrollment information refers to the fall term 2014. Twenty-two academic programs were included in the survey universe, with all 22 programs providing data. Since 2009, data for two health physics programs located in engineering departments are also included in the nuclear engineering survey. The enrollments and degrees data includes students majoring in health physics or in an option program equivalent to a major.

  17. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  18. [Colombia. Prevalence, Demography and Health Survey 1990].

    PubMed

    1991-06-01

    Colombia's 1990 Survey of Prevalence, Demography, and Health (EPDS) was intended to provide data on the total population and on the status of women's and children's health for use in planning and in formulating health and family planning policy. 7412 household interviews and 8647 individual interviews with women aged 15-49 years were completed. This document provides a brief description of the questionnaire, sample design, data processing, and survey results. More detailed works on each topic are expected to follow. After weighing, 74.8% of respondents were urban and 25.2% rural. 3.2% were illiterate, 36.6% had some primary education, 50.2% had secondary educations, and 9.9% had high higher educations. Among all respondents and respondents currently in union respectively, 98.2% and 997% knew some contraceptive method, 94.1% and 97.9% knew some source of family planning, 57.6% and 86.0% had ever used a method, and 39.9% and 66.1% were currently using a method. Among all respondents and respondents currently in union respectively, 52.2% and 78.9% had ever used a modern method and 33.0% and 54.6% were currently using a modern method. Among women in union, 14.1% currently used pills, 12.4% IUDs, 2.2% injectables, 1.7% vaginal methods, 2.9% condoms, 20.9% female sterilization, .5% vasectomy, 11.5% some tradition method, 6.1% periodic abstinence, 4.8% withdrawal, and .5% others. Equal proportions of rural and urban women were sterilized. The prevalence of female sterilization declined with education and increased with family size. Modern methods were used by 57.5% of urban and 47.7% of rural women, 44.0% of illiterate women, 51.8% of women with primary and 57.8% with secondary educations. Among women in union, 10.9% wanted a child soon, 19.7% wanted 1 eventually, 3.6% were undecided, 42.6% did not want 1, 21.4% were sterilized, and 1.2% were infertile. Among women giving birth in the past 5 years, the proportion having antitetanus vaccinations increased from 39% in 1986

  19. Large scale purification of RNA nanoparticles by preparative ultracentrifugation.

    PubMed

    Jasinski, Daniel L; Schwartz, Chad T; Haque, Farzin; Guo, Peixuan

    2015-01-01

    Purification of large quantities of supramolecular RNA complexes is of paramount importance due to the large quantities of RNA needed and the purity requirements for in vitro and in vivo assays. Purification is generally carried out by liquid chromatography (HPLC), polyacrylamide gel electrophoresis (PAGE), or agarose gel electrophoresis (AGE). Here, we describe an efficient method for the large-scale purification of RNA prepared by in vitro transcription using T7 RNA polymerase by cesium chloride (CsCl) equilibrium density gradient ultracentrifugation and the large-scale purification of RNA nanoparticles by sucrose gradient rate-zonal ultracentrifugation or cushioned sucrose gradient rate-zonal ultracentrifugation.

  20. The Evolution of Baryons in Cosmic Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali; Arielle Phillips, Lara; Mathews, Grant James; Coughlin, Jared; Suh, In-Saeng; Bhattacharya, Aparna

    2015-01-01

    The environments of galaxies play a critical role in their formation and evolution. We study these environments using cosmological simulations with star formation and supernova feedback included. From these simulations, we parse the large scale structure into clusters, filaments and voids using a segmentation algorithm adapted from medical imaging. We trace the star formation history, gas phase and metal evolution of the baryons in the intergalactic medium as function of structure. We find that our algorithm reproduces the baryon fraction in the intracluster medium and that the majority of star formation occurs in cold, dense filaments. We present the consequences this large scale environment has for galactic halos and galaxy evolution.

  1. [Issues of large scale tissue culture of medicinal plant].

    PubMed

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  2. Large-Scale Graph Processing Analysis using Supercomputer Cluster

    NASA Astrophysics Data System (ADS)

    Vildario, Alfrido; Fitriyani; Nugraha Nurkahfi, Galih

    2017-01-01

    Graph implementation is widely use in various sector such as automotive, traffic, image processing and many more. They produce graph in large-scale dimension, cause the processing need long computational time and high specification resources. This research addressed the analysis of implementation large-scale graph using supercomputer cluster. We impelemented graph processing by using Breadth-First Search (BFS) algorithm with single destination shortest path problem. Parallel BFS implementation with Message Passing Interface (MPI) used supercomputer cluster at High Performance Computing Laboratory Computational Science Telkom University and Stanford Large Network Dataset Collection. The result showed that the implementation give the speed up averages more than 30 times and eficiency almost 90%.

  3. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  4. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  5. Large-scale search for dark-matter axions

    SciTech Connect

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  6. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  7. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  8. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  9. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  10. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  11. Improving the Utility of Large-Scale Assessments in Canada

    ERIC Educational Resources Information Center

    Rogers, W. Todd

    2014-01-01

    Principals and teachers do not use large-scale assessment results because the lack of distinct and reliable subtests prevents identifying strengths and weaknesses of students and instruction, the results arrive too late to be used, and principals and teachers need assistance to use the results to improve instruction so as to improve student…

  12. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  13. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  14. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  15. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  16. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  17. Large-Scale Physical Separation of Depleted Uranium from Soil

    DTIC Science & Technology

    2012-09-01

    ER D C/ EL T R -1 2 - 2 5 Army Range Technology Program Large-Scale Physical Separation of Depleted Uranium from Soil E nv ir on m en ta l...Separation ................................................................................................................ 2   Project Background...5  2   Materials and Methods

  18. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  19. Computational Complexity, Efficiency and Accountability in Large Scale Teleprocessing Systems.

    DTIC Science & Technology

    1980-12-01

    COMPLEXITY, EFFICIENCY AND ACCOUNTABILITY IN LARGE SCALE TELEPROCESSING SYSTEMS DAAG29-78-C-0036 STANFORD UNIVERSITY JOHN T. GILL MARTIN E. BELLMAN...solve but easy to check. Ve have also suggested howy sucb random tapes can be simulated by determin- istically generating "pseudorandom" numbers by a

  20. Large-scale silicon optical switches for optical interconnection

    NASA Astrophysics Data System (ADS)

    Qiao, Lei; Tang, Weijie; Chu, Tao

    2016-11-01

    Large-scale optical switches are greatly demanded in building optical interconnections in data centers and high performance computers (HPCs). Silicon optical switches have advantages of being compact and CMOS process compatible, which can be easily monolithically integrated. However, there are difficulties to construct large ports silicon optical switches. One of them is the non-uniformity of the switch units in large scale silicon optical switches, which arises from the fabrication error and causes confusion in finding the unit optimum operation points. In this paper, we proposed a method to detect the optimum operating point in large scale switch with limited build-in power monitors. We also propose methods for improving the unbalanced crosstalk of cross/bar states in silicon electro-optical MZI switches and insertion losses. Our recent progress in large scale silicon optical switches, including 64 × 64 thermal-optical and 32 × 32 electro-optical switches will be introduced. To the best our knowledge, both of them are the largest scale silicon optical switches in their sections, respectively. The switches were fabricated on 340-nm SOI substrates with CMOS 180- nm processes. The crosstalk of the 32 × 32 electro-optic switch was -19.2dB to -25.1 dB, while the value of the 64 × 64 thermal-optic switch was -30 dB to -48.3 dB.

  1. The large scale microwave background anisotropy in decaying particle cosmology

    SciTech Connect

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs.

  2. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  3. Ecosystem resilience despite large-scale altered hydro climatic conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  4. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  5. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  6. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  7. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  8. Large scale structure of the sun's radio corona

    NASA Technical Reports Server (NTRS)

    Kundu, M. R.

    1986-01-01

    Results of studies of large scale structures of the corona at long radio wavelengths are presented, using data obtained with the multifrequency radioheliograph of the Clark Lake Radio Observatory. It is shown that features corresponding to coronal streamers and coronal holes are readily apparent in the Clark Lake maps.

  9. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  10. Large-scale screening by the automated Wassermann reaction

    PubMed Central

    Wagstaff, W.; Firth, R.; Booth, J. R.; Bowley, C. C.

    1969-01-01

    In view of the drawbacks in the use of the Kahn test for large-scale screening of blood donors, mainly those of human error through work overload and fatiguability, an attempt was made to adapt an existing automated complement-fixation technique for this purpose. This paper reports the successful results of that adaptation. PMID:5776559

  11. Large-scale societal changes and intentionality - an uneasy marriage.

    PubMed

    Bodor, Péter; Fokas, Nikos

    2014-08-01

    Our commentary focuses on juxtaposing the proposed science of intentional change with facts and concepts pertaining to the level of large populations or changes on a worldwide scale. Although we find a unified evolutionary theory promising, we think that long-term and large-scale, scientifically guided - that is, intentional - social change is not only impossible, but also undesirable.

  12. A review of national health surveys in India

    PubMed Central

    Pandey, Anamika; Dandona, Lalit

    2016-01-01

    Abstract Several rounds of national health surveys have generated a vast amount of data in India since 1992. We describe and compare the key health information gathered, assess the availability of health data in the public domain, and review publications resulting from the National Family Health Survey (NFHS), the District Level Household Survey (DLHS) and the Annual Health Survey (AHS). We highlight issues that need attention to improve the usefulness of the surveys in monitoring changing trends in India’s disease burden: (i) inadequate coverage of noncommunicable diseases, injuries and some major communicable diseases; (ii) modest comparability between surveys on the key themes of child and maternal mortality and immunization to understand trends over time; (iii) short time intervals between the most recent survey rounds; and (iv) delays in making individual-level data available for analysis in the public domain. We identified 337 publications using NFHS data, in contrast only 48 and three publications were using data from the DLHS and AHS respectively. As national surveys are resource-intensive, it would be prudent to maximize their benefits. We suggest that India plan for a single major national health survey at five-year intervals in consultation with key stakeholders. This could cover additional major causes of the disease burden and their risk factors, as well as causes of death and adult mortality rate estimation. If done in a standardized manner, such a survey would provide useable and timely data to inform health interventions and facilitate assessment of their impact on population health. PMID:27034522

  13. Effects of large-scale environment on the assembly history of central galaxies

    SciTech Connect

    Jung, Intae; Lee, Jaehyun; Yi, Sukyoung K.

    2014-10-10

    We examine whether large-scale environment affects the mass assembly history of central galaxies. To facilitate this, we constructed dark matter halo merger trees from a cosmological N-body simulation and calculated the formation and evolution of galaxies using a semi-analytic method. We confirm earlier results that smaller halos show a notable difference in formation time with a mild dependence on large-scale environment. However, using a semi-analytic model, we found that on average the growth rate of the stellar mass of central galaxies is largely insensitive to large-scale environment. Although our results show that the star formation rate (SFR) and the stellar mass of central galaxies in smaller halos are slightly affected by the assembly bias of halos, those galaxies are faint and the difference in the SFR is minute, therefore it is challenging to detect it in real galaxies given the current observational accuracy. Future galaxy surveys, such as the BigBOSS experiment and the Large Synoptic Survey Telescope, which are expected to provide observational data for fainter objects, will provide a chance to test our model predictions.

  14. ADHD and Health Services Utilization in the National Health Interview Survey

    ERIC Educational Resources Information Center

    Cuffe, Steven P.; Moore, Charity G.; McKeown, Robert

    2009-01-01

    Objective: Describe the general health, comorbidities and health service use among U.S. children with ADHD. Method: The 2001 National Health Interview Survey (NHIS) contained the Strengths and Difficulties Questionnaire (SDQ; used to determine probable ADHD), data on medical problems, overall health, and health care utilization. Results: Asthma…

  15. Resurrecting hot dark matter - Large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1988-01-01

    These are the results of a numerical simulation of the formation of large-scale structure from cosmic-string loops in a universe dominated by massive neutrinos (hot dark matter). This model has several desirable features. The final matter distribution contains isolated density peaks embedded in a smooth background, producing a natural bias in the distribution of luminous matter. Because baryons can accrete onto the cosmic strings before the neutrinos, the galaxies will have baryon cores and dark neutrino halos. Galaxy formation in this model begins much earlier than in random-phase models. On large scales the distribution of clustered matter visually resembles the CfA survey, with large voids and filaments.

  16. Challenges and Innovations in Surveying the Governmental Public Health Workforce.

    PubMed

    Leider, Jonathon P; Shah, Gulzar; Rider, Nikki; Beck, Angela; Castrucci, Brian C; Harris, Jenine K; Sellers, Katie; Varda, Danielle; Ye, Jiali; Erwin, Paul C; Brownson, Ross C

    2016-11-01

    Surveying governmental public health practitioners is a critical means of collecting data about public health organizations, their staff, and their partners. A greater focus on evidence-based practices, practice-based systems research, and evaluation has resulted in practitioners consistently receiving requests to participate in myriad surveys. This can result in a substantial survey burden for practitioners and declining response rates for researchers. This is potentially damaging to practitioners and researchers as well as the field of public health more broadly. We have examined recent developments in survey research, especially issues highly relevant for public health practice. We have also proposed a process by which researchers can engage with practitioners and practitioner groups on research questions of mutual interest.

  17. Challenges and Innovations in Surveying the Governmental Public Health Workforce

    PubMed Central

    Shah, Gulzar; Rider, Nikki; Beck, Angela; Castrucci, Brian C.; Harris, Jenine K.; Sellers, Katie; Varda, Danielle; Ye, Jiali; Erwin, Paul C.; Brownson, Ross C.

    2016-01-01

    Surveying governmental public health practitioners is a critical means of collecting data about public health organizations, their staff, and their partners. A greater focus on evidence-based practices, practice-based systems research, and evaluation has resulted in practitioners consistently receiving requests to participate in myriad surveys. This can result in a substantial survey burden for practitioners and declining response rates for researchers. This is potentially damaging to practitioners and researchers as well as the field of public health more broadly. We have examined recent developments in survey research, especially issues highly relevant for public health practice. We have also proposed a process by which researchers can engage with practitioners and practitioner groups on research questions of mutual interest. PMID:27715307

  18. The Large Scale Synthesis of Aligned Plate Nanostructures

    NASA Astrophysics Data System (ADS)

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-07-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ‧ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential.

  19. Large-scale linear nonparallel support vector machine solver.

    PubMed

    Tian, Yingjie; Ping, Yuan

    2014-02-01

    Twin support vector machines (TWSVMs), as the representative nonparallel hyperplane classifiers, have shown the effectiveness over standard SVMs from some aspects. However, they still have some serious defects restricting their further study and real applications: (1) They have to compute and store the inverse matrices before training, it is intractable for many applications where data appear with a huge number of instances as well as features; (2) TWSVMs lost the sparseness by using a quadratic loss function making the proximal hyperplane close enough to the class itself. This paper proposes a Sparse Linear Nonparallel Support Vector Machine, termed as L1-NPSVM, to deal with large-scale data based on an efficient solver-dual coordinate descent (DCD) method. Both theoretical analysis and experiments indicate that our method is not only suitable for large scale problems, but also performs as good as TWSVMs and SVMs.

  20. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    SciTech Connect

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  1. Electron drift in a large scale solid xenon

    SciTech Connect

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  2. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    NASA Technical Reports Server (NTRS)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  3. The workshop on iterative methods for large scale nonlinear problems

    SciTech Connect

    Walker, H.F.; Pernice, M.

    1995-12-01

    The aim of the workshop was to bring together researchers working on large scale applications with numerical specialists of various kinds. Applications that were addressed included reactive flows (combustion and other chemically reacting flows, tokamak modeling), porous media flows, cardiac modeling, chemical vapor deposition, image restoration, macromolecular modeling, and population dynamics. Numerical areas included Newton iterative (truncated Newton) methods, Krylov subspace methods, domain decomposition and other preconditioning methods, large scale optimization and optimal control, and parallel implementations and software. This report offers a brief summary of workshop activities and information about the participants. Interested readers are encouraged to look into an online proceedings available at http://www.usi.utah.edu/logan.proceedings. In this, the material offered here is augmented with hypertext abstracts that include links to locations such as speakers` home pages, PostScript copies of talks and papers, cross-references to related talks, and other information about topics addresses at the workshop.

  4. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  5. The Large Scale Synthesis of Aligned Plate Nanostructures

    PubMed Central

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ′ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  6. Long gradient mode and large-scale structure observables

    NASA Astrophysics Data System (ADS)

    Allahyari, Alireza; Firouzjaee, Javad T.

    2017-03-01

    We extend the study of long-mode perturbations to other large-scale observables such as cosmic rulers, galaxy-number counts, and halo bias. The long mode is a pure gradient mode that is still outside an observer's horizon. We insist that gradient-mode effects on observables vanish. It is also crucial that the expressions for observables are relativistic. This allows us to show that the effects of a gradient mode on the large-scale observables vanish identically in a relativistic framework. To study the potential modulation effect of the gradient mode on halo bias, we derive a consistency condition to the first order in gradient expansion. We find that the matter variance at a fixed physical scale is not modulated by the long gradient mode perturbations when the consistency condition holds. This shows that the contribution of long gradient modes to bias vanishes in this framework.

  7. Lagrangian space consistency relation for large scale structure

    SciTech Connect

    Horn, Bart; Hui, Lam; Xiao, Xiao E-mail: lh399@columbia.edu

    2015-09-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.

  8. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  9. In the fast lane: large-scale bacterial genome engineering.

    PubMed

    Fehér, Tamás; Burland, Valerie; Pósfai, György

    2012-07-31

    The last few years have witnessed rapid progress in bacterial genome engineering. The long-established, standard ways of DNA synthesis, modification, transfer into living cells, and incorporation into genomes have given way to more effective, large-scale, robust genome modification protocols. Expansion of these engineering capabilities is due to several factors. Key advances include: (i) progress in oligonucleotide synthesis and in vitro and in vivo assembly methods, (ii) optimization of recombineering techniques, (iii) introduction of parallel, large-scale, combinatorial, and automated genome modification procedures, and (iv) rapid identification of the modifications by barcode-based analysis and sequencing. Combination of the brute force of these techniques with sophisticated bioinformatic design and modeling opens up new avenues for the analysis of gene functions and cellular network interactions, but also in engineering more effective producer strains. This review presents a summary of recent technological advances in bacterial genome engineering.

  10. Electron drift in a large scale solid xenon

    DOE PAGES

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor twomore » faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.« less

  11. LARGE-SCALE MOTIONS IN THE PERSEUS GALAXY CLUSTER

    SciTech Connect

    Simionescu, A.; Werner, N.; Urban, O.; Allen, S. W.; Fabian, A. C.; Sanders, J. S.; Mantz, A.; Nulsen, P. E. J.; Takei, Y.

    2012-10-01

    By combining large-scale mosaics of ROSAT PSPC, XMM-Newton, and Suzaku X-ray observations, we present evidence for large-scale motions in the intracluster medium of the nearby, X-ray bright Perseus Cluster. These motions are suggested by several alternating and interleaved X-ray bright, low-temperature, low-entropy arcs located along the east-west axis, at radii ranging from {approx}10 kpc to over a Mpc. Thermodynamic features qualitatively similar to these have previously been observed in the centers of cool-core clusters, and were successfully modeled as a consequence of the gas sloshing/swirling motions induced by minor mergers. Our observations indicate that such sloshing/swirling can extend out to larger radii than previously thought, on scales approaching the virial radius.

  12. The CLASSgal code for relativistic cosmological large scale structure

    SciTech Connect

    Dio, Enea Di; Montanari, Francesco; Durrer, Ruth; Lesgourgues, Julien E-mail: Francesco.Montanari@unige.ch E-mail: Ruth.Durrer@unige.ch

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum C{sub ℓ}(z{sub 1},z{sub 2}) and the corresponding correlation function ξ(θ,z{sub 1},z{sub 2}) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  13. A Cloud Computing Platform for Large-Scale Forensic Computing

    NASA Astrophysics Data System (ADS)

    Roussev, Vassil; Wang, Liqiang; Richard, Golden; Marziale, Lodovico

    The timely processing of massive digital forensic collections demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections. This paper describes MPI MapReduce (MMR), an open implementation of the MapReduce processing model that outperforms traditional forensic computing techniques. MMR provides linear scaling for CPU-intensive processing and super-linear scaling for indexing-related workloads.

  14. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  15. A Holistic Management Architecture for Large-Scale Adaptive Networks

    DTIC Science & Technology

    2007-09-01

    MANAGEMENT ARCHITECTURE FOR LARGE-SCALE ADAPTIVE NETWORKS by Michael R. Clement September 2007 Thesis Advisor: Alex Bordetsky Second Reader...TECHNOLOGY MANAGEMENT from the NAVAL POSTGRADUATE SCHOOL September 2007 Author: Michael R. Clement Approved by: Dr. Alex ...achieve in life is by His will. Ad Majorem Dei Gloriam. To my parents, my family, and Caitlin: For supporting me, listening to me when I got

  16. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  17. Large-scale detection of recombination in nucleotide sequences

    NASA Astrophysics Data System (ADS)

    Chan, Cheong Xin; Beiko, Robert G.; Ragan, Mark A.

    2008-01-01

    Genetic recombination following a genetic transfer event can produce heterogeneous phylogenetic histories within sets of genes that share a common ancestral origin. Delineating recombination events will enhance our understanding in genome evolution. However, the task of detecting recombination is not trivial due to effect of more-recent evolutionary changes that can obscure such event from detection. In this paper, we demonstrate the use of a two-phase strategy for detecting recombination events on a large-scale dataset.

  18. Multimodel Design of Large Scale Systems with Multiple Decision Makers.

    DTIC Science & Technology

    1982-08-01

    virtue. 5- , Lead me from darkneu to light. - Lead me from death to eternal Life. ( Vedic Payer) p. I, MULTIMODEL DESIGN OF LARGE SCALE SYSTEMS WITH...guidance during the course of *: this research . He would also like to thank Professors W. R. Perkins, P. V. Kokotovic, T. Basar, and T. N. Trick for...thesis concludes with Chapter 7 where we summarize the results obtained, outline the main contributions, and indicate directions for future research . 7- I

  19. Turbulent amplification of large-scale magnetic fields

    NASA Technical Reports Server (NTRS)

    Montgomery, D.; Chen, H.

    1984-01-01

    Previously-introduced methods for analytically estimating the effects of small-scale turbulent fluctuations on large-scale dynamics are extended to fully three-dimensional magnetohydrodynamics. The problem becomes algebraically tractable in the presence of sufficiently large spectral gaps. The calculation generalizes 'alpha dynamo' calculations, except that the velocity fluctuations and magnetic fluctuations are treated on an independent and equal footing. Earlier expressions for the 'alpha coefficients' of turbulent magnetic field amplification are recovered as a special case.

  20. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  1. Relic vector field and CMB large scale anomalies

    SciTech Connect

    Chen, Xingang; Wang, Yi E-mail: yw366@cam.ac.uk

    2014-10-01

    We study the most general effects of relic vector fields on the inflationary background and density perturbations. Such effects are observable if the number of inflationary e-folds is close to the minimum requirement to solve the horizon problem. We show that this can potentially explain two CMB large scale anomalies: the quadrupole-octopole alignment and the quadrupole power suppression. We discuss its effect on the parity anomaly. We also provide analytical template for more detailed data comparison.

  2. The large-scale anisotropy with the PAMELA calorimeter

    NASA Astrophysics Data System (ADS)

    Karelin, A.; Adriani, O.; Barbarino, G.; Bazilevskaya, G.; Bellotti, R.; Boezio, M.; Bogomolov, E.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bruno, A.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; De Donato, C.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A.; Koldashov, S.; Koldobskiy, S.; Krut'kov, S.; Kvashnin, A.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A.; Menn, W.; Mergé, M.; Mikhailov, V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S.; Sarkar, R.; Simon, M.; Scotti, V.; Sparvoli, R.; Spillantini, P.; Stozhkov, Y.; Vacchi, A.; Vannuccini, E.; Vasilyev, G.; Voronov, S.; Yurkin, Y.; Zampa, G.; Zampa, N.

    2015-10-01

    The large-scale anisotropy (or the so-called star-diurnal wave) has been studied using the calorimeter of the space-born experiment PAMELA. The cosmic ray anisotropy has been obtained for the Southern and Northern hemispheres simultaneously in the equatorial coordinate system for the time period 2006-2014. The dipole amplitude and phase have been measured for energies 1-20 TeV n-1.

  3. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  4. The Phoenix series large scale LNG pool fire experiments.

    SciTech Connect

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  5. Information Tailoring Enhancements for Large Scale Social Data

    DTIC Science & Technology

    2016-03-15

    Social Data Progress Report No. 2 Reporting Period: December 16, 2015 – March 15, 2016 Contract No. N00014-15-P-5138 Sponsored by ONR...Intelligent Automation Incorporated Progress Report No. 2 Information Tailoring Enhancements for Large-Scale Social Data Submitted in accordance with...robustness. We imporoved the (i) messaging architecture, (ii) data redundancy, and (iii) service availability of Scraawl computational framework

  6. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    DTIC Science & Technology

    2016-07-25

    system for host immunity that combines virtualization , emulation, and mutable network configurations. This system is deployed on a single host, and...entire !Pv4 address space within 5 Host Immunity via Mutable Virtualized Large-Scale Network Containers 45 minutes from a single machine. Second, when...URL, and we call it URL marker. A URL marker records the information about its parent web page’s URL and the user ID who collects the URL. Thus, when

  7. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  8. Developing and Understanding Methods for Large Scale Nonlinear Optimization

    DTIC Science & Technology

    2001-12-01

    development of new algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with...analysis of tensor and SQP methods for singular con- strained optimization", to appear in SIAM Journal on Optimization. Published in peer-reviewed...Mathematica, Vol III, Journal der Deutschen Mathematiker-Vereinigung, 1998. S. Crivelli, B. Bader, R. Byrd, E. Eskow, V. Lamberti , R.Schnabel and T

  9. Large-scale Alfvén vortices

    SciTech Connect

    Onishchenko, O. G.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  10. Analysis plan for 1985 large-scale tests. Technical report

    SciTech Connect

    McMullan, F.W.

    1983-01-01

    The purpose of this effort is to assist DNA in planning for large-scale (upwards of 5000 tons) detonations of conventional explosives in the 1985 and beyond time frame. Primary research objectives were to investigate potential means to increase blast duration and peak pressures. This report identifies and analyzes several candidate explosives. It examines several charge designs and identifies advantages and disadvantages of each. Other factors including terrain and multiburst techniques are addressed as are test site considerations.

  11. "The Health Educator" Readership Survey, 2011: Reporting the Results

    ERIC Educational Resources Information Center

    Bliss, Kadi; Ogletree, Roberta J.; Liefer, Maureen

    2011-01-01

    Readership surveys can help editors assess satisfaction with a journal as well as identify potential modifications to be made. The editorial staff of "The Health Educator" conducted an online readership survey in the summer of 20 11. After a five-week data solicitation and collection period, a total of 504 Eta Sigma Gamma (ESG) members responded.…

  12. Equivalent common path method in large-scale laser comparator

    NASA Astrophysics Data System (ADS)

    He, Mingzhao; Li, Jianshuang; Miao, Dongjing

    2015-02-01

    Large-scale laser comparator is main standard device that providing accurate, reliable and traceable measurements for high precision large-scale line and 3D measurement instruments. It mainly composed of guide rail, motion control system, environmental parameters monitoring system and displacement measurement system. In the laser comparator, the main error sources are temperature distribution, straightness of guide rail and pitch and yaw of measuring carriage. To minimize the measurement uncertainty, an equivalent common optical path scheme is proposed and implemented. Three laser interferometers are adjusted to parallel with the guide rail. The displacement in an arbitrary virtual optical path is calculated using three displacements without the knowledge of carriage orientations at start and end positions. The orientation of air floating carriage is calculated with displacements of three optical path and position of three retroreflectors which are precisely measured by Laser Tracker. A 4th laser interferometer is used in the virtual optical path as reference to verify this compensation method. This paper analyzes the effect of rail straightness on the displacement measurement. The proposed method, through experimental verification, can improve the measurement uncertainty of large-scale laser comparator.

  13. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  14. Impact of Large-scale Geological Architectures On Recharge

    NASA Astrophysics Data System (ADS)

    Troldborg, L.; Refsgaard, J. C.; Engesgaard, P.; Jensen, K. H.

    Geological and hydrogeological data constitutes the basis for assessment of ground- water flow pattern and recharge zones. The accessibility and applicability of hard ge- ological data is often a major obstacle in deriving plausible conceptual models. Nev- ertheless focus is often on parameter uncertainty caused by the effect of geological heterogeneity due to lack of hard geological data, thus neglecting the possibility of alternative conceptualizations of the large-scale geological architecture. For a catchment in the eastern part of Denmark we have constructed different geologi- cal models based on different conceptualization of the major geological trends and fa- cies architecture. The geological models are equally plausible in a conceptually sense and they are all calibrated to well head and river flow measurements. Comparison of differences in recharge zones and subsequently well protection zones emphasize the importance of assessing large-scale geological architecture in hydrological modeling on regional scale in a non-deterministic way. Geostatistical modeling carried out in a transitional probability framework shows the possibility of assessing multiple re- alizations of large-scale geological architecture from a combination of soft and hard geological information.

  15. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  16. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  17. Learning Short Binary Codes for Large-scale Image Retrieval.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  18. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  19. Large-scale quantization from local correlations in space plasmas

    NASA Astrophysics Data System (ADS)

    Livadiotis, George; McComas, David J.

    2014-05-01

    This study examines the large-scale quantization that can characterize the phase space of certain physical systems. Plasmas are such systems where large-scale quantization, ħ*, is caused by Debye shielding that structures correlations between particles. The value of ħ* is constant—some 12 orders of magnitude larger than the Planck constant—across a wide range of space plasmas, from the solar wind in the inner heliosphere to the distant plasma in the inner heliosheath and the local interstellar medium. This paper develops the foundation and advances the understanding of the concept of plasma quantization; in particular, we (i) show the analogy of plasma to Planck quantization, (ii) show the key points of plasma quantization, (iii) construct some basic quantum mechanical concepts for the large-scale plasma quantization, (iv) investigate the correlation between plasma parameters that implies plasma quantization, when it is approximated by a relation between the magnetosonic energy and the plasma frequency, (v) analyze typical space plasmas throughout the heliosphere and show the constancy of plasma quantization over many orders of magnitude in plasma parameters, (vi) analyze Advanced Composition Explorer (ACE) solar wind measurements to develop another measurement of the value of ħ*, and (vii) apply plasma quantization to derive unknown plasma parameters when some key observable is missing.

  20. Channel capacity of next generation large scale MIMO systems

    NASA Astrophysics Data System (ADS)

    Alshammari, A.; Albdran, S.; Matin, M.

    2016-09-01

    Information rate that can be transferred over a given bandwidth is limited by the information theory. Capacity depends on many factors such as the signal to noise ratio (SNR), channel state information (CSI) and the spatial correlation in the propagation environment. It is very important to increase spectral efficiency in order to meet the growing demand for wireless services. Thus, Multiple input multiple output (MIMO) technology has been developed and applied in most of the wireless standards and it has been very successful in increasing capacity and reliability. As the demand is still increasing, attention now is shifting towards large scale multiple input multiple output (MIMO) which has a potential of bringing orders of magnitude of improvement in spectral and energy efficiency. It has been shown that users channels decorrelate after increasing the number of antennas. As a result, inter-user interference can be avoided since energy can be focused on precise directions. This paper investigates the limits of channel capacity for large scale MIMO. We study the relation between spectral efficiency and the number of antenna N. We use time division duplex (TDD) system in order to obtain CSI using training sequence in the uplink. The same CSI is used for the downlink because the channel is reciprocal. Spectral efficiency is measured for channel model that account for small scale fading while ignoring the effect of large scale fading. It is shown the spectral efficiency can be improved significantly when compared to single antenna systems in ideal circumstances.

  1. Sparse approximation through boosting for learning large scale kernel machines.

    PubMed

    Sun, Ping; Yao, Xin

    2010-06-01

    Recently, sparse approximation has become a preferred method for learning large scale kernel machines. This technique attempts to represent the solution with only a subset of original data points also known as basis vectors, which are usually chosen one by one with a forward selection procedure based on some selection criteria. The computational complexity of several resultant algorithms scales as O(NM(2)) in time and O(NM) in memory, where N is the number of training points and M is the number of basis vectors as well as the steps of forward selection. For some large scale data sets, to obtain a better solution, we are sometimes required to include more basis vectors, which means that M is not trivial in this situation. However, the limited computational resource (e.g., memory) prevents us from including too many vectors. To handle this dilemma, we propose to add an ensemble of basis vectors instead of only one at each forward step. The proposed method, closely related to gradient boosting, could decrease the required number M of forward steps significantly and thus a large fraction of computational cost is saved. Numerical experiments on three large scale regression tasks and a classification problem demonstrate the effectiveness of the proposed approach.

  2. Alteration of Large-Scale Chromatin Structure by Estrogen Receptor

    PubMed Central

    Nye, Anne C.; Rajendran, Ramji R.; Stenoien, David L.; Mancini, Michael A.; Katzenellenbogen, Benita S.; Belmont, Andrew S.

    2002-01-01

    The estrogen receptor (ER), a member of the nuclear hormone receptor superfamily important in human physiology and disease, recruits coactivators which modify local chromatin structure. Here we describe effects of ER on large-scale chromatin structure as visualized in live cells. We targeted ER to gene-amplified chromosome arms containing large numbers of lac operator sites either directly, through a lac repressor-ER fusion protein (lac rep-ER), or indirectly, by fusing lac repressor with the ER interaction domain of the coactivator steroid receptor coactivator 1. Significant decondensation of large-scale chromatin structure, comparable to that produced by the ∼150-fold-stronger viral protein 16 (VP16) transcriptional activator, was produced by ER in the absence of estradiol using both approaches. Addition of estradiol induced a partial reversal of this unfolding by green fluorescent protein-lac rep-ER but not by wild-type ER recruited by a lac repressor-SRC570-780 fusion protein. The chromatin decondensation activity did not require transcriptional activation by ER nor did it require ligand-induced coactivator interactions, and unfolding did not correlate with histone hyperacetylation. Ligand-induced coactivator interactions with helix 12 of ER were necessary for the partial refolding of chromatin in response to estradiol using the lac rep-ER tethering system. This work demonstrates that when tethered or recruited to DNA, ER possesses a novel large-scale chromatin unfolding activity. PMID:11971975

  3. Geospatial Optimization of Siting Large-Scale Solar Projects

    SciTech Connect

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  4. Assessing salivary cortisol in large-scale, epidemiological research.

    PubMed

    Adam, Emma K; Kumari, Meena

    2009-11-01

    Salivary cortisol measures are increasingly being incorporated into large-scale, population-based, or epidemiological research, in which participants are selected to be representative of particular communities or populations of interest, and sample sizes are in the order of hundreds to tens of thousands of participants. These approaches to studying salivary cortisol provide important advantages but pose a set of challenges. The representative nature of sampling, and large samples sizes associated with population-based research offer high generalizability and power, and the ability to examine cortisol functioning in relation to: (a) a wide range of social environments; (b) a diverse array individuals and groups; and (c) a broad set of pre-disease and disease outcomes. The greater importance of high response rates (to maintain generalizability) and higher costs associated with this type of large-scale research, however, requires special adaptations of existing ambulatory cortisol protocols. These include: using the most efficient sample collection protocol possible that still adequately address the specific cortisol-related questions at hand, and ensuring the highest possible response and compliance rates among those individuals invited to participate. Examples of choices made, response rates obtained, and examples of results obtained from existing epidemiological cortisol studies are offered, as are suggestions for the modeling and interpretation of salivary cortisol data obtained in large-scale epidemiological research.

  5. Large-scale investigation of genomic markers for severe periodontitis.

    PubMed

    Suzuki, Asami; Ji, Guijin; Numabe, Yukihiro; Ishii, Keisuke; Muramatsu, Masaaki; Kamoi, Kyuichi

    2004-09-01

    The purpose of the present study was to investigate the genomic markers for periodontitis, using large-scale single-nucleotide polymorphism (SNP) association studies comparing healthy volunteers and patients with periodontitis. Genomic DNA was obtained from 19 healthy volunteers and 22 patients with severe periodontitis, all of whom were Japanese. The subjects were genotyped at 637 SNPs in 244 genes on a large scale, using the TaqMan polymerase chain reaction (PCR) system. Statistically significant differences in allele and genotype frequencies were analyzed with Fisher's exact test. We found statistically significant differences (P < 0.01) between the healthy volunteers and patients with severe periodontitis in the following genes; gonadotropin-releasing hormone 1 (GNRH1), phosphatidylinositol 3-kinase regulatory 1 (PIK3R1), dipeptidylpeptidase 4 (DPP4), fibrinogen-like 2 (FGL2), and calcitonin receptor (CALCR). These results suggest that SNPs in the GNRH1, PIK3R1, DPP4, FGL2, and CALCR genes are genomic markers for severe periodontitis. Our findings indicate the necessity of analyzing SNPs in genes on a large scale (i.e., genome-wide approach), to identify genomic markers for periodontitis.

  6. Large-scale biodiversity patterns in freshwater phytoplankton.

    PubMed

    Stomp, Maayke; Huisman, Jef; Mittelbach, Gary G; Litchman, Elena; Klausmeier, Christopher A

    2011-11-01

    Our planet shows striking gradients in the species richness of plants and animals, from high biodiversity in the tropics to low biodiversity in polar and high-mountain regions. Recently, similar patterns have been described for some groups of microorganisms, but the large-scale biogeographical distribution of freshwater phytoplankton diversity is still largely unknown. We examined the species diversity of freshwater phytoplankton sampled from 540 lakes and reservoirs distributed across the continental United States and found strong latitudinal, longitudinal, and altitudinal gradients in phytoplankton biodiversity, demonstrating that microorganisms can show substantial geographic variation in biodiversity. Detailed analysis using structural equation models indicated that these large-scale biodiversity gradients in freshwater phytoplankton diversity were mainly driven by local environmental factors, although there were residual direct effects of latitude, longitude, and altitude as well. Specifically, we found that phytoplankton species richness was an increasing saturating function of lake chlorophyll a concentration, increased with lake surface area and possibly increased with water temperature, resembling effects of productivity, habitat area, and temperature on diversity patterns commonly observed for macroorganisms. In turn, these local environmental factors varied along latitudinal, longitudinal, and altitudinal gradients. These results imply that changes in land use or climate that affect these local environmental factors are likely to have major impacts on large-scale biodiversity patterns of freshwater phytoplankton.

  7. A model of plasma heating by large-scale flow

    NASA Astrophysics Data System (ADS)

    Pongkitiwanichakul, P.; Cattaneo, F.; Boldyrev, S.; Mason, J.; Perez, J. C.

    2015-12-01

    In this work, we study the process of energy dissipation triggered by a slow large-scale motion of a magnetized conducting fluid. Our consideration is motivated by the problem of heating the solar corona, which is believed to be governed by fast reconnection events set off by the slow motion of magnetic field lines anchored in the photospheric plasma. To elucidate the physics governing the disruption of the imposed laminar motion and the energy transfer to small scales, we propose a simplified model where the large-scale motion of magnetic field lines is prescribed not at the footpoints but rather imposed volumetrically. As a result, the problem can be treated numerically with an efficient, highly accurate spectral method, allowing us to use a resolution and statistical ensemble exceeding those of the previous work. We find that, even though the large-scale deformations are slow, they eventually lead to reconnection events that drive a turbulent state at smaller scales. The small-scale turbulence displays many of the universal features of field-guided magnetohydrodynamic turbulence like a well-developed inertial range spectrum. Based on these observations, we construct a phenomenological model that gives the scalings of the amplitude of the fluctuations and the energy-dissipation rate as functions of the input parameters. We find good agreement between the numerical results and the predictions of the model.

  8. Large-scale flow generation by inhomogeneous helicity.

    PubMed

    Yokoi, N; Brandenburg, A

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  9. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  10. French Frigate Shoals reef health survey

    USGS Publications Warehouse

    Work, Thierry M.; Coles, Steve L.; Rameyer, Robert

    2002-01-01

    French Frigate Shoals consists of a large (31 nm) fringing reef partially enclosing a lagoon. A basalt pinnacle (La Perouse Pinnacle) arises approximately halfway between the two ends of the arcs of the fringing reefs. Tern Island is situated at the northern end of the lagoon and is surrounded by a dredged ship channel. The lagoon becomes progressively shallower from west to east and harbors a variety of marine life including corals, fish, marine mammals, and sea turtles (Amerson 1971). In 2000, an interagency survey of the northwestern Hawaiian Islands was done to document the fauna and flora in FFS (Maragos and Gulko, 2002). During that survey, 38 stations were examined, and 41 species of stony corals were documented, the most of any of the NW Hawaiian islands (Maragos and Gulko 2002). In some of these stations, corals with abnormalities were observed. The present study aimed to expand on the 2000 survey to evaluate the lesions in areas where they were documented.

  11. New Mexico Adolescent Health Risks Survey.

    ERIC Educational Resources Information Center

    Antle, David

    To inform students of health risks (posed by behavior, environment, and genetics) and provide schools with collective risk appraisal information as a basis for planning/evaluating health and wellness initiatives, New Mexico administered the Teen Wellness Check in 1985 to 1,573 ninth-grade students from 7 New Mexico public schools. Subjects were…

  12. Health sciences library building projects, 1998 survey.

    PubMed

    Bowden, V M

    1999-10-01

    Twenty-eight health sciences library building projects are briefly described, including twelve new buildings and sixteen additions, remodelings, and renovations. The libraries range in size from 2,144 square feet to 190,000 gross square feet. Twelve libraries are described in detail. These include three hospital libraries, one information center sponsored by ten institutions, and eight academic health sciences libraries.

  13. New probes of Cosmic Microwave Background large-scale anomalies

    NASA Astrophysics Data System (ADS)

    Aiola, Simone

    Fifty years of Cosmic Microwave Background (CMB) data played a crucial role in constraining the parameters of the LambdaCDM model, where Dark Energy, Dark Matter, and Inflation are the three most important pillars not yet understood. Inflation prescribes an isotropic universe on large scales, and it generates spatially-correlated density fluctuations over the whole Hubble volume. CMB temperature fluctuations on scales bigger than a degree in the sky, affected by modes on super-horizon scale at the time of recombination, are a clean snapshot of the universe after inflation. In addition, the accelerated expansion of the universe, driven by Dark Energy, leaves a hardly detectable imprint in the large-scale temperature sky at late times. Such fundamental predictions have been tested with current CMB data and found to be in tension with what we expect from our simple LambdaCDM model. Is this tension just a random fluke or a fundamental issue with the present model? In this thesis, we present a new framework to probe the lack of large-scale correlations in the temperature sky using CMB polarization data. Our analysis shows that if a suppression in the CMB polarization correlations is detected, it will provide compelling evidence for new physics on super-horizon scale. To further analyze the statistical properties of the CMB temperature sky, we constrain the degree of statistical anisotropy of the CMB in the context of the observed large-scale dipole power asymmetry. We find evidence for a scale-dependent dipolar modulation at 2.5sigma. To isolate late-time signals from the primordial ones, we test the anomalously high Integrated Sachs-Wolfe effect signal generated by superstructures in the universe. We find that the detected signal is in tension with the expectations from LambdaCDM at the 2.5sigma level, which is somewhat smaller than what has been previously argued. To conclude, we describe the current status of CMB observations on small scales, highlighting the

  14. Robust large-scale parallel nonlinear solvers for simulations.

    SciTech Connect

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write

  15. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  16. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  17. The California Health Interview Survey 2001: translation of a major survey for California's multiethnic population.

    PubMed Central

    Ponce, Ninez A.; Lavarreda, Shana Alex; Yen, Wei; Brown, E. Richard; DiSogra, Charles; Satter, Delight E.

    2004-01-01

    The cultural and linguistic diversity of the U.S. population presents challenges to the design and implementation of population-based surveys that serve to inform public policies. Information derived from such surveys may be less than representative if groups with limited or no English language skills are not included. The California Health Interview Survey (CHIS), first administered in 2001, is a population-based health survey of more than 55,000 California households. This article describes the process that the designers of CHIS 2001 underwent in culturally adapting the survey and translating it into an unprecedented number of languages: Spanish, Chinese, Vietnamese, Korean, and Khmer. The multiethnic and multilingual CHIS 2001 illustrates the importance of cultural and linguistic adaptation in raising the quality of population-based surveys, especially when the populations they intend to represent are as diverse as California's. PMID:15219795

  18. 75 FR 51843 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products Containing the Same... certain large scale integrated circuit semiconductor chips and products containing same by reason...

  19. National Natality Survey/National Maternal and Infant Health Survey (NMIHS)

    Cancer.gov

    The survey provides data on socioeconomic and demographic characteristics of mothers, prenatal care, pregnancy history, occupational background, health status of mother and infant, and types and sources of medical care received.

  20. Licensed Practical Nurses in Occupational Health. An Initial Survey.

    ERIC Educational Resources Information Center

    Lee, Jane A.; And Others

    The study, conducted in 1971, assessed characteristics of licensed practical nurses (LPN's) who worked in occupational health nursing. The survey instrument, a questionnaire, was returned by 591 LPN's in occupational health and provided data related to: personal characteristics, work and setting, administrative and professional functioning,…

  1. Health Research Facilities: A survey of Doctorate-Granting Institutions.

    ERIC Educational Resources Information Center

    Atelsek, Frank J.; Gomberg, Irene L.

    The survey data cover three broad categories: (1) the status of existing health research facilities at doctorate-granting institutions (including their current value, adequacy, and condition); (2) the volume of new construction in progress; and (3) the additions to health research facilities anticipated during the next 5 years…

  2. Student Opinions About Health Services at Miami. Survey Report.

    ERIC Educational Resources Information Center

    Keller, Michael J.

    A random sample of Miami University undergraduate and graduate students were surveyed to determine their opinions about health care at the university. Most of the questions dealt with the university's student health service and satisfaction with the quality of medical treatment at the facility, perception of the staff's performance and interest in…

  3. Taking the Pulse of Undergraduate Health Psychology: A Nationwide Survey

    ERIC Educational Resources Information Center

    Brack, Amy Badura; Kesitilwe, Kutlo; Ware, Mark E.

    2010-01-01

    We conducted a random national survey of 100 doctoral, 100 comprehensive, and 100 baccalaureate institutions to determine the current state of the undergraduate health psychology course. We found clear evidence of a maturing course with much greater commonality in name (health psychology), theoretical foundation (the biopsychosocial model), and…

  4. TOPOLOGY OF A LARGE-SCALE STRUCTURE AS A TEST OF MODIFIED GRAVITY

    SciTech Connect

    Wang Xin; Chen Xuelei; Park, Changbom

    2012-03-01

    The genus of the isodensity contours is a robust measure of the topology of a large-scale structure, and it is relatively insensitive to nonlinear gravitational evolution, galaxy bias, and redshift-space distortion. We show that the growth of density fluctuations is scale dependent even in the linear regime in some modified gravity theories, which opens a new possibility of testing the theories observationally. We propose to use the genus of the isodensity contours, an intrinsic measure of the topology of the large-scale structure, as a statistic to be used in such tests. In Einstein's general theory of relativity, density fluctuations grow at the same rate on all scales in the linear regime, and the genus per comoving volume is almost conserved as structures grow homologously, so we expect that the genus-smoothing-scale relation is basically time independent. However, in some modified gravity models where structures grow with different rates on different scales, the genus-smoothing-scale relation should change over time. This can be used to test the gravity models with large-scale structure observations. We study the cases of the f(R) theory, DGP braneworld theory as well as the parameterized post-Friedmann models. We also forecast how the modified gravity models can be constrained with optical/IR or redshifted 21 cm radio surveys in the near future.

  5. Summary Health Statistics for U.S. Children: National Health Interview Survey, 1999.

    ERIC Educational Resources Information Center

    Blackwell, Debra L.; Tonthat, Luong

    This report presents statistics from the 1999 National Health Interview Survey (NHIS) on selected health measures for children under 18 years of age, classified by sex, age, race/ethnicity, family structure, parent education, family income, poverty status, health insurance coverage, place of residence, region, and current health status. The NHIS…

  6. Health sciences library building projects, 1998 survey.

    PubMed Central

    Bowden, V M

    1999-01-01

    Twenty-eight health sciences library building projects are briefly described, including twelve new buildings and sixteen additions, remodelings, and renovations. The libraries range in size from 2,144 square feet to 190,000 gross square feet. Twelve libraries are described in detail. These include three hospital libraries, one information center sponsored by ten institutions, and eight academic health sciences libraries. Images PMID:10550027

  7. CUMULATIVE TRAUMAS AND RISK THRESHOLDS: 12-MONTH PTSD IN THE WORLD MENTAL HEALTH (WMH) SURVEYS

    PubMed Central

    Karam, Elie G.; Friedman, Matthew J.; Hill, Eric D.; Kessler, Ronald C.; McLaughlin, Katie A.; Petukhova, Maria; Sampson, Laura; Shahly, Victoria; Angermeyer, Matthias C.; Bromet, Evelyn J.; de Girolamo, Giovanni; de Graaf, Ron; Demyttenaere, Koen; Ferry, Finola; Florescu, Silvia E.; Haro, Josep Maria; He, Yanling; Karam, Aimee N.; Kawakami, Norito; Kovess-Masfety, Viviane; Medina-Mora, María Elena; Browne, Mark A. Oakley; Posada-Villa, José A.; Shalev, Arieh Y.; Stein, Dan J.; Viana, Maria Carmen; Zarkov, Zahari; Koenen, Karestan C.

    2014-01-01

    Background Clinical research suggests that posttraumatic stress disorder (PTSD) patients exposed to multiple traumatic events (TEs) rather than a single TE have increased morbidity and dysfunction. Although epidemiological surveys in the United States and Europe also document high rates of multiple TE exposure, no population-based cross-national data have examined this issue. Methods Data were analyzed from 20 population surveys in the World Health Organization World Mental Health Survey Initiative (n 51,295 aged 18+). The Composite International Diagnostic Interview (3.0) assessed 12-month PTSD and other common DSM-IV disorders. Respondents with 12-month PTSD were assessed for single versus multiple TEs implicated in their symptoms. Associations were examined with age of onset (AOO), functional impairment, comorbidity, and PTSD symptom counts. Results 19.8% of respondents with 12-month PTSD reported that their symptoms were associated with multiple TEs. Cases who associated their PTSD with four or more TEs had greater functional impairment, an earlier AOO, longer duration, higher comorbidity with mood and anxiety disorders, elevated hyper-arousal symptoms, higher proportional exposures to partner physical abuse and other types of physical assault, and lower proportional exposure to unexpected death of a loved one than cases with fewer associated TEs. Conclusions A risk threshold was observed in this large-scale cross-national database wherein cases who associated their PTSD with four or more TEs presented a more “complex” clinical picture with substantially greater functional impairment and greater morbidity than other cases of PTSD. PTSD cases associated with four or more TEs may merit specific and targeted intervention strategies. Depression and Anxiety 31:130–142, 2014. PMID:23983056

  8. Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability

    ERIC Educational Resources Information Center

    Stoynoff, Stephen

    2009-01-01

    This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…

  9. Results from the 2010 National Survey on Drug Use and Health: Mental Health Findings

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration, 2012

    2012-01-01

    This report presents results pertaining to mental health from the 2010 National Survey on Drug Use and Health (NSDUH), an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older. This report presents national estimates of the prevalence of past year mental disorders and past year mental health…

  10. Development and Implementation of Culturally Tailored Offline Mobile Health Surveys

    PubMed Central

    2016-01-01

    Background In low and middle income countries (LMICs), and other areas with low resources and unreliable access to the Internet, understanding the emerging best practices for the implementation of new mobile health (mHealth) technologies is needed for efficient and secure data management and for informing public health researchers. Innovations in mHealth technology can improve on previous methods, and dissemination of project development details and lessons learned during implementation are needed to provide lessons learned to stakeholders in both the United States and LMIC settings. Objective The aims of this paper are to share implementation strategies and lessons learned from the development and implementation stages of two survey research projects using offline mobile technology, and to inform and prepare public health researchers and practitioners to implement new mobile technologies in survey research projects in LMICs. Methods In 2015, two survey research projects were developed and piloted in Puerto Rico and pre-tested in Costa Rica to collect face-to-face data, get formative evaluation feedback, and to test the feasibility of an offline mobile data collection process. Fieldwork in each setting involved survey development, back translation with cultural tailoring, ethical review and approvals, data collector training, and piloting survey implementation on mobile tablets. Results Critical processes and workflows for survey research projects in low resource settings were identified and implemented. This included developing a secure mobile data platform tailored to each survey, establishing user accessibility, and training and eliciting feedback from data collectors and on-site LMIC project partners. Conclusions Formative and process evaluation strategies are necessary and useful for the development and implementation of survey research projects using emerging mHealth technologies in LMICs and other low resource settings. Lessons learned include: (1) plan

  11. Behavioral Health in the Gulf Coast Region Following the Deepwater Horizon Oil Spill: Findings from Two Federal Surveys

    PubMed Central

    Gould, Deborah W.; Pemberton, Michael R.; Pierannunzi, Carol; Larson, Sharon

    2015-01-01

    This article summarizes findings from two large-scale, population-based surveys conducted by Substance Abuse and Mental Health Services Administration (SAMHSA) and Centers for Disease Control and Prevention (CDC) in the Gulf Coast region following the 2010 Deepwater Horizon oil spill, to measure the prevalence of mental and substance use disorders, chronic health conditions, and utilization of behavioral health services. Although many area residents undoubtedly experienced increased levels of anxiety and stress following the spill, findings suggest only modest or minimal changes in behavioral health at the aggregate level before and after the spill. The studies do not address potential long-term effects of the spill on physical and behavioral health nor did they target subpopulations that might have been most affected by the spill. Resources mobilized to reduce the economic and behavioral health impacts of the spill on coastal residents—including compensation for lost income from BP and increases in available mental health services—may have resulted in a reduction in potential mental health problems. PMID:25339594

  12. Prevalence and Correlates of Metabolic Syndrome in Chinese Children: The China Health and Nutrition Survey

    PubMed Central

    Song, Peige; Yu, Jinyue; Chang, Xinlei; Wang, Manli; An, Lin

    2017-01-01

    Metabolic syndrome (MetS) is generally defined as a cluster of metabolically related cardiovascular risk factors which are often associated with the condition of insulin resistance, elevated blood pressure, and abdominal obesity. During the past decades, MetS has become a major public health issue worldwide in both adults and children. In this study, data from the China Health and Nutrition Surveys (CHNS) was used to assess the prevalence of MetS based on both the National Cholesterol Education Program Adult Treatment Panel III (NCEP-ATPIII) guidelines and the International Diabetes Federation (IDF) criteria, and to evaluate its possible correlates. A total of 831 children aged 7–18 years were included in this study, and 28 children were classified as having MetS as defined by the modified NCEP-ATPIII definition, which yielded an overall prevalence of 3.37%. Elevated blood pressure was the most frequent MetS component. The results of logistic regression models revealed that increased body mass index (BMI), hyperuricemia, and insulin resistance (IR) were all associated with the presence of MetS. To conclude, our study revealed the prevalence of MetS in Chinese children at the national level. Further large-scale studies are still needed to identify better MetS criteria in the general paediatric population in China. PMID:28106792

  13. Dual pricing of health sciences periodicals: a survey.

    PubMed Central

    Miller, D R; Jensen, J E

    1980-01-01

    A survey of dual pricing practices among publishers of health-related journals identified 281 periodicals with an average price differential of over 100% between individual and institutional subscription rates. Both the practice itself and the amount of the differential are increasing, indicating that journal subscriptions of health sciences libraries increasingly provide the financial support necessary for the publication of health sciences journals. Dual pricing is also correlated with copyright royalties. The problems that dual pricing creates for health sciences libraries' budgets are due in part to uncritical purchasing by libraries. Increased consumerism on the part of health science librarians is recommended. PMID:7437588

  14. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  15. Large-scale smart passive system for civil engineering applications

    NASA Astrophysics Data System (ADS)

    Jung, Hyung-Jo; Jang, Dong-Doo; Lee, Heon-Jae; Cho, Sang-Won

    2008-03-01

    The smart passive system consisting of a magnetorheological (MR) damper and an electromagnetic induction (EMI) part has been recently proposed. An EMI part can generate the input current for an MR damper from vibration of a structure according to Faraday's law of electromagnetic induction. The control performance of the smart passive system has been demonstrated mainly by numerical simulations. It was verified from the numerical results that the system could be effective to reduce the structural responses in the cases of civil engineering structures such as buildings and bridges. On the other hand, the experimental validation of the system is not sufficiently conducted yet. In this paper, the feasibility of the smart passive system to real-scale structures is investigated. To do this, the large-scale smart passive system is designed, manufactured, and tested. The system consists of the large-capacity MR damper, which has a maximum force level of approximately +/-10,000N, a maximum stroke level of +/-35mm and the maximum current level of 3 A, and the large-scale EMI part, which is designed to generate sufficient induced current for the damper. The applicability of the smart passive system to large real-scale structures is examined through a series of shaking table tests. The magnitudes of the induced current of the EMI part with various sinusoidal excitation inputs are measured. According to the test results, the large-scale EMI part shows the possibility that it could generate the sufficient current or power for changing the damping characteristics of the large-capacity MR damper.

  16. Infectious diseases in large-scale cat hoarding investigations.

    PubMed

    Polak, K C; Levy, J K; Crawford, P C; Leutenegger, C M; Moriello, K A

    2014-08-01

    Animal hoarders accumulate animals in over-crowded conditions without adequate nutrition, sanitation, and veterinary care. As a result, animals rescued from hoarding frequently have a variety of medical conditions including respiratory infections, gastrointestinal disease, parasitism, malnutrition, and other evidence of neglect. The purpose of this study was to characterize the infectious diseases carried by clinically affected cats and to determine the prevalence of retroviral infections among cats in large-scale cat hoarding investigations. Records were reviewed retrospectively from four large-scale seizures of cats from failed sanctuaries from November 2009 through March 2012. The number of cats seized in each case ranged from 387 to 697. Cats were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) in all four cases and for dermatophytosis in one case. A subset of cats exhibiting signs of upper respiratory disease or diarrhea had been tested for infections by PCR and fecal flotation for treatment planning. Mycoplasma felis (78%), calicivirus (78%), and Streptococcus equi subspecies zooepidemicus (55%) were the most common respiratory infections. Feline enteric coronavirus (88%), Giardia (56%), Clostridium perfringens (49%), and Tritrichomonas foetus (39%) were most common in cats with diarrhea. The seroprevalence of FeLV and FIV were 8% and 8%, respectively. In the one case in which cats with lesions suspicious for dermatophytosis were cultured for Microsporum canis, 69/76 lesional cats were culture-positive; of these, half were believed to be truly infected and half were believed to be fomite carriers. Cats from large-scale hoarding cases had high risk for enteric and respiratory infections, retroviruses, and dermatophytosis. Case responders should be prepared for mass treatment of infectious diseases and should implement protocols to prevent transmission of feline or zoonotic infections during the emergency response and when

  17. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  18. The Large-Scale Current System During Auroral Substorms

    NASA Astrophysics Data System (ADS)

    Gjerloev, Jesper

    2015-04-01

    The substorm process has been discussed for more than four decades and new empirical large-scale models continue to be published. The continued activity implies both the importance and the complexity of the problem. We recently published a new model of the large-scale substorm current system (Gjerloev and Hoffman, JGR, 2014). Based on data from >100 ground magnetometers (obtained from SuperMAG), 116 isolated substorms, global auroral images (obtained by the Polar VIS Earth Camera) and a careful normalization technique we derived an empirical model of the ionospheric equivalent current system. Our model yield some unexpected features that appear inconsistent with the classical single current wedge current system. One of these features is a distinct latitudinal shift of the westward electrojet (WEJ) current between the pre- and post-midnight region and we find evidence that these two WEJ regions are quasi disconnected. This, and other observational facts, led us to propose a modified 3D current system configuration that consists of 2 wedge type systems: a current wedge in the pre-midnight region (bulge current wedge), and another current wedge system in the post-midnight region (oval current wedge). The two wedge systems are shifted in latitude but overlap in local time in the midnight region. Our model is at considerable variance with previous global models and conceptual schematics of the large-scale substorm current system. We speculate that the data coverage, the methodologies and the techniques used in these previous global studies are the cause of the differences in solutions. In this presentation we present our model, compare with other published models and discuss possible causes for the differences.

  19. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency

  20. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    SciTech Connect

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  1. EPIDEMIOLOGY and Health Care Reform The National Health Survey of 1935-1936

    PubMed Central

    2011-01-01

    The National Health Survey undertaken in 1935 and 1936 was the largest morbidity survey until that time. It was also the first national survey to focus on chronic disease and disability. The decision to conduct a survey of this magnitude was part of the larger strategy to reform health care in the United States. The focus on morbidity allowed reformers to argue that the health status of Americans was poor, despite falling mortality rates that suggested the opposite. The focus on chronic disease morbidity proved to be an especially effective way of demonstrating the poor health of the population and the strong links between poverty and illness. The survey, undertaken by a small group of reform-minded epidemiologists led by Edgar Sydenstricker, was made possible by the close interaction during the Depression of agencies and actors in the public health and social welfare sectors, a collaboration which produced new ways of thinking about disease burdens. PMID:21233434

  2. Large-Scale periodic solar velocities: An observational study

    NASA Technical Reports Server (NTRS)

    Dittmer, P. H.

    1977-01-01

    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating.

  3. Large-scale sodium spray fire code validation (SOFICOV) test

    SciTech Connect

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m/sup 3/ Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes.

  4. Large scale obscuration and related climate effects open literature bibliography

    SciTech Connect

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  5. Water-based scintillators for large-scale liquid calorimetry

    SciTech Connect

    Winn, D.R.; Raftery, D.

    1985-02-01

    We have investigated primary and secondary solvent intermediates in search of a recipe to create a bulk liquid scintillator with water as the bulk solvent and common fluors as the solutes. As we are not concerned with energy resolution below 1 MeV in large-scale experiments, light-output at the 10% level of high-quality organic solvent based scintillators is acceptable. We have found encouraging performance from industrial surfactants as primary solvents for PPO and POPOP. This technique may allow economical and environmentally safe bulk scintillator for kiloton-sized high energy calorimetry.

  6. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  7. Large-Scale Measurement of Absolute Protein Glycosylation Stoichiometry.

    PubMed

    Sun, Shisheng; Zhang, Hui

    2015-07-07

    Protein glycosylation is one of the most important protein modifications. Glycosylation site occupancy alteration has been implicated in human diseases and cancers. However, current glycoproteomic methods focus on the identification and quantification of glycosylated peptides and glycosylation sites but not glycosylation occupancy or glycoform stoichiometry. Here we describe a method for large-scale determination of the absolute glycosylation stoichiometry using three independent relative ratios. Using this method, we determined 117 absolute N-glycosylation occupancies in OVCAR-3 cells. Finally, we investigated the possible functions and the determinants for partial glycosylation.

  8. Large scale mortality of nestling ardeids caused by nematode infection.

    PubMed

    Wiese, J H; Davidson, W R; Nettles, V F

    1977-10-01

    During the summer of 1976, an epornitic of verminous peritonitis caused by Eustrongylides ignotus resulted in large scale mortality of young herons and egrets on Pea Patch Island, Delaware. Mortality was highest (84%) in snowy egret nestlings ( Egretta thula ) and less severe in great egrets ( Casmerodius albus ), Louisiana herons ( Hydranassa tricolor ), little blue herons ( Florida caerulea ), and black crowned night herons ( Nycticorax nycticorax ). Most deaths occured within the first 4 weeks after hatching. Migration of E. ignotus resulted in multiple perforations of the visceral organs, escape of intestinal contents into the body cavity and subsequent bacterial peritonitis. Killifish ( Fundulus heteroclitus ) served as the source of infective larvae.

  9. Integrated High Accuracy Portable Metrology for Large Scale Structural Testing

    NASA Astrophysics Data System (ADS)

    Klaas, Andrej; Richardson, Paul; Burguete, Richard; Harris, Linden

    2014-06-01

    As the performance and accuracy of analysis tools increases bespoke solutions are more regularly being requested to perform high-accuracy measurement on structural tests to validate these methods. These can include optical methods and full-field techniques in place of the more traditional point measurements. As each test is unique it presents its own individual challenges.In this paper two recent, large scale tests performed by Airbus, will be presented and the metrology solutions that were identified for them will be discussed.

  10. Large-scale normal fluid circulation in helium superflows

    NASA Astrophysics Data System (ADS)

    Galantucci, Luca; Sciacca, Michele; Barenghi, Carlo F.

    2017-01-01

    We perform fully coupled numerical simulations of helium II pure superflows in a channel, with vortex-line density typical of experiments. Peculiar to our model is the computation of the back-reaction of the superfluid vortex motion on the normal fluid and the presence of solid boundaries. We recover the uniform vortex-line density experimentally measured employing second sound resonators and we show that pure superflow in helium II is associated with a large-scale circulation of the normal fluid which can be detected using existing particle-tracking visualization techniques.

  11. Large-scale genotoxicity assessments in the marine environment.

    PubMed Central

    Hose, J E

    1994-01-01

    There are a number of techniques for detecting genotoxicity in the marine environment, and many are applicable to large-scale field assessments. Certain tests can be used to evaluate responses in target organisms in situ while others utilize surrogate organisms exposed to field samples in short-term laboratory bioassays. Genotoxicity endpoints appear distinct from traditional toxicity endpoints, but some have chemical or ecotoxicologic correlates. One versatile end point, the frequency of anaphase aberrations, has been used in several large marine assessments to evaluate genotoxicity in the New York Bight, in sediment from San Francisco Bay, and following the Exxon Valdez oil spill. PMID:7713029

  12. Towards large scale production and separation of carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Alvarez, Noe T.

    Since their discovery, carbon nanotubes (CNTs) have boosted the research and applications of nanotechnology; however, many applications of CNTs are inaccessible because they depend upon large-scale CNT production and separations. Type, chirality and diameter control of CNTs determine many of their physical properties, and such control is still not accesible. This thesis studies the fundamentals for scalable selective reactions of HiPCo CNTs as well as the early phase of routes to an inexpensive approach for large-scale CNT production. In the growth part, this thesis covers a complete wet-chemistry process of catalyst and catalyst support deposition for growth of vertically aligned (VA) CNTs. A wet-chemistry preparation process has significant importance for CNT synthesis through chemical vapor deposition (CVD). CVD is by far, the most suitable and inexpensive process for large-scale CNT production when compared to other common processes such as laser ablation and arc discharge. However, its potential has been limited by low-yielding and difficult preparation processes of catalyst and its support, therefore its competitiveness has been reduced. The wet-chemistry process takes advantage of current nanoparticle technology to deposit the catalyst and the catalyst support as a thin film of nanoparticles, making the protocol simple compared to electron beam evaporation and sputtering processes. In the CNT selective reactions part, this thesis studies UV irradiation of individually dispersed HiPCo CNTs that generates auto-selective reactions in the liquid phase with good control over their diameter and chirality. This technique is ideal for large-scale and continuous-process of separations of CNTs by diameter and type. Additionally, an innovative simple catalyst deposition through abrasion is demonstrated. Simple friction between the catalyst and the substrates deposit a high enough density of metal catalyst particles for successful CNT growth. This simple approach has

  13. Clusters as cornerstones of large-scale structure.

    NASA Astrophysics Data System (ADS)

    Gottlöber, S.; Retzlaff, J.; Turchaninov, V.

    1997-04-01

    Galaxy clusters are one of the best tracers of large-scale structure in the Universe on scales well above 100 Mpc. The authors investigate here the clustering properties of a redshift sample of Abell/ACO clusters and compare the observational sample with mock samples constructed from N-body simulations on the basis of four different cosmological models. The authors discuss the power spectrum, the Minkowski functionals and the void statistics of these samples and conclude, that the SCDM and TCDM models are ruled out whereas the ACDM and BSI models are in agreement with the observational data.

  14. Large-Scale Patterns of Filament Channels and Filaments

    NASA Astrophysics Data System (ADS)

    Mackay, Duncan

    2016-07-01

    In this review the properties and large-scale patterns of filament channels and filaments will be considered. Initially, the global formation locations of filament channels and filaments are discussed, along with their hemispheric pattern. Next, observations of the formation of filament channels and filaments are described where two opposing views are considered. Finally, the wide range of models that have been constructed to consider the formation of filament channels and filaments over long time-scales are described, along with the origin of the hemispheric pattern of filaments.

  15. Quantum computation for large-scale image classification

    NASA Astrophysics Data System (ADS)

    Ruan, Yue; Chen, Hanwu; Tan, Jianing; Li, Xi

    2016-10-01

    Due to the lack of an effective quantum feature extraction method, there is currently no effective way to perform quantum image classification or recognition. In this paper, for the first time, a global quantum feature extraction method based on Schmidt decomposition is proposed. A revised quantum learning algorithm is also proposed that will classify images by computing the Hamming distance of these features. From the experimental results derived from the benchmark database Caltech 101, and an analysis of the algorithm, an effective approach to large-scale image classification is derived and proposed against the background of big data.

  16. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  17. Large-scale genotoxicity assessments in the marine environment

    SciTech Connect

    Hose, J.E.

    1994-12-01

    There are a number of techniques for detecting genotoxicity in the marine environment, and many are applicable to large-scale field assessments. Certain tests can be used to evaluate responses in target organisms in situ while others utilize surrogate organisms exposed to field samples in short-term laboratory bioassays. Genotoxicity endpoints appear distinct from traditional toxicity endpoints, but some have chemical or ecotoxicologic correlates. One versatile end point, the frequency of anaphase aberrations, has been used in several large marine assessments to evaluate genotoxicity in the New York Bight, in sediment from San Francisco Bay, and following the Exxon Valdez oil spill. 31 refs., 2 tabs.

  18. Solving Large-scale Eigenvalue Problems in SciDACApplications

    SciTech Connect

    Yang, Chao

    2005-06-29

    Large-scale eigenvalue problems arise in a number of DOE applications. This paper provides an overview of the recent development of eigenvalue computation in the context of two SciDAC applications. We emphasize the importance of Krylov subspace methods, and point out its limitations. We discuss the value of alternative approaches that are more amenable to the use of preconditioners, and report the progression using the multi-level algebraic sub-structuring techniques to speed up eigenvalue calculation. In addition to methods for linear eigenvalue problems, we also examine new approaches to solving two types of non-linear eigenvalue problems arising from SciDAC applications.

  19. Analysis Plan for 1985 Large-Scale Tests.

    DTIC Science & Technology

    1983-01-01

    KEY WORDS (Continue on reverse side it necessary mnd Identify by block number) Large-Scale Blasting Agents Multiburst ANFO S:,ock Waves 20. ABSTRACT...CONSIDERATIONS 6 1.5 MULTIBURST TECHNIQUES 6 1.6 TEST SITE CONSIDERATIONS 6 2 CANDIDATE EXPLOSIVES 8 2.1 INTRODUCTION 82.2 ANFO 8 2.2.1 Bulk (Loose) ANFO 11...2.2.2 Bagged ANFO 13 2.3 APEX 1360 15 2.4 NITRIC ACID AND NITROPROPANE 17 2.5 NITROPROPANENITRATE (NPN) 19 2.6 DBA - 22M 21 2.7 HARDENING EMULSION 22 2.8

  20. Frequency domain multiplexing for large-scale bolometer arrays

    SciTech Connect

    Spieler, Helmuth

    2002-05-31

    The development of planar fabrication techniques for superconducting transition-edge sensors has brought large-scale arrays of 1000 pixels or more to the realm of practicality. This raises the problem of reading out a large number of sensors with a tractable number of connections. A possible solution is frequency-domain multiplexing. I summarize basic principles, present various circuit topologies, and discuss design trade-offs, noise performance, cross-talk and dynamic range. The design of a practical device and its readout system is described with a discussion of fabrication issues, practical limits and future prospects.