Transformation of Public Kindergartens in Shenzhen: Internet Study of Public Views
ERIC Educational Resources Information Center
Li, Hui; Wang, X. Christine
2008-01-01
Since the early 1990s, the central government of China has been gradually privatizing early childhood education. As the main precursor of this large-scale reform in the country, Shenzhen has witnessed several innovative as well as radical changes. The existing public early childhood settings in the city were therefore forced to transform into…
Environmental Response Laboratory Network
The ERLN as a national network of laboratories that can be ramped up as needed to support large scale environmental responses. It integrates capabilities of existing public and private sector labs, providing consistent capacity and quality data.
Abebe, Gumataw K; Chalak, Ali; Abiad, Mohamad G
2017-07-01
Food safety is a key public health issue worldwide. This study aims to characterise existing governance mechanisms - governance structures (GSs) and food safety management systems (FSMSs) - and analyse the alignment thereof in detecting food safety hazards, based on empirical evidence from Lebanon. Firm-to-firm and public baseline are the dominant FSMSs applied in a large-scale, while chain-wide FSMSs are observed only in a small-scale. Most transactions involving farmers are relational and market-based in contrast to (large-scale) processors, which opt for hierarchical GSs. Large-scale processors use a combination of FSMSs and GSs to minimise food safety hazards albeit potential increase in coordination costs; this is an important feature of modern food supply chains. The econometric analysis reveals contract period, on-farm inspection and experience having significant effects in minimising food safety hazards. However, the potential to implement farm-level FSMS is influenced by formality of the contract, herd size, trading partner choice, and experience. Public baseline FSMSs appear effective in controlling food safety hazards; however, this may not be viable due to the scarcity of public resources. We suggest public policies to focus on long-lasting governance mechanisms by introducing incentive schemes and farm-level FSMSs by providing loans and education to farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
ERIC Educational Resources Information Center
Grobe, Cary; Campbell, Elaine
1990-01-01
Attempted to discover patterns of alcohol, drug, and tobacco use among public school children in New Brunswick using Provincial School Drug Survey (PSDS), an existing large-scale assessment. Recoded variables in PSDS dataset to derive profiles of typical tobacco, cannabis, and alcohol users. Found increase in predictive accuracy of regression…
Increasing returns to scale: The solution to the second-order social dilemma
Ye, Hang; Chen, Shu; Luo, Jun; Tan, Fei; Jia, Yongmin; Chen, Yefeng
2016-01-01
Humans benefit from extensive cooperation; however, the existence of free-riders may cause cooperation to collapse. This is called the social dilemma. It has been shown that punishing free-riders is an effective way of resolving this problem. Because punishment is costly, this gives rise to the second-order social dilemma. Without exception, existing solutions rely on some stringent assumptions. This paper proposes, under very mild conditions, a simple model of a public goods game featuring increasing returns to scale. We find that punishers stand out and even dominate the population provided that the degree of increasing returns to scale is large enough; consequently, the second-order social dilemma dissipates. Historical evidence shows that people are more willing to cooperate with others and punish defectors when they suffer from either internal or external menaces. During the prehistoric age, the abundance of contributors was decisive in joint endeavours such as fighting floods, defending territory, and hunting. These situations serve as favourable examples of public goods games in which the degrees of increasing returns to scale are undoubtedly very large. Our findings show that natural selection has endowed human kind with a tendency to pursue justice and punish defection that deviates from social norms. PMID:27535087
Comparing Public, Private, and Market Schools: The International Evidence
ERIC Educational Resources Information Center
Coulson, Andrew J.
2009-01-01
Would large-scale, free-market reforms improve educational outcomes for American children? This question cannot be reliably answered by looking exclusively at domestic evidence, much less by looking exclusively at existing "school choice" programs. Though many such programs have been implemented around the United States, none has created…
Support for solar energy: Examining sense of place and utility-scale development in California
Carlisle, Juliet E.; Kane, Stephanie L.; Solan, David; ...
2014-08-20
As solar costs have declined PV systems have experienced considerable growth since 2003, especially in China, Japan, Germany, and the U.S. Thus, a more nuanced understanding of a particular public's attitudes toward utility-scale solar development, as it arrives in a market and region, is warranted and will likely be instructive for other areas in the world where this type of development will occur in the near future. Using data collected from a 2013 telephone survey (N=594) from the six Southern Californian counties selected based on existing and proposed solar developments and available suitable land, we examine public attitudes toward solarmore » energy and construction of large-scale solar facilities, testing whether attitudes toward such developments are the result of sense of place and attachment to place. Overall, we have mixed results. Place attachment and sense of place fail to produce significant effects except in terms of perceived positive benefits. That is, respondents interpret the change resulting from large-scale solar development in a positive way insofar as perceived positive economic impacts are positively related to support for nearby large-scale construction.« less
Support for solar energy: Examining sense of place and utility-scale development in California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juliet E. Carlisle; Stephanie L. Kane; David Solan
2015-07-01
As solar costs have declined PV systems have experienced considerable growth since 2003, especially in China, Japan, Germany, and the U.S. Thus, a more nuanced understanding of a particular public's attitudes toward utility-scale solar development, as it arrives in a market and region, is warranted and will likely be instructive for other areas in the world where this type of development will occur in the near future. Using data collected from a 2013 telephone survey (N = 594) from the six Southern Californian counties selected based on existing and proposed solar developments and available suitable land, we examine public attitudesmore » toward solar energy and construction of large-scale solar facilities, testing whether attitudes toward such developments are the result of sense of place and attachment to place. Overall, we have mixed results. Place attachment and sense of place fail to produce significant effects except in terms of perceived positive benefits. That is, respondents interpret the change resulting from large-scale solar development in a positive way insofar as perceived positive economic impacts are positively related to support for nearby large-scale construction.« less
Large-scale virtual screening on public cloud resources with Apache Spark.
Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola
2017-01-01
Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.
SLIDE - a web-based tool for interactive visualization of large-scale -omics data.
Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon
2018-06-28
Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.
Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich
2017-04-16
Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.
2016-08-01
Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.
New Approaches for Very Large-Scale Integer Programming
2016-06-24
existing algorithms. This research has been presented at several conferences and has and will appear in archival journals. 15. SUBJECT TERMS integer...This research has been presented at several conferences and has and will appear in archival journals. Distribution Statement This is block 12 on the...pdf Upload a Report Document, if any. The maximum file size for the Report Document is 50MB. Archival Publications (published) during reporting
Co-governing decentralised water systems: an analytical framework.
Yu, C; Brown, R; Morison, P
2012-01-01
Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.
NASA Technical Reports Server (NTRS)
Shostak, A. B.
1973-01-01
The question of how ready the public is for the implementation of large-scale programs of technological change is considered. Four vital aspects of the issue are discussed which include: (1) the ways in which the public mis-perceives the change process, (2) the ways in which recent history impacts on public attitudes, (3) the ways in which the public divides among itself, and (4) the fundamentals of public attitudes towards change. It is concluded that nothing is so critical in the 1970's to securing public approval for large-scale planned change projects as is securing the approval by change-agents of the public.
Challenges Concerning the Energy-Dependency of the Telecom Infrastructure
NASA Astrophysics Data System (ADS)
Fickert, Lothar; Malleck, Helmut; Wakolbinger, Christian
Industry worldwide depends on Information and Communication Technology (ICT). Through large-scale blackouts of the public electricity supply telephone services and Internet connections are massively reduced in their functions, leading to cascading effects. Following analysis of selected, typical failure situations counter-measures to re-establish the public electricity supply in Austria to consumers are identified. This can serve also as an example for other countries. Based on the existing public electricity supply system, a sensitivity analysis both in power and in the ICT sector for the mobile and the fixed network is carried out. As a new possible solution ”smart grid” or ”microgrids” and the controlled operation of decentralized stable islands are investigated.
Coevolving agent strategies and network topology for the public goods games
NASA Astrophysics Data System (ADS)
Zhang, C. Y.; Zhang, J. L.; Xie, G. M.; Wang, L.
2011-03-01
Much of human cooperation remains an evolutionary riddle. Coevolutionary public goods games in structured populations are studied where players can change from an unproductive public goods game to a productive one, by evaluating the productivity of the public goods games. In our model, each individual participates in games organized by its neighborhood plus by itself. Coevolution here refers to an evolutionary process entailing both deletion of existing links and addition of new links between agents that accompanies the evolution of their strategies. Furthermore, we investigate the effects of time scale separation of strategy and structure on cooperation level. This study presents the following: Foremost, we observe that high cooperation levels in public goods interactions are attained by the entangled coevolution of strategy and structure. Presented results also confirm that the resulting networks show many features of real systems, such as cooperative behavior and hierarchical clustering. The heterogeneity of the interaction network is held responsible for the observed promotion of cooperation. We hope our work may offer an explanation for the origin of large-scale cooperative behavior among unrelated individuals.
Parker, Cindy L; Everly, George S; Barnett, Daniel J; Links, Jonathan M
2006-01-01
A full-scale public health response to disasters must attend to both the physical and mental health needs of affected communities. Public health preparedness efforts can be greatly expanded to address the latter set of needs, particularly in light of the high ratio of psychological to physical casualties that often rapidly overwhelms existing mental health response resources in a large-scale emergency. Psychological first aid--the provision of basic psychological care in the short term aftermath of a traumatic event--is a mental health response skill set that public health personnel can readily acquire with proper training. The application of psychological first aid by public health workers can significantly augment front-line community-based mental health responses during the crisis phase of an event. To help achieve this augmented response, we have developed a set of psychological first aid intervention competencies for public health personnel. These competencies, empirically grounded and based on best practice models and consensus statements from leading mental health organizations, represent a necessary step for developing a public health workforce that can better respond to the psychological needs of impacted populations in disasters.
Reforming primary healthcare: from public policy to organizational change.
Gilbert, Frédéric; Denis, Jean-Louis; Lamothe, Lise; Beaulieu, Marie-Dominique; D'amour, Danielle; Goudreau, Johanne
2015-01-01
Governments everywhere are implementing reform to improve primary care. However, the existence of a high degree of professional autonomy makes large-scale change difficult to achieve. The purpose of this paper is to elucidate the change dynamics and the involvement of professionals in a primary healthcare reform initiative carried out in the Canadian province of Quebec. An empirical approach was used to investigate change processes from the inception of a public policy to the execution of changes in professional practices. The data were analysed from a multi-level, combined contextualist-processual perspective. Results are based on a longitudinal multiple-case study of five family medicine groups, which was informed by over 100 interviews, questionnaires, and documentary analysis. The results illustrate the multiple processes observed with the introduction of planned large-scale change in primary care services. The analysis of change content revealed that similar post-change states concealed variations between groups in the scale of their respective changes. The analysis also demonstrated more precisely how change evolved through the introduction of "intermediate change" and how cycles of prescribed and emergent mechanisms distinctively drove change process and change content, from the emergence of the public policy to the change in primary care service delivery. This research was conducted among a limited number of early policy adopters. However, given the international interest in turning to the medical profession to improve primary care, the results offer avenues for both policy development and implementation. The findings offer practical insights for those studying and managing large-scale transformations. They provide a better understanding of how deliberate reforms coexist with professional autonomy through an intertwining of change content and processes. This research is one of few studies to examine a primary care reform from emergence to implementation using a longitudinal multi-level design.
Shrimpton, Roger; du Plessis, Lisanne M; Delisle, Hélène; Blaney, Sonia; Atwood, Stephen J; Sanders, David; Margetts, Barrie; Hughes, Roger
2016-08-01
To describe why and how capacity-building systems for scaling up nutrition programmes should be constructed in low- and middle-income countries (LMIC). Position paper with task force recommendations based on literature review and joint experience of global nutrition programmes, public health nutrition (PHN) workforce size, organization, and pre-service and in-service training. The review is global but the recommendations are made for LMIC scaling up multisectoral nutrition programmes. The multitude of PHN workers, be they in the health, agriculture, education, social welfare, or water and sanitation sector, as well as the community workers who ensure outreach and coverage of nutrition-specific and -sensitive interventions. Overnutrition and undernutrition problems affect at least half of the global population, especially those in LMIC. Programme guidance exists for undernutrition and overnutrition, and priority for scaling up multisectoral programmes for tackling undernutrition in LMIC is growing. Guidance on how to organize and scale up such programmes is scarce however, and estimates of existing PHN workforce numbers - although poor - suggest they are also inadequate. Pre-service nutrition training for a PHN workforce is mostly clinical and/or food science oriented and in-service nutrition training is largely restricted to infant and young child nutrition. Unless increased priority and funding is given to building capacity for scaling up nutrition programmes in LMIC, maternal and child undernutrition rates are likely to remain high and nutrition-related non-communicable diseases to escalate. A hybrid distance learning model for PHN workforce managers' in-service training is urgently needed in LMIC.
Reengineering Real-Time Software Systems
1993-09-09
reengineering existing large-scale (or real-time) systems; systems designed prior to or during the advent of applied SE (Parnas 1979, Freeman 1980). Is... Advisor : Yutaka Kanayama Approved for public release; distribution is unlimited. 93-29769 93 12 6 098 Form Appmoved REPORT DOCUMENTATION PAGE 1o No. PI rep...trm b Idn 1o tl# caik t al wdornon s easnated to waere 1how per response. fr4ikcdm the time rem matnodons. siauide exetig da"a siuo a i and mami diqw
Kyriacou, Demetrios N; Dobrez, Debra; Parada, Jorge P; Steinberg, Justin M; Kahn, Adam; Bennett, Charles L; Schmitt, Brian P
2012-09-01
Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality. However, there is uncertainty about the optimal cost-effective response strategy based on timing of intervention, public health resources, and critical care facilities. We conducted a decision analytic study to compare response strategies to a theoretical large-scale anthrax attack on the Chicago metropolitan area beginning either Day 2 or Day 5 after the attack. These strategies correspond to the policy options set forth by the Anthrax Modeling Working Group for population-wide responses to a large-scale anthrax attack: (1) postattack antibiotic prophylaxis, (2) postattack antibiotic prophylaxis and vaccination, (3) preattack vaccination with postattack antibiotic prophylaxis, and (4) preattack vaccination with postattack antibiotic prophylaxis and vaccination. Outcomes were measured in costs, lives saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We estimated that postattack antibiotic prophylaxis of all 1,390,000 anthrax-exposed people beginning on Day 2 after attack would result in 205,835 infected victims, 35,049 fulminant victims, and 28,612 deaths. Only 6,437 (18.5%) of the fulminant victims could be saved with the existing critical care facilities in the Chicago metropolitan area. Mortality would increase to 69,136 if the response strategy began on Day 5. Including postattack vaccination with antibiotic prophylaxis of all exposed people reduces mortality and is cost-effective for both Day 2 (ICER=$182/QALY) and Day 5 (ICER=$1,088/QALY) response strategies. Increasing ICU bed availability significantly reduces mortality for all response strategies. We conclude that postattack antibiotic prophylaxis and vaccination of all exposed people is the optimal cost-effective response strategy for a large-scale anthrax attack. Our findings support the US government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack. Future policies should consider expanding critical care capacity to allow for the rescue of more victims.
Dobrez, Debra; Parada, Jorge P.; Steinberg, Justin M.; Kahn, Adam; Bennett, Charles L.; Schmitt, Brian P.
2012-01-01
Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality. However, there is uncertainty about the optimal cost-effective response strategy based on timing of intervention, public health resources, and critical care facilities. We conducted a decision analytic study to compare response strategies to a theoretical large-scale anthrax attack on the Chicago metropolitan area beginning either Day 2 or Day 5 after the attack. These strategies correspond to the policy options set forth by the Anthrax Modeling Working Group for population-wide responses to a large-scale anthrax attack: (1) postattack antibiotic prophylaxis, (2) postattack antibiotic prophylaxis and vaccination, (3) preattack vaccination with postattack antibiotic prophylaxis, and (4) preattack vaccination with postattack antibiotic prophylaxis and vaccination. Outcomes were measured in costs, lives saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We estimated that postattack antibiotic prophylaxis of all 1,390,000 anthrax-exposed people beginning on Day 2 after attack would result in 205,835 infected victims, 35,049 fulminant victims, and 28,612 deaths. Only 6,437 (18.5%) of the fulminant victims could be saved with the existing critical care facilities in the Chicago metropolitan area. Mortality would increase to 69,136 if the response strategy began on Day 5. Including postattack vaccination with antibiotic prophylaxis of all exposed people reduces mortality and is cost-effective for both Day 2 (ICER=$182/QALY) and Day 5 (ICER=$1,088/QALY) response strategies. Increasing ICU bed availability significantly reduces mortality for all response strategies. We conclude that postattack antibiotic prophylaxis and vaccination of all exposed people is the optimal cost-effective response strategy for a large-scale anthrax attack. Our findings support the US government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack. Future policies should consider expanding critical care capacity to allow for the rescue of more victims. PMID:22845046
Experimental effects of climate messages vary geographically
NASA Astrophysics Data System (ADS)
Zhang, Baobao; van der Linden, Sander; Mildenberger, Matto; Marlon, Jennifer R.; Howe, Peter D.; Leiserowitz, Anthony
2018-05-01
Social science scholars routinely evaluate the efficacy of diverse climate frames using local convenience or nationally representative samples1-5. For example, previous research has focused on communicating the scientific consensus on climate change, which has been identified as a `gateway' cognition to other key beliefs about the issue6-9. Importantly, although these efforts reveal average public responsiveness to particular climate frames, they do not describe variation in message effectiveness at the spatial and political scales relevant for climate policymaking. Here we use a small-area estimation method to map geographical variation in public responsiveness to information about the scientific consensus as part of a large-scale randomized national experiment (n = 6,301). Our survey experiment finds that, on average, public perception of the consensus increases by 16 percentage points after message exposure. However, substantial spatial variation exists across the United States at state and local scales. Crucially, responsiveness is highest in more conservative parts of the country, leading to national convergence in perceptions of the climate science consensus across diverse political geographies. These findings not only advance a geographical understanding of how the public engages with information about scientific agreement, but will also prove useful for policymakers, practitioners and scientists engaged in climate change mitigation and adaptation.
Lee, Ching-Pei; Lin, Chih-Jen
2014-04-01
Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.
Hernandez-Villafuerte, Karla; Sussex, Jon; Robin, Enora; Guthrie, Sue; Wooding, Steve
2017-02-02
Publicly funded biomedical and health research is expected to achieve the best return possible for taxpayers and for society generally. It is therefore important to know whether such research is more productive if concentrated into a small number of 'research groups' or dispersed across many. We undertook a systematic rapid evidence assessment focused on the research question: do economies of scale and scope exist in biomedical and health research? In other words, is that research more productive per unit of cost if more of it, or a wider variety of it, is done in one location? We reviewed English language literature without date restriction to the end of 2014. To help us to classify and understand that literature, we first undertook a review of econometric literature discussing models for analysing economies of scale and/or scope in research generally (not limited to biomedical and health research). We found a large and disparate literature. We reviewed 60 empirical studies of (dis-)economies of scale and/or scope in biomedical and health research, or in categories of research including or overlapping with biomedical and health research. This literature is varied in methods and findings. At the level of universities or research institutes, studies more often point to positive economies of scale than to diseconomies of scale or constant returns to scale in biomedical and health research. However, all three findings exist in the literature, along with inverse U-shaped relationships. At the level of individual research units, laboratories or projects, the numbers of studies are smaller and evidence is mixed. Concerning economies of scope, the literature more often suggests positive economies of scope than diseconomies, but the picture is again mixed. The effect of varying the scope of activities by a research group was less often reported than the effect of scale and the results were more mixed. The absence of predominant findings for or against the existence of economies of scale or scope implies a continuing need for case by case decisions when distributing research funding, rather than a general policy either to concentrate funding in a few centres or to disperse it across many.
Editorial opinion: public dissemination of raw turbulence data
NASA Astrophysics Data System (ADS)
Sillero, Juan A.; Jiménez, Javier
2016-04-01
Many of the papers in this issue deal with processing of pre-existing large-scale turbulence data. We argue here that there is a certain urgency to the discussion of whether raw data should be made publicly available within the turbulence community, and of which are the best procedures, technology and rules for possible dissemination. Besides expressing the personal opinion that such sharing would be advantageous for the field, the urgency mostly arises from the danger that funding agencies and other institutions would otherwise set standards without proper community input. The experience of the Madrid School of Aeronautics with the dissemination of numerical simulation results is briefly reviewed, including the present technological solutions and usage statistics.
A Commercialization Roadmap for Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, D.
2016-12-01
The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.
Large-scale annotation of small-molecule libraries using public databases.
Zhou, Yingyao; Zhou, Bin; Chen, Kaisheng; Yan, S Frank; King, Frederick J; Jiang, Shumei; Winzeler, Elizabeth A
2007-01-01
While many large publicly accessible databases provide excellent annotation for biological macromolecules, the same is not true for small chemical compounds. Commercial data sources also fail to encompass an annotation interface for large numbers of compounds and tend to be cost prohibitive to be widely available to biomedical researchers. Therefore, using annotation information for the selection of lead compounds from a modern day high-throughput screening (HTS) campaign presently occurs only under a very limited scale. The recent rapid expansion of the NIH PubChem database provides an opportunity to link existing biological databases with compound catalogs and provides relevant information that potentially could improve the information garnered from large-scale screening efforts. Using the 2.5 million compound collection at the Genomics Institute of the Novartis Research Foundation (GNF) as a model, we determined that approximately 4% of the library contained compounds with potential annotation in such databases as PubChem and the World Drug Index (WDI) as well as related databases such as the Kyoto Encyclopedia of Genes and Genomes (KEGG) and ChemIDplus. Furthermore, the exact structure match analysis showed 32% of GNF compounds can be linked to third party databases via PubChem. We also showed annotations such as MeSH (medical subject headings) terms can be applied to in-house HTS databases in identifying signature biological inhibition profiles of interest as well as expediting the assay validation process. The automated annotation of thousands of screening hits in batch is becoming feasible and has the potential to play an essential role in the hit-to-lead decision making process.
ERIC Educational Resources Information Center
Nolin, Anna P.
2014-01-01
This study explored the role of professional learning communities for district leadership implementing large-scale technology initiatives such as 1:1 implementations (one computing device for every student). The existing literature regarding technology leadership is limited, as is literature on how districts use existing collaborative structures…
ERIC Educational Resources Information Center
Urban Inst., Washington, DC.
This last of a three-volume report of a study done to assess the feasibility of large-scale, countercyclical public job creation covers the findings regarding the priorities among projects, indirect employment effects, skill imbalances, and administrative issues; and summarizes the overall findings, conclusions, and recommendations. (Volume 1,…
An interactive web-based system using cloud for large-scale visual analytics
NASA Astrophysics Data System (ADS)
Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.
2015-03-01
Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Watson, Willie R. (Technical Monitor)
2005-01-01
The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.
Kim, Kye-Hyun; Park, Eun-Cheol
2012-01-01
The Korean National Health Insurance (NHI) system was an unprecedented accomplishment that was achieved in a short period of time. In this study, we sought to identify gaps between physicians and the public with respect to attitudes toward the NHI system in Korea. The study population was derived from the 2008 Korean Medical Association Survey, which was conducted to investigate satisfaction with and perceptions of the NHI system among physicians (n = 961) and the public (n = 935). Only 6.5% of the physicians were satisfied with NHI system, and 71.5% were dissatisfied with it. In contrast, 28.3% of the public were satisfied with the NHI system, and 21.4% were dissatisfied. The level of dissatisfaction expressed by physicians (2.03 ± 0.91 on a five-point scale) was also higher than that expressed by the public (3.06 ± 0.84). Despite rapid growth of NHI system, a large gap in satisfaction exists between physicians and the public. PMID:22690087
Yarkoni, Tal
2012-01-01
Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. Efforts to replace or supplement traditional evaluation models with open evaluation platforms that leverage advances in information technology are slowly gaining traction, but remain in the early stages of design and implementation. Here I discuss a number of considerations relevant to the development of such platforms. I focus particular attention on three core elements that next-generation evaluation platforms should strive to emphasize, including (1) open and transparent access to accumulated evaluation data, (2) personalized and highly customizable performance metrics, and (3) appropriate short-term incentivization of the userbase. Because all of these elements have already been successfully implemented on a large scale in hundreds of existing social web applications, I argue that development of new scientific evaluation platforms should proceed largely by adapting existing techniques rather than engineering entirely new evaluation mechanisms. Successful implementation of open evaluation platforms has the potential to substantially advance both the pace and the quality of scientific publication and evaluation, and the scientific community has a vested interest in shifting toward such models as soon as possible. PMID:23060783
The 'dirty downside' of global sporting events: focus on human trafficking for sexual exploitation.
Finkel, R; Finkel, M L
2015-01-01
Human trafficking is as complex human rights and public health issue. The issue of human trafficking for sexual exploitation at large global sporting events has proven to be elusive given the clandestine nature of the industry. This piece examines the issue from a public health perspective. This is a literature review of the 'most comprehensive' studies published on the topic. A PubMed search was done using MeSH terms 'human traffickings' and 'sex trafficking' and 'human rights abuses'. Subheadings included 'statistics and numerical data', 'legislation and jurispudence', 'prevention and control', and 'therapy'. Only papers published in English were reviewed. The search showed that very few well-designed empirical studies have been conducted on the topic and only one pertinent systematic review was identified. Findings show a high prevalence of physical violence among those trafficked compared to non-trafficked women. Sexually transmitted infections and HIV AIDS are prevalent and preventive care is virtually non-existent. Quantifying human trafficking for sexual exploitation at large global sporting events has proven to be elusive given the clandestine nature of the industry. This is not to say that human trafficking for sex as well as forced sexual exploitation does not occur. It almost certainly exists, but to what extent is the big question. It is a hidden problem on a global scale in plain view with tremendous public health implications. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
DEXTER: Disease-Expression Relation Extraction from Text.
Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K
2018-01-01
Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung cancer, 115 glycosyltransferases in 62 cancers and 826 microRNA in 171 cancers. All extractions using DEXTER are integrated in the literature-based portion of BioXpress.Database URL: http://biotm.cis.udel.edu/DEXTER.
Large-Scale 1:1 Computing Initiatives: An Open Access Database
ERIC Educational Resources Information Center
Richardson, Jayson W.; McLeod, Scott; Flora, Kevin; Sauers, Nick J.; Kannan, Sathiamoorthy; Sincar, Mehmet
2013-01-01
This article details the spread and scope of large-scale 1:1 computing initiatives around the world. What follows is a review of the existing literature around 1:1 programs followed by a description of the large-scale 1:1 database. Main findings include: 1) the XO and the Classmate PC dominate large-scale 1:1 initiatives; 2) if professional…
Hamilton, Jada G.; Breen, Nancy; Klabunde, Carrie N.; Moser, Richard P.; Leyva, Bryan; Breslau, Erica S.; Kobrin, Sarah C.
2014-01-01
Large-scale surveys that assess cancer prevention and control behaviors are a readily-available, rich resource for public health researchers. Although these data are used by a subset of researchers who are familiar with them, their potential is not fully realized by the research community for reasons including lack of awareness of the data, and limited understanding of their content, methodology, and utility. Until now, no comprehensive resource existed to describe and facilitate use of these data. To address this gap and maximize use of these data, we catalogued the characteristics and content of four surveys that assessed cancer screening behaviors in 2005, the most recent year with concurrent periods of data collection: the National Health Interview Survey, Health Information National Trends Survey, Behavioral Risk Factor Surveillance System, and California Health Interview Survey. We documented each survey's characteristics, measures of cancer screening, and relevant correlates; examined how published studies (n=78) have used the surveys’ cancer screening data; and reviewed new cancer screening constructs measured in recent years. This information can guide researchers in deciding how to capitalize on the opportunities presented by these data resources. PMID:25300474
Survey on large scale system control methods
NASA Technical Reports Server (NTRS)
Mercadal, Mathieu
1987-01-01
The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.
Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia
2016-09-01
Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.
TARGET Publication Guidelines | Office of Cancer Genomics
Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are
Analyzing large scale genomic data on the cloud with Sparkhit
Huang, Liren; Krüger, Jan
2018-01-01
Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074
Penders, Bart; Vos, Rein; Horstman, Klasien
2009-11-01
Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.
Critical Issues in Large-Scale Assessment: A Resource Guide.
ERIC Educational Resources Information Center
Redfield, Doris
The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…
DOT National Transportation Integrated Search
2016-08-31
A major challenge for achieving large-scale adoption of EVs is an accessible infrastructure for the communities. The societal benefits of large-scale adoption of EVs cannot be realized without adequate deployment of publicly accessible charging stati...
Assessing Large-Scale Public Job Creation. R&D Monograph 67.
ERIC Educational Resources Information Center
Employment and Training Administration (DOL), Washington, DC.
To assess the feasibility of large-scale, countercyclical public job creation, a study was initiated. Job creation program activities were examined in terms of how many activities could be undertaken; what would be their costs; and what would be their characteristics (labor-intensity, skill-mix, and political acceptability) that might contribute…
Evaluating the Health Impact of Large-Scale Public Policy Changes: Classical and Novel Approaches
Basu, Sanjay; Meghani, Ankita; Siddiqi, Arjumand
2018-01-01
Large-scale public policy changes are often recommended to improve public health. Despite varying widely—from tobacco taxes to poverty-relief programs—such policies present a common dilemma to public health researchers: how to evaluate their health effects when randomized controlled trials are not possible. Here, we review the state of knowledge and experience of public health researchers who rigorously evaluate the health consequences of large-scale public policy changes. We organize our discussion by detailing approaches to address three common challenges of conducting policy evaluations: distinguishing a policy effect from time trends in health outcomes or preexisting differences between policy-affected and -unaffected communities (using difference-in-differences approaches); constructing a comparison population when a policy affects a population for whom a well-matched comparator is not immediately available (using propensity score or synthetic control approaches); and addressing unobserved confounders by utilizing quasi-random variations in policy exposure (using regression discontinuity, instrumental variables, or near-far matching approaches). PMID:28384086
Contracting out of health services in developing countries.
McPake, B; Banda, E E
1994-03-01
Contracting out is emerging as a common policy issue in a number of developing countries. The theoretical case for contracting out suggests many advantages in combining public finance with private provision. However, practical difficulties such as those of ensuring that competition takes place between potential contractors, that competition leads to efficiency and that contracts and the process of contracting are effectively managed, suggest that such advantages may not always be realized. Most countries are likely only to contemplate restricted contracting of small-scale non-clinical services in the short term. Prerequisites of more extensive models appear to be the development of information systems and human resources to that end. Some urban areas of larger countries may have the existing preconditions for more successful large-scale contracting.
Existence and control of Legionella bacteria in building water systems: A review.
Springston, John P; Yocavitch, Liana
2017-02-01
Legionellae are waterborne bacteria which are capable of causing potentially fatal Legionnaires' disease (LD), as well as Pontiac Fever. Public concern about Legionella exploded following the 1976 outbreak at the American Legion conference in Philadelphia, where 221 attendees contracted pneumonia and 34 died. Since that time, a variety of different control methods and strategies have been developed and implemented in an effort to eradicate Legionella from building water systems. Despite these efforts, the incidence of LD has been steadily increasing in the U.S. for more than a decade. Public health and occupational hygiene professionals have maintained an active debate regarding best practices for management and control of Legionella. Professional opinion remains divided with respect to the relative merits of performing routine sampling for Legionella, vs. the passive, reactive approach that has been largely embraced by public health officials and facility owners. Given the potential risks and ramifications associated with waiting to assess systems for Legionella until after disease has been identified and confirmed, a proactive approach of periodic testing for Legionella, along with proper water treatment, is the best approach to avoiding large-scale disease outbreaks.
NASA Astrophysics Data System (ADS)
Nakata, Mitsuhiko; Tanimoto, Shunsuke; Ishida, Shuichi; Ohsumi, Michio; Hoshikuma, Jun-ichi
2017-10-01
There is risk of bridge foundations to be damaged by liquefaction-induced lateral spreading of ground. Once bridge foundations have been damaged, it takes a lot of time for restoration. Therefore, it is important to assess the seismic behavior of the foundations on liquefiable ground appropriately. In this study, shaking table tests of models on a scale of 1/10 were conducted at the large scale shaking table in Public Works Research Institute, Japan, to investigate the seismic behavior of pile-supported bridge abutment on liquefiable ground. The shaking table tests were conducted for three types of model. Two are models of existing bridge which was built without design for liquefaction and the other is a model of bridge which was designed based on the current Japanese design specifications for highway bridges. As a result, the bending strains of piles of the abutment which were designed based on the current design specifications were less than those of the existing bridge.
Cloud computing for genomic data analysis and collaboration.
Langmead, Ben; Nellore, Abhinav
2018-04-01
Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.
Multiple scales in metapopulations of public goods producers
NASA Astrophysics Data System (ADS)
Bauer, Marianne; Frey, Erwin
2018-04-01
Multiple scales in metapopulations can give rise to paradoxical behavior: in a conceptual model for a public goods game, the species associated with a fitness cost due to the public good production can be stabilized in the well-mixed limit due to the mere existence of these scales. The scales in this model involve a length scale corresponding to separate patches, coupled by mobility, and separate time scales for reproduction and interaction with a local environment. Contrary to the well-mixed high mobility limit, we find that for low mobilities, the interaction rate progressively stabilizes this species due to stochastic effects, and that the formation of spatial patterns is not crucial for this stabilization.
Tuan, Nguyen Thanh; Alayon, Silvia; Do, Tran Thanh; Ngan, Tran Thi; Hajeebhoy, Nemat
2015-01-01
Little information is available about how to build a monitoring system to measure the output of preventive nutrition interventions, such as counselling on infant and young child feeding. This paper describes the Alive & Thrive Vietnam (A&T) project experience in nesting a large-scale project monitoring system into the existing public health information system (e.g. using the system and resources), and in using monitoring data to strengthen service delivery in 15 provinces with A&T franchises. From January 2012 to April 2014, the 780 A&T franchises provided 1,700,000 counselling contacts (~3/4 by commune franchises). In commune franchises in April 2014, 80% of mothers who were pregnant or with children under two years old had been to the counselling service at least one time, and 87% of clients had been to the service earlier. Monitoring data are used to track the progress of the project, make decisions, provide background for a costing study and advocate for the integration of nutrition counselling indicators into the health information system nationwide. With careful attention to the needs of stakeholders at multiple levels, clear data quality assurance measures and strategic feedback mechanisms, it is feasible to monitor the scale-up of nutrition programmes through the existing routine health information system.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.
Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C
2011-11-27
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.
A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project
Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.
2011-01-01
Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969
Large scale geologic sequestration (GS) of carbon dioxide poses a novel set of challenges for regulators. This paper focuses on the unique needs of large scale GS projects in light of the existing regulatory regimes in the United States and Canada and identifies several differen...
Large Scale PEM Electrolysis to Enable Renewable Hydrogen Fuel Production
2010-02-10
PEM Fuel Cell Anode + -Cathode e- e- e- e- Electric load...BOP system. • Enables new product launch (C- Series) Proton PEM cell stack for UK Vanguard subs 18UNCLASSIFIED: Dist A. Approved for public release...UNCLASSIFIED: Dist A. Approved for public release “Large Scale PEM Electrolysis to Enable Renewable Hydrogen Fuel Production” Alternative Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Eric J
The ResStock analysis tool is helping states, municipalities, utilities, and manufacturers identify which home upgrades save the most energy and money. Across the country there's a vast diversity in the age, size, construction practices, installed equipment, appliances, and resident behavior of the housing stock, not to mention the range of climates. These variations have hindered the accuracy of predicting savings for existing homes. Researchers at the National Renewable Energy Laboratory (NREL) developed ResStock. It's a versatile tool that takes a new approach to large-scale residential energy analysis by combining: large public and private data sources, statistical sampling, detailed subhourly buildingmore » simulations, high-performance computing. This combination achieves unprecedented granularity and most importantly - accuracy - in modeling the diversity of the single-family housing stock.« less
A process for creating multimetric indices for large-scale aquatic surveys
Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...
Managing Risk and Uncertainty in Large-Scale University Research Projects
ERIC Educational Resources Information Center
Moore, Sharlissa; Shangraw, R. F., Jr.
2011-01-01
Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…
Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities
NASA Astrophysics Data System (ADS)
Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.
2012-07-01
Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.
Evaluation of Large-Scale Public-Sector Reforms: A Comparative Analysis
ERIC Educational Resources Information Center
Breidahl, Karen N.; Gjelstrup, Gunnar; Hansen, Hanne Foss; Hansen, Morten Balle
2017-01-01
Research on the evaluation of large-scale public-sector reforms is rare. This article sets out to fill that gap in the evaluation literature and argues that it is of vital importance since the impact of such reforms is considerable and they change the context in which evaluations of other and more delimited policy areas take place. In our…
Xu, Rong; Li, Li; Wang, QuanQiu
2013-01-01
Motivation: Systems approaches to studying phenotypic relationships among diseases are emerging as an active area of research for both novel disease gene discovery and drug repurposing. Currently, systematic study of disease phenotypic relationships on a phenome-wide scale is limited because large-scale machine-understandable disease–phenotype relationship knowledge bases are often unavailable. Here, we present an automatic approach to extract disease–manifestation (D-M) pairs (one specific type of disease–phenotype relationship) from the wide body of published biomedical literature. Data and Methods: Our method leverages external knowledge and limits the amount of human effort required. For the text corpus, we used 119 085 682 MEDLINE sentences (21 354 075 citations). First, we used D-M pairs from existing biomedical ontologies as prior knowledge to automatically discover D-M–specific syntactic patterns. We then extracted additional pairs from MEDLINE using the learned patterns. Finally, we analysed correlations between disease manifestations and disease-associated genes and drugs to demonstrate the potential of this newly created knowledge base in disease gene discovery and drug repurposing. Results: In total, we extracted 121 359 unique D-M pairs with a high precision of 0.924. Among the extracted pairs, 120 419 (99.2%) have not been captured in existing structured knowledge sources. We have shown that disease manifestations correlate positively with both disease-associated genes and drug treatments. Conclusions: The main contribution of our study is the creation of a large-scale and accurate D-M phenotype relationship knowledge base. This unique knowledge base, when combined with existing phenotypic, genetic and proteomic datasets, can have profound implications in our deeper understanding of disease etiology and in rapid drug repurposing. Availability: http://nlp.case.edu/public/data/DMPatternUMLS/ Contact: rxx@case.edu PMID:23828786
NASA Astrophysics Data System (ADS)
Vucinic, Dean; Deen, Danny; Oanta, Emil; Batarilo, Zvonimir; Lacor, Chris
This paper focuses on visualization and manipulation of graphical content in distributed network environments. The developed graphical middleware and 3D desktop prototypes were specialized for situational awareness. This research was done in the LArge Scale COllaborative decision support Technology (LASCOT) project, which explored and combined software technologies to support human-centred decision support system for crisis management (earthquake, tsunami, flooding, airplane or oil-tanker incidents, chemical, radio-active or other pollutants spreading, etc.). The performed state-of-the-art review did not identify any publicly available large scale distributed application of this kind. Existing proprietary solutions rely on the conventional technologies and 2D representations. Our challenge was to apply the "latest" available technologies, such Java3D, X3D and SOAP, compatible with average computer graphics hardware. The selected technologies are integrated and we demonstrate: the flow of data, which originates from heterogeneous data sources; interoperability across different operating systems and 3D visual representations to enhance the end-users interactions.
Goater, Sarah; Derne, Bonnie; Weinstein, Philip
2011-01-01
Background Emerging environmental pressures resulting from climate change and globalization challenge the capacity of health information systems (HIS) in the Pacific to inform future policy and public health interventions. Ciguatera, a globally common marine food-borne illness, is used here to illustrate specific HIS challenges in the Pacific and how these might be overcome proactively to meet the changing surveillance needs resulting from environmental change. Objectives We review and highlight inefficiencies in the reactive nature of existing HIS in the Pacific to collect, collate, and communicate ciguatera fish poisoning data currently used to inform public health intervention. Further, we review the capacity of existing HIS to respond to new data needs associated with shifts in ciguatera disease burden likely to result from coral reef habitat disruption. Discussion Improved knowledge on the ecological drivers of ciguatera prevalence at local and regional levels is needed, combined with enhanced surveillance techniques and data management systems, to capture environmental drivers as well as health outcomes data. Conclusions The capacity of public HIS to detect and prevent future outbreaks is largely dependent on the future development of governance strategies that promote proactive surveillance and health action. Accordingly, we present an innovative framework from which to stimulate scientific debate on how this might be achieved by using existing larger scale data sets and multidisciplinary collaborations. PMID:21163721
Goater, Sarah; Derne, Bonnie; Weinstein, Philip
2011-05-01
Emerging environmental pressures resulting from climate change and globalization challenge the capacity of health information systems (HIS) in the Pacific to inform future policy and public health interventions. Ciguatera, a globally common marine food-borne illness, is used here to illustrate specific HIS challenges in the Pacific and how these might be overcome proactively to meet the changing surveillance needs resulting from environmental change. We review and highlight inefficiencies in the reactive nature of existing HIS in the Pacific to collect, collate, and communicate ciguatera fish poisoning data currently used to inform public health intervention. Further, we review the capacity of existing HIS to respond to new data needs associated with shifts in ciguatera disease burden likely to result from coral reef habitat disruption. Improved knowledge on the ecological drivers of ciguatera prevalence at local and regional levels is needed, combined with enhanced surveillance techniques and data management systems, to capture environmental drivers as well as health outcomes data. The capacity of public HIS to detect and prevent future outbreaks is largely dependent on the future development of governance strategies that promote proactive surveillance and health action. Accordingly, we present an innovative framework from which to stimulate scientific debate on how this might be achieved by using existing larger scale data sets and multidisciplinary collaborations.
NASA Astrophysics Data System (ADS)
Petropoulos, Z.; Clavin, C.; Zuckerman, B.
2015-12-01
The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale chemical releases.
Hermjakob, Henning; Montecchi-Palazzi, Luisa; Bader, Gary; Wojcik, Jérôme; Salwinski, Lukasz; Ceol, Arnaud; Moore, Susan; Orchard, Sandra; Sarkans, Ugis; von Mering, Christian; Roechert, Bernd; Poux, Sylvain; Jung, Eva; Mersch, Henning; Kersey, Paul; Lappe, Michael; Li, Yixue; Zeng, Rong; Rana, Debashis; Nikolski, Macha; Husi, Holger; Brun, Christine; Shanker, K; Grant, Seth G N; Sander, Chris; Bork, Peer; Zhu, Weimin; Pandey, Akhilesh; Brazma, Alvis; Jacq, Bernard; Vidal, Marc; Sherman, David; Legrain, Pierre; Cesareni, Gianni; Xenarios, Ioannis; Eisenberg, David; Steipe, Boris; Hogue, Chris; Apweiler, Rolf
2004-02-01
A major goal of proteomics is the complete description of the protein interaction network underlying cell physiology. A large number of small scale and, more recently, large-scale experiments have contributed to expanding our understanding of the nature of the interaction network. However, the necessary data integration across experiments is currently hampered by the fragmentation of publicly available protein interaction data, which exists in different formats in databases, on authors' websites or sometimes only in print publications. Here, we propose a community standard data model for the representation and exchange of protein interaction data. This data model has been jointly developed by members of the Proteomics Standards Initiative (PSI), a work group of the Human Proteome Organization (HUPO), and is supported by major protein interaction data providers, in particular the Biomolecular Interaction Network Database (BIND), Cellzome (Heidelberg, Germany), the Database of Interacting Proteins (DIP), Dana Farber Cancer Institute (Boston, MA, USA), the Human Protein Reference Database (HPRD), Hybrigenics (Paris, France), the European Bioinformatics Institute's (EMBL-EBI, Hinxton, UK) IntAct, the Molecular Interactions (MINT, Rome, Italy) database, the Protein-Protein Interaction Database (PPID, Edinburgh, UK) and the Search Tool for the Retrieval of Interacting Genes/Proteins (STRING, EMBL, Heidelberg, Germany).
Glacial sediment causing regional-scale elevated arsenic in drinking water.
Erickson, Melinda L; Barnes, Randal J
2005-01-01
In the upper Midwest, USA, elevated arsenic concentrations in public drinking water systems are associated with the lateral extent of northwest provenance late Wisconsin-aged drift. Twelve percent of public water systems located within the footprint of this drift (212 of 1764) exceed 10 microg/L arsenic, which is the U.S. EPA's drinking water standard. Outside of the footprint, only 2.4% of public water systems (52 of 2182) exceed 10 microg/L arsenic. Both glacial drift aquifers and shallow bedrock aquifers overlain by northwest provenance late Wisconsin-aged sediment are affected by arsenic contamination. Evidence suggests that the distinct physical characteristics of northwest provenance late Wisconsin-aged drift--its fine-grained matrix and entrained organic carbon that fosters biological activity--cause the geochemical conditions necessary to mobilize arsenic via reductive mechanisms such as reductive desorption and reductive dissolution of metal oxides. This study highlights an important and often unrecognized phenomenon: high-arsenic sediment is not necessary to cause arsenic-impacted ground water--when "impacted" is now defined as >10 microg/L. This analysis also demonstrates the scientific and economic value of using existing large but imperfect statewide data sets to observe and characterize regional-scale environmental problems.
48 CFR 852.236-71 - Specifications and drawings for construction.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (b) Large scale drawings supersede small scale drawings. (c) Dimensions govern in all cases. Scaling of drawings may be done only for general location and general size of items. (d) Dimensions shown of existing work and all dimensions required for work that is to connect with existing work shall be verified...
48 CFR 852.236-71 - Specifications and drawings for construction.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (b) Large scale drawings supersede small scale drawings. (c) Dimensions govern in all cases. Scaling of drawings may be done only for general location and general size of items. (d) Dimensions shown of existing work and all dimensions required for work that is to connect with existing work shall be verified...
48 CFR 852.236-71 - Specifications and drawings for construction.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (b) Large scale drawings supersede small scale drawings. (c) Dimensions govern in all cases. Scaling of drawings may be done only for general location and general size of items. (d) Dimensions shown of existing work and all dimensions required for work that is to connect with existing work shall be verified...
48 CFR 852.236-71 - Specifications and drawings for construction.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (b) Large scale drawings supersede small scale drawings. (c) Dimensions govern in all cases. Scaling of drawings may be done only for general location and general size of items. (d) Dimensions shown of existing work and all dimensions required for work that is to connect with existing work shall be verified...
48 CFR 852.236-71 - Specifications and drawings for construction.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (b) Large scale drawings supersede small scale drawings. (c) Dimensions govern in all cases. Scaling of drawings may be done only for general location and general size of items. (d) Dimensions shown of existing work and all dimensions required for work that is to connect with existing work shall be verified...
Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering
Sun, Peng; Speicher, Nora K.; Röttger, Richard; Guo, Jiong; Baumbach, Jan
2014-01-01
Abstract The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as ‘simultaneous clustering’ or ‘co-clustering’, has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: ‘Bi-Force’. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279–292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. PMID:24682815
ERIC Educational Resources Information Center
Hamilton, Mary
2017-01-01
This paper examines how international, large-scale skills assessments (ILSAs) engage with the broader societies they seek to serve and improve. It looks particularly at the discursive work that is done by different interest groups and the media through which the findings become part of public conversations and are translated into usable form in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebrahimi, Fatima
Magnetic fields are observed to exist on all scales in many astrophysical sources such as stars, galaxies, and accretion discs. Understanding the origin of large scale magnetic fields, whereby the field emerges on spatial scales large compared to the fluctuations, has been a particularly long standing challenge. Our physics objective are: 1) what are the minimum ingredients for large-scale dynamo growth? 2) could a large-scale magnetic field grow out of turbulence and sustained despite the presence of dissipation? These questions are fundamental for understanding the large-scale dynamo in both laboratory and astrophysical plasmas. Here, we report major new findings inmore » the area of Large-Scale Dynamo (magnetic field generation).« less
Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M
2001-12-05
Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.
Development of a large-scale transportation optimization course.
DOT National Transportation Integrated Search
2011-11-01
"In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...
Gething, Peter W; Patil, Anand P; Hay, Simon I
2010-04-01
Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.
Spherical 3D isotropic wavelets
NASA Astrophysics Data System (ADS)
Lanusse, F.; Rassat, A.; Starck, J.-L.
2012-04-01
Context. Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D spherical Fourier-Bessel (SFB) analysis in spherical coordinates is natural. Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. Aims: The aim of this paper is to present a new formalism for a spherical 3D isotropic wavelet, i.e. one based on the SFB decomposition of a 3D field and accompany the formalism with a public code to perform wavelet transforms. Methods: We describe a new 3D isotropic spherical wavelet decomposition based on the undecimated wavelet transform (UWT) described in Starck et al. (2006). We also present a new fast discrete spherical Fourier-Bessel transform (DSFBT) based on both a discrete Bessel transform and the HEALPIX angular pixelisation scheme. We test the 3D wavelet transform and as a toy-application, apply a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and find we can successfully remove noise without much loss to the large scale structure. Results: We have described a new spherical 3D isotropic wavelet transform, ideally suited to analyse and denoise future 3D spherical cosmological surveys, which uses a novel DSFBT. We illustrate its potential use for denoising using a toy model. All the algorithms presented in this paper are available for download as a public code called MRS3D at http://jstarck.free.fr/mrs3d.html
Hill, Elizabeth M; Turner, Emma L; Martin, Richard M; Donovan, Jenny L
2013-06-04
Opt-in consent is usually required for research, but is known to introduce selection bias. This is a particular problem for large scale epidemiological studies using only pre-collected health data. Most previous studies have shown that members of the public value opt-in consent and can perceive research without consent as an invasion of privacy. Past research has suggested that people are generally unaware of research processes and existing safeguards, and that education may increase the acceptability of research without prior informed consent, but this recommendation has not been formally evaluated. Our objectives were to determine the range of public opinion about the use of existing medical data for research and to explore views about consent to a secondary review of medical records for research. We also investigated the effect of the provision of detailed information about the potential effect of selection bias on public acceptability of the use of data for research. We carried out a systematic review of existing literature on public attitudes to secondary use of existing health records identified by searching PubMed (1966-present), Embase (1974-present) and reference lists of identified studies to provide a general overview, followed by a qualitative focus group study with 19 older men recruited from rural and suburban primary care practices in the UK to explore key issues in detail. The systematic review identified twenty-seven relevant papers and the findings suggested that males and older people were more likely to consent to a review of their medical data. Many studies noted participants' lack of knowledge about research processes and existing safeguards and this was reflected in the focus groups. Focus group participants became more accepting of the use of pre-collected medical data without consent after being given information about selection bias and research processes. All participants were keen to contribute to NHS-related research but some were concerned about data-sharing for commercial gain and the potential misuse of information. Increasing public education about research and specific targeted information provision could promote trust in research processes and safeguards, which in turn could increase the acceptability of research without specific consent where the need for consent would lead to biased findings and impede research necessary to improve public health.
Public Health Crisis in War and Conflict - Health Security in Aggregate.
Quinn, John; Zelený, Tomáš; Subramaniam, Rammika; Bencko, Vladimír
2017-03-01
Public health status of populations is multifactorial and besides other factors it is linked to war and conflict. Public health crisis can erupt when states go to war or are invaded; health security may be reduced for affected populations. This study reviews in aggregate multiple indices of human security, human development and legitimacy of the state in order to describe a predictable global health portrait. Paradigm shift of large global powers to that non-state actors and proxies impact regional influence through scaled conflict and present major global health challenges for policy makers. Small scale conflict with large scale violence threatens health security for at-risk populations. The paper concludes that health security is directly proportional to state security. Copyright© by the National Institute of Public Health, Prague 2017
Mackintosh, Maureen; Channon, Amos; Karan, Anup; Selvaraj, Sakthivel; Cavagnero, Eleonora; Zhao, Hongwen
2016-08-06
Private health care in low-income and middle-income countries is very extensive and very heterogeneous, ranging from itinerant medicine sellers, through millions of independent practitioners-both unlicensed and licensed-to corporate hospital chains and large private insurers. Policies for universal health coverage (UHC) must address this complex private sector. However, no agreed measures exist to assess the scale and scope of the private health sector in these countries, and policy makers tasked with managing and regulating mixed health systems struggle to identify the key features of their private sectors. In this report, we propose a set of metrics, drawn from existing data that can form a starting point for policy makers to identify the structure and dynamics of private provision in their particular mixed health systems; that is, to identify the consequences of specific structures, the drivers of change, and levers available to improve efficiency and outcomes. The central message is that private sectors cannot be understood except within their context of mixed health systems since private and public sectors interact. We develop an illustrative and partial country typology, using the metrics and other country information, to illustrate how the scale and operation of the public sector can shape the private sector's structure and behaviour, and vice versa. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Eichhubl, Peter; Frohlich, Cliff; Gale, Julia; Olson, Jon; Fan, Zhiqiang; Gono, Valerie
2014-05-01
Induced seismicity during or following the subsurface injection of waste fluids such as well stimulation flow back and production fluids has recently received heightened public and industry attention. It is understood that induced seismicity occurs by reactivation of existing faults that are generally present in the injection intervals. We seek to address the question why fluid injection triggers earthquakes in some areas and not in others, with the aim toward improved injection methods that optimize injection volume and cost while avoiding induced seismicity. A GIS database has been built of natural and induced earthquakes in four hydrocarbon-producing basins: the Fort Worth Basin, South Texas, East Texas/Louisiana, and the Williston Basin. These areas are associated with disposal from the Barnett, Eagle Ford, Bakken, and Haynesville Shales respectively. In each region we analyzed data that were been collected using temporary seismographs of the National Science Foundation's USArray Transportable Array. Injection well locations, formations, histories, and volumes are also mapped using public and licensed datasets. Faults are mapped at a range of scales for selected areas that show different levels of seismic activity, and scaling relationships used to extrapolate between the seismic and wellbore scale. Reactivation potential of these faults is assessed using fault occurrence, and in-situ stress conditions, identifying areas of high and low fault reactivation potential. A correlation analysis between fault reactivation potential, induced seismicity, and fluid injection will use spatial statistics to quantify the probability of seismic fault reactivation for a given injection pressure in the studied reservoirs. The limiting conditions inducing fault reactivation will be compared to actual injection parameters (volume, rate, injection duration and frequency) where available. The objective of this project is a statistical reservoir- to basin-scale assessment of fault reactivation and seismicity induced by fluid injection. By assessing the occurrence of earthquakes (M>2) evenly across large geographic regions, this project differs from previous studies of injection-induced seismicity that focused on earthquakes large enough to cause public concern in well-populated areas. The understanding of triggered seismicity gained through this project is expected to allow for improved design strategies for waste fluid injection to industry and public decision makers.
Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.
2016-09-28
Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.
Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less
Process for Low Cost Domestic Production of LIB Cathode Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thurston, Anthony
The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less
WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’
NASA Astrophysics Data System (ADS)
Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.
2009-12-01
The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1986-01-01
The existence of large scale coherent structures in turbulent shear flows has been well documented. Discrepancies between experimental and computational data suggest a necessity to understand the roles they play in mass and momentum transport. Using conditional sampling and averaging on coincident two-component velocity and concentration velocity experimental data for swirling and nonswirling coaxial jets, triggers for identifying the structures were examined. Concentration fluctuation was found to be an adequate trigger or indicator for the concentration-velocity data, but no suitable detector was located for the two-component velocity data. The large scale structures are found in the region where the largest discrepancies exist between model and experiment. The traditional gradient transport model does not fit in this region as a result of these structures. The large scale motion was found to be responsible for a large percentage of the axial mass transport. The large scale structures were found to convect downstream at approximately the mean velocity of the overall flow in the axial direction. The radial mean velocity of the structures was found to be substantially greater than that of the overall flow.
[Privacy and public benefit in using large scale health databases].
Yamamoto, Ryuichi
2014-01-01
In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.
Using social network analysis to understand Missouri's system of public health emergency planners.
Harris, Jenine K; Clements, Bruce
2007-01-01
Effective response to large-scale public health threats requires well-coordinated efforts among individuals and agencies. While guidance is available to help states put emergency planning programs into place, little has been done to evaluate the human infrastructure that facilitates successful implementation of these programs. This study examined the human infrastructure of the Missouri public health emergency planning system in 2006. The Center for Emergency Response and Terrorism (CERT) at the Missouri Department of Health and Senior Services has responsibility for planning, guiding, and funding statewide emergency response activities. Thirty-two public health emergency planners working primarily in county health departments contract with CERT to support statewide preparedness. We surveyed the planners to determine whom they communicate with, work with, seek expertise from, and exchange guidance with regarding emergency preparedness in Missouri. Most planners communicated regularly with planners in their region but seldom with planners outside their region. Planners also reported working with an average of 12 local entities (e.g., emergency management, hospitals/ clinics). Planners identified the following leaders in Missouri's public health emergency preparedness system: local public health emergency planners, state epidemiologists, the state vaccine and grant coordinator, regional public health emergency planners, State Emergency Management Agency area coordinators, the state Strategic National Stockpile coordinator, and Federal Bureau of Investigation Weapons of Mass Destruction coordinators. Generally, planners listed few federal-level or private-sector individuals in their emergency preparedness networks. While Missouri public health emergency planners maintain large and varied emergency preparedness networks, there are opportunities for strengthening existing ties and seeking additional connections.
Generation of large-scale magnetic fields by small-scale dynamo in shear flows
Squire, J.; Bhattacharjee, A.
2015-10-20
We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Furthermore, given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic naturemore » of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.« less
Generation of Large-Scale Magnetic Fields by Small-Scale Dynamo in Shear Flows.
Squire, J; Bhattacharjee, A
2015-10-23
We propose a new mechanism for a turbulent mean-field dynamo in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of a large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the "shear-current" effect. Given the inevitable existence of nonhelical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help explain the generation of large-scale magnetic fields across a wide range of astrophysical objects.
Community Consultation and Public Disclosure: Preliminary Results From A New Model
Ramsey, Cornelia A.; Quearry, Bonnie; Ripley, Elizabeth
2011-01-01
Emergency medicine research conducted under the exception from informed consent (EFIC) regulation enables critical scientific advancements. When EFIC is proposed, there is a requirement for broad community consultation and public disclosure regarding the risks and benefits of the study. At the present time, no clear guidelines or standards exist for conducting and evaluating the community consultation and public disclosure. This preliminary study tested the feasibility and acceptability of a new approach to community consultation and public disclosure for a large-scale EFIC trial by engaging community members in designing and conducting the strategies. The authors enrolled key community members (called Community Advocates for Research, or CARs) to use community-based participatory methods to design and implement community consultation and public disclosure. By partnering with community members who represent target populations for the research study, this new approach has demonstrated a feasible community consultation and public disclosure plan with greater community participation and less cost than previous studies. In a community survey, the percentage of community members reporting having heard about the EFIC trial more than doubled after employing the new approach. This article discusses initial implementation and results. PMID:21729187
Data management strategies for multinational large-scale systems biology projects.
Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.
Data management strategies for multinational large-scale systems biology projects
Peuker, Martin; Regenbrecht, Christian R.A.
2014-01-01
Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157
Toren, Katelynne Gardner; Elsenboss, Carina; Narita, Masahiro
2017-01-01
Public Health—Seattle and King County, a metropolitan health department in western Washington, experiences rates of tuberculosis (TB) that are 1.6 times higher than are state and national averages. The department’s TB Control Program uses public health emergency management tools and capabilities sustained with Centers for Disease Control and Prevention grant funding to manage large-scale complex case investigations. We have described 3 contact investigations in large congregate settings that the TB Control Program conducted in 2015 and 2016. The program managed the investigations using public health emergency management tools, with support from the Preparedness Program. The 3 investigations encompassed medical evaluation of more than 1600 people, used more than 100 workers, identified nearly 30 individuals with latent TB infection, and prevented an estimated 3 cases of active disease. These incidents exemplify how investments in public health emergency preparedness can enhance health outcomes in traditional areas of public health. PMID:28892445
Agtini, M D; Ochiai, R L; Soeharno, R; Lee, H J; Sundoro, J; Hadinegoro, S R; Han, O P; Tana, L; Halim, F X S; Ghani, L; Delima; Lestari, W; Sintawati, F X; Kusumawardani, N; Malik, R; Santoso, T S; Nadjib, M; Soeroso, S; Wangsasaputra, F; Ali, M; Ivanoff, B; Galindo, C M; Pang, T; Clemens, J D; Suwandono, A; Acosta, C J
2006-11-01
To report results on coverage, safety and logistics of a large-scale, school-based Vi polysaccharide immunization campaign in North Jakarta. Of 443 primary schools in North Jakarta, Indonesia, 18 public schools were randomly selected for this study. Exclusion criteria were fever 37.5 degrees C or higher at the time of vaccination or a known history of hypersensitivity to any vaccine. Adverse events were monitored and recorded for 1 month after immunization. Because this was a pilot programme, resource use was tracked in detail. During the February 2004 vaccination campaign, 4828 students were immunized (91% of the target population); another 394 students (7%) were vaccinated during mop-up programmes. Informed consent was obtained for 98% of the target population. In all, 34 adverse events were reported, corresponding to seven events per 1000 doses injected; none was serious. The manufacturer recommended cold chain was maintained throughout the programme. This demonstration project in two sub-districts of North Jakarta shows that a large-scale, school-based typhoid fever Vi polysaccharide vaccination campaign is logistically feasible, safe and minimally disruptive to regular school activities, when used in the context of an existing successful immunization platform. The project had high parental acceptance. Nonetheless, policy-relevant questions still need to be answered before implementing a widespread Vi polysaccharide vaccine programme in Indonesia.
Large Scale Density Estimation of Blue and Fin Whales (LSD)
2014-09-30
172. McDonald, MA, Hildebrand, JA, and Mesnick, S (2009). Worldwide decline in tonal frequencies of blue whale songs . Endangered Species Research 9...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...estimating blue and fin whale density that is effective over large spatial scales and is designed to cope with spatial variation in animal density utilizing
Large Scale Pedagogical Transformation as Widespread Cultural Change in Mexican Public Schools
ERIC Educational Resources Information Center
Rincón-Gallardo, Santiago
2016-01-01
This article examines how and under what conditions a new pedagogy can spread at scale using the Learning Community Project (LCP) in Mexico as a case study. Started as a small-scale, grassroots pedagogical change initiative in a handful of public schools, LCP evolved over an 8-year period into a national policy that spread its pedagogy of tutorial…
Geospatial Optimization of Siting Large-Scale Solar Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Quinby, Ted; Caulfield, Emmet
2014-03-01
Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less
MycoDB, a global database of plant response to mycorrhizal fungi.
Chaudhary, V Bala; Rúa, Megan A; Antoninka, Anita; Bever, James D; Cannon, Jeffery; Craig, Ashley; Duchicela, Jessica; Frame, Alicia; Gardes, Monique; Gehring, Catherine; Ha, Michelle; Hart, Miranda; Hopkins, Jacob; Ji, Baoming; Johnson, Nancy Collins; Kaonongbua, Wittaya; Karst, Justine; Koide, Roger T; Lamit, Louis J; Meadow, James; Milligan, Brook G; Moore, John C; Pendergast, Thomas H; Piculell, Bridget; Ramsby, Blake; Simard, Suzanne; Shrestha, Shubha; Umbanhowar, James; Viechtbauer, Wolfgang; Walters, Lawrence; Wilson, Gail W T; Zee, Peter C; Hoeksema, Jason D
2016-05-10
Plants form belowground associations with mycorrhizal fungi in one of the most common symbioses on Earth. However, few large-scale generalizations exist for the structure and function of mycorrhizal symbioses, as the nature of this relationship varies from mutualistic to parasitic and is largely context-dependent. We announce the public release of MycoDB, a database of 4,010 studies (from 438 unique publications) to aid in multi-factor meta-analyses elucidating the ecological and evolutionary context in which mycorrhizal fungi alter plant productivity. Over 10 years with nearly 80 collaborators, we compiled data on the response of plant biomass to mycorrhizal fungal inoculation, including meta-analysis metrics and 24 additional explanatory variables that describe the biotic and abiotic context of each study. We also include phylogenetic trees for all plants and fungi in the database. To our knowledge, MycoDB is the largest ecological meta-analysis database. We aim to share these data to highlight significant gaps in mycorrhizal research and encourage synthesis to explore the ecological and evolutionary generalities that govern mycorrhizal functioning in ecosystems.
MycoDB, a global database of plant response to mycorrhizal fungi
Chaudhary, V. Bala; Rúa, Megan A.; Antoninka, Anita; Bever, James D.; Cannon, Jeffery; Craig, Ashley; Duchicela, Jessica; Frame, Alicia; Gardes, Monique; Gehring, Catherine; Ha, Michelle; Hart, Miranda; Hopkins, Jacob; Ji, Baoming; Johnson, Nancy Collins; Kaonongbua, Wittaya; Karst, Justine; Koide, Roger T.; Lamit, Louis J.; Meadow, James; Milligan, Brook G.; Moore, John C.; Pendergast IV, Thomas H.; Piculell, Bridget; Ramsby, Blake; Simard, Suzanne; Shrestha, Shubha; Umbanhowar, James; Viechtbauer, Wolfgang; Walters, Lawrence; Wilson, Gail W. T.; Zee, Peter C.; Hoeksema, Jason D.
2016-01-01
Plants form belowground associations with mycorrhizal fungi in one of the most common symbioses on Earth. However, few large-scale generalizations exist for the structure and function of mycorrhizal symbioses, as the nature of this relationship varies from mutualistic to parasitic and is largely context-dependent. We announce the public release of MycoDB, a database of 4,010 studies (from 438 unique publications) to aid in multi-factor meta-analyses elucidating the ecological and evolutionary context in which mycorrhizal fungi alter plant productivity. Over 10 years with nearly 80 collaborators, we compiled data on the response of plant biomass to mycorrhizal fungal inoculation, including meta-analysis metrics and 24 additional explanatory variables that describe the biotic and abiotic context of each study. We also include phylogenetic trees for all plants and fungi in the database. To our knowledge, MycoDB is the largest ecological meta-analysis database. We aim to share these data to highlight significant gaps in mycorrhizal research and encourage synthesis to explore the ecological and evolutionary generalities that govern mycorrhizal functioning in ecosystems. PMID:27163938
MycoDB, a global database of plant response to mycorrhizal fungi
NASA Astrophysics Data System (ADS)
Chaudhary, V. Bala; Rúa, Megan A.; Antoninka, Anita; Bever, James D.; Cannon, Jeffery; Craig, Ashley; Duchicela, Jessica; Frame, Alicia; Gardes, Monique; Gehring, Catherine; Ha, Michelle; Hart, Miranda; Hopkins, Jacob; Ji, Baoming; Johnson, Nancy Collins; Kaonongbua, Wittaya; Karst, Justine; Koide, Roger T.; Lamit, Louis J.; Meadow, James; Milligan, Brook G.; Moore, John C.; Pendergast, Thomas H., IV; Piculell, Bridget; Ramsby, Blake; Simard, Suzanne; Shrestha, Shubha; Umbanhowar, James; Viechtbauer, Wolfgang; Walters, Lawrence; Wilson, Gail W. T.; Zee, Peter C.; Hoeksema, Jason D.
2016-05-01
Plants form belowground associations with mycorrhizal fungi in one of the most common symbioses on Earth. However, few large-scale generalizations exist for the structure and function of mycorrhizal symbioses, as the nature of this relationship varies from mutualistic to parasitic and is largely context-dependent. We announce the public release of MycoDB, a database of 4,010 studies (from 438 unique publications) to aid in multi-factor meta-analyses elucidating the ecological and evolutionary context in which mycorrhizal fungi alter plant productivity. Over 10 years with nearly 80 collaborators, we compiled data on the response of plant biomass to mycorrhizal fungal inoculation, including meta-analysis metrics and 24 additional explanatory variables that describe the biotic and abiotic context of each study. We also include phylogenetic trees for all plants and fungi in the database. To our knowledge, MycoDB is the largest ecological meta-analysis database. We aim to share these data to highlight significant gaps in mycorrhizal research and encourage synthesis to explore the ecological and evolutionary generalities that govern mycorrhizal functioning in ecosystems.
Big Data in Medicine is Driving Big Changes
Verspoor, K.
2014-01-01
Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716
[Current state of measures to deal with natural disasters at public universities].
Hirouchi, Tomoko; Tanka, Mamoru; Shimada, Ikuko; Yoshimoto, Yoshinobu; Sato, Atsushi
2012-03-01
The responsibility of a university after a large-scale, natural disaster is to secure the safety of students' and local residents' lives. The present study investigated the current state of measures at public universities to deal with natural disasters in coordination with the local community. A survey was administered at 77 public universities in Japan from March 25 to May 10, 2011. The survey included questions on the existence of local disaster evacuation sites, a disaster manual, disaster equipment storage, emergency drinking water, and food storage. A total of 51% of universities had designated local evacuation sites. Based on responses for the remaining questions, universities with and without the designated disaster response solutions accounted for 42% and 57%, respectively, for disaster manuals; 55% and 33%, respectively, for disaster equipment; 32% and 13%, respectively, for disaster drinking water storage; and 26% and 7%, respectively, for emergency food storage. A majority of public universities have not created disaster manuals, regardless of whether they have a local evacuation site. The survey results also indicated that most universities have no storage of disaster equipment or emergency supplies.
Multi-scale chromatin state annotation using a hierarchical hidden Markov model
NASA Astrophysics Data System (ADS)
Marco, Eugenio; Meuleman, Wouter; Huang, Jialiang; Glass, Kimberly; Pinello, Luca; Wang, Jianrong; Kellis, Manolis; Yuan, Guo-Cheng
2017-04-01
Chromatin-state analysis is widely applied in the studies of development and diseases. However, existing methods operate at a single length scale, and therefore cannot distinguish large domains from isolated elements of the same type. To overcome this limitation, we present a hierarchical hidden Markov model, diHMM, to systematically annotate chromatin states at multiple length scales. We apply diHMM to analyse a public ChIP-seq data set. diHMM not only accurately captures nucleosome-level information, but identifies domain-level states that vary in nucleosome-level state composition, spatial distribution and functionality. The domain-level states recapitulate known patterns such as super-enhancers, bivalent promoters and Polycomb repressed regions, and identify additional patterns whose biological functions are not yet characterized. By integrating chromatin-state information with gene expression and Hi-C data, we identify context-dependent functions of nucleosome-level states. Thus, diHMM provides a powerful tool for investigating the role of higher-order chromatin structure in gene regulation.
NASA Technical Reports Server (NTRS)
Brondum, D. C.; Bennett, J. C.
1986-01-01
The existence of large scale coherent structures in turbulent shear flows has been well documented. Discrepancies between experimental and computational data suggest a necessity to understand the roles they play in mass and momentum transport. Using conditional sampling and averaging on coincident two component velocity and concentration velocity experimental data for swirling and nonswirling coaxial jets, triggers for identifying the structures were examined. Concentration fluctuation was found to be an adequate trigger or indicator for the concentration-velocity data, but no suitable detector was located for the two component velocity data. The large scale structures are found in the region where the largest discrepancies exist between model and experiment. The traditional gradient transport model does not fit in this region as a result of these structures. The large scale motion was found to be responsible for a large percentage downstream at approximately the mean velocity of the overall flow in the axial direction. The radial mean velocity of the structures was found to be substantially greater than that of the overall flow.
Wireless Technology Infrastructures for Authentication of Patients: PKI that Rings
Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D.
2005-01-01
As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system. PMID:15684133
Wireless technology infrastructures for authentication of patients: PKI that rings.
Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D
2005-01-01
As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system.
NASA Astrophysics Data System (ADS)
Yin, Mingqiang; Hu, Wen; Wu, Haonan
2018-03-01
The country vigorously promotes the collective operation of traditional culture, which brings a lot of negative externalities of environmental pollution. Environmental pollution causes the difference between social cost and private cost. The cost of pollution is not borne by private enterprises, it is an external cost for the polluters. This paper attempts to take the Chongqing pig farm as an example to select the COD and TN indicators in the wastewater as the focus point of the analysis. We explore the equilibrium point based on the personal cost of Party A and the public welfare cost brought by environmental pollution, and test the rationality and accuracy of the existing norms. On the basis of existing research, the end of the paper tries to explore the better solution according to the law of Pigou and the property rights delimitation.
Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.
Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.
2017-01-01
The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482
Ecosystem management: A comparison of greater yellowstone and georges bank
NASA Astrophysics Data System (ADS)
Burroughs, Richard H.; Clark, Tim W.
1995-09-01
Ecosystem management links human activities with the functioning of natural environments over large spatial and temporal scales. Our examination of Greater Yellowstone and Georges Bank shows similarities exist between human uses, administrative characteristics, and some biophysical features. Each region faces growing pressures to replace traditional extractive uses with more sustainable extractive or noncommodity uses coupled with concern about endangered species. Ecosystem management as a set of practical guidelines for making decisions under evolving expectations is far from complete, and it embodies new demands on individuals and institutions. In each system these challenges are considered relative to: the public's symbolic understanding of the management challenge, ecosystem management ambiguities, information availability, information use, administrative setting, and learning capabilities of governance organizations Progress in making ecosystem management operational may occur as refinements in content and approach make it an increasingly attractive option for resource users, the public, and government officials.
Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.
Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian
2014-07-01
We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
Taking Stock: Existing Resources for Assessing a New Vision of Science Learning
ERIC Educational Resources Information Center
Alonzo, Alicia C.; Ke, Li
2016-01-01
A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-17
... Integrated Circuit Semiconductor Chips and Products Containing the Same; Notice of a Commission Determination... certain large scale integrated circuit semiconductor chips and products containing same by reason of... existence of a domestic industry. The Commission's notice of investigation named several respondents...
Ontology-based meta-analysis of global collections of high-throughput public data.
Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa
2010-09-29
The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.
DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS
The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...
Ramkumar, Prem N; Muschler, George F; Spindler, Kurt P; Harris, Joshua D; McCulloch, Patrick C; Mont, Michael A
2017-04-01
The recent private-public partnership to unlock and utilize all available health data has large-scale implications for public health and personalized medicine, especially within orthopedics. Today, consumer based technologies such as smartphones and "wearables" store tremendous amounts of personal health data (known as "mHealth") that, when processed and contextualized, have the potential to open new windows of insight for the orthopedic surgeon about their patients. In the present report, the landscape, role, and future technical considerations of mHealth and open architecture are defined with particular examples in lower extremity arthroplasty. A limitation of the current mHealth landscape is the fragmentation and lack of interconnectivity between the myriad of available apps. The importance behind the currently lacking open mHealth architecture is underscored by the offer of improved research, increased workflow efficiency, and value capture for the orthopedic surgeon. There exists an opportunity to leverage existing mobile health data for orthopaedic surgeons, particularly those specializing in lower extremity arthroplasty, by transforming patient small data into insightful big data through the implementation of "open" architecture that affords universal data standards and a global interconnected network. Copyright © 2016 Elsevier Inc. All rights reserved.
He, Guizhen; Zhang, Lei; Lu, Yonglong
2009-09-01
Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Sisi; Nicely, Lucas D; Zhang, Haibin
Modern large-scale networks require the ability to withstand arbitrary failures (i.e., Byzantine failures). Byzantine reliable broadcast algorithms can be used to reliably disseminate information in the presence of Byzantine failures. We design a novel Byzantine reliable broadcast protocol for loosely connected and synchronous networks. While previous such protocols all assume correct senders, our protocol is the first to handle Byzantine senders. To achieve this goal, we have developed new techniques for fault detection and fault tolerance. Our protocol is efficient, and under normal circumstances, no expensive public-key cryptographic operations are used. We implement and evaluate our protocol, demonstrating that ourmore » protocol has high throughput and is superior to the existing protocols in uncivil executions.« less
Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Miller, J.
2017-12-01
Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.
NASA's Hyperwall Revealing the Big Picture
NASA Technical Reports Server (NTRS)
Sellers, Piers
2011-01-01
NASA:s hyperwall is a sophisticated visualization tool used to display large datasets. The hyperwall, or video wall, is capable of displaying multiple high-definition data visualizations and/or images simultaneously across an arrangement of screens. Functioning as a key component at many NASA exhibits, the hyperwall is used to help explain phenomena, ideas, or examples of world change. The traveling version of the hyperwall is typically comprised of nine 42-50" flat-screen monitors arranged in a 3x3 array (as depicted below). However, it is not limited to monitor size or number; screen sizes can be as large as 52" and the arrangement of screens can include more than nine monitors. Generally, NASA satellite and model data are used to highlight particular themes in atmospheric, land, and ocean science. Many of the existing hyperwall stories reveal change across space and time, while others display large-scale still-images accompanied by descriptive, story-telling captions. Hyperwall content on a variety of Earth Science topics already exists and is made available to the public at: eospso.gsfc.nasa.gov/hyperwall. Keynote and PowerPoint presentations as well as Summary of Story files are available for download on each existing topic. New hyperwall content and accompanying files will continue being developed to promote scientific literacy across a diverse group of audience members. NASA invites the use of content accessible through this website but requests the user to acknowledge any and all data sources referenced in the content being used.
Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering.
Sun, Peng; Speicher, Nora K; Röttger, Richard; Guo, Jiong; Baumbach, Jan
2014-05-01
The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as 'simultaneous clustering' or 'co-clustering', has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: 'Bi-Force'. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279-292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
... Integrated Circuit Semiconductor Chips and Products Containing Same; Notice of Investigation AGENCY: U.S... of certain large scale integrated circuit semiconductor chips and products containing same by reason... alleges that an industry in the United States exists as required by subsection (a)(2) of section 337. The...
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
NASA Astrophysics Data System (ADS)
Newcomer, Adam
Increasing demand for electricity and an aging fleet of generators are the principal drivers behind an increasing need for a large amount of capital investments in the US electric power sector in the near term. The decisions (or lack thereof) by firms, regulators and policy makers in response to this challenge have long lasting consequences, incur large economic and environmental risks, and must be made despite large uncertainties about the future operating and business environment. Capital investment decisions are complex: rates of return are not guaranteed; significant uncertainties about future environmental legislation and regulations exist at both the state and national levels---particularly about carbon dioxide emissions; there is an increasing number of shareholder mandates requiring public utilities to reduce their exposure to potentially large losses from stricter environmental regulations; and there are significant concerns about electricity and fuel price levels, supplies, and security. Large scale, low carbon electricity generation facilities using coal, such as integrated gasification combined cycle (IGCC) facilities coupled with carbon capture and sequestration (CCS) technologies, have been technically proven but are unprofitable in the current regulatory and business environment where there is no explicit or implicit price on carbon dioxide emissions. The paper examines two separate scenarios that are actively discussed by policy and decision makers at corporate, state and national levels: a future US electricity system where coal plays a role; and one where the role of coal is limited or nonexistent. The thesis intends to provide guidance for firms and policy makers and outline applications and opportunities for public policies and for private investment decisions to limit financial risks of electricity generation capital investments under carbon constraints.
Numerical Simulations of Vortical Mode Stirring: Effects of Large Scale Shear and Strain
2015-09-30
Numerical Simulations of Vortical Mode Stirring: Effects of Large-Scale Shear and Strain M.-Pascale Lelong NorthWest Research Associates...can be implemented in larger-scale ocean models. These parameterizations will incorporate the effects of local ambient conditions including latitude...talk at the 1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Nonlinear Effects in Internal Waves Conference held
The Expanded Large Scale Gap Test
1987-03-01
NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T
Balkányi, László
2002-01-01
To develop information systems (IS) in the changing environment of the health sector, a simple but throughout model, avoiding the techno-jargon of informatics, might be useful for the top management. A platform neutral, extensible, transparent conceptual model should be established. Limitations of current methods lead to a simple, but comprehensive mapping, in the form of a three-dimensional cube. The three 'orthogonal' views are (a) organization functionality, (b) organizational structures and (c) information technology. Each of the cube-sides is described according to its nature. This approach enables to define any kind of an IS component as a certain point/layer/domain of the cube and enables also the management to label all IS components independently form any supplier(s) and/or any specific platform. The model handles changes in organization structure, business functionality and the serving info-system independently form each other. Practical application extends to (a) planning complex, new ISs, (b) guiding development of multi-vendor, multi-site ISs, (c) supporting large-scale public procurement procedures and the contracting, implementation phase by establishing a platform neutral reference, (d) keeping an exhaustive inventory of an existing large-scale system, that handles non-tangible aspects of the IS.
NASA Astrophysics Data System (ADS)
Williamson, C. E.; Weathers, K. C.; Knoll, L. B.; Brentrup, J.
2012-12-01
Recent rapid advances in sensor technology and cyberinfrastructure have enabled the development of numerous environmental observatories ranging from local networks at field stations and marine laboratories (FSML) to continental scale observatories such as the National Ecological Observatory Network (NEON) to global scale observatories such as the Global Lake Ecological Observatory Network (GLEON). While divergent goals underlie the initial development of these observatories, and they are often designed to serve different communities, many opportunities for synergies exist. In addition, the use of existing infrastructure may enhance the cost-effectiveness of building and maintaining large scale observatories. For example, FSMLs are established facilities with the staff and infrastructure to host sensor nodes of larger networks. Many field stations have existing staff and long-term databases as well as smaller sensor networks that are the product of a single or small group of investigators with a unique data management system embedded in a local or regional community. These field station based facilities and data are a potentially untapped gold mine for larger continental and global scale observatories; common ecological and environmental challenges centered on understanding the impacts of changing climate, land use, and invasive species often underlie these efforts. The purpose of this talk is to stimulate a dialog on the challenges of merging efforts across these different spatial and temporal scales, as well as addressing how to develop synergies among observatory networks with divergent roots and philosophical approaches. For example, FSMLs have existing long-term databases and facilities, while NEON has sparse past data but a well-developed template and closely coordinated team working in a coherent format across a continental scale. GLEON on the other hand is a grass-roots network of experts in science, information technology, and engineering with a common goal of building a scalable network around the world to understand and predict how lakes respond to global change. Creating synergies among networks at these divergent scales requires open discussions ranging from data collection and management to data serving and sharing. Coordination of these efforts can provide an additional opportunity to educate both students and the public in innovative new ways about the broader continental to global scale of ecological and environmental challenges that they have observed in their more local ecosystems.
MetaMap: An atlas of metatranscriptomic reads in human disease-related RNA-seq data.
Simon, L M; Karg, S; Westermann, A J; Engel, M; Elbehery, A H A; Hense, B; Heinig, M; Deng, L; Theis, F J
2018-06-12
With the advent of the age of big data in bioinformatics, large volumes of data and high performance computing power enable researchers to perform re-analyses of publicly available datasets at an unprecedented scale. Ever more studies imply the microbiome in both normal human physiology and a wide range of diseases. RNA sequencing technology (RNA-seq) is commonly used to infer global eukaryotic gene expression patterns under defined conditions, including human disease-related contexts, but its generic nature also enables the detection of microbial and viral transcripts. We developed a bioinformatic pipeline to screen existing human RNA-seq datasets for the presence of microbial and viral reads by re-inspecting the non-human-mapping read fraction. We validated this approach by recapitulating outcomes from 6 independent controlled infection experiments of cell line models and comparison with an alternative metatranscriptomic mapping strategy. We then applied the pipeline to close to 150 terabytes of publicly available raw RNA-seq data from >17,000 samples from >400 studies relevant to human disease using state-of-the-art high performance computing systems. The resulting data of this large-scale re-analysis are made available in the presented MetaMap resource. Our results demonstrate that common human RNA-seq data, including those archived in public repositories, might contain valuable information to correlate microbial and viral detection patterns with diverse diseases. The presented MetaMap database thus provides a rich resource for hypothesis generation towards the role of the microbiome in human disease. Additionally, codes to process new datasets and perform statistical analyses are made available at https://github.com/theislab/MetaMap.
Science for What Public? Addressing Equity in American Science Museums and Science Centers
ERIC Educational Resources Information Center
Feinstein, Noah Weeth; Meshoulam, David
2014-01-01
Science museums and science centers exist (in large part) to bring science to the public. But what public do they serve? The challenge of equity is embodied by the gulf that separates a museum's actual public and the more diverse publics that comprise our society. Yet despite growing scholarly interest in museums and science centers, few…
Preferential pathways in complex fracture systems and their influence on large scale transport
NASA Astrophysics Data System (ADS)
Willmann, M.; Mañé, R.; Tyukhova, A.
2017-12-01
Many subsurface applications in complex fracture systems require large-scale predictions. Precise predictions are difficult because of the existence of preferential pathways at different scales. The intrinsic complexity of fracture systems increases within fractured sedimentary formations, because also the coupling of fractures and matrix has to be taken into account. This interplay of fracture system and the sedimentary matrix is strongly controlled by the actual fracture aperture of an individual fracture. And an effective aperture cannot be easily be determined because of the preferential pathways along the fracture plane. We investigate the influence of these preferential pathways on large scale solute transport and upscale the aperture. By explicitly modeling flow and particle tracking in individual fractures, we develop a new effective transport aperture, which is weighted by the aperture along the preferential paths, a Lagrangian aperture. We show that this new aperture is consistently larger than existing definitions of effective flow and transport apertures. Finally, we apply our results to a fractured sedimentary formation in Northern Switzerland.
Yap, Kien-Pong; Ho, Wing S; Gan, Han M; Chai, Lay C; Thong, Kwai L
2016-01-01
Typhoid fever, caused by Salmonella enterica serovar Typhi, remains an important public health burden in Southeast Asia and other endemic countries. Various genotyping methods have been applied to study the genetic variations of this human-restricted pathogen. Multilocus sequence typing (MLST) is one of the widely accepted methods, and recently, there is a growing interest in the re-application of MLST in the post-genomic era. In this study, we provide the global MLST distribution of S. Typhi utilizing both publicly available 1,826 S. Typhi genome sequences in addition to performing conventional MLST on S. Typhi strains isolated from various endemic regions spanning over a century. Our global MLST analysis confirms the predominance of two sequence types (ST1 and ST2) co-existing in the endemic regions. Interestingly, S. Typhi strains with ST8 are currently confined within the African continent. Comparative genomic analyses of ST8 and other rare STs with genomes of ST1/ST2 revealed unique mutations in important virulence genes such as flhB, sipC, and tviD that may explain the variations that differentiate between seemingly successful (widespread) and unsuccessful (poor dissemination) S. Typhi populations. Large scale whole-genome phylogeny demonstrated evidence of phylogeographical structuring and showed that ST8 may have diverged from the earlier ancestral population of ST1 and ST2, which later lost some of its fitness advantages, leading to poor worldwide dissemination. In response to the unprecedented increase in genomic data, this study demonstrates and highlights the utility of large-scale genome-based MLST as a quick and effective approach to narrow the scope of in-depth comparative genomic analysis and consequently provide new insights into the fine scale of pathogen evolution and population structure.
Placebo use in vaccine trials: Recommendations of a WHO expert panel
Rid, Annette; Saxena, Abha; Baqui, Abdhullah H.; Bhan, Anant; Bines, Julie; Bouesseau, Marie-Charlotte; Caplan, Arthur; Colgrove, James; Dhai, Ames; Gomez-Diaz, Rita; Green, Shane K.; Kang, Gagandeep; Lagos, Rosanna; Loh, Patricia; London, Alex John; Mulholland, Kim; Neels, Pieter; Pitisuttithum, Punee; Sarr, Samba Cor; Selgelid, Michael; Sheehan, Mark; Smith, Peter G.
2014-01-01
Vaccines are among the most cost-effective interventions against infectious diseases. Many candidate vaccines targeting neglected diseases in low- and middle-income countries are now progressing to large-scale clinical testing. However, controversy surrounds the appropriate design of vaccine trials and, in particular, the use of unvaccinated controls (with or without placebo) when an efficacious vaccine already exists. This paper specifies four situations in which placebo use may be acceptable, provided that the study question cannot be answered in an active-controlled trial design; the risks of delaying or foregoing an efficacious vaccine are mitigated; the risks of using a placebo control are justified by the social and public health value of the research; and the research is responsive to local health needs. The four situations are: (1) developing a locally affordable vaccine, (2) evaluating the local safety and efficacy of an existing vaccine, (3) testing a new vaccine when an existing vaccine is considered inappropriate for local use (e.g. based on epidemiologic or demographic factors), and (4) determining the local burden of disease. PMID:24768580
Architecture and Programming Models for High Performance Intensive Computation
2016-06-29
Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID
Tarras-Wahlberg, N H
2002-06-01
This paper considers technical measures and policy initiatives needed to improve environmental management in the Portovelo-Zaruma mining district of southern Ecuador. In this area, gold is mined by a large number of small-scale and artisanal operators, and discharges of cyanide and metal-laden tailings have had a severe impact on the shared Ecuadorian-Peruvian Puyango river system. It is shown to be technically possible to confine mining waste and tailings at a reasonable cost. However, the complex topography of the mining district forces tailings management to be communal, where all operators are connected to one central tailings impoundment. This, in turn, implies two things: (i) that a large number of operators must agree to pool resources to bring such a facility into reality; and (ii) that miners must move away from rudimentary operations that survive on a day-to-day basis, towards bigger, mechanized and longer-term sustainable operations that are based on proven ore reserves. It is deemed unlikely that existing environmental regulations and the provision of technical solutions will be sufficient to resolve the environmental problems. Important impediments relate to the limited financial resources available to each individual miner and the problems of pooling these resources, and to the fact that the main impacts of pollution are suffered downstream of the mining district and, hence, do not affect the miners themselves. Three policy measures are therefore suggested. First, the enforcement of existing regulations must be improved, and this may be achieved by the strengthening of the central authority charged with supervision and control of mining activities. Second, local government involvement and local public participation in environmental management needs to be promoted. Third, a clear policy should be defined which promotes the reorganisation of small operations into larger units that are strong enough to sustain rational exploration and environmental obligations. The case study suggests that mining policy in lesser-developed countries should develop to enable small-scale and artisanal miners to form entities that are of a sufficiently large scale to allow adequate and cost-effective environmental protection.
ERIC Educational Resources Information Center
Strietholt, Rolf; Scherer, Ronny
2018-01-01
The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…
NASA Technical Reports Server (NTRS)
1985-01-01
The goal of defining a CO2 laser transmitter approach suited to Shuttle Coherent Atmospheric Lidar Experiment (SCALE) requirements is discussed. The adaptation of the existing WINDVAN system to the shuttle environment is addressed. The size, weight, reliability, and efficiency of the existing WINDVAN system are largely compatible with SCALE requirements. Repacking is needed for compatibility with vacuum and thermal environments. Changes are required to ensure survival through launch and landing, mechanical, vibration, and acoustic loads. Existing WINDVAN thermal management approaches depending on convection need to be upgraded zero gravity operations.
2013-01-01
The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361
Hatala, Jeffrey J; Fields, Tina T
2015-05-01
Obesity rates in the southern US states are higher than in other states. Historically, large-scale community-based interventions in the United States have not proven successful. With local public health agencies (LPHAs) tasked with prevention, their role in obesity prevention is important, yet little research exists regarding what predicts the participation of LPHAs. Cross-sectional data from the 2008 National Association of City and County Health Officials profile study and two public health conceptual frameworks were used to assess structural and environmental predictors of LPHA participation in obesity prevention. The predictors were compared between southern and nonsouthern states. Univariate and weighted logistic regressions were performed. Analysis revealed that more LPHAs in southern states were engaged in nearly all of the 10 essential public health functions related to obesity prevention compared with nonsouthern states. Presence of community-based organizations and staffing levels were the only significant variables in two of the six logistic regression models. This study provides insights into the success rates of the obesity prevention efforts of LPHAs in southern and nonsouthern states. Future research is needed to understand why and how certain structural elements and any additional factors influence LPHA participation in obesity prevention.
Spatiotemporal property and predictability of large-scale human mobility
NASA Astrophysics Data System (ADS)
Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin
2018-04-01
Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.
PKI security in large-scale healthcare networks.
Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos
2012-06-01
During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.
Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind
2014-12-01
An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Possible evidence for the existence of antimatter on a cosmological scale in the universe.
NASA Technical Reports Server (NTRS)
Stecker, F. W.; Morgan, D. L., Jr.; Bredekamp, J.
1971-01-01
Initial results of a detailed calculation of the cosmological gamma-ray spectrum from matter-antimatter annihilation in the universe. The similarity between the calculated spectrum and the present observations of the gamma-ray background spectrum above 1 MeV suggests that such observations may be evidence of the existence of antimatter on a large scale in the universe.
Generation of large-scale magnetic fields by small-scale dynamo in shear flows
NASA Astrophysics Data System (ADS)
Squire, Jonathan; Bhattacharjee, Amitava
2015-11-01
A new mechanism for turbulent mean-field dynamo is proposed, in which the magnetic fluctuations resulting from a small-scale dynamo drive the generation of large-scale magnetic fields. This is in stark contrast to the common idea that small-scale magnetic fields should be harmful to large-scale dynamo action. These dynamos occur in the presence of large-scale velocity shear and do not require net helicity, resulting from off-diagonal components of the turbulent resistivity tensor as the magnetic analogue of the ``shear-current'' effect. The dynamo is studied using a variety of computational and analytic techniques, both when the magnetic fluctuations arise self-consistently through the small-scale dynamo and in lower Reynolds number regimes. Given the inevitable existence of non-helical small-scale magnetic fields in turbulent plasmas, as well as the generic nature of velocity shear, the suggested mechanism may help to explain generation of large-scale magnetic fields across a wide range of astrophysical objects. This work was supported by a Procter Fellowship at Princeton University, and the US Department of Energy Grant DE-AC02-09-CH11466.
The lost summer: Community experiences of large wildfires in Trinity County, California
Emily J. Davis; Cassandra Moseley; Pamela Jakes; Max Nielsen-Pincus
2011-01-01
As wildfires are increasing in scale and duration, and communities are increasingly located where these wildfires are occurring, we need a clearer understanding of how large wildfires affect economic and social well being. These wildfires can have complex impacts on rural public lands communities. They can threaten homes, public health, and livelihoods. Wildfires can...
2014-01-01
Background Food-borne Salmonella infections are a worldwide concern. During a large-scale outbreak, it is important that the public follows preventive advice. To increase compliance, insight in how the public gathers its knowledge and which factors determine whether or not an individual complies with preventive advice is crucial. Methods In 2012, contaminated salmon caused a large Salmonella Thompson outbreak in the Netherlands. During the outbreak, we conducted an online survey (n = 1,057) to assess the general public’s perceptions, knowledge, preventive behavior and sources of information. Results Respondents perceived Salmonella infections and the 2012 outbreak as severe (m = 4.21; five-point scale with 5 as severe). Their knowledge regarding common food sources, the incubation period and regular treatment of Salmonella (gastro-enteritis) was relatively low (e.g., only 28.7% knew that Salmonella is not normally treated with antibiotics). Preventive behavior differed widely, and the majority (64.7%) did not check for contaminated salmon at home. Most information about the outbreak was gathered through traditional media and news and newspaper websites. This was mostly determined by time spent on the medium. Social media played a marginal role. Wikipedia seemed a potentially important source of information. Conclusions To persuade the public to take preventive actions, public health organizations should deliver their message primarily through mass media. Wikipedia seems a promising instrument for educating the public about food-borne Salmonella. PMID:24479614
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density Len Thomas & Danielle Harris Centre...to develop and implement a new method for estimating blue and fin whale density that is effective over large spatial scales and is designed to cope
NASA Astrophysics Data System (ADS)
Parajuli, Sagar Prasad; Yang, Zong-Liang; Lawrence, David M.
2016-06-01
Large amounts of mineral dust are injected into the atmosphere during dust storms, which are common in the Middle East and North Africa (MENA) where most of the global dust hotspots are located. In this work, we present simulations of dust emission using the Community Earth System Model Version 1.2.2 (CESM 1.2.2) and evaluate how well it captures the spatio-temporal characteristics of dust emission in the MENA region with a focus on large-scale dust storm mobilization. We explicitly focus our analysis on the model's two major input parameters that affect the vertical mass flux of dust-surface winds and the soil erodibility factor. We analyze dust emissions in simulations with both prognostic CESM winds and with CESM winds that are nudged towards ERA-Interim reanalysis values. Simulations with three existing erodibility maps and a new observation-based erodibility map are also conducted. We compare the simulated results with MODIS satellite data, MACC reanalysis data, AERONET station data, and CALIPSO 3-d aerosol profile data. The dust emission simulated by CESM, when driven by nudged reanalysis winds, compares reasonably well with observations on daily to monthly time scales despite CESM being a global General Circulation Model. However, considerable bias exists around known high dust source locations in northwest/northeast Africa and over the Arabian Peninsula where recurring large-scale dust storms are common. The new observation-based erodibility map, which can represent anthropogenic dust sources that are not directly represented by existing erodibility maps, shows improved performance in terms of the simulated dust optical depth (DOD) and aerosol optical depth (AOD) compared to existing erodibility maps although the performance of different erodibility maps varies by region.
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1974-01-01
Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1975-01-01
Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
Investigating a link between large and small-scale chaos features on Europa
NASA Astrophysics Data System (ADS)
Tognetti, L.; Rhoden, A.; Nelson, D. M.
2017-12-01
Chaos is one of the most recognizable, and studied, features on Europa's surface. Most models of chaos formation invoke liquid water at shallow depths within the ice shell; the liquid destabilizes the overlying ice layer, breaking it into mobile rafts and destroying pre-existing terrain. This class of model has been applied to both large-scale chaos like Conamara and small-scale features (i.e. microchaos), which are typically <10 km in diameter. Currently unknown, however, is whether both large-scale and small-scale features are produced together, e.g. through a network of smaller sills linked to a larger liquid water pocket. If microchaos features do form as satellites of large-scale chaos features, we would expect a drop off in the number density of microchaos with increasing distance from the large chaos feature; the trend should not be observed in regions without large-scale chaos features. Here, we test the hypothesis that large chaos features create "satellite" systems of smaller chaos features. Either outcome will help us better understand the relationship between large-scale chaos and microchaos. We focus first on regions surrounding the large chaos features Conamara and Murias (e.g. the Mitten). We map all chaos features within 90,000 sq km of the main chaos feature and assign each one a ranking (High Confidence, Probable, or Low Confidence) based on the observed characteristics of each feature. In particular, we look for a distinct boundary, loss of preexisting terrain, the existence of rafts or blocks, and the overall smoothness of the feature. We also note features that are chaos-like but lack sufficient characteristics to be classified as chaos. We then apply the same criteria to map microchaos features in regions of similar area ( 90,000 sq km) that lack large chaos features. By plotting the distribution of microchaos with distance from the center point of the large chaos feature or the mapping region (for the cases without a large feature), we determine whether there is a distinct signature linking large-scale chaos features with nearby microchaos. We discuss the implications of these results on the process of chaos formation and the extent of liquid water within Europa's ice shell.
ERIC Educational Resources Information Center
Feuer, Michael J.
2011-01-01
Few arguments about education are as effective at galvanizing public attention and motivating political action as those that compare the performance of students with their counterparts in other countries and that connect academic achievement to economic performance. Because data from international large-scale assessments (ILSA) have a powerful…
Influencing Public School Policy in the United States: The Role of Large-Scale Assessments
ERIC Educational Resources Information Center
Schmidt, William H.; Burroughs, Nathan A.
2016-01-01
The authors review the influence of state, national and international large-scale assessments (LSAs) on education policy and research. They distinguish between two main uses of LSAs: as a means for conducting research that informs educational reform and LSAs as a tool for implementing standards and enforcing accountability. The authors discuss the…
The changing role and legitimate boundaries of epidemiology: community-based prevention programmes.
Tuomilehto, J; Puska, P
1987-01-01
Epidemiology is the basic science of public health. It combines medical and social sciences, both of which are developing with new inventions. Therefore, the role of epidemiology and its boundaries are also changing over time. An important role of epidemiology is to develop and implement community-based control programmes for major diseases in the community. Such programmes are essential for large scale public health policy. It is necessary that epidemiological research can as freely as possible test new methods of disease prevention and health promotion. The first community-based control programme for cardiovascular diseases, the North Karelia Project is reviewed against this background. At present, it is still possible to define the boundaries of epidemiology geographically and culturally, but in the future, however, it will become more difficult. There is no doubt that epidemiology will remain as the basic science of public health but the scope of public health problems are growing much wider. These include the prevention of the final epidemic--the destruction of our planet by nuclear bombs. In the control of the existing epidemics and in the prevention of new ones the boundaries of epidemiology cannot stay rigid but they must be changing as new facts about the emerging public health problems are identified.
ERIC Educational Resources Information Center
Abramovich, Samuel; Schunn, Christian
2012-01-01
Ultra-large-scale interactive systems on the Internet have begun to change how teachers prepare for instruction, particularly in regards to resource selection. Consequently, it is important to look at how teachers are currently selecting resources beyond content or keyword search. We conducted a two-part observational study of an existing popular…
ERIC Educational Resources Information Center
Li, Ying; Jiao, Hong; Lissitz, Robert W.
2012-01-01
This study investigated the application of multidimensional item response theory (IRT) models to validate test structure and dimensionality. Multiple content areas or domains within a single subject often exist in large-scale achievement tests. Such areas or domains may cause multidimensionality or local item dependence, which both violate the…
Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations
NASA Technical Reports Server (NTRS)
Sorensen, Danny C.
1996-01-01
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Katul, Gabriel G; Porporato, Amilcare; Nikora, Vladimir
2012-12-01
The existence of a "-1" power-law scaling at low wavenumbers in the longitudinal velocity spectrum of wall-bounded turbulence was explained by multiple mechanisms; however, experimental support has not been uniform across laboratory studies. This letter shows that Heisenberg's eddy viscosity approach can provide a theoretical framework that bridges these multiple mechanisms and explains the elusiveness of the "-1" power law in some experiments. Novel theoretical outcomes are conjectured about the role of intermittency and very-large scale motions in modifying the k⁻¹ scaling.
NASA Astrophysics Data System (ADS)
Rassat, A.; Starck, J.-L.; Dupé, F.-X.
2013-09-01
Context. Although there is currently a debate over the significance of the claimed large-scale anomalies in the cosmic microwave background (CMB), their existence is not totally dismissed. In parallel to the debate over their statistical significance, recent work has also focussed on masks and secondary anisotropies as potential sources of these anomalies. Aims: In this work we investigate simultaneously the impact of the method used to account for masked regions as well as the impact of the integrated Sachs-Wolfe (ISW) effect, which is the large-scale secondary anisotropy most likely to affect the CMB anomalies. In this sense, our work is an update of previous works. Our aim is to identify trends in CMB data from different years and with different mask treatments. Methods: We reconstruct the ISW signal due to 2 Micron All-Sky Survey (2MASS) and NRAO VLA Sky Survey (NVSS) galaxies, effectively reconstructing the low-redshift ISW signal out to z ~ 1. We account for regions of missing data using the sparse inpainting technique. We test sparse inpainting of the CMB, large scale structure and ISW and find that it constitutes a bias-free reconstruction method suitable to study large-scale statistical isotropy and the ISW effect. Results: We focus on three large-scale CMB anomalies: the low quadrupole, the quadrupole/octopole alignment, and the octopole planarity. After sparse inpainting, the low quadrupole becomes more anomalous, whilst the quadrupole/octopole alignment becomes less anomalous. The significance of the low quadrupole is unchanged after subtraction of the ISW effect, while the trend amongst the CMB maps is that both the low quadrupole and the quadrupole/octopole alignment have reduced significance, yet other hypotheses remain possible as well (e.g. exotic physics). Our results also suggest that both of these anomalies may be due to the quadrupole alone. While the octopole planarity significance is reduced after inpainting and after ISW subtraction, however, we do not find that it was very anomalous to start with. In the spirit of participating in reproducible research, we make all codes and resulting products which constitute main results of this paper public here: http://www.cosmostat.org/anomaliesCMB.html
Self-similar solutions of stationary Navier-Stokes equations
NASA Astrophysics Data System (ADS)
Shi, Zuoshunhua
2018-02-01
In this paper, we mainly study the existence of self-similar solutions of stationary Navier-Stokes equations for dimension n = 3 , 4. For n = 3, if the external force is axisymmetric, scaling invariant, C 1 , α continuous away from the origin and small enough on the sphere S2, we shall prove that there exists a family of axisymmetric self-similar solutions which can be arbitrarily large in the class Cloc3 , α (R3 0). Moreover, for axisymmetric external forces without swirl, corresponding to this family, the momentum flux of the flow along the symmetry axis can take any real number. However, there are no regular (U ∈ Cloc3 , α (R3 0)) axisymmetric self-similar solutions provided that the external force is a large multiple of some scaling invariant axisymmetric F which cannot be driven by a potential. In the case of dimension 4, there always exists at least one self-similar solution to the stationary Navier-Stokes equations with any scaling invariant external force in L 4 / 3 , ∞ (R4).
CFD Script for Rapid TPS Damage Assessment
NASA Technical Reports Server (NTRS)
McCloud, Peter
2013-01-01
This grid generation script creates unstructured CFD grids for rapid thermal protection system (TPS) damage aeroheating assessments. The existing manual solution is cumbersome, open to errors, and slow. The invention takes a large-scale geometry grid and its large-scale CFD solution, and creates a unstructured patch grid that models the TPS damage. The flow field boundary condition for the patch grid is then interpolated from the large-scale CFD solution. It speeds up the generation of CFD grids and solutions in the modeling of TPS damages and their aeroheating assessment. This process was successfully utilized during STS-134.
Li, Zhijin; Vogelmann, Andrew M.; Feng, Sha; ...
2015-01-20
We produce fine-resolution, three-dimensional fields of meteorological and other variables for the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM) Southern Great Plains site. The Community Gridpoint Statistical Interpolation system is implemented in a multiscale data assimilation (MS-DA) framework that is used within the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. The MS-DA algorithm uses existing reanalysis products and constrains fine-scale atmospheric properties by assimilating high-resolution observations. A set of experiments show that the data assimilation analysis realistically reproduces the intensity, structure, and time evolution of clouds and precipitation associated with a mesoscale convective system.more » Evaluations also show that the large-scale forcing derived from the fine-resolution analysis has an overall accuracy comparable to the existing ARM operational product. For enhanced applications, the fine-resolution fields are used to characterize the contribution of subgrid variability to the large-scale forcing and to derive hydrometeor forcing, which are presented in companion papers.« less
Relay discovery and selection for large-scale P2P streaming
Zhang, Chengwei; Wang, Angela Yunxian
2017-01-01
In peer-to-peer networks, application relays have been commonly used to provide various networking services. The service performance often improves significantly if a relay is selected appropriately based on its network location. In this paper, we studied the location-aware relay discovery and selection problem for large-scale P2P streaming networks. In these large-scale and dynamic overlays, it incurs significant communication and computation cost to discover a sufficiently large relay candidate set and further to select one relay with good performance. The network location can be measured directly or indirectly with the tradeoffs between timeliness, overhead and accuracy. Based on a measurement study and the associated error analysis, we demonstrate that indirect measurements, such as King and Internet Coordinate Systems (ICS), can only achieve a coarse estimation of peers’ network location and those methods based on pure indirect measurements cannot lead to a good relay selection. We also demonstrate that there exists significant error amplification of the commonly used “best-out-of-K” selection methodology using three RTT data sets publicly available. We propose a two-phase approach to achieve efficient relay discovery and accurate relay selection. Indirect measurements are used to narrow down a small number of high-quality relay candidates and the final relay selection is refined based on direct probing. This two-phase approach enjoys an efficient implementation using the Distributed-Hash-Table (DHT). When the DHT is constructed, the node keys carry the location information and they are generated scalably using indirect measurements, such as the ICS coordinates. The relay discovery is achieved efficiently utilizing the DHT-based search. We evaluated various aspects of this DHT-based approach, including the DHT indexing procedure, key generation under peer churn and message costs. PMID:28410384
Relay discovery and selection for large-scale P2P streaming.
Zhang, Chengwei; Wang, Angela Yunxian; Hei, Xiaojun
2017-01-01
In peer-to-peer networks, application relays have been commonly used to provide various networking services. The service performance often improves significantly if a relay is selected appropriately based on its network location. In this paper, we studied the location-aware relay discovery and selection problem for large-scale P2P streaming networks. In these large-scale and dynamic overlays, it incurs significant communication and computation cost to discover a sufficiently large relay candidate set and further to select one relay with good performance. The network location can be measured directly or indirectly with the tradeoffs between timeliness, overhead and accuracy. Based on a measurement study and the associated error analysis, we demonstrate that indirect measurements, such as King and Internet Coordinate Systems (ICS), can only achieve a coarse estimation of peers' network location and those methods based on pure indirect measurements cannot lead to a good relay selection. We also demonstrate that there exists significant error amplification of the commonly used "best-out-of-K" selection methodology using three RTT data sets publicly available. We propose a two-phase approach to achieve efficient relay discovery and accurate relay selection. Indirect measurements are used to narrow down a small number of high-quality relay candidates and the final relay selection is refined based on direct probing. This two-phase approach enjoys an efficient implementation using the Distributed-Hash-Table (DHT). When the DHT is constructed, the node keys carry the location information and they are generated scalably using indirect measurements, such as the ICS coordinates. The relay discovery is achieved efficiently utilizing the DHT-based search. We evaluated various aspects of this DHT-based approach, including the DHT indexing procedure, key generation under peer churn and message costs.
[Development of the role scale for municipal supervising public health nurses].
Hatono, Yoko; Suzuki, Hiroko; Masaki, Naoko
2013-05-01
As public health nurses are becoming increasingly decentralized in municipalities, recommendations for allocating supervising public health nurses are being made. This study aimed to develop a scale for measuring the implementation of role of municipal supervising public health nurses and to test its reliability and validity. Scale items were developed using results of a qualitative inductive analysis of interview data, and the items were then revised following an examination of content validity by experts, resulting in a provisional scale of 17 items. A self-administered, written questionnaire was then completed by supervising public health nurses or public health nurses holding the most senior positions in all municipalities nationwide, with the exception of three prefectures in the Tohoku region (total 1,621 locations). In total, 1,036 responses were received, and 931 were used for analysis (valid response rate = 57.4%). Of these, 406 were completed by supervising public health nurses. After deleting one item as a result of item analysis and conducting principal component analysis, factor analysis was conducted using the major factor method and Promax rotation. One item with high loading on multiple factors was deleted, resulting in a scale comprising 15 items and 3 factors. The cumulative contribution ratio was 56.10%. The three factors were labeled "Promotion of health activities across the whole locality," "Coordination as a PHN role leader," and "Development of the skills of public health nurses". The reliability coefficient of the RMSP (Role Scale for Municipal Supervising Public Health Nurses) as a whole was 0.84 using the split-half method (Spearman-Brown formula) and 0.91 using Cronbach's alpha, confirming internal consistency. In terms of validity, an examination was conducted of the correlation of two RMSP scale scores (strength of awareness of role as a supervising public health nurse and confidence as a supervising public health nurse) and scores on existing scales assessing management abilities, and a significant correlation (P < 0.01) was obtained. Additionally, a comparison of the RMSP scores of decentralized local public health nurses according to rank and years of service in areas where there were no supervising public health nurses with the RMSP scores of supervising public health nurses showed that the scores of supervising public health nurses were higher. The developed scale was found to be reliable and valid for measuring the implementation of supervising public health nurses' role.
Think global, act local: Preserving the global commons
Hauser, Oliver P.; Hendriks, Achim; Rand, David G.; Nowak, Martin A.
2016-01-01
Preserving global public goods, such as the planet’s ecosystem, depends on large-scale cooperation, which is difficult to achieve because the standard reciprocity mechanisms weaken in large groups. Here we demonstrate a method by which reciprocity can maintain cooperation in a large-scale public goods game (PGG). In a first experiment, participants in groups of on average 39 people play one round of a Prisoner’s Dilemma (PD) with their two nearest neighbours on a cyclic network after each PGG round. We observe that people engage in “local-to-global” reciprocity, leveraging local interactions to enforce global cooperation: Participants reduce PD cooperation with neighbours who contribute little in the PGG. In response, low PGG contributors increase their contributions if both neighbours defect in the PD. In a control condition, participants do not know their neighbours’ PGG contribution and thus cannot link play in the PD to the PGG. In the control we observe a sharp decline of cooperation in the PGG, while in the treatment condition global cooperation is maintained. In a second experiment, we demonstrate the scalability of this effect: in a 1,000-person PGG, participants in the treatment condition successfully sustain public contributions. Our findings suggest that this simple “local-to-global” intervention facilitates large-scale cooperation. PMID:27808222
ERIC Educational Resources Information Center
Säljö, Roger; Radišic, Jelena
2018-01-01
Public discussion on the quality of education in different corners of the world very much relies on the data provided by the international large-scale assessment (ILSA) studies. While aware of different methodological keystones and technicalities embedded in these, the idea behind this special issue is to contribute to the understanding of how…
ERIC Educational Resources Information Center
Tobin, Kenneth; Fraser, Barry J.
Large scale assessments of educational progress can be useful tools to judge the effectiveness of educational programs and assessments. This document contains papers presented at the research seminar on this topic held at the Western Australian Institute of Technology in November, 1984. It is the fifth in a series of publications of papers…
ERIC Educational Resources Information Center
Flanagan, Helen E.; Perry, Adrienne; Freeman, Nancy L.
2012-01-01
File review data were used to explore the impact of a large-scale publicly funded Intensive Behavioral Intervention (IBI) program for young children with autism. Outcomes were compared for 61 children who received IBI and 61 individually matched children from a waitlist comparison group. In addition, predictors of better cognitive outcomes were…
In Search of the Eco-Teacher: Public School Edition
ERIC Educational Resources Information Center
Blenkinsop, Sean
2014-01-01
This paper uses an innovative building-less Canadian public elementary school and its accompanying large-scale research project to consider the characteristics that might be required of a teacher interested in working in an emergent, environmental, place- and community-based experiential public school setting. The six characteristics considered…
Public Education Finances: 1949-1985.
ERIC Educational Resources Information Center
Denzau, Arthur
As a part of an overall study of large-scale communications satellite systems for education, an estimate was made of the amount of money available for media-technology for the next five to fifteen years. Information was gathered on public educational expenditures in the United States. Public elementary and secondary school expenditures were…
Relationship between Recall of Sex Education and College Students' Sexual Attitudes and Behavior
ERIC Educational Resources Information Center
Walcott, Christy M.; Chenneville, Tiffany; Tarquini, Sarah
2011-01-01
Sexuality education programs have been prevalent in U.S. public schools for decades (Cornblatt, 2009). Although strong public support exists for school-based sexuality education (e.g., Albert, 2007; Dailard, 2001), there is a great divide among parents, policy makers, and the public at large regarding how to prevent sexual activity (and its…
Mother Nature versus human nature: public compliance with evacuation and quarantine.
Manuell, Mary-Elise; Cukor, Jeffrey
2011-04-01
Effectively controlling the spread of contagious illnesses has become a critical focus of disaster planning. It is likely that quarantine will be a key part of the overall public health strategy utilised during a pandemic, an act of bioterrorism or other emergencies involving contagious agents. While the United States lacks recent experience of large-scale quarantines, it has considerable accumulated experience of large-scale evacuations. Risk perception, life circumstance, work-related issues, and the opinions of influential family, friends and credible public spokespersons all play a role in determining compliance with an evacuation order. Although the comparison is not reported elsewhere to our knowledge, this review of the principal factors affecting compliance with evacuations demonstrates many similarities with those likely to occur during a quarantine. Accurate identification and understanding of barriers to compliance allows for improved planning to protect the public more effectively. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.
Research directions in large scale systems and decentralized control
NASA Technical Reports Server (NTRS)
Tenney, R. R.
1980-01-01
Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.
Reinventing intention: ‘self-harm’ and the ‘cry for help’ in post-war Britain
Millard, Chris
2013-01-01
Purpose of review To sketch out how contemporary Anglophone literature on self-damaging behaviour negotiates serious conceptual difficulties around intention, and to demonstrate (in the British context) how the large-scale emergence of this type of behaviour is made possible by new forms of psychological provision at district general hospitals. Recent findings In the past decade there has been increasing public awareness of ‘self-harm’. Despite the view that ‘self-harm’ has always existed, the British roots of the current ‘epidemic’ can be traced to changes in the organisation of mental healthcare in the post-war period. These changes make possible new understandings of the story behind physical injuries, and allow these readings to be aggregated and projected onto a national, epidemic scale. Summary The increasing provision of psychiatric expertise in general hospitals makes possible new interpretations of self injury – as psychosocial communication, or affect self-regulation – and creates the phenomenon of ‘self-harm’ as we understand it today. PMID:23037964
Standard methods for sampling freshwater fishes: Opportunities for international collaboration
Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.
2017-01-01
With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.
Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia
NASA Astrophysics Data System (ADS)
Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.
2007-12-01
The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research aims to further investigate how the enhanced understanding of earthquake and tsunami hazards is being used to direct hazard mitigation strategies and enables direct comparison with the scientific and public policy developments in Cascadia.
Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories
NASA Astrophysics Data System (ADS)
Park, Kiwan; Blackman, Eric G.; Subramanian, Kandaswamy
2013-05-01
Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.
Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories.
Park, Kiwan; Blackman, Eric G; Subramanian, Kandaswamy
2013-05-01
Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaoliang; Stauffer, Philip H.
This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.
Requirements for global elimination of hepatitis B: a modelling study.
Nayagam, Shevanthi; Thursz, Mark; Sicuri, Elisa; Conteh, Lesong; Wiktor, Stefan; Low-Beer, Daniel; Hallett, Timothy B
2016-12-01
Despite the existence of effective prevention and treatment interventions, hepatitis B virus (HBV) infection continues to cause nearly 1 million deaths each year. WHO aspires to global control and elimination of HBV infection. We aimed to evaluate the potential impact of public health interventions against HBV, propose targets for reducing incidence and mortality, and identify the key developments required to achieve them. We developed a simulation model of the global HBV epidemic, incorporating data on the natural history of HBV, prevalence, mortality, vaccine coverage, treatment dynamics, and demographics. We estimate the impact of current interventions and scaling up of existing interventions for prevention of infection and introducing wide-scale population screening and treatment interventions on the worldwide epidemic. Vaccination of infants and neonates is already driving a large decrease in new infections; vaccination has already prevented 210 million new chronic infections by 2015 and will have averted 1·1 million deaths by 2030. However, without scale-up of existing interventions, our model showed that there will be a cumulative 63 million new cases of chronic infection and 17 million HBV-related deaths between 2015 and 2030 because of ongoing transmission in some regions and poor access to treatment for people already infected. A target of a 90% reduction in new chronic infections and 65% reduction in mortality could be achieved by scaling up the coverage of infant vaccination (to 90% of infants), birth-dose vaccination (to 80% of neonates), use of peripartum antivirals (to 80% of hepatitis B e antigen-positive mothers), and population-wide testing and treatment (to 80% of eligible people). These interventions would avert 7·3 million deaths between 2015 and 2030, including 1·5 million cases of cancer deaths. An elimination threshold for incidence of new chronic infections would be reached by 2090 worldwide. The annual cost would peak at US$7·5 billion worldwide ($3·4 billion in low-income and lower-middle-income countries), but decrease rapidly and this would be accelerated if a cure is developed. Scale-up of vaccination coverage, innovations in scalable options for prevention of mother-to-child transmission, and ambitious population-wide testing and treatment are needed to eliminate HBV as a major public health threat. Achievement of these targets could make a major contribution to one of the Sustainable Development Goals of combating hepatitis. Medical Research Council. Copyright © 2016 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.
Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian
2016-01-01
The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.
USDA-ARS?s Scientific Manuscript database
Relevant data about subsurface water flow and solute transport at relatively large scales that are of interest to the public are inherently laborious and in most cases simply impossible to obtain. Upscaling in which fine-scale models and data are used to predict changes at the coarser scales is the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@auecc.aichi-edu.ac.jp, E-mail: sano@ile.osaka-u.ac.jp
We report the first successful simulation of spontaneous formation of surface magnetic structures from a large-scale dynamo by strongly stratified thermal convection in Cartesian geometry. The large-scale dynamo observed in our strongly stratified model has physical properties similar to those in earlier weakly stratified convective dynamo simulations, indicating that the α {sup 2}-type mechanism is responsible for the dynamo. In addition to the large-scale dynamo, we find that large-scale structures of the vertical magnetic field are spontaneously formed in the convection zone (CZ) surface only in cases with a strongly stratified atmosphere. The organization of the vertical magnetic field proceedsmore » in the upper CZ within tens of convective turnover time and band-like bipolar structures recurrently appear in the dynamo-saturated stage. We consider several candidates to be possibly be the origin of the surface magnetic structure formation, and then suggest the existence of an as-yet-unknown mechanism for the self-organization of the large-scale magnetic structure, which should be inherent in the strongly stratified convective atmosphere.« less
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas
2016-09-19
Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.
ERIC Educational Resources Information Center
Wilkin, John P.
2017-01-01
The 1961 Copyright Office study on renewals, authored by Barbara Ringer, has cast an outsized influence on discussions of the U.S. 1923-1963 public domain. As more concrete data emerge from initiatives such as the large-scale determination process in the Copyright Review Management System (CRMS) project, questions are raised about the reliability…
On the Interactions Between Planetary and Mesoscale Dynamics in the Oceans
NASA Astrophysics Data System (ADS)
Grooms, I.; Julien, K. A.; Fox-Kemper, B.
2011-12-01
Multiple-scales asymptotic methods are used to investigate the interaction of planetary and mesoscale dynamics in the oceans. We find three regimes. In the first, the slow, large-scale planetary flow sets up a baroclinically unstable background which leads to vigorous mesoscale eddy generation, but the eddy dynamics do not affect the planetary dynamics. In the second, the planetary flow feels the effects of the eddies, but appears to be unable to generate them. The first two regimes rely on horizontally isotropic large-scale dynamics. In the third regime, large-scale anisotropy, as exists for example in the Antarctic Circumpolar Current and in western boundary currents, allows the large-scale dynamics to both generate and respond to mesoscale eddies. We also discuss how the investigation may be brought to bear on the problem of parameterization of unresolved mesoscale dynamics in ocean general circulation models.
Assessment of a Solar System Walk
ERIC Educational Resources Information Center
LoPresto, Michael C.; Murrell, Steven R.; Kirchner, Brian
2010-01-01
The idea of sending students and the general public on a walk through a scale model of the solar system in an attempt to instill an appreciation of the relative scales of the sizes of the objects compared to the immense distances between them is certainly not new. A good number of such models exist, including one on the National Mall in…
Lifetime evaluation of large format CMOS mixed signal infrared devices
NASA Astrophysics Data System (ADS)
Linder, A.; Glines, Eddie
2015-09-01
New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.
2016-12-01
With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.
Effects of Interim Assessments on Student Achievement: Evidence from a Large-Scale Experiment
ERIC Educational Resources Information Center
Konstantopoulos, Spyros; Miller, Shazia R.; van der Ploeg, Arie; Li, Wei
2016-01-01
We use data from a large-scale, school-level randomized experiment conducted in 2010-2011 in public schools in Indiana. Our sample includes more than 30,000 students in 70 schools. We examine the impact of two interim assessment programs (i.e., mCLASS in Grades K-2 and Acuity in Grades 3--8) on mathematics and reading achievement. Two-level models…
Development of Novel Therapeutics for Neglected Tropical Disease Leishmaniasis
2015-10-01
Approved for public release; distribution unlimited We undertook planning of kick off coordination meeting. A low dose infection model of CL was validated...A large scale synthesis of PEN optimized and in vitro studies were performed revealed that PEN alters parasite lipidome. Further studies were...Pentalinonsterol, Leishmania, cutaneous leishmaniasis, treatment Accomplishments • Undertook planning of kick off coordination meeting • Large scale synthesis of
ERIC Educational Resources Information Center
Gasim, Gamal; Stevens, Tara; Zebidi, Amira
2012-01-01
All undergraduate students are required by state law to take six credited hours in political science. This study will help us identify if differences exist in self-determination among students enrolled in American Public Policy and American Government at a large, Southwestern public university. Because some types of motivation are associated with…
What Googling Trends Tell Us About Public Interest in Earthquakes
NASA Astrophysics Data System (ADS)
Tan, Y. J.; Maharjan, R.
2017-12-01
Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.
Capabilities of the Large-Scale Sediment Transport Facility
2016-04-01
experiments in wave /current environments. INTRODUCTION: The LSTF (Figure 1) is a large-scale laboratory facility capable of simulating conditions...comparable to low- wave energy coasts. The facility was constructed to address deficiencies in existing methods for calculating longshore sediment...transport. The LSTF consists of a 30 m wide, 50 m long, 1.4 m deep basin. Waves are generated by four digitally controlled wave makers capable of producing
Clinical terminology support for a national ambulatory practice outcomes research network.
Ricciardi, Thomas N; Lieberman, Michael I; Kahn, Michael G; Masarie, F E
2005-01-01
The Medical Quality Improvement Consortium (MQIC) is a nationwide collaboration of 74 healthcare delivery systems, consisting of 3755 clinicians, who contribute de-identified clinical data from the same commercial electronic medical record (EMR) for quality reporting, outcomes research and clinical research in public health and practice benchmarking. Despite the existence of a common, centrally-managed, shared terminology for core concepts (medications, problem lists, observation names), a substantial "back-end" information management process is required to ensure terminology and data harmonization for creating multi-facility clinically-acceptable queries and comparable results. We describe the information architecture created to support terminology harmonization across this data-sharing consortium and discuss the implications for large scale data sharing envisioned by proponents for the national adoption of ambulatory EMR systems.
Clinical Terminology Support for a National Ambulatory Practice Outcomes Research Network
Ricciardi, Thomas N.; Lieberman, Michael I.; Kahn, Michael G.; Masarie, F.E. “Chip”
2005-01-01
The Medical Quality Improvement Consortium (MQIC) is a nationwide collaboration of 74 healthcare delivery systems, consisting of 3755 clinicians, who contribute de-identified clinical data from the same commercial electronic medical record (EMR) for quality reporting, outcomes research and clinical research in public health and practice benchmarking. Despite the existence of a common, centrally-managed, shared terminology for core concepts (medications, problem lists, observation names), a substantial “back-end” information management process is required to ensure terminology and data harmonization for creating multi-facility clinically-acceptable queries and comparable results. We describe the information architecture created to support terminology harmonization across this data-sharing consortium and discuss the implications for large scale data sharing envisioned by proponents for the national adoption of ambulatory EMR systems. PMID:16779116
Mono-isotope Prediction for Mass Spectra Using Bayes Network.
Li, Hui; Liu, Chunmei; Rwebangira, Mugizi Robert; Burge, Legand
2014-12-01
Mass spectrometry is one of the widely utilized important methods to study protein functions and components. The challenge of mono-isotope pattern recognition from large scale protein mass spectral data needs computational algorithms and tools to speed up the analysis and improve the analytic results. We utilized naïve Bayes network as the classifier with the assumption that the selected features are independent to predict mono-isotope pattern from mass spectrometry. Mono-isotopes detected from validated theoretical spectra were used as prior information in the Bayes method. Three main features extracted from the dataset were employed as independent variables in our model. The application of the proposed algorithm to publicMo dataset demonstrates that our naïve Bayes classifier is advantageous over existing methods in both accuracy and sensitivity.
The co-evolution of social institutions, demography, and large-scale human cooperation.
Powers, Simon T; Lehmann, Laurent
2013-11-01
Human cooperation is typically coordinated by institutions, which determine the outcome structure of the social interactions individuals engage in. Explaining the Neolithic transition from small- to large-scale societies involves understanding how these institutions co-evolve with demography. We study this using a demographically explicit model of institution formation in a patch-structured population. Each patch supports both social and asocial niches. Social individuals create an institution, at a cost to themselves, by negotiating how much of the costly public good provided by cooperators is invested into sanctioning defectors. The remainder of their public good is invested in technology that increases carrying capacity, such as irrigation systems. We show that social individuals can invade a population of asocials, and form institutions that support high levels of cooperation. We then demonstrate conditions where the co-evolution of cooperation, institutions, and demographic carrying capacity creates a transition from small- to large-scale social groups. © 2013 John Wiley & Sons Ltd/CNRS.
76 FR 67256 - Advisory Group to the Commissioner of Internal Revenue; Renewal of Charter
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-31
... public, the tax professional community, small and large businesses, international, wage and investment... Revenue Service officials and representatives of the public to discuss relevant tax administration issues. As an advisory body designed to focus on broad policy matters, the IRSAC reviews existing tax policy...
USDA-ARS?s Scientific Manuscript database
NASA’s SMAP satellite, launched in November of 2014, produces estimates of average volumetric soil moisture at 3, 9, and 36-kilometer scales. The calibration and validation process of these estimates requires the generation of an identically-scaled soil moisture product from existing in-situ networ...
Cross-indexing of binary SIFT codes for large-scale image search.
Liu, Zhen; Li, Houqiang; Zhang, Liyan; Zhou, Wengang; Tian, Qi
2014-05-01
In recent years, there has been growing interest in mapping visual features into compact binary codes for applications on large-scale image collections. Encoding high-dimensional data as compact binary codes reduces the memory cost for storage. Besides, it benefits the computational efficiency since the computation of similarity can be efficiently measured by Hamming distance. In this paper, we propose a novel flexible scale invariant feature transform (SIFT) binarization (FSB) algorithm for large-scale image search. The FSB algorithm explores the magnitude patterns of SIFT descriptor. It is unsupervised and the generated binary codes are demonstrated to be dispreserving. Besides, we propose a new searching strategy to find target features based on the cross-indexing in the binary SIFT space and original SIFT space. We evaluate our approach on two publicly released data sets. The experiments on large-scale partial duplicate image retrieval system demonstrate the effectiveness and efficiency of the proposed algorithm.
Towards a New Assessment of Urban Areas from Local to Global Scales
NASA Astrophysics Data System (ADS)
Bhaduri, B. L.; Roy Chowdhury, P. K.; McKee, J.; Weaver, J.; Bright, E.; Weber, E.
2015-12-01
Since early 2000s, starting with NASA MODIS, satellite based remote sensing has facilitated collection of imagery with medium spatial resolution but high temporal resolution (daily). This trend continues with an increasing number of sensors and data products. Increasing spatial and temporal resolutions of remotely sensed data archives, from both public and commercial sources, have significantly enhanced the quality of mapping and change data products. However, even with automation of such analysis on evolving computing platforms, rates of data processing have been suboptimal largely because of the ever-increasing pixel to processor ratio coupled with limitations of the computing architectures. Novel approaches utilizing spatiotemporal data mining techniques and computational architectures have emerged that demonstrates the potential for sustained and geographically scalable landscape monitoring to be operational. We exemplify this challenge with two broad research initiatives on High Performance Geocomputation at Oak Ridge National Laboratory: (a) mapping global settlement distribution; (b) developing national critical infrastructure databases. Our present effort, on large GPU based architectures, to exploit high resolution (1m or less) satellite and airborne imagery for extracting settlements at global scale is yielding understanding of human settlement patterns and urban areas at unprecedented resolution. Comparison of such urban land cover database, with existing national and global land cover products, at various geographic scales in selected parts of the world is revealing intriguing patterns and insights for urban assessment. Early results, from the USA, Taiwan, and Egypt, indicate closer agreements (5-10%) in urban area assessments among databases at larger, aggregated geographic extents. However, spatial variability at local scales could be significantly different (over 50% disagreement).
DTMiner: identification of potential disease targets through biomedical literature mining
Xu, Dong; Zhang, Meizhuo; Xie, Yanping; Wang, Fan; Chen, Ming; Zhu, Kenny Q.; Wei, Jia
2016-01-01
Motivation: Biomedical researchers often search through massive catalogues of literature to look for potential relationships between genes and diseases. Given the rapid growth of biomedical literature, automatic relation extraction, a crucial technology in biomedical literature mining, has shown great potential to support research of gene-related diseases. Existing work in this field has produced datasets that are limited both in scale and accuracy. Results: In this study, we propose a reliable and efficient framework that takes large biomedical literature repositories as inputs, identifies credible relationships between diseases and genes, and presents possible genes related to a given disease and possible diseases related to a given gene. The framework incorporates name entity recognition (NER), which identifies occurrences of genes and diseases in texts, association detection whereby we extract and evaluate features from gene–disease pairs, and ranking algorithms that estimate how closely the pairs are related. The F1-score of the NER phase is 0.87, which is higher than existing studies. The association detection phase takes drastically less time than previous work while maintaining a comparable F1-score of 0.86. The end-to-end result achieves a 0.259 F1-score for the top 50 genes associated with a disease, which performs better than previous work. In addition, we released a web service for public use of the dataset. Availability and Implementation: The implementation of the proposed algorithms is publicly available at http://gdr-web.rwebox.com/public_html/index.php?page=download.php. The web service is available at http://gdr-web.rwebox.com/public_html/index.php. Contact: jenny.wei@astrazeneca.com or kzhu@cs.sjtu.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27506226
DTMiner: identification of potential disease targets through biomedical literature mining.
Xu, Dong; Zhang, Meizhuo; Xie, Yanping; Wang, Fan; Chen, Ming; Zhu, Kenny Q; Wei, Jia
2016-12-01
Biomedical researchers often search through massive catalogues of literature to look for potential relationships between genes and diseases. Given the rapid growth of biomedical literature, automatic relation extraction, a crucial technology in biomedical literature mining, has shown great potential to support research of gene-related diseases. Existing work in this field has produced datasets that are limited both in scale and accuracy. In this study, we propose a reliable and efficient framework that takes large biomedical literature repositories as inputs, identifies credible relationships between diseases and genes, and presents possible genes related to a given disease and possible diseases related to a given gene. The framework incorporates name entity recognition (NER), which identifies occurrences of genes and diseases in texts, association detection whereby we extract and evaluate features from gene-disease pairs, and ranking algorithms that estimate how closely the pairs are related. The F1-score of the NER phase is 0.87, which is higher than existing studies. The association detection phase takes drastically less time than previous work while maintaining a comparable F1-score of 0.86. The end-to-end result achieves a 0.259 F1-score for the top 50 genes associated with a disease, which performs better than previous work. In addition, we released a web service for public use of the dataset. The implementation of the proposed algorithms is publicly available at http://gdr-web.rwebox.com/public_html/index.php?page=download.php The web service is available at http://gdr-web.rwebox.com/public_html/index.php CONTACT: jenny.wei@astrazeneca.com or kzhu@cs.sjtu.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Lohr, E.W.; Love, S.K.
1954-01-01
Investigations by others have shown that a definite relationship exists between fluoride in drinking water and the incidence of dental caries in the teeth of children. A total of about 85 percent of the population served from the large public supplies receive water having a fluoride concentration in the range of 0.0 to 0. 5 part per million. Few large public supplies contain fluoride in concentrations in excess of 3 parts per million. A total of 155 places of those included in the report received fluoridated water in 1955.
Lavis, John N
2006-01-01
Public policymakers must contend with a particular set of institutional arrangements that govern what can be done to address any given issue, pressure from a variety of interest groups about what they would like to see done to address any given issue, and a range of ideas (including research evidence) about how best to address any given issue. Rarely do processes exist that can get optimally packaged high-quality and high-relevance research evidence into the hands of public policymakers when they most need it, which is often in hours and days, not months and years. In Canada, a variety of efforts have been undertaken to address the factors that have been found to increase the prospects for research use, including the production of systematic reviews that meet the shorter term (but not urgent) needs of public policymakers and encouraging partnerships between researchers and policymakers that allow for their interaction around the tasks of asking and answering relevant questions. Much less progress has been made in making available research evidence to inform the urgent needs of public policymakers and in addressing attitudinal barriers and capacity limitations. In the future, knowledge-translation processes, particularly push efforts and efforts to facilitate user pull, should be undertaken on a sufficiently large scale and with a sufficiently rigorous evaluation so that robust conclusions can be drawn about their effectiveness.
Watterson, Andrew; Dinan, William
2018-04-04
Unconventional oil and gas extraction (UOGE) including fracking for shale gas is underway in North America on a large scale, and in Australia and some other countries. It is viewed as a major source of global energy needs by proponents. Critics consider fracking and UOGE an immediate and long-term threat to global, national, and regional public health and climate. Rarely have governments brought together relatively detailed assessments of direct and indirect public health risks associated with fracking and weighed these against potential benefits to inform a national debate on whether to pursue this energy route. The Scottish government has now done so in a wide-ranging consultation underpinned by a variety of reports on unconventional gas extraction including fracking. This paper analyses the Scottish government approach from inception to conclusion, and from procedures to outcomes. The reports commissioned by the Scottish government include a comprehensive review dedicated specifically to public health as well as reports on climate change, economic impacts, transport, geology, and decommissioning. All these reports are relevant to public health, and taken together offer a comprehensive review of existing evidence. The approach is unique globally when compared with UOGE assessments conducted in the USA, Australia, Canada, and England. The review process builds a useful evidence base although it is not without flaws. The process approach, if not the content, offers a framework that may have merits globally.
Equipment Efficiency for Healthy School Meals. [Videotape].
ERIC Educational Resources Information Center
National Food Service Management Inst., University, MS.
A satellite seminar on large-scale food production equipment discusses ways child nutrition personnel can maximize use of existing equipment, considers research related to use of existing equipment, explains plan reviews for equipment selection and purchase, and explores new equipment options. Examples illustrate use of planning or modernizing…
NASA Astrophysics Data System (ADS)
Yan, Peng; Lu, Wenbo; Zhang, Jing; Zou, Yujun; Chen, Ming
2017-04-01
Ground vibration, as the most critical public hazard of blasting, has received much attention from the community. Many countries established national standards to suppress vibration impact on structures, but a world-accepted blasting vibration criterion on human safety is still missing. In order to evaluate human response to the vibration from blasting excavation of a large-scale rock slope in China, this study aims to suggest a revised criterion. The vibration frequency was introduced to improve the existing single-factor (peak particle velocity) standard recommended by the United States Bureau of Mines (USBM). The feasibility of the new criterion was checked based on field vibration monitoring and investigation of human reactions. Moreover, the air overpressure or blast effects on human beings have also been discussed. The result indicates that the entire zone of influence can be divided into three subzones: severe-annoyance, light-annoyance and perception zone according to the revised safety standard. Both the construction company and local residents have provided positive comments on this influence degree assessment, which indicates that the presented criterion is suitable for evaluating human response to nearby blasts. Nevertheless, this specific criterion needs more field tests and verifications before it can be
Large Scale Density Estimation of Blue and Fin Whales (LSD)
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse
A family of conjugate gradient methods for large-scale nonlinear equations.
Feng, Dexiang; Sun, Min; Wang, Xueyong
2017-01-01
In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.
Viewpoint: observations on scaled average bioequivalence.
Patterson, Scott D; Jones, Byron
2012-01-01
The two one-sided test procedure (TOST) has been used for average bioequivalence testing since 1992 and is required when marketing new formulations of an approved drug. TOST is known to require comparatively large numbers of subjects to demonstrate bioequivalence for highly variable drugs, defined as those drugs having intra-subject coefficients of variation greater than 30%. However, TOST has been shown to protect public health when multiple generic formulations enter the marketplace following patent expiration. Recently, scaled average bioequivalence (SABE) has been proposed as an alternative statistical analysis procedure for such products by multiple regulatory agencies. SABE testing requires that a three-period partial replicate cross-over or full replicate cross-over design be used. Following a brief summary of SABE analysis methods applied to existing data, we will consider three statistical ramifications of the proposed additional decision rules and the potential impact of implementation of scaled average bioequivalence in the marketplace using simulation. It is found that a constraint being applied is biased, that bias may also result from the common problem of missing data and that the SABE methods allow for much greater changes in exposure when generic-generic switching occurs in the marketplace. Copyright © 2011 John Wiley & Sons, Ltd.
Evidence for the timing of sea-level events during MIS 3
NASA Astrophysics Data System (ADS)
Siddall, M.
2005-12-01
Four large sea-level peaks of millennial-scale duration occur during MIS 3. In addition smaller peaks may exist close to the sensitivity of existing methods to derive sea level during these periods. Millennial-scale changes in temperature during MIS 3 are well documented across much of the planet and are linked in some unknown, yet fundamental way to changes in ice volume / sea level. It is therefore highly likely that the timing of the sea level events during MIS 3 will prove to be a `Rosetta Stone' for understanding millennial scale climate variability. I will review observational and mechanistic arguments for the variation of sea level on Antarctic, Greenland and absolute time scales.
NASA Astrophysics Data System (ADS)
Zhang, Yangyue; Hu, Ruifeng; Zheng, Xiaojing
2018-04-01
Dust particles can remain suspended in the atmospheric boundary layer, motions of which are primarily determined by turbulent diffusion and gravitational settling. Little is known about the spatial organizations of suspended dust concentration and how turbulent coherent motions contribute to the vertical transport of dust particles. Numerous studies in recent years have revealed that large- and very-large-scale motions in the logarithmic region of laboratory-scale turbulent boundary layers also exist in the high Reynolds number atmospheric boundary layer, but their influence on dust transport is still unclear. In this study, numerical simulations of dust transport in a neutral atmospheric boundary layer based on an Eulerian modeling approach and large-eddy simulation technique are performed to investigate the coherent structures of dust concentration. The instantaneous fields confirm the existence of very long meandering streaks of dust concentration, with alternating high- and low-concentration regions. A strong negative correlation between the streamwise velocity and concentration and a mild positive correlation between the vertical velocity and concentration are observed. The spatial length scales and inclination angles of concentration structures are determined, compared with their flow counterparts. The conditionally averaged fields vividly depict that high- and low-concentration events are accompanied by a pair of counter-rotating quasi-streamwise vortices, with a downwash inside the low-concentration region and an upwash inside the high-concentration region. Through the quadrant analysis, it is indicated that the vertical dust transport is closely related to the large-scale roll modes, and ejections in high-concentration regions are the major mechanisms for the upward motions of dust particles.
NASA Technical Reports Server (NTRS)
Mjolsness, Eric; Castano, Rebecca; Mann, Tobias; Wold, Barbara
2000-01-01
We provide preliminary evidence that existing algorithms for inferring small-scale gene regulation networks from gene expression data can be adapted to large-scale gene expression data coming from hybridization microarrays. The essential steps are (I) clustering many genes by their expression time-course data into a minimal set of clusters of co-expressed genes, (2) theoretically modeling the various conditions under which the time-courses are measured using a continuous-time analog recurrent neural network for the cluster mean time-courses, (3) fitting such a regulatory model to the cluster mean time courses by simulated annealing with weight decay, and (4) analysing several such fits for commonalities in the circuit parameter sets including the connection matrices. This procedure can be used to assess the adequacy of existing and future gene expression time-course data sets for determining transcriptional regulatory relationships such as coregulation.
Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.
Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C
2003-03-01
Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.
NASA Astrophysics Data System (ADS)
Silvis, Maurits H.; Remmerswaal, Ronald A.; Verstappen, Roel
2017-01-01
We study the construction of subgrid-scale models for large-eddy simulation of incompressible turbulent flows. In particular, we aim to consolidate a systematic approach of constructing subgrid-scale models, based on the idea that it is desirable that subgrid-scale models are consistent with the mathematical and physical properties of the Navier-Stokes equations and the turbulent stresses. To that end, we first discuss in detail the symmetries of the Navier-Stokes equations, and the near-wall scaling behavior, realizability and dissipation properties of the turbulent stresses. We furthermore summarize the requirements that subgrid-scale models have to satisfy in order to preserve these important mathematical and physical properties. In this fashion, a framework of model constraints arises that we apply to analyze the behavior of a number of existing subgrid-scale models that are based on the local velocity gradient. We show that these subgrid-scale models do not satisfy all the desired properties, after which we explain that this is partly due to incompatibilities between model constraints and limitations of velocity-gradient-based subgrid-scale models. However, we also reason that the current framework shows that there is room for improvement in the properties and, hence, the behavior of existing subgrid-scale models. We furthermore show how compatible model constraints can be combined to construct new subgrid-scale models that have desirable properties built into them. We provide a few examples of such new models, of which a new model of eddy viscosity type, that is based on the vortex stretching magnitude, is successfully tested in large-eddy simulations of decaying homogeneous isotropic turbulence and turbulent plane-channel flow.
Cosmic microwave background anomalies in an open universe.
Liddle, Andrew R; Cortês, Marina
2013-09-13
We argue that the observed large-scale cosmic microwave anomalies, discovered by WMAP and confirmed by the Planck satellite, are most naturally explained in the context of a marginally open universe. Particular focus is placed on the dipole power asymmetry, via an open universe implementation of the large-scale gradient mechanism of Erickcek et al. Open inflation models, which are motivated by the string landscape and which can excite "supercurvature" perturbation modes, can explain the presence of a very-large-scale perturbation that leads to a dipole modulation of the power spectrum measured by a typical observer. We provide a specific implementation of the scenario which appears compatible with all existing constraints.
Field-aligned currents and large-scale magnetospheric electric fields
NASA Technical Reports Server (NTRS)
Dangelo, N.
1979-01-01
The existence of field-aligned currents (FAC) at northern and southern high latitudes was confirmed by a number of observations, most clearly by experiments on the TRIAD and ISIS 2 satellites. The high-latitude FAC system is used to relate what is presently known about the large-scale pattern of high-latitude ionospheric electric fields and their relation to solar wind parameters. Recently a simplified model was presented for polar cap electric fields. The model is of considerable help in visualizing the large-scale features of FAC systems. A summary of the FAC observations is given. The simplified model is used to visualize how the FAC systems are driven by their generators.
Reciprocity and the Emergence of Power Laws in Social Networks
NASA Astrophysics Data System (ADS)
Schnegg, Michael
Research in network science has shown that many naturally occurring and technologically constructed networks are scale free, that means a power law degree distribution emerges from a growth model in which each new node attaches to the existing network with a probability proportional to its number of links (= degree). Little is known about whether the same principles of local attachment and global properties apply to societies as well. Empirical evidence from six ethnographic case studies shows that complex social networks have significantly lower scaling exponents γ ~ 1 than have been assumed in the past. Apparently humans do not only look for the most prominent players to play with. Moreover cooperation in humans is characterized through reciprocity, the tendency to give to those from whom one has received in the past. Both variables — reciprocity and the scaling exponent — are negatively correlated (r = -0.767, sig = 0.075). If we include this effect in simulations of growing networks, degree distributions emerge that are much closer to those empirically observed. While the proportion of nodes with small degrees decreases drastically as we introduce reciprocity, the scaling exponent is more robust and changes only when a relatively large proportion of attachment decisions follow this rule. If social networks are less scale free than previously assumed this has far reaching implications for policy makers, public health programs and marketing alike.
Marcellin, Patrick; Kutala, Blaise K
2018-02-01
CLDs represent an important, and certainly underestimated, global public health problem. CLDs are highly prevalent and silent, related to different, sometimes associated causes. The distribution of the causes of these diseases is slowly changing, and within the next decade, the proportion of virus-induced CLDs will certainly decrease significantly while the proportion of NASH will increase. There is an urgent need for effective global actions including education, prevention and early diagnosis to manage and treat CLDs, thus preventing cirrhosis-related morbidity and mortality. Our role is to increase the awareness of the public, healthcare professionals and public health authorities to encourage active policies for early management that will decrease the short- and long-term public health burden of these diseases. Because necroinflammation is the key mechanism in the progression of CLDs, it should be detected early. Thus, large-scale screening for CLDs is needed. ALT levels are an easy and inexpensive marker of liver necroinflammation and could be the first-line tool in this process. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.
Davis-Kean, Pamela E; Jager, Justin
2017-06-01
For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.
Derivation of large-scale cellular regulatory networks from biological time series data.
de Bivort, Benjamin L
2010-01-01
Pharmacological agents and other perturbants of cellular homeostasis appear to nearly universally affect the activity of many genes, proteins, and signaling pathways. While this is due in part to nonspecificity of action of the drug or cellular stress, the large-scale self-regulatory behavior of the cell may also be responsible, as this typically means that when a cell switches states, dozens or hundreds of genes will respond in concert. If many genes act collectively in the cell during state transitions, rather than every gene acting independently, models of the cell can be created that are comprehensive of the action of all genes, using existing data, provided that the functional units in the model are collections of genes. Techniques to develop these large-scale cellular-level models are provided in detail, along with methods of analyzing them, and a brief summary of major conclusions about large-scale cellular networks to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, A.; Lubliner, M.; Howard, L.
2014-04-01
This project analyzes the cost effectiveness of energy savings measures installed by a large public housing authority in Salishan, a community in Tacoma Washington. Research focuses on the modeled and measured energy usage of the first six phases of construction, and compares the energy usage of those phases to phase 7. Market-ready energy solutions were also evaluated to improve the efficiency of affordable housing for new and existing (built since 2001) affordable housing in the marine climate of Washington State.
Policy challenges and approaches for the conservation of mangrove forests in Southeast Asia.
Friess, Daniel A; Thompson, Benjamin S; Brown, Ben; Amir, A Aldrie; Cameron, Clint; Koldewey, Heather J; Sasmito, Sigit D; Sidik, Frida
2016-10-01
Many drivers of mangrove forest loss operate over large scales and are most effectively addressed by policy interventions. However, conflicting or unclear policy objectives exist at multiple tiers of government, resulting in contradictory management decisions. To address this, we considered four approaches that are being used increasingly or could be deployed in Southeast Asia to ensure sustainable livelihoods and biodiversity conservation. First, a stronger incorporation of mangroves into marine protected areas (that currently focus largely on reefs and fisheries) could resolve some policy conflicts and ensure that mangroves do not fall through a policy gap. Second, examples of community and government comanagement exist, but achieving comanagement at scale will be important in reconciling stakeholders and addressing conflicting policy objectives. Third, private-sector initiatives could protect mangroves through existing and novel mechanisms in degraded areas and areas under future threat. Finally, payments for ecosystem services (PES) hold great promise for mangrove conservation, with carbon PES schemes (known as blue carbon) attracting attention. Although barriers remain to the implementation of PES, the potential to implement them at multiple scales exists. Closing the gap between mangrove conservation policies and action is crucial to the improved protection and management of this imperiled coastal ecosystem and to the livelihoods that depend on them. © 2016 Society for Conservation Biology.
Ward identities and consistency relations for the large scale structure with multiple species
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peloso, Marco; Pietroni, Massimo, E-mail: peloso@physics.umn.edu, E-mail: pietroni@pd.infn.it
2014-04-01
We present fully nonlinear consistency relations for the squeezed bispectrum of Large Scale Structure. These relations hold when the matter component of the Universe is composed of one or more species, and generalize those obtained in [1,2] in the single species case. The multi-species relations apply to the standard dark matter + baryons scenario, as well as to the case in which some of the fields are auxiliary quantities describing a particular population, such as dark matter halos or a specific galaxy class. If a large scale velocity bias exists between the different populations new terms appear in the consistencymore » relations with respect to the single species case. As an illustration, we discuss two physical cases in which such a velocity bias can exist: (1) a new long range scalar force in the dark matter sector (resulting in a violation of the equivalence principle in the dark matter-baryon system), and (2) the distribution of dark matter halos relative to that of the underlying dark matter field.« less
Rebling, Johannes; Estrada, Héctor; Gottschalk, Sven; Sela, Gali; Zwack, Michael; Wissmeyer, Georg; Ntziachristos, Vasilis; Razansky, Daniel
2018-04-19
A critical link exists between pathological changes of cerebral vasculature and diseases affecting brain function. Microscopic techniques have played an indispensable role in the study of neurovascular anatomy and functions. Yet, investigations are often hindered by suboptimal trade-offs between the spatiotemporal resolution, field-of-view (FOV) and type of contrast offered by the existing optical microscopy techniques. We present a hybrid dual-wavelength optoacoustic (OA) biomicroscope capable of rapid transcranial visualization of large-scale cerebral vascular networks. The system offers 3-dimensional views of the morphology and oxygenation status of the cerebral vasculature with single capillary resolution and a FOV exceeding 6 × 8 mm 2 , thus covering the entire cortical vasculature in mice. The large-scale OA imaging capacity is complemented by simultaneously acquired pulse-echo ultrasound (US) biomicroscopy scans of the mouse skull. The new approach holds great potential to provide better insights into cerebrovascular function and facilitate efficient studies into neurological and vascular abnormalities of the brain. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Volkov, Vadim
2014-01-01
This brief opinion proposes measures to increase efficiency and exclude errors in biomedical research under the existing dynamic situation. Rapid changes in biology began with the description of the three dimensional structure of DNA 60 years ago; today biology has progressed by interacting with computer science and nanoscience together with the introduction of robotic stations for the acquisition of large-scale arrays of data. These changes have had an increasing influence on the entire research and scientific community. Future advance demands short-term measures to ensure error-proof and efficient development. They can include the fast publishing of negative results, publishing detailed methodical papers and excluding a strict connection between career progression and publication activity, especially for younger researchers. Further development of theoretical and systems biology together with the use of multiple experimental methods for biological experiments could also be helpful in the context of years and decades. With regards to the links between science and society, it is reasonable to compare both these systems, to find and describe specific features for biology and to integrate it into the existing stream of social life and financial fluxes. It will increase the level of scientific research and have mutual positive effects for both biology and society. Several examples are given for further discussion.
Hutchinson, Marie; Jackson, Debra
2015-03-01
Health-care and public sector institutions are high-risk settings for workplace bullying. Despite growing acknowledgement of the scale and consequence of this pervasive problem, there has been little critical examination of the institutional power dynamics that enable bullying. In the aftermath of large-scale failures in care standards in public sector healthcare institutions, which were characterised by managerial bullying, attention to the nexus between bullying, power and institutional failures is warranted. In this study, employing Foucault's framework of power, we illuminate bullying as a feature of structures of power and knowledge in public sector institutions. Our analysis draws upon the experiences of a large sample (n = 3345) of workers in Australian public sector agencies - the type with which most nurses in the public setting will be familiar. In foregrounding these power dynamics, we provide further insight into how cultures that are antithetical to institutional missions can arise and seek to broaden the debate on the dynamics of care failures within public sector institutions. Understanding the practices of power in public sector institutions, particularly in the context of ongoing reform, has important implications for nursing. © 2014 John Wiley & Sons Ltd.
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2015-01-01
Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit. PMID:27512239
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice.
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2014-04-01
System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children's service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.
Groundwater Variability in a Sandstone Catchment and Linkages with Large-scale Climatic Circulatio
NASA Astrophysics Data System (ADS)
Hannah, D. M.; Lavers, D. A.; Bradley, C.
2015-12-01
Groundwater is a crucial water resource that sustains river ecosystems and provides public water supply. Furthermore, during periods of prolonged high rainfall, groundwater-dominated catchments can be subject to protracted flooding. Climate change and associated projected increases in the frequency and intensity of hydrological extremes have implications for groundwater levels. This study builds on previous research undertaken on a Chalk catchment by investigating groundwater variability in a UK sandstone catchment: the Tern in Shropshire. In contrast to the Chalk, sandstone is characterised by a more lagged response to precipitation inputs; and, as such, it is important to determine the groundwater behaviour and its links with the large-scale climatic circulation to improve process understanding of recharge, groundwater level and river flow responses to hydroclimatological drivers. Precipitation, river discharge and groundwater levels for borehole sites in the Tern basin over 1974-2010 are analysed as the target variables; and we use monthly gridded reanalysis data from the Twentieth Century Reanalysis Project (20CR). First, groundwater variability is evaluated and associations with precipitation / discharge are explored using monthly concurrent and lagged correlation analyses. Second, gridded 20CR reanalysis data are used in composite and correlation analyses to identify the regions of strongest climate-groundwater association. Results show that reasonably strong climate-groundwater connections exist in the Tern basin, with a several months lag. These lags are associated primarily with the time taken for recharge waters to percolate through to the groundwater table. The uncovered patterns improve knowledge of large-scale climate forcing of groundwater variability and may provide a basis to inform seasonal prediction of groundwater levels, which would be useful for strategic water resource planning.
Ecological fire use for ecological fire management: Managing large wildfires by design
Timothy Ingalsbee
2015-01-01
Past fire exclusion policies and fire suppression actions have led to a historic "fire deficit" on public wildlands. These sociocultural actions have led to unprecedented environmental changes that have created conditions conducive to more frequent large-scale wildfires. Politicians, the newsmedia, and agency officials portray large wildland fires as...
Gravity versus radiation models: on the importance of scale and heterogeneity in commuting flows.
Masucci, A Paolo; Serras, Joan; Johansson, Anders; Batty, Michael
2013-08-01
We test the recently introduced radiation model against the gravity model for the system composed of England and Wales, both for commuting patterns and for public transportation flows. The analysis is performed both at macroscopic scales, i.e., at the national scale, and at microscopic scales, i.e., at the city level. It is shown that the thermodynamic limit assumption for the original radiation model significantly underestimates the commuting flows for large cities. We then generalize the radiation model, introducing the correct normalization factor for finite systems. We show that even if the gravity model has a better overall performance the parameter-free radiation model gives competitive results, especially for large scales.
On a Game of Large-Scale Projects Competition
NASA Astrophysics Data System (ADS)
Nikonov, Oleg I.; Medvedeva, Marina A.
2009-09-01
The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].
Superfluid-like turbulence in cosmology
NASA Technical Reports Server (NTRS)
Gradwohl, Ben-Ami
1991-01-01
A network of vortices in a superfluid system exhibits turbulent behavior. It is argued that the universe may have experienced such a phase of superfluid-like turbulence due to the existence of a coherent state with non-topological charge and a network of global strings. The unique feature of a distribution of turbulent domains is that it can yield non-gravitationally induced large-scale coherent velocities. It may be difficult, however, to relate these velocities to the observed large-scale bulk motion.
Supporting large scale applications on networks of workstations
NASA Technical Reports Server (NTRS)
Cooper, Robert; Birman, Kenneth P.
1989-01-01
Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.
Validating Bayesian truth serum in large-scale online human experiments.
Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.
Validating Bayesian truth serum in large-scale online human experiments
Frank, Morgan R.; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad
2017-01-01
Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method’s mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon’s Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the “honest” distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where “honest” answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers. PMID:28494000
Edwards, Nick
2003-05-15
It can be difficult to reach people in their teens and early 20s with overt public health messages. Borrowing some of the messages and mediums of 'alternative culture' marketing can be more effective. It is possible to organise large-scale events cheaply with the right funding and entertainment partners.
Guidance For The Bioremediation Of Oil-Contaminated Wetlands, Marshes, And Marine Shorelines
Marine shorelines are important public and ecological resources that serve as a home to a variety of wildlife and provide public recreation. Marine oil spills, particularly large scale spill accidents, have posed great threats and cause extensive damage to the marine coastal env...
Yi, Tianzhu; He, Zhihua; He, Feng; Dong, Zhen; Wu, Manqing
2017-01-01
This paper presents an efficient and precise imaging algorithm for the large bandwidth sliding spotlight synthetic aperture radar (SAR). The existing sub-aperture processing method based on the baseband azimuth scaling (BAS) algorithm cannot cope with the high order phase coupling along the range and azimuth dimensions. This coupling problem causes defocusing along the range and azimuth dimensions. This paper proposes a generalized chirp scaling (GCS)-BAS processing algorithm, which is based on the GCS algorithm. It successfully mitigates the deep focus along the range dimension of a sub-aperture of the large bandwidth sliding spotlight SAR, as well as high order phase coupling along the range and azimuth dimensions. Additionally, the azimuth focusing can be achieved by this azimuth scaling method. Simulation results demonstrate the ability of the GCS-BAS algorithm to process the large bandwidth sliding spotlight SAR data. It is proven that great improvements of the focus depth and imaging accuracy are obtained via the GCS-BAS algorithm. PMID:28555057
O'Connor, Ben L; Hamada, Yuki; Bowen, Esther E; Grippo, Mark A; Hartmann, Heidi M; Patton, Terri L; Van Lonkhuyzen, Robert A; Carr, Adrianne E
2014-11-01
Large areas of public lands administered by the Bureau of Land Management and located in arid regions of the southwestern United States are being considered for the development of utility-scale solar energy facilities. Land-disturbing activities in these desert, alluvium-filled valleys have the potential to adversely affect the hydrologic and ecologic functions of ephemeral streams. Regulation and management of ephemeral streams typically falls under a spectrum of federal, state, and local programs, but scientifically based guidelines for protecting ephemeral streams with respect to land-development activities are largely nonexistent. This study developed an assessment approach for quantifying the sensitivity to land disturbance of ephemeral stream reaches located in proposed solar energy zones (SEZs). The ephemeral stream assessment approach used publicly-available geospatial data on hydrology, topography, surficial geology, and soil characteristics, as well as high-resolution aerial imagery. These datasets were used to inform a professional judgment-based score index of potential land disturbance impacts on selected critical functions of ephemeral streams, including flow and sediment conveyance, ecological habitat value, and groundwater recharge. The total sensitivity scores (sum of scores for the critical stream functions of flow and sediment conveyance, ecological habitats, and groundwater recharge) were used to identify highly sensitive stream reaches to inform decisions on developable areas in SEZs. Total sensitivity scores typically reflected the scores of the individual stream functions; some exceptions pertain to groundwater recharge and ecological habitats. The primary limitations of this assessment approach were the lack of high-resolution identification of ephemeral stream channels in the existing National Hydrography Dataset, and the lack of mechanistic processes describing potential impacts on ephemeral stream functions at the watershed scale. The primary strength of this assessment approach is that it allows watershed-scale planning for low-impact development in arid ecosystems; the qualitative scoring of potential impacts can also be adjusted to accommodate new geospatial data, and to allow for expert and stakeholder input into decisions regarding the identification and potential avoidance of highly sensitive stream reaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Connor, Ben L.; Hamada, Yuki; Bowen, Esther E.
2014-08-17
Large areas of public lands administered by the Bureau of Land Management and located in arid regions of the southwestern United States are being considered for the development of utility-scale solar energy facilities. Land-disturbing activities in these desert, alluvium-filled valleys have the potential to adversely affect the hydrologic and ecologic functions of ephemeral streams. Regulation and management of ephemeral streams typically falls under a spectrum of federal, state, and local programs, but scientifically based guidelines for protecting ephemeral streams with respect to land-development activities are largely nonexistent. This study developed an assessment approach for quantifying the sensitivity to land disturbancemore » of ephemeral stream reaches located in proposed solar energy zones (SEZs). The ephemeral stream assessment approach used publicly-available geospatial data on hydrology, topography, surficial geology, and soil characteristics, as well as highresolution aerial imagery. These datasets were used to inform a professional judgment-based score index of potential land disturbance impacts on selected critical functions of ephemeral streams, including flow and sediment conveyance, ecological habitat value, and groundwater recharge. The total sensitivity scores (sum of scores for the critical stream functions of flow and sediment conveyance, ecological habitats, and groundwater recharge) were used to identify highly sensitive stream reaches to inform decisions on developable areas in SEZs. Total sensitivity scores typically reflected the scores of the individual stream functions; some exceptions pertain to groundwater recharge and ecological habitats. The primary limitations of this assessment approach were the lack of high-resolution identification of ephemeral stream channels in the existing National Hydrography Dataset, and the lack of mechanistic processes describing potential impacts on ephemeral stream functions at the watershed scale.The primary strength of this assessment approach is that it allows watershed-scale planning for low-impact development in arid ecosystems; the qualitative scoring of potential impacts can also be adjusted to accommodate new geospatial data, and to allow for expert and stakeholder input into decisions regarding the identification and potential avoidance of highly sensitive stream reaches.« less
After Fukushima: managing the consequences of a radiological release.
Fitzgerald, Joe; Wollner, Samuel B; Adalja, Amesh A; Morhard, Ryan; Cicero, Anita; Inglesby, Thomas V
2012-06-01
Even amidst the devastation following the earthquake and tsunami in Japan that killed more than 20,000 people, it was the accident at the Fukushima Daiichi nuclear power plant that led the country's prime minister, Naoto Kan, to fear for "the very existence of the Japanese nation." While accidents that result in mass radiological releases have been rare throughout the operating histories of existing nuclear power plants, the growing number of plants worldwide increases the likelihood that such releases will occur again in the future. Nuclear power is an important source of energy in the U.S. and will be for the foreseeable future. Accidents far smaller in scale than the one in Fukushima could have major societal consequences. Given the extensive, ongoing Nuclear Regulatory Commission (NRC) and industry assessment of nuclear power plant safety and preparedness issues, the Center for Biosecurity of UPMC focused on offsite policies and plans intended to reduce radiation exposure to the public in the aftermath of an accident. This report provides an assessment of Japan's efforts at nuclear consequence management; identifies concerns with current U.S. policies and practices for "outside the fence" management of such an event in the U.S.; and makes recommendations for steps that can be taken to strengthen U.S. government, industry, and community response to large-scale accidents at nuclear power plants.
Reynolds number trend of hierarchies and scale interactions in turbulent boundary layers.
Baars, W J; Hutchins, N; Marusic, I
2017-03-13
Small-scale velocity fluctuations in turbulent boundary layers are often coupled with the larger-scale motions. Studying the nature and extent of this scale interaction allows for a statistically representative description of the small scales over a time scale of the larger, coherent scales. In this study, we consider temporal data from hot-wire anemometry at Reynolds numbers ranging from Re τ ≈2800 to 22 800, in order to reveal how the scale interaction varies with Reynolds number. Large-scale conditional views of the representative amplitude and frequency of the small-scale turbulence, relative to the large-scale features, complement the existing consensus on large-scale modulation of the small-scale dynamics in the near-wall region. Modulation is a type of scale interaction, where the amplitude of the small-scale fluctuations is continuously proportional to the near-wall footprint of the large-scale velocity fluctuations. Aside from this amplitude modulation phenomenon, we reveal the influence of the large-scale motions on the characteristic frequency of the small scales, known as frequency modulation. From the wall-normal trends in the conditional averages of the small-scale properties, it is revealed how the near-wall modulation transitions to an intermittent-type scale arrangement in the log-region. On average, the amplitude of the small-scale velocity fluctuations only deviates from its mean value in a confined temporal domain, the duration of which is fixed in terms of the local Taylor time scale. These concentrated temporal regions are centred on the internal shear layers of the large-scale uniform momentum zones, which exhibit regions of positive and negative streamwise velocity fluctuations. With an increasing scale separation at high Reynolds numbers, this interaction pattern encompasses the features found in studies on internal shear layers and concentrated vorticity fluctuations in high-Reynolds-number wall turbulence.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).
Humphreys, B L; Hole, W T; McCray, A T; Fitzmaurice, J M
1996-01-01
The National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR) are sponsoring a test to determine the extent to which a combination of existing health-related terminologies covers vocabulary needed in health information systems. The test vocabularies are the 30 that are fully or partially represented in the 1996 edition of the Unified Medical Language System (UMLS) Metathesaurus, plus three planned additions: the portions of SNOMED International not in the 1996 Metathesaurus Read Clinical Classification, and the Logical Observations Identifiers, Names, and Codes (LOINC) system. These vocabularies are available to testers through a special interface to the Internet-based UMLS Knowledge Source Server. The test will determine the ability of the test vocabularies to serve as a source of controlled vocabulary for health data systems and applications. It should provide the basis for realistic resource estimates for developing and maintaining a comprehensive "standard" health vocabulary that is based on existing terminologies. PMID:8816351
Soil Water Content Sensors as a Method of Measuring Ice Depth
NASA Astrophysics Data System (ADS)
Whitaker, E.; Reed, D. E.; Desai, A. R.
2015-12-01
Lake ice depth provides important information about local and regional climate change, weather patterns, and recreational safety, as well as impacting in situ ecology and carbon cycling. However, it is challenging to measure ice depth continuously from a remote location, as existing methods are too large, expensive, and/or time-intensive. Therefore, we present a novel application that reduces the size and cost issues by using soil water content reflectometer sensors. Analysis of sensors deployed in an environmental chamber using a scale model of a lake demonstrated their value as accurate measures of the change in ice depth over any time period, through measurement of the liquid-to-solid phase change. A robust correlation exists between volumetric water content in time as a function of environmental temperature. This relationship allows us to convert volumetric water content into ice depth. An array of these sensors will be placed in Lake Mendota, Madison, Wisconsin in winter 2015-2016, to create a temporally high-resolution ice depth record, which will be used for ecological or climatological studies while also being transmitted to the public to increase recreational safety.
Performance-Oriented Privacy-Preserving Data Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, R K; Critchlow, T
2004-09-15
Current solutions to integrating private data with public data have provided useful privacy metrics, such as relative information gain, that can be used to evaluate alternative approaches. Unfortunately, they have not addressed critical performance issues, especially when the public database is very large. The use of hashes and noise yields better performance than existing techniques while still making it difficult for unauthorized entities to distinguish which data items truly exist in the private database. As we show here, leveraging the uncertainty introduced by collisions caused by hashing and the injection of noise, we present a technique for performing a relationalmore » join operation between a massive public table and a relatively smaller private one.« less
Could the electroweak scale be linked to the large scale structure of the Universe?
NASA Technical Reports Server (NTRS)
Chakravorty, Alak; Massarotti, Alessandro
1991-01-01
We study a model where the domain walls are generated through a cosmological phase transition involving a scalar field. We assume the existence of a coupling between the scalar field and dark matter and show that the interaction between domain walls and dark matter leads to an energy dependent reflection mechanism. For a simple Yakawa coupling, we find that the vacuum expectation value of the scalar field is theta approx. equals 30GeV - 1TeV, in order for the model to be successful in the formation of large scale 'pancake' structures.
NASA Astrophysics Data System (ADS)
Manfredi, Sabato
2016-06-01
Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.
Why do large and small scales couple in a turbulent boundary layer?
NASA Astrophysics Data System (ADS)
Bandyopadhyay, Promode R.
2011-11-01
Correlation measurement, which is not definitive, suggests that large and small scales in a turbulent boundary layer (TBL) couple. A TBL is modeled as a jungle of interacting nonlinear oscillators to explore the origin of the coupling. These oscillators have the inherent property of self-sustainability, disturbance rejection, and of self-referential phase reset whereby several oscillators can phase align (or have constant phase difference between them) when an ``external'' impulse is applied. Consequently, these properties of a TBL are accounted for: self-sustainability, return of the wake component after a disturbance is removed, and the formation of the 18o large structures, which are composed of a sequential train of hairpin vortices. The nonlinear ordinary differential equations of the oscillators are solved using an analog circuit for rapid solution. The post-bifurcation limit cycles are determined. A small scale and a large scale are akin to two different oscillators. The state variables from the two disparate interacting oscillators are shown to couple and the small scales appear at certain regions of the phase of the large scale. The coupling is a consequence of the nonlinear oscillatory behavior. Although state planes exist where the disparate scales appear de-superposed, all scales in a TBL are in fact coupled and they cannot be monochromatically isolated.
What Will the Neighbors Think? Building Large-Scale Science Projects Around the World
Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug
2017-12-22
Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.
Brown, Kenneth H; Baker, Shawn K
2009-03-01
This paper summarizes the results of the foregoing reviews of the impact of different intervention strategies designed to enhance zinc nutrition, including supplementation, fortification, and dietary diversification or modification. Current evidence indicates a beneficial impact of such interventions on zinc status and zinc-related functional outcomes. Preventive zinc supplementation reduces the incidence of diarrhea and acute lower respiratory tract infection among young children, decreases mortality of children over 12 months of age, and increases growth velocity. Therapeutic zinc supplementation during episodes of diarrhea reduces the duration and severity of illness. Zinc fortification increases zinc intake and total absorbed zinc, and recent studies are beginning to confirm a positive impact of zinc fortification on indicators of population zinc status. To assist with the development of zinc intervention programs, more information is needed on the prevalence of zinc deficiency in different countries, and rigorous evaluations of the effectiveness of large-scale zinc intervention programs should be planned. Recommended steps for scaling up zinc intervention programs, with or without other micronutrients, are described. In summary, there is now clear evidence of the benefit of selected interventions to reduce the risk of zinc deficiency, and a global commitment is urgently needed to conduct systematic assessments of population zinc status and to develop interventions to control zinc deficiency in the context of existing public health and nutrition programs.
Three controversies over item disclosure in medical licensure examinations.
Park, Yoon Soo; Yang, Eunbae B
2015-01-01
In response to views on public's right to know, there is growing attention to item disclosure - release of items, answer keys, and performance data to the public - in medical licensure examinations and their potential impact on the test's ability to measure competence and select qualified candidates. Recent debates on this issue have sparked legislative action internationally, including South Korea, with prior discussions among North American countries dating over three decades. The purpose of this study is to identify and analyze three issues associated with item disclosure in medical licensure examinations - 1) fairness and validity, 2) impact on passing levels, and 3) utility of item disclosure - by synthesizing existing literature in relation to standards in testing. Historically, the controversy over item disclosure has centered on fairness and validity. Proponents of item disclosure stress test takers' right to know, while opponents argue from a validity perspective. Item disclosure may bias item characteristics, such as difficulty and discrimination, and has consequences on setting passing levels. To date, there has been limited research on the utility of item disclosure for large scale testing. These issues requires ongoing and careful consideration.
Pandemic influenza A (H1N1) 2009 vaccination in children: a UK perspective.
de Whalley, Philip C S; Pollard, Andrew J
2013-03-01
Pandemic H1N1 influenza infection was common in the UK in 2009 and children were particularly vulnerable. Most cases were mild or subclinical, but there was significant mortality, predominantly in those with pre-existing disease. Despite the rapid development of monovalent pandemic vaccines, and the fast-tracked approval process, these products were not available for large-scale use until the end of the second wave of infection. Vaccine uptake was relatively low, both among children and health-care workers. The monovalent pandemic vaccines and the 2010/2011 trivalent seasonal influenza vaccines were immunogenic and effective, and they probably reduced the impact of the third wave of infection. Vaccines containing novel adjuvants enabled antigen sparing, but safety concerns could limit the future use of these adjuvanted influenza vaccines in children. Public perceptions that the threat of the pandemic was exaggerated by the authorities, and concerns about vaccine safety, might prompt an inadequate response to the next influenza pandemic, potentially compromising public health. © 2012 The Authors. Journal of Paediatrics and Child Health © 2012 Paediatrics and Child Health Division (Royal Australasian College of Physicians).
Pozzebon, Julie A; Visser, Beth A; Ashton, Michael C; Lee, Kibeom; Goldberg, Lewis R
2010-03-01
We investigated the psychometric properties of the Oregon Vocational Interest Scales (ORVIS), a brief public-domain alternative to commercial inventories, in a large community sample and in a college sample. In both samples, we examined the factor structure, scale intercorrelations, and personality correlates of the ORVIS, and in the community sample, we also examined the correlations of the ORVIS scales with cognitive abilities and with the scales of a longer, proprietary interest survey. In both samples, all 8 scales-Leadership, Organization, Altruism, Creativity, Analysis, Producing, Adventuring, and Erudition-showed wide variation in scores, high internal-consistency reliabilities, and a pattern of high convergent and low discriminant correlations with the scales of the proprietary interest survey. Overall, the results support the construct validity of the scales, which are recommended for use in research on vocational interests and other individual differences.
Strategic behavior and governance challenges in self-organized coupled natural-human systems
NASA Astrophysics Data System (ADS)
Muneepeerakul, R.; Anderies, J. M.
2017-12-01
Successful and sustainable coupling of human societies and natural systems requires effective governance, which depends on the existence of proper infrastructure (both hard and soft). In recent decades, much attention has been paid to what has allowed many small-scale self-organized coupled natural-human systems around the world to persist for centuries, thanks to a large part to the work by Elinor Ostrom and colleagues. In this work, we mathematically operationalize a conceptual framework that is developed based on this body of work by way of a stylized model. The model captures the interplay between replicator dynamics within the population, dynamics of natural resources, and threshold characteristics of public infrastructure. The model analysis reveals conditions for long-term sustainability and collapse of the coupled systems as well as other tradeoffs and potential pitfalls in governing these systems.
Colvin, Christopher J.
2014-01-01
The HIV epidemic is widely recognised as having prompted one of the most remarkable intersections ever of illness, science and activism. The production, circulation, use and evaluation of empirical scientific ‘evidence’ played a central part in activists’ engagement with AIDS science. Previous activist engagement with evidence focused on the social and biomedical responses to HIV in the global North as well as challenges around ensuring antiretroviral treatment (ART) was available in the global South. More recently, however, with the roll-out and scale-up of large public-sector ART programmes and new multi-dimensional prevention efforts, the relationships between evidence and activism have been changing. Scale-up of these large-scale treatment and prevention programmes represents an exciting new opportunity while bringing with it a host of new challenges. This paper examines what new forms of evidence and activism will be required to address the challenges of the scaling-up era of HIV treatment and prevention. It reviews some recent controversies around evidence and HIV scale-up and describes the different forms of evidence and activist strategies that will be necessary for a robust response to these new challenges. PMID:24498918
The Pros and Cons of Existing Formula Financing Systems and a Suggested New Approach.
ERIC Educational Resources Information Center
Van Wijk, Alfons P.; Levine, Jack B.
Because the Colleges of Applied Arts and Technology (CAATS) of Ontario are largely dependent on the Ontario Government for financial support, it is important to justify their level of expenditure, and prove that allotted public funds are effectively and efficiently managed. After a brief discussion of the existing resource allocation process in…
Deployment, Design, and Commercialization of Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, Daniel Lucio
Climate change mitigation requires gigaton-scale carbon dioxide removal technologies, yet few examples exist beyond niche markets. This dissertation informs large-scale implementation of bioenergy with carbon capture and sequestration (BECCS), a carbon-negative energy technology. It builds on existing literature with a novel focus on deployment, design, commercialization, and communication of BECCS. BECCS, combined with aggressive renewable deployment and fossil emission reductions, can enable a carbon-negative power system in Western North America by 2050, with up to 145% emissions reduction from 1990 levels. BECCS complements other sources of renewable energy, and can be deployed in a manner consistent with regional policies and design considerations. The amount of biomass resource available limits the level of fossil CO2 emissions that can still satisfy carbon emissions caps. Offsets produced by BECCS are more valuable to the power system than the electricity it provides. Implied costs of carbon for BECCS are relatively low ( 75/ton CO2 at scale) for a capital-intensive technology. Optimal scales for BECCS are an order of magnitude larger than proposed scales found in existing literature. Deviations from optimal scaled size have little effect on overall systems costs - suggesting that other factors, including regulatory, political, or logistical considerations, may ultimately have a greater influence on plant size than the techno-economic factors considered. The flexibility of thermochemical conversion enables a viable transition pathway for firms, utilities and governments to achieve net-negative CO 2 emissions in production of electricity and fuels given increasingly stringent climate policy. Primary research, development (R&D), and deployment needs are in large-scale biomass logistics, gasification, gas cleaning, and geological CO2 storage. R&D programs, subsidies, and policy that recognize co-conversion processes can support this pathway to commercialization. Here, firms can embrace a gradual transition pathway to deep decarbonization, limiting economic dislocation and increasing transfer of knowledge between the fossil and renewable sectors. Global cumulative capital investment needs for BECCS through 2050 are over 1.9 trillion (2015$, 4% real interest rate) for scenarios likely to limit global warming to 2 °C. This scenario envisions deployment of as much as 24 GW/yr of BECCS by 2040 in the electricity sector. To achieve theses rates of deployment within 15-20 years, governments and firms must commit to research, development, and deployment on an unprecedented scale. Three primary issues complicate emissions accounting for BECCS: cross-sector CO2 accounting, regrowth, and timing. Switchgrass integration decreases lifecycle greenhouse gas impacts of co-conversion systems with CCS, across a wide range of land-use change scenarios. Risks at commercial scale include adverse effects on food security, land conservation, social equity, and biodiversity, as well as competition for water resources. This dissertation argues for an iterative risk management approach to BECCS sustainability, with standards being updated as more knowledge is gained through deployment. Sustainability impacts and public opposition to BECCS may be reduced with transparent measurement and communication. Commercial-scale deployment is dependent on the coordination of a wide range of actors, many with different incentives and worldviews. Despite this problem, this dissertation challenges governments, industry incumbents, and emerging players to research, support, and deploy BECCS.
Instructional Supervision in Public Secondary Schools in Kenya
ERIC Educational Resources Information Center
Wanzare, Zachariah
2012-01-01
This article reports some findings of study regarding practices and procedures of internal instructional supervision in public secondary schools in Kenya. The findings are part of a large-scale project undertaken in Kenya to determine the perceptions of headteachers, teachers and senior government education officers regarding the practices of…
When Transparency Obscures: The Political Spectacle of Accountability
ERIC Educational Resources Information Center
Koyama, Jill; Kania, Brian
2014-01-01
In the United States (US), an increase in standardization, quantification, competition, and large-scale comparison--cornerstones of neoliberal accountability--have been accompanied by devices of transparency, through which various forms of school data are made available to the public. Such public reporting, we are told by politicians and education…
Incorporating Learning Theory into Existing Systems Engineering Models
2013-09-01
3. Social Cognition 22 Table 1. Classification of learning theories Behaviorism Cognitivism Constructivism Connectivism...Introdution to design of large scale systems. New York: Mcgraw-Hill. Grusec. J. (1992). Social learning theory and development psychology: The... LEARNING THEORY INTO EXISTING SYSTEMS ENGINEERING MODELS by Valentine Leo September 2013 Thesis Advisor: Gary O. Langford Co-Advisor
2015-02-11
A similar risk-based approach may be appropriate for deploying military personnel. e) If DoD were to consider implementing a large- scale pre...quality of existing spirometry programs prior to considering a larger scale pre-deployment effort. Identifying an accelerated decrease in spirometry...baseline spirometry on a wider scale . e) Conduct pre-deployment baseline spirometry if there is a significant risk of exposure to a pulmonary hazard based
Interactive Exploration on Large Genomic Datasets.
Tu, Eric
2016-01-01
The prevalence of large genomics datasets has made the the need to explore this data more important. Large sequencing projects like the 1000 Genomes Project [1], which reconstructed the genomes of 2,504 individuals sampled from 26 populations, have produced over 200TB of publically available data. Meanwhile, existing genomic visualization tools have been unable to scale with the growing amount of larger, more complex data. This difficulty is acute when viewing large regions (over 1 megabase, or 1,000,000 bases of DNA), or when concurrently viewing multiple samples of data. While genomic processing pipelines have shifted towards using distributed computing techniques, such as with ADAM [4], genomic visualization tools have not. In this work we present Mango, a scalable genome browser built on top of ADAM that can run both locally and on a cluster. Mango presents a combination of different optimizations that can be combined in a single application to drive novel genomic visualization techniques over terabytes of genomic data. By building visualization on top of a distributed processing pipeline, we can perform visualization queries over large regions that are not possible with current tools, and decrease the time for viewing large data sets. Mango is part of the Big Data Genomics project at University of California-Berkeley [25] and is published under the Apache 2 license. Mango is available at https://github.com/bigdatagenomics/mango.
Group Size Effect on Cooperation in One-Shot Social Dilemmas II: Curvilinear Effect.
Capraro, Valerio; Barcelo, Hélène
2015-01-01
In a world in which many pressing global issues require large scale cooperation, understanding the group size effect on cooperative behavior is a topic of central importance. Yet, the nature of this effect remains largely unknown, with lab experiments insisting that it is either positive or negative or null, and field experiments suggesting that it is instead curvilinear. Here we shed light on this apparent contradiction by considering a novel class of public goods games inspired to the realistic scenario in which the natural output limits of the public good imply that the benefit of cooperation increases fast for early contributions and then decelerates. We report on a large lab experiment providing evidence that, in this case, group size has a curvilinear effect on cooperation, according to which intermediate-size groups cooperate more than smaller groups and more than larger groups. In doing so, our findings help fill the gap between lab experiments and field experiments and suggest concrete ways to promote large scale cooperation among people.
NASA Astrophysics Data System (ADS)
Wüest, Robert; Nebiker, Stephan
2018-05-01
In this paper we present an app framework for augmenting large-scale walkable maps and orthoimages in museums or public spaces using standard smartphones and tablets. We first introduce a novel approach for using huge orthoimage mosaic floor prints covering several hundred square meters as natural Augmented Reality (AR) markers. We then present a new app architecture and subsequent tests in the Swissarena of the Swiss National Transport Museum in Lucerne demonstrating the capabilities of accurately tracking and augmenting different map topics, including dynamic 3d data such as live air traffic. The resulting prototype was tested with everyday visitors of the museum to get feedback on the usability of the AR app and to identify pitfalls when using AR in the context of a potentially crowded museum. The prototype is to be rolled out to the public after successful testing and optimization of the app. We were able to show that AR apps on standard smartphone devices can dramatically enhance the interactive use of large-scale maps for different purposes such as education or serious gaming in a museum context.
A Value-Added Estimate of Higher Education Quality of US States
ERIC Educational Resources Information Center
Zhang, Lei
2009-01-01
States differ substantially in higher education policies. Little is known about the effects of state policies on the performance of public colleges and universities, largely because no clear measures of college quality exist. In this paper, I estimate the average quality of public colleges of US states based on the value-added to individuals'…
Are Selective Private and Public Colleges Affordable?
ERIC Educational Resources Information Center
Karikari, John A.; Dezhbakhsh, Hashem
2013-01-01
We examine college affordability under the existing pricing and financial aid system that awards both non need-based and need-based aid. Using data of freshmen attending a large number of selective private and public colleges in the USA, we find that the prices students actually pay for college have increased over time. Need-based grant aid has…
Charter Schools' Performance and Accountability: A Disconnect. Policy Brief
ERIC Educational Resources Information Center
Bracey, Gerald W.
2005-01-01
This report argues that evidence exists for the case that the charter school movement is largely a failed reform. The report puts the charter school movement in the context of dissatisfaction with public schools and the public sector in general. It then describes the claims for charters made by the early charter school advocates, emphasizing the…
Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S
2017-08-15
Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Yitan; Xu, Yanxun; Helseth, Donald L.
Background: Genetic interactions play a critical role in cancer development. Existing knowledge about cancer genetic interactions is incomplete, especially lacking evidences derived from large-scale cancer genomics data. The Cancer Genome Atlas (TCGA) produces multimodal measurements across genomics and features of thousands of tumors, which provide an unprecedented opportunity to investigate the interplays of genes in cancer. Methods: We introduce Zodiac, a computational tool and resource to integrate existing knowledge about cancer genetic interactions with new information contained in TCGA data. It is an evolution of existing knowledge by treating it as a prior graph, integrating it with a likelihood modelmore » derived by Bayesian graphical model based on TCGA data, and producing a posterior graph as updated and data-enhanced knowledge. In short, Zodiac realizes “Prior interaction map + TCGA data → Posterior interaction map.” Results: Zodiac provides molecular interactions for about 200 million pairs of genes. All the results are generated from a big-data analysis and organized into a comprehensive database allowing customized search. In addition, Zodiac provides data processing and analysis tools that allow users to customize the prior networks and update the genetic pathways of their interest. Zodiac is publicly available at www.compgenome.org/ZODIAC. Conclusions: Zodiac recapitulates and extends existing knowledge of molecular interactions in cancer. It can be used to explore novel gene-gene interactions, transcriptional regulation, and other types of molecular interplays in cancer.« less
Wood, Fiona; Kowalczuk, Jenny; Elwyn, Glyn; Mitchell, Clive; Gallacher, John
2011-08-01
Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies is acceptable for prospective participants using an example online genetics study. We conducted semi-structured interviews with 42 members of the public stratified by age group, gender and newspaper readership (a measure of social status). Respondents were asked to use a website designed to recruit for a large-scale genetic study. After using the website a semi-structured interview was conducted to explore opinions and any issues they would have. Responses were analysed using thematic content analysis. The majority of respondents said they would take part in the research (32/42). Those who said they would decline to participate saw fewer benefits from the research, wanted more information and expressed a greater number of concerns about the study. Younger respondents had concerns over time commitment. Middle aged respondents were concerned about privacy and security. Older respondents were more altruistic in their motivation to participate. Common themes included trust in the authenticity of the website, security of personal data, curiosity about their own genetic profile, operational concerns and a desire for more information about the research. Online consent to large-scale genetic studies is likely to be acceptable to the public. The online consent process must establish trust quickly and effectively by asserting authenticity and credentials, and provide access to a range of information to suit different information preferences.
Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)
NASA Astrophysics Data System (ADS)
Hancher, M.
2013-12-01
Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.
Large-Scale Event Extraction from Literature with Multi-Level Gene Normalization
Wei, Chih-Hsuan; Hakala, Kai; Pyysalo, Sampo; Ananiadou, Sophia; Kao, Hung-Yu; Lu, Zhiyong; Salakoski, Tapio; Van de Peer, Yves; Ginter, Filip
2013-01-01
Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/). Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from http://evexdb.org/download/, under the Creative Commons – Attribution – Share Alike (CC BY-SA) license. PMID:23613707
Large-scale data analysis of power grid resilience across multiple US service regions
NASA Astrophysics Data System (ADS)
Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert
2016-05-01
Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, Alan J.
2016-04-29
While the stated reason for asking this question is “to understand better our ability to warn policy makers in the unlikely event of an unanticipated SRM geoengineering deployment or large-scale field experiment”, my colleagues and I felt that motives would be important context because the scale of any meaningful SRM deployment would be so large that covert deployment seems impossible. However, several motives emerged that suggest a less-than-global effort might be important.
Crash test and evaluation of temporary wood sign support system for large guide signs.
DOT National Transportation Integrated Search
2016-07-01
The objective of this research task was to evaluate the impact performance of a temporary wood sign support : system for large guide signs. It was desired to use existing TxDOT sign hardware in the design to the extent possible. : The full-scale cras...
SHEAR-DRIVEN DYNAMO WAVES IN THE FULLY NONLINEAR REGIME
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pongkitiwanichakul, P.; Nigro, G.; Cattaneo, F.
2016-07-01
Large-scale dynamo action is well understood when the magnetic Reynolds number ( Rm ) is small, but becomes problematic in the astrophysically relevant large Rm limit since the fluctuations may control the operation of the dynamo, obscuring the large-scale behavior. Recent works by Tobias and Cattaneo demonstrated numerically the existence of large-scale dynamo action in the form of dynamo waves driven by strongly helical turbulence and shear. Their calculations were carried out in the kinematic regime in which the back-reaction of the Lorentz force on the flow is neglected. Here, we have undertaken a systematic extension of their work tomore » the fully nonlinear regime. Helical turbulence and large-scale shear are produced self-consistently by prescribing body forces that, in the kinematic regime, drive flows that resemble the original velocity used by Tobias and Cattaneo. We have found four different solution types in the nonlinear regime for various ratios of the fluctuating velocity to the shear and Reynolds numbers. Some of the solutions are in the form of propagating waves. Some solutions show large-scale helical magnetic structure. Both waves and structures are permanent only when the kinetic helicity is non-zero on average.« less
Measuring happiness in large population
NASA Astrophysics Data System (ADS)
Wenas, Annabelle; Sjahputri, Smita; Takwin, Bagus; Primaldhi, Alfindra; Muhamad, Roby
2016-01-01
The ability to know emotional states for large number of people is important, for example, to ensure the effectiveness of public policies. In this study, we propose a measure of happiness that can be used in large scale population that is based on the analysis of Indonesian language lexicons. Here, we incorporate human assessment of Indonesian words, then quantify happiness on large-scale of texts gathered from twitter conversations. We used two psychological constructs to measure happiness: valence and arousal. We found that Indonesian words have tendency towards positive emotions. We also identified several happiness patterns during days of the week, hours of the day, and selected conversation topics.
A public health perspective on the U.S. response to the Fukushima radiological emergency.
Whitcomb, Robert C; Ansari, Armin J; Buzzell, Jennifer J; McCurley, M Carol; Miller, Charles W; Smith, James M; Evans, D Lynn
2015-03-01
On 11 March 2011, northern Japan was struck by first a magnitude 9.0 earthquake off the eastern coast and then by an ensuing tsunami. At the Fukushima Dai-ichi Nuclear Power Plant (NPP), these twin disasters initiated a cascade of events that led to radionuclide releases. Radioactive material from Japan was subsequently transported to locations around the globe, including the U.S. The levels of radioactive material that arrived in the U.S. were never large enough to cause health effects, but the presence of this material in the environment was enough to require a response from the public health community. Events during the response illustrated some U.S. preparedness challenges that previously had been anticipated and others that were newly identified. Some of these challenges include the following: (1) Capacity, including radiation health experts, for monitoring potentially exposed people for radioactive contamination are limited and may not be adequate at the time of a large-scale radiological incident; (2) there is no public health authority to detain people contaminated with radioactive materials; (3) public health and medical capacities for response to radiation emergencies are limited; (4) public health communications regarding radiation emergencies can be improved to enhance public health response; (5) national and international exposure standards for radiation measurements (and units) and protective action guides lack uniformity; (6) access to radiation emergency monitoring data can be limited; and (7) the Strategic National Stockpile may not be currently prepared to meet the public health need for KI in the case of a surge in demand from a large-scale radiation emergency. Members of the public health community can draw on this experience to improve public health preparedness.
Quantitative controls on submarine slope failure morphology
Lee, H.J.; Schwab, W.C.; Edwards, B.D.; Kayen, R.E.
1991-01-01
The concept of the steady-state of deformation can be applied to predicting the ultimate form a landslide will take. The steady-state condition, defined by a line in void ratio-effective stress space, exists at large levels of strain and remolding. Conceptually, if sediment initially exists with void ratio-effective stress conditions above the steady-state line, the sediment shear strength will decrease during a transient loading event, such as an earthquake or storm. If the reduced shear strength existing at the steady state is less than the downslope shear stress induced by gravity, then large-scale internal deformation, disintegration, and flow will occur. -from Authors
A multi-scale network method for two-phase flow in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khayrat, Karim, E-mail: khayratk@ifd.mavt.ethz.ch; Jenny, Patrick
Pore-network models of porous media are useful in the study of pore-scale flow in porous media. In order to extract macroscopic properties from flow simulations in pore-networks, it is crucial the networks are large enough to be considered representative elementary volumes. However, existing two-phase network flow solvers are limited to relatively small domains. For this purpose, a multi-scale pore-network (MSPN) method, which takes into account flow-rate effects and can simulate larger domains compared to existing methods, was developed. In our solution algorithm, a large pore network is partitioned into several smaller sub-networks. The algorithm to advance the fluid interfaces withinmore » each subnetwork consists of three steps. First, a global pressure problem on the network is solved approximately using the multiscale finite volume (MSFV) method. Next, the fluxes across the subnetworks are computed. Lastly, using fluxes as boundary conditions, a dynamic two-phase flow solver is used to advance the solution in time. Simulation results of drainage scenarios at different capillary numbers and unfavourable viscosity ratios are presented and used to validate the MSPN method against solutions obtained by an existing dynamic network flow solver.« less
Tropical agricultural is a major threat to biodiversity worldwide. In addition to the direct impacts of converting native vegetation to agriculture this process is accompanied by a wider set of human-induced disturbances, many of which are poorly addressed by existing environment...
Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale
ERIC Educational Resources Information Center
Roddy, Dermot J.
2008-01-01
An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…
Bellman Ford algorithm - in Routing Information Protocol (RIP)
NASA Astrophysics Data System (ADS)
Krianto Sulaiman, Oris; Mahmud Siregar, Amir; Nasution, Khairuddin; Haramaini, Tasliyah
2018-04-01
In a large scale network need a routing that can handle a lot number of users, one of the solutions to cope with large scale network is by using a routing protocol, There are 2 types of routing protocol that is static and dynamic, Static routing is manually route input based on network admin, while dynamic routing is automatically route input formed based on existing network. Dynamic routing is efficient used to network extensively because of the input of route automatic formed, Routing Information Protocol (RIP) is one of dynamic routing that uses the bellman-ford algorithm where this algorithm will search for the best path that traversed the network by leveraging the value of each link, so with the bellman-ford algorithm owned by RIP can optimize existing networks.
Optimization of large matrix calculations for execution on the Cray X-MP vector supercomputer
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1988-01-01
A considerable volume of large computational computer codes were developed for NASA over the past twenty-five years. This code represents algorithms developed for machines of earlier generation. With the emergence of the vector supercomputer as a viable, commercially available machine, an opportunity exists to evaluate optimization strategies to improve the efficiency of existing software. This result is primarily due to architectural differences in the latest generation of large-scale machines and the earlier, mostly uniprocessor, machines. A sofware package being used by NASA to perform computations on large matrices is described, and a strategy for conversion to the Cray X-MP vector supercomputer is also described.
De Ferrari, Aldo; Gentille, Cesar; Davalos, Long; Huayanay, Leandro; Malaga, German
2014-01-01
Background The interaction between physicians and the pharmaceutical industry influences physicians' attitudes and prescribing behavior. Although largely studied in the US, this topic has not been well studied in resource-poor settings, where a close relationship between physicians and industry still exists. Objective To describe physician interactions with and attitudes towards the pharmaceutical industry in a public general hospital in Lima, Peru. Design Descriptive, cross-sectional study through an anonymous, self-filled questionnaire distributed among faculty and trainee physicians of five different clinical departments working in a Peruvian public general hospital. A transcultural validation of an existing Spanish questionnaire was performed. Exposure to marketing activities, motivations to contact pharmaceutical representatives and attitudes towards industry were studied. Collected data was analyzed by degree of training, clinical department, gender and teaching status. Attitudes were measured on a four-point LIKERT scale. Results 155 physicians completed the survey, of which 148 were included in the study sample. 94.5% of attending physicians reported ongoing encounters with pharmaceutical representatives. The most common industry-related activities were receiving medical samples (91.2%), promotional material (87.8%) and attending meetings in restaurants (81.8%). Respondents considered medical samples and continuing medical education the most ethically acceptable benefits. We found significant differences between attendings and residents, and teaching and non-teaching attendings. An association between the amount of encounters with pharmaceutical representatives, and attitudes towards industry and acceptance of medical samples was found. Conclusions A close physician-industry relationship exists in the population under study. The contact is established mainly through pharmaceutical representatives. Medical samples are the most received and ethically accepted benefit. The attitudes of physicians on the ethical standards of acceptance of medical samples and other benefits are closely related with their exposure to the pharmaceutical industry. Future studies could explore the motivations of physicians working in resource-poor settings to maintain a close relationship with industry. PMID:24978481
GIS Representation of Coal-Bearing Areas in North, Central, and South America
Tewalt, Susan J.; Kinney, Scott A.; Merrill, Matthew D.
2008-01-01
Worldwide coal consumption and international coal trade are projected to increase in the next several decades (Energy Information Administration, 2007). A search of existing literature indicates that in the Western Hemisphere, coal resources are known to occur in about 30 countries. The need exists to be able to depict these areas in a digital format for use in Geographic Information System (GIS) applications at small scales (large areas) and in visual presentations. Existing surficial geology GIS layers of the appropriate geologic age have been used as an approximation to depict the extent of coal-bearing areas in North, Central, and South America, as well as Greenland (fig. 1). Global surficial geology GIS data were created by the U.S. Geological Survey (USGS) for use in world petroleum assessments (Hearn and others, 2003). These USGS publications served as the major sources for the selection and creation of polygons to represent coal-bearing areas. Additional publications and maps by various countries and agencies were also used as sources of coal locations. GIS geologic polygons were truncated where literature or hardcopy maps did not indicate the presence of coal. The depicted areas are not adequate for use in coal resource calculations, as they were not adjusted for geologic structure and do not include coal at depth. Additionally, some coal areas in Central America could not be represented by the mapped surficial geology and are shown only as points based on descriptions or depictions from scientific publications or available maps. The provided GIS files are intended to serve as a backdrop for display of coal information. Three attributes of the coal that are represented by the polygons or points include geologic age (or range of ages), published rank (or range of ranks), and information source (published sources for age, rank, or physical location, or GIS geology base).
De Ferrari, Aldo; Gentille, Cesar; Davalos, Long; Huayanay, Leandro; Malaga, German
2014-01-01
The interaction between physicians and the pharmaceutical industry influences physicians' attitudes and prescribing behavior. Although largely studied in the US, this topic has not been well studied in resource-poor settings, where a close relationship between physicians and industry still exists. To describe physician interactions with and attitudes towards the pharmaceutical industry in a public general hospital in Lima, Peru. Descriptive, cross-sectional study through an anonymous, self-filled questionnaire distributed among faculty and trainee physicians of five different clinical departments working in a Peruvian public general hospital. A transcultural validation of an existing Spanish questionnaire was performed. Exposure to marketing activities, motivations to contact pharmaceutical representatives and attitudes towards industry were studied. Collected data was analyzed by degree of training, clinical department, gender and teaching status. Attitudes were measured on a four-point LIKERT scale. 155 physicians completed the survey, of which 148 were included in the study sample. 94.5% of attending physicians reported ongoing encounters with pharmaceutical representatives. The most common industry-related activities were receiving medical samples (91.2%), promotional material (87.8%) and attending meetings in restaurants (81.8%). Respondents considered medical samples and continuing medical education the most ethically acceptable benefits. We found significant differences between attendings and residents, and teaching and non-teaching attendings. An association between the amount of encounters with pharmaceutical representatives, and attitudes towards industry and acceptance of medical samples was found. A close physician-industry relationship exists in the population under study. The contact is established mainly through pharmaceutical representatives. Medical samples are the most received and ethically accepted benefit. The attitudes of physicians on the ethical standards of acceptance of medical samples and other benefits are closely related with their exposure to the pharmaceutical industry. Future studies could explore the motivations of physicians working in resource-poor settings to maintain a close relationship with industry.
From Product to Place-Spatializing governance in a commodified landscape.
van Oosten, Cora; Moeliono, Moira; Wiersum, Freerk
2018-07-01
This article analyzes the potential for landscape governance in large-scale commodity landscapes in Indonesia. It conceptualizes landscape governance as the spatialization of governance, which entails the interplay between natural-spatial conditions of place, public-private actor constellations, and policy responses. The article presents the case of a commodified oil palm landscape in West Kalimantan, where a potentially new type of landscape governance is emerging out of the experimental activities of an ecologically responsible commercial enterprise. It describes the development of a multifunctional concession as a process of productive bricolage involving the creative combination of different land uses within a single productive space. It also describes how such a multifunctional concession does not fit into existing policies, which are sectorally defined and embedded in sticky institutional frames. The formation of new public-private institutional arrangements needed for the development of multifunctional concessions is a difficult process, as it requires an alignment of contrasting discourses and an integration of sectorally-defined policy frames. If successful, it might facilitate the transition from multifunctional concessions to multifunctional landscapes. Such a fundamental change in land use and production relations however requires intensive stakeholder engagement and policy dialog. Indonesia's continuous decentralization process offers opportunities for this, as it increasingly provides institutional space at the landscape level, for public and private actors to explore common concerns, and craft public-private arrangements specific to the landscape.
CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.
Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee
2018-04-20
The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.
Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E
2017-09-01
Some recent modelling papers projecting smaller fish sizes and catches in a warmer future are based on erroneous assumptions regarding (i) the scaling of gills with body mass and (ii) the energetic cost of 'maintenance'. Assumption (i) posits that insurmountable geometric constraints prevent respiratory surface areas from growing as fast as body volume. It is argued that these constraints explain allometric scaling of energy metabolism, whereby larger fishes have relatively lower mass-specific metabolic rates. Assumption (ii) concludes that when fishes reach a certain size, basal oxygen demands will not be met, because of assumption (i). We here demonstrate unequivocally, by applying accepted physiological principles with reference to the existing literature, that these assumptions are not valid. Gills are folded surfaces, where the scaling of surface area to volume is not constrained by spherical geometry. The gill surface area can, in fact, increase linearly in proportion to gill volume and body mass. We cite the large body of evidence demonstrating that respiratory surface areas in fishes reflect metabolic needs, not vice versa, which explains the large interspecific variation in scaling of gill surface areas. Finally, we point out that future studies basing their predictions on models should incorporate factors for scaling of metabolic rate and for temperature effects on metabolism, which agree with measured values, and should account for interspecific variation in scaling and temperature effects. It is possible that some fishes will become smaller in the future, but to make reliable predictions the underlying mechanisms need to be identified and sought elsewhere than in geometric constraints on gill surface area. Furthermore, to ensure that useful information is conveyed to the public and policymakers about the possible effects of climate change, it is necessary to improve communication and congruity between fish physiologists and fisheries scientists. © 2017 John Wiley & Sons Ltd.
Pozzebon, Julie A.; Visser, Beth A.; Ashton, Michael C.; Lee, Kibeom; Goldberg, Lewis R.
2009-01-01
We investigated the psychometric properties of the Oregon Vocational Interest Scales (ORVIS), a brief public-domain alternative to commercial inventories, in a large community sample and in a college sample. In both samples, we examined the factor structure, scale intercorrelations, and personality correlates of the ORVIS, and in the community sample we also examined the correlations of the ORVIS scales with cognitive abilities and with the scales of a longer, proprietary interest survey. In both samples, all eight scales—Leadership, Organization, Altruism, Creativity, Analysis, Producing, Adventuring, and Erudition—showed wide variation in scores, high internal-consistency reliabilities, and a pattern of high convergent and low discriminant correlations with the scales of the proprietary interest survey. Overall, the results support the construct validity of the scales, which are recommended for use in research on vocational interests and other individual differences. PMID:20155566
Energy spectrum of tearing mode turbulence in sheared background field
NASA Astrophysics Data System (ADS)
Hu, Di; Bhattacharjee, Amitava; Huang, Yi-Min
2018-06-01
The energy spectrum of tearing mode turbulence in a sheared background magnetic field is studied in this work. We consider the scenario where the nonlinear interaction of overlapping large-scale modes excites a broad spectrum of small-scale modes, generating tearing mode turbulence. The spectrum of such turbulence is of interest since it is relevant to the small-scale back-reaction on the large-scale field. The turbulence we discuss here differs from traditional MHD turbulence mainly in two aspects. One is the existence of many linearly stable small-scale modes which cause an effective damping during the energy cascade. The other is the scale-independent anisotropy induced by the large-scale modes tilting the sheared background field, as opposed to the scale-dependent anisotropy frequently encountered in traditional critically balanced turbulence theories. Due to these two differences, the energy spectrum deviates from a simple power law and takes the form of a power law multiplied by an exponential falloff. Numerical simulations are carried out using visco-resistive MHD equations to verify our theoretical predictions, and a reasonable agreement is found between the numerical results and our model.
Soil organic carbon across scales.
O'Rourke, Sharon M; Angers, Denis A; Holden, Nicholas M; McBratney, Alex B
2015-10-01
Mechanistic understanding of scale effects is important for interpreting the processes that control the global carbon cycle. Greater attention should be given to scale in soil organic carbon (SOC) science so that we can devise better policy to protect/enhance existing SOC stocks and ensure sustainable use of soils. Global issues such as climate change require consideration of SOC stock changes at the global and biosphere scale, but human interaction occurs at the landscape scale, with consequences at the pedon, aggregate and particle scales. This review evaluates our understanding of SOC across all these scales in the context of the processes involved in SOC cycling at each scale and with emphasis on stabilizing SOC. Current synergy between science and policy is explored at each scale to determine how well each is represented in the management of SOC. An outline of how SOC might be integrated into a framework of soil security is examined. We conclude that SOC processes at the biosphere to biome scales are not well understood. Instead, SOC has come to be viewed as a large-scale pool subjects to carbon flux. Better understanding exists for SOC processes operating at the scales of the pedon, aggregate and particle. At the landscape scale, the influence of large- and small-scale processes has the greatest interaction and is exposed to the greatest modification through agricultural management. Policy implemented at regional or national scale tends to focus at the landscape scale without due consideration of the larger scale factors controlling SOC or the impacts of policy for SOC at the smaller SOC scales. What is required is a framework that can be integrated across a continuum of scales to optimize SOC management. © 2015 John Wiley & Sons Ltd.
Watterson, Andrew
2018-01-01
Unconventional oil and gas extraction (UOGE) including fracking for shale gas is underway in North America on a large scale, and in Australia and some other countries. It is viewed as a major source of global energy needs by proponents. Critics consider fracking and UOGE an immediate and long-term threat to global, national, and regional public health and climate. Rarely have governments brought together relatively detailed assessments of direct and indirect public health risks associated with fracking and weighed these against potential benefits to inform a national debate on whether to pursue this energy route. The Scottish government has now done so in a wide-ranging consultation underpinned by a variety of reports on unconventional gas extraction including fracking. This paper analyses the Scottish government approach from inception to conclusion, and from procedures to outcomes. The reports commissioned by the Scottish government include a comprehensive review dedicated specifically to public health as well as reports on climate change, economic impacts, transport, geology, and decommissioning. All these reports are relevant to public health, and taken together offer a comprehensive review of existing evidence. The approach is unique globally when compared with UOGE assessments conducted in the USA, Australia, Canada, and England. The review process builds a useful evidence base although it is not without flaws. The process approach, if not the content, offers a framework that may have merits globally. PMID:29617318
Mesoscale Predictability and Error Growth in Short Range Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Gingrich, Mark
Although it was originally suggested that small-scale, unresolved errors corrupt forecasts at all scales through an inverse error cascade, some authors have proposed that those mesoscale circulations resulting from stationary forcing on the larger scale may inherit the predictability of the large-scale motions. Further, the relative contributions of large- and small-scale uncertainties in producing error growth in the mesoscales remain largely unknown. Here, 100 member ensemble forecasts are initialized from an ensemble Kalman filter (EnKF) to simulate two winter storms impacting the East Coast of the United States in 2010. Four verification metrics are considered: the local snow water equivalence, total liquid water, and 850 hPa temperatures representing mesoscale features; and the sea level pressure field representing a synoptic feature. It is found that while the predictability of the mesoscale features can be tied to the synoptic forecast, significant uncertainty existed on the synoptic scale at lead times as short as 18 hours. Therefore, mesoscale details remained uncertain in both storms due to uncertainties at the large scale. Additionally, the ensemble perturbation kinetic energy did not show an appreciable upscale propagation of error for either case. Instead, the initial condition perturbations from the cycling EnKF were maximized at large scales and immediately amplified at all scales without requiring initial upscale propagation. This suggests that relatively small errors in the synoptic-scale initialization may have more importance in limiting predictability than errors in the unresolved, small-scale initial conditions.
Initiation process of earthquakes and its implications for seismic hazard reduction strategy.
Kanamori, H
1996-04-30
For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.
Late-time cosmological phase transitions
NASA Technical Reports Server (NTRS)
Schramm, David N.
1991-01-01
It is shown that the potential galaxy formation and large scale structure problems of objects existing at high redshifts (Z approx. greater than 5), structures existing on scales of 100 M pc as well as velocity flows on such scales, and minimal microwave anisotropies ((Delta)T/T) (approx. less than 10(exp -5)) can be solved if the seeds needed to generate structure form in a vacuum phase transition after decoupling. It is argued that the basic physics of such a phase transition is no more exotic than that utilized in the more traditional GUT scale phase transitions, and that, just as in the GUT case, significant random Gaussian fluctuations and/or topological defects can form. Scale lengths of approx. 100 M pc for large scale structure as well as approx. 1 M pc for galaxy formation occur naturally. Possible support for new physics that might be associated with such a late-time transition comes from the preliminary results of the SAGE solar neutrino experiment, implying neutrino flavor mixing with values similar to those required for a late-time transition. It is also noted that a see-saw model for the neutrino masses might also imply a tau neutrino mass that is an ideal hot dark matter candidate. However, in general either hot or cold dark matter can be consistent with a late-time transition.
Knowledge based word-concept model estimation and refinement for biomedical text mining.
Jimeno Yepes, Antonio; Berlanga, Rafael
2015-02-01
Text mining of scientific literature has been essential for setting up large public biomedical databases, which are being widely used by the research community. In the biomedical domain, the existence of a large number of terminological resources and knowledge bases (KB) has enabled a myriad of machine learning methods for different text mining related tasks. Unfortunately, KBs have not been devised for text mining tasks but for human interpretation, thus performance of KB-based methods is usually lower when compared to supervised machine learning methods. The disadvantage of supervised methods though is they require labeled training data and therefore not useful for large scale biomedical text mining systems. KB-based methods do not have this limitation. In this paper, we describe a novel method to generate word-concept probabilities from a KB, which can serve as a basis for several text mining tasks. This method not only takes into account the underlying patterns within the descriptions contained in the KB but also those in texts available from large unlabeled corpora such as MEDLINE. The parameters of the model have been estimated without training data. Patterns from MEDLINE have been built using MetaMap for entity recognition and related using co-occurrences. The word-concept probabilities were evaluated on the task of word sense disambiguation (WSD). The results showed that our method obtained a higher degree of accuracy than other state-of-the-art approaches when evaluated on the MSH WSD data set. We also evaluated our method on the task of document ranking using MEDLINE citations. These results also showed an increase in performance over existing baseline retrieval approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Towards the prevention of lead exposure in South Africa: contemporary and emerging challenges.
Mathee, Angela
2014-12-01
The prevention of lead exposure continues to constitute a major public health challenge in developed countries. In well-resourced countries major lead exposure reduction interventions have resulted in significant improvements in childhood blood lead distributions. In developing countries on the other hand, while lead exposure and poisoning remain serious public health concerns, a range of prevailing factors and circumstances, such as poverty, a large informal sector, competing public health challenges, low levels of awareness of lead hazards and weak capacity to enforce legislation, contribute to an increase in the scale and intensity of the challenge, and limit the prospects of comparable success in the foreseeable future. This paper collates available information to illustrate that despite some progress, a wide range of sources of lead exist in South Africa, and that certain settings and groups continue to be at high risk of lead exposure. Lead exposure in relation to paint, mining, lead melting in subsistence fishing communities, the consumption of Ayurvedic medicines and food production is described, and discussed with regard to the key factors hindering efforts to prevent lead poisoning and exposure in South Africa and many other developing countries. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Wilson, Todd; Chang, Arthur; Berro, Andre; Still, Aaron; Brown, Clive; Demma, Andrew; Nemhauser, Jeffrey; Martin, Colleen; Salame-Alfie, Adela; Fisher-Tyler, Frieda; Smith, Lee; Grady-Erickson, Onalee; Alvarado-Ramy, Francisco; Brunette, Gary; Ansari, Armin; McAdam, David; Marano, Nina
2012-10-01
On March 11, 2011, a magnitude 9.0 earthquake and subsequent tsunami damaged nuclear reactors at the Fukushima Daiichi complex in Japan, resulting in radionuclide release. In response, US officials augmented existing radiological screening at its ports of entry (POEs) to detect and decontaminate travelers contaminated with radioactive materials. During March 12 to 16, radiation screening protocols detected 3 travelers from Japan with external radioactive material contamination at 2 air POEs. Beginning March 23, federal officials collaborated with state and local public health and radiation control authorities to enhance screening and decontamination protocols at POEs. Approximately 543 000 (99%) travelers arriving directly from Japan at 25 US airports were screened for radiation contamination from March 17 to April 30, and no traveler was detected with contamination sufficient to require a large-scale public health response. The response highlighted synergistic collaboration across government levels and leveraged screening methods already in place at POEs, leading to rapid protocol implementation. Policy development, planning, training, and exercising response protocols and the establishment of federal authority to compel decontamination of travelers are needed for future radiological responses. Comparison of resource-intensive screening costs with the public health yield should guide policy decisions, given the historically low frequency of contaminated travelers arriving during radiological disasters.
The stigma of mental illness in children and adolescents: A systematic review.
Kaushik, Anya; Kostaki, Evgenia; Kyriakopoulos, Marinos
2016-09-30
One in ten children and adolescents suffer with mental health difficulties at any given time, yet less than one third seek treatment. Untreated mental illness predisposes to longstanding individual difficulties and presents a great public health burden. Large scale initiatives to reduce stigmatization of mental illness, identified as a key deterrent to treatment, have been disappointing. This indicates the need for a clearer understanding of the stigmatizing processes faced by young people, so that more effective interventions are employed. A systematic review of the literature, assessing public stigma and self-stigma (i.e. internalized public stigma) specifically in children and adolescents with mental health difficulties (YP-MHD), was conducted. Forty-two studies were identified, confirming that stigmatization of YP-MHD is a universal and disabling problem, present amongst both children and adults. There was some variation by diagnosis and gender, and stigmatization was for the most part unaffected by labelling. Self-stigmatization led to more secrecy and an avoidance of interventions. The findings confirm that stigmatization of mental illness is poorly understood due to a lack of research and methodological discrepancies between existing studies. Implications for the findings are discussed, and suggestions made for future research. Copyright © 2016. Published by Elsevier Ireland Ltd.
ERIC Educational Resources Information Center
Datnow, Amanda; Hubbard, Lea; Woody, Elisabeth
In 1997, California became the first state to conduct large-scale experimentation with single gender public education. This longitudinal study examined the impact of single gender academies in six California districts, focusing on equity implications. Data from observations and interviews with educators, policymakers, and students indicated that…
A Strong Future for Public Library Use and Employment
ERIC Educational Resources Information Center
Griffiths, Jose-Marie; King, Donald W.
2011-01-01
The latest and most comprehensive assessment of public librarians' education and career paths to date, this important volume reports on a large-scale research project performed by authors Jose-Marie Griffiths and Donald W. King. Presented in collaboration with the Office for Research and Statistics (ORS), the book includes an examination of trends…
Scale-invariant properties of public-debt growth
NASA Astrophysics Data System (ADS)
Petersen, A. M.; Podobnik, B.; Horvatic, D.; Stanley, H. E.
2010-05-01
Public debt is one of the important economic variables that quantitatively describes a nation's economy. Because bankruptcy is a risk faced even by institutions as large as governments (e.g., Iceland), national debt should be strictly controlled with respect to national wealth. Also, the problem of eliminating extreme poverty in the world is closely connected to the study of extremely poor debtor nations. We analyze the time evolution of national public debt and find "convergence": initially less-indebted countries increase their debt more quickly than initially more-indebted countries. We also analyze the public debt-to-GDP ratio {\\cal R} , a proxy for default risk, and approximate the probability density function P({\\cal R}) with a Gamma distribution, which can be used to establish thresholds for sustainable debt. We also observe "convergence" in {\\cal R} : countries with initially small {\\cal R} increase their {\\cal R} more quickly than countries with initially large {\\cal R} . The scaling relationships for debt and {\\cal R} have practical applications, e.g. the Maastricht Treaty requires members of the European Monetary Union to maintain {\\cal R} < 0.6 .
O'Connor, M; van den Hove, S
2001-09-14
We outline the potential participative governance and risk management in application to technological choices in the nuclear sector within the European Union (EU). Well-conducted public participation, stakeholder consultation and deliberation procedures can enhance the policy process and improve the robustness of strategies dealing with high-stakes investment and risk management challenges. Key nuclear issues now confronting EU member states are: public concern with large-scale environmental and health issues; the Chernobyl accident (and others less catastrophic) whose effect has been to erode public confidence and trust in the nuclear sector; the maturity of the nuclear plant, hence the emerging prominence of waste transportation, reprocessing and disposal issues as part of historical liability within the EU; the nuclear energy heritage of central and eastern European candidate countries to EU accession. The obligatory management of inherited technological risks and uncertainties on large temporal and geographical scales, is a novel feature of technology assessment and governance. Progress in the nuclear sector will aid the development of methodologies for technological foresight and risk governance in fields other than the nuclear alone.
Die-off rates of Cryptosporidium parvum oocysts in a swine lagoon and in a spray field
USDA-ARS?s Scientific Manuscript database
Background: Because of several large-scale outbreaks of cryptosporidiosis in humans, Cryptosporidium has become a public health concern. Commercial swine operations apply large volumes of effluent from lagoons to spray fields as a waste management practice. This effluent is a source of Cryptosporidi...
USDA-ARS?s Scientific Manuscript database
The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...
ERIC Educational Resources Information Center
Bold, Tessa; Kimenyi, Mwangi; Mwabu, Germano; Sandefur, Justin
2013-01-01
Existing studies from the United States, Latin America and Asia provide scant evidence that private schools dramatically improve academic performance relative to public schools. Using data from Kenya--a poor country with weak public institutions--we find a large effect of private schooling on test scores, equivalent to one full standard deviation.…
ERIC Educational Resources Information Center
Allied Health Professions Projects, Los Angeles, CA.
Twenty-eight committee members, representing educational institutions, professional associations, public agencies, and the public-at-large, participated in a meeting to provide guidance in a 4-year project undertaken by UCLA to develop exemplary instructional programs for the continuing education of existing allied health personnel and for the…
ERIC Educational Resources Information Center
Dobbins, Michael
2015-01-01
This article places developments in Polish public higher education (HE) in the broader context of the literature on HE governance and, in particular, marketization. The Polish case stands out due to the parallel existence of prestigious large universities with long histories of scientific advancement and the largest number of private HE…
Panigrahi, Priyabrata; Jere, Abhay; Anamika, Krishanpal
2018-01-01
Gene fusion is a chromosomal rearrangement event which plays a significant role in cancer due to the oncogenic potential of the chimeric protein generated through fusions. At present many databases are available in public domain which provides detailed information about known gene fusion events and their functional role. Existing gene fusion detection tools, based on analysis of transcriptomics data usually report a large number of fusion genes as potential candidates, which could be either known or novel or false positives. Manual annotation of these putative genes is indeed time-consuming. We have developed a web platform FusionHub, which acts as integrated search engine interfacing various fusion gene databases and simplifies large scale annotation of fusion genes in a seamless way. In addition, FusionHub provides three ways of visualizing fusion events: circular view, domain architecture view and network view. Design of potential siRNA molecules through ensemble method is another utility integrated in FusionHub that could aid in siRNA-based targeted therapy. FusionHub is freely available at https://fusionhub.persistent.co.in.
Angermeier, Paul L.; Frimpong, Emmanuel A.
2009-01-01
The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.
Creating a biopower agenda through grassroots organizing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauter, W.
1995-11-01
Biomass electricity provides both opportunities for strengthening the rural economy and advancing environmental goals. However, while large scale biomass development can be done in a manner that both furthers economic development and helps prevent environmental degradation, its commercialization requires a complex coordination of activities between utilities and farmers. Inherent problems exist in creating parallel development of a resource base and technological advancements. In fact, an understanding of the anthropology of biopower is necessary in order to advance it on a large scale. The Union of Concerned Scientists (UCS) published a report on renewable electricity, released in March 1992, that hasmore » been used as a foundation for state-based work promoting renewables. In several Midwestern states, such as Nebraska, Minnesota, and Wisconsin, we have used classic grassroots organizing skills to educate the public and key constituencies about the benefits of biomass. Besides working directly with utilities to promote biomass development, we also have a legislative agenda that helps create a climate favorable to biopower. This paper will focus on the grassroots aspect of our campaigns. It will also include an overview of some anthropological work that the author has done in communities with farmers. The main tool for this has been focus groups. We have found that people can be organized around biomass issues and that a grassroots base furthers biomass development.« less
On the statistical mechanics of the 2D stochastic Euler equation
NASA Astrophysics Data System (ADS)
Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg
2011-12-01
The dynamics of vortices and large scale structures is qualitatively very different in two dimensional flows compared to its three dimensional counterparts, due to the presence of multiple integrals of motion. These are believed to be responsible for a variety of phenomena observed in Euler flow such as the formation of large scale coherent structures, the existence of meta-stable states and random abrupt changes in the topology of the flow. In this paper we study stochastic dynamics of the finite dimensional approximation of the 2D Euler flow based on Lie algebra su(N) which preserves all integrals of motion. In particular, we exploit rich algebraic structure responsible for the existence of Euler's conservation laws to calculate the invariant measures and explore their properties and also study the approach to equilibrium. Unexpectedly, we find deep connections between equilibrium measures of finite dimensional su(N) truncations of the stochastic Euler equations and random matrix models. Our work can be regarded as a preparation for addressing the questions of large scale structures, meta-stability and the dynamics of random transitions between different flow topologies in stochastic 2D Euler flows.
The USNO Astrometry Department
and methods, such as large scale CCD measuring devices, speckle and radio interferometry, are being the observational programs are published in the Naval Observatory Publications and in refereed
Self-sustaining processes at all scales in wall-bounded turbulent shear flows
NASA Astrophysics Data System (ADS)
Cossu, Carlo; Hwang, Yongyun
2017-03-01
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend's attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier-Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions.
Forced Alignment for Understudied Language Varieties: Testing Prosodylab-Aligner with Tongan Data
ERIC Educational Resources Information Center
Johnson, Lisa M.; Di Paolo, Marianna; Bell, Adrian
2018-01-01
Automated alignment of transcriptions to audio files expedites the process of preparing data for acoustic analysis. Unfortunately, the benefits of auto-alignment have generally been available only to researchers studying majority languages, for which large corpora exist and for which acoustic models have been created by large-scale research…
ERIC Educational Resources Information Center
Greene, Jennifer C.; Kellogg, Theodore
Statewide assessment data available from two school years, two grade levels, and five sources (achievement tests; student, principal, and teacher questionnaires; and principal interviews), were aggregated to more closely investigate the relationship between student/school characteristics and student achievement. To organize this large number of…
Chemical Warfare and Medical Response During World War I
Fitzgerald, Gerard J.
2008-01-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914–1918). Historians now refer to the Great War as the chemist’s war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations. PMID:18356568
Chemical warfare and medical response during World War I.
Fitzgerald, Gerard J
2008-04-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914-1918). Historians now refer to the Great War as the chemist's war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations.
GenomeDiagram: a python package for the visualization of large-scale genomic data.
Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K
2006-03-01
We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.
NASA Astrophysics Data System (ADS)
D'Aranno, Peppe J. V.; Marsella, Maria; Scifoni, Silvia; Scutti, Marianna; Sonnessa, Alberico; Bonano, Manuela
2015-10-01
Remote sensing data play an important role for the environmental monitoring because they allow to provide systematic information on very large areas and for a long period of time. Such information must be analyzed, validated and incorporated into proper modeling tools in order to become useful for performing risk assessment analysis. These approaches has been already applied in the field of natural hazard evaluation (i.e. for monitoring seismic, volcanic areas and landslides). However, not enough attention has been devoted to the development of validated methods for implementing quantitative analysis on civil structures. This work is dedicated to the comprehensive utilization of ERS / ENVISAT data store ESA SAR used to detect deformation trends and perform back-analysis of the investigated structures useful to calibrate the damage assessment models. After this preliminary analysis, SAR data of the new satellite mission (ie Cosmo SkyMed) were adopted to monitor the evolution of existent surface deformation processes and to detect new occurrence. The specific objective was to set up a data processing and data analysis chain tailored on a service that sustains the safe maintenance of the built-up environment, including critical construction such as public (schools, hospital, etc), strategic (dam, highways, etc) and also the cultural heritage sites. The analysis of the test area, in the southeastern sector of Roma, has provided three different levels and sub-levels of products from metropolitan area scale (territorial analysis), settlement scale (aggregated analysis) to single structure scale (damage degree associated to the structure).
Pressing needs of biomedical text mining in biocuration and beyond: opportunities and challenges.
Singhal, Ayush; Leaman, Robert; Catlett, Natalie; Lemberger, Thomas; McEntyre, Johanna; Polson, Shawn; Xenarios, Ioannis; Arighi, Cecilia; Lu, Zhiyong
2016-01-01
Text mining in the biomedical sciences is rapidly transitioning from small-scale evaluation to large-scale application. In this article, we argue that text-mining technologies have become essential tools in real-world biomedical research. We describe four large scale applications of text mining, as showcased during a recent panel discussion at the BioCreative V Challenge Workshop. We draw on these applications as case studies to characterize common requirements for successfully applying text-mining techniques to practical biocuration needs. We note that system 'accuracy' remains a challenge and identify several additional common difficulties and potential research directions including (i) the 'scalability' issue due to the increasing need of mining information from millions of full-text articles, (ii) the 'interoperability' issue of integrating various text-mining systems into existing curation workflows and (iii) the 'reusability' issue on the difficulty of applying trained systems to text genres that are not seen previously during development. We then describe related efforts within the text-mining community, with a special focus on the BioCreative series of challenge workshops. We believe that focusing on the near-term challenges identified in this work will amplify the opportunities afforded by the continued adoption of text-mining tools. Finally, in order to sustain the curation ecosystem and have text-mining systems adopted for practical benefits, we call for increased collaboration between text-mining researchers and various stakeholders, including researchers, publishers and biocurators. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
The Use of Weighted Graphs for Large-Scale Genome Analysis
Zhou, Fang; Toivonen, Hannu; King, Ross D.
2014-01-01
There is an acute need for better tools to extract knowledge from the growing flood of sequence data. For example, thousands of complete genomes have been sequenced, and their metabolic networks inferred. Such data should enable a better understanding of evolution. However, most existing network analysis methods are based on pair-wise comparisons, and these do not scale to thousands of genomes. Here we propose the use of weighted graphs as a data structure to enable large-scale phylogenetic analysis of networks. We have developed three types of weighted graph for enzymes: taxonomic (these summarize phylogenetic importance), isoenzymatic (these summarize enzymatic variety/redundancy), and sequence-similarity (these summarize sequence conservation); and we applied these types of weighted graph to survey prokaryotic metabolism. To demonstrate the utility of this approach we have compared and contrasted the large-scale evolution of metabolism in Archaea and Eubacteria. Our results provide evidence for limits to the contingency of evolution. PMID:24619061
Self-sustaining processes at all scales in wall-bounded turbulent shear flows
Hwang, Yongyun
2017-01-01
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend’s attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier–Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167581
Self-sustaining processes at all scales in wall-bounded turbulent shear flows.
Cossu, Carlo; Hwang, Yongyun
2017-03-13
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend's attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier-Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Madriz Aguilar, José Edgar; Bellini, Mauricio
2009-08-01
Considering a five-dimensional (5D) Riemannian spacetime with a particular stationary Ricci-flat metric, we obtain in the framework of the induced matter theory an effective 4D static and spherically symmetric metric which give us ordinary gravitational solutions on small (planetary and astrophysical) scales, but repulsive (anti gravitational) forces on very large (cosmological) scales with ω=-1. Our approach is an unified manner to describe dark energy, dark matter and ordinary matter. We illustrate the theory with two examples, the solar system and the great attractor. From the geometrical point of view, these results follow from the assumption that exists a confining force that make possible that test particles move on a given 4D hypersurface.
Apolinário-Hagen, Jennifer; Vehreschild, Viktor; Alkoudmani, Ramez M
2017-02-23
Despite the advanced development of evidence-based psychological treatment services, help-seeking persons with mental health problems often fail to receive appropriate professional help. Internet-delivered psychotherapy has thus been suggested as an efficient strategy to overcome barriers to access mental health care on a large scale. However, previous research indicated poor public acceptability as an issue for the dissemination of Internet-delivered therapies. Currently, little is known about the expectations and attitudes toward Internet-delivered therapies in the general population. This is especially the case for countries such as Germany where electronic mental health (e-mental health) treatment services are planned to be implemented in routine care. This pilot study aimed to determine the expectations and attitudes toward Internet-based psychotherapy in the general population in Germany. Furthermore, it aimed to explore the associations between attitudes toward Internet-based therapies and perceived stress. To assess public attitudes toward Internet-based psychotherapy, we conducted both Web-based and paper-and-pencil surveys using a self-developed 14-item questionnaire (Cronbach alpha=.89). Psychological distress was measured by employing a visual analogue scale (VAS) and the 20-item German version of the Perceived Stress Questionnaire (PSQ). In addition, we conducted explorative factor analysis (principal axis factor analysis with promax rotation). Spearman's rank correlations were used to determine the associations between attitudes toward Internet-based therapies and perceived stress. Descriptive analyses revealed that most respondents (N=1558; female: 78.95%, 1230/1558) indicated being not aware of the existence of Internet-delivered therapies (83.46%, 1141/1367). The average age was 32 years (standard deviation, SD 10.9; range 16-76). Through exploratory factor analysis, we identified 3 dimensions of public attitudes toward Internet-based therapies, which we labeled "usefulness or helpfulness," "relative advantage or comparability," and "accessibility or access to health care." Analyses revealed negative views about Internet-based therapies on most domains, such as perceived helpfulness. The study findings further indicated ambivalent attitudes: Although most respondents agreed to statements on expected improvements in health care (eg, expanded access), we observed low intentions to future use of Internet-delivered therapies in case of mental health problems. This pilot study showed deficient "e-awareness" and rather negative or ambivalent attitudes toward Internet-delivered therapies in the German-speaking general population. However, research targeting determinants of the large-scale adoption of Internet-based psychotherapy is still in its infancy. Thus, further research is required to explore the "black box" of public attitudes toward Internet-delivered therapies with representative samples, validated measures, and longitudinal survey designs. ©Jennifer Apolinário-Hagen, Viktor Vehreschild, Ramez M Alkoudmani. Originally published in JMIR Mental Health (http://mental.jmir.org), 23.02.2017.
Vehreschild, Viktor; Alkoudmani, Ramez M
2017-01-01
Background Despite the advanced development of evidence-based psychological treatment services, help-seeking persons with mental health problems often fail to receive appropriate professional help. Internet-delivered psychotherapy has thus been suggested as an efficient strategy to overcome barriers to access mental health care on a large scale. However, previous research indicated poor public acceptability as an issue for the dissemination of Internet-delivered therapies. Currently, little is known about the expectations and attitudes toward Internet-delivered therapies in the general population. This is especially the case for countries such as Germany where electronic mental health (e-mental health) treatment services are planned to be implemented in routine care. Objective This pilot study aimed to determine the expectations and attitudes toward Internet-based psychotherapy in the general population in Germany. Furthermore, it aimed to explore the associations between attitudes toward Internet-based therapies and perceived stress. Methods To assess public attitudes toward Internet-based psychotherapy, we conducted both Web-based and paper-and-pencil surveys using a self-developed 14-item questionnaire (Cronbach alpha=.89). Psychological distress was measured by employing a visual analogue scale (VAS) and the 20-item German version of the Perceived Stress Questionnaire (PSQ). In addition, we conducted explorative factor analysis (principal axis factor analysis with promax rotation). Spearman’s rank correlations were used to determine the associations between attitudes toward Internet-based therapies and perceived stress. Results Descriptive analyses revealed that most respondents (N=1558; female: 78.95%, 1230/1558) indicated being not aware of the existence of Internet-delivered therapies (83.46%, 1141/1367). The average age was 32 years (standard deviation, SD 10.9; range 16-76). Through exploratory factor analysis, we identified 3 dimensions of public attitudes toward Internet-based therapies, which we labeled “usefulness or helpfulness,” “relative advantage or comparability,” and “accessibility or access to health care.” Analyses revealed negative views about Internet-based therapies on most domains, such as perceived helpfulness. The study findings further indicated ambivalent attitudes: Although most respondents agreed to statements on expected improvements in health care (eg, expanded access), we observed low intentions to future use of Internet-delivered therapies in case of mental health problems. Conclusions This pilot study showed deficient “e-awareness” and rather negative or ambivalent attitudes toward Internet-delivered therapies in the German-speaking general population. However, research targeting determinants of the large-scale adoption of Internet-based psychotherapy is still in its infancy. Thus, further research is required to explore the “black box” of public attitudes toward Internet-delivered therapies with representative samples, validated measures, and longitudinal survey designs. PMID:28232298
Organization and scaling in water supply networks
NASA Astrophysics Data System (ADS)
Cheng, Likwan; Karney, Bryan W.
2017-12-01
Public water supply is one of the society's most vital resources and most costly infrastructures. Traditional concepts of these networks capture their engineering identity as isolated, deterministic hydraulic units, but overlook their physics identity as related entities in a probabilistic, geographic ensemble, characterized by size organization and property scaling. Although discoveries of allometric scaling in natural supply networks (organisms and rivers) raised the prospect for similar findings in anthropogenic supplies, so far such a finding has not been reported in public water or related civic resource supplies. Examining an empirical ensemble of large number and wide size range, we show that water supply networks possess self-organized size abundance and theory-explained allometric scaling in spatial, infrastructural, and resource- and emission-flow properties. These discoveries establish scaling physics for water supply networks and may lead to novel applications in resource- and jurisdiction-scale water governance.
Resources for Functional Genomics Studies in Drosophila melanogaster
Mohr, Stephanie E.; Hu, Yanhui; Kim, Kevin; Housden, Benjamin E.; Perrimon, Norbert
2014-01-01
Drosophila melanogaster has become a system of choice for functional genomic studies. Many resources, including online databases and software tools, are now available to support design or identification of relevant fly stocks and reagents or analysis and mining of existing functional genomic, transcriptomic, proteomic, etc. datasets. These include large community collections of fly stocks and plasmid clones, “meta” information sites like FlyBase and FlyMine, and an increasing number of more specialized reagents, databases, and online tools. Here, we introduce key resources useful to plan large-scale functional genomics studies in Drosophila and to analyze, integrate, and mine the results of those studies in ways that facilitate identification of highest-confidence results and generation of new hypotheses. We also discuss ways in which existing resources can be used and might be improved and suggest a few areas of future development that would further support large- and small-scale studies in Drosophila and facilitate use of Drosophila information by the research community more generally. PMID:24653003
Cosmic string induced peculiar velocities
NASA Technical Reports Server (NTRS)
Van Dalen, Anthony; Schramm, David N.
1988-01-01
This paper considers the scenario of a flat universe with a network of heavy cosmic strings as the primordial fluctuation spectrum. The joint probability of finding streaming velocities of at least 600 km/s on large scales and local peculiar velocities of less than 800 km/s is calculated. It is shown how the effects of loops breaking up and being born with a spectrum of sizes can be estimated. It is found that to obtain large-scale streaming velocities of at least 600 km/s, it is necessary that either a large value for beta G mu exist or the effect of loop fissioning and production details be considerable.
Friction Stir Welding of Large Scale Cryogenic Tanks for Aerospace Applications
NASA Technical Reports Server (NTRS)
Russell, Carolyn; Ding, R. Jeffrey
1998-01-01
The Marshall Space Flight Center (MSFC) has established a facility for the joining of large-scale aluminum cryogenic propellant tanks using the friction stir welding process. Longitudinal welds, approximately five meters in length, have been made by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and travel system will be described in this presentation along with process controls and real-time data acquisition developed for this application. The approach to retrofitting other large welding tools at MSFC with the friction stir welding process will also be discussed.
Large-scale Density Structures in Magneto-rotational Disk Turbulence
NASA Astrophysics Data System (ADS)
Youdin, Andrew; Johansen, A.; Klahr, H.
2009-01-01
Turbulence generated by the magneto-rotational instability (MRI) is a strong candidate to drive accretion flows in disks, including sufficiently ionized regions of protoplanetary disks. The MRI is often studied in local shearing boxes, which model a small section of the disk at high resolution. I will present simulations of large, stratified shearing boxes which extend up to 10 gas scale-heights across. These simulations are a useful bridge to fully global disk simulations. We find that MRI turbulence produces large-scale, axisymmetric density perturbations . These structures are part of a zonal flow --- analogous to the banded flow in Jupiter's atmosphere --- which survives in near geostrophic balance for tens of orbits. The launching mechanism is large-scale magnetic tension generated by an inverse cascade. We demonstrate the robustness of these results by careful study of various box sizes, grid resolutions, and microscopic diffusion parameterizations. These gas structures can trap solid material (in the form of large dust or ice particles) with important implications for planet formation. Resolved disk images at mm-wavelengths (e.g. from ALMA) will verify or constrain the existence of these structures.
Kennedy, Amy E.; Khoury, Muin J.; Ioannidis, John P.A.; Brotzman, Michelle; Miller, Amy; Lane, Crystal; Lai, Gabriel Y.; Rogers, Scott D.; Harvey, Chinonye; Elena, Joanne W.; Seminara, Daniela
2017-01-01
Background We report on the establishment of a web-based Cancer Epidemiology Descriptive Cohort Database (CEDCD). The CEDCD’s goals are to enhance awareness of resources, facilitate interdisciplinary research collaborations, and support existing cohorts for the study of cancer-related outcomes. Methods Comprehensive descriptive data were collected from large cohorts established to study cancer as primary outcome using a newly developed questionnaire. These included an inventory of baseline and follow-up data, biospecimens, genomics, policies, and protocols. Additional descriptive data extracted from publicly available sources were also collected. This information was entered in a searchable and publicly accessible database. We summarized the descriptive data across cohorts and reported the characteristics of this resource. Results As of December 2015, the CEDCD includes data from 46 cohorts representing more than 6.5 million individuals (29% ethnic/racial minorities). Overall, 78% of the cohorts have collected blood at least once, 57% at multiple time points, and 46% collected tissue samples. Genotyping has been performed by 67% of the cohorts, while 46% have performed whole-genome or exome sequencing in subsets of enrolled individuals. Information on medical conditions other than cancer has been collected in more than 50% of the cohorts. More than 600,000 incident cancer cases and more than 40,000 prevalent cases are reported, with 24 cancer sites represented. Conclusions The CEDCD assembles detailed descriptive information on a large number of cancer cohorts in a searchable database. Impact Information from the CEDCD may assist the interdisciplinary research community by facilitating identification of well-established population resources and large-scale collaborative and integrative research. PMID:27439404
A high-resolution European dataset for hydrologic modeling
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Salamon, Peter; Gomes, Goncalo; Sint, Hadewij; Lorini, Valerio; Thielen, Jutta
2013-04-01
There is an increasing demand for large scale hydrological models not only in the field of modeling the impact of climate change on water resources but also for disaster risk assessments and flood or drought early warning systems. These large scale models need to be calibrated and verified against large amounts of observations in order to judge their capabilities to predict the future. However, the creation of large scale datasets is challenging for it requires collection, harmonization, and quality checking of large amounts of observations. For this reason, only a limited number of such datasets exist. In this work, we present a pan European, high-resolution gridded dataset of meteorological observations (EFAS-Meteo) which was designed with the aim to drive a large scale hydrological model. Similar European and global gridded datasets already exist, such as the HadGHCND (Caesar et al., 2006), the JRC MARS-STAT database (van der Goot and Orlandi, 2003) and the E-OBS gridded dataset (Haylock et al., 2008). However, none of those provide similarly high spatial resolution and/or a complete set of variables to force a hydrologic model. EFAS-Meteo contains daily maps of precipitation, surface temperature (mean, minimum and maximum), wind speed and vapour pressure at a spatial grid resolution of 5 x 5 km for the time period 1 January 1990 - 31 December 2011. It furthermore contains calculated radiation, which is calculated by using a staggered approach depending on the availability of sunshine duration, cloud cover and minimum and maximum temperature, and evapotranspiration (potential evapotranspiration, bare soil and open water evapotranspiration). The potential evapotranspiration was calculated using the Penman-Monteith equation with the above-mentioned meteorological variables. The dataset was created as part of the development of the European Flood Awareness System (EFAS) and has been continuously updated throughout the last years. The dataset variables are used as inputs to the hydrological calibration and validation of EFAS as well as for establishing long-term discharge "proxy" climatologies which can then in turn be used for statistical analysis to derive return periods or other time series derivatives. In addition, this dataset will be used to assess climatological trends in Europe. Unfortunately, to date no baseline dataset at the European scale exists to test the quality of the herein presented data. Hence, a comparison against other existing datasets can therefore only be an indication of data quality. Due to availability, a comparison was made for precipitation and temperature only, arguably the most important meteorological drivers for hydrologic models. A variety of analyses was undertaken at country scale against data reported to EUROSTAT and E-OBS datasets. The comparison revealed that while the datasets showed overall similar temporal and spatial patterns, there were some differences in magnitudes especially for precipitation. It is not straightforward to define the specific cause for these differences. However, in most cases the comparatively low observation station density appears to be the principal reason for the differences in magnitude.
Prisoner reentry: a public health or public safety issue for social work practice?
Patterson, George T
2013-01-01
A significant literature identifies the policy, economic, health, and social challenges that confront released prisoners. This literature also describes the public health and public safety risks associated with prisoner reentry, provides recommendations for improving the reentry process, and describes the effectiveness of prison-based programs on recidivism rates. Public health and public safety risks are particularly significant in communities where large numbers of prisoners are released and few evidence-based services exist. The purpose of this article is to describe the public health and public safety risks that released prisoners experience when they reenter communities, and to discuss the social justice issues relevant for social work practice.
Program on application of communications satellites to educational development
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.
1971-01-01
Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.
Planning of Eka Hospital Pekanbaru wastewater recycling facility
NASA Astrophysics Data System (ADS)
Jecky, A.; Andrio, D.; Sasmita, A.
2018-04-01
The Ministry of Public Works No. 06 2011 required the large scale of water to conserve the water resource, Eka Hospital Pekanbaru have to improve the sewage treatment plant through the wastewater recycling. The effluent from the plant can be used to landscape gardening and non-potable activities. The wastewater recycling design was done by analyzing the existing condition of thesewage treatment plant, determine the effluent quality standards for wastewater recycling, selected of alternative technology and processing, design the treatment unit and analyze the economic aspects. The design of recycling facility by using of combination cartridge filters processing, ultrafiltration membranes, and desinfection by chlorination. The wastewater recycling capacity approximately of 75 m3/day or 75% of the STP effluent. The estimated costs for installation of wastewater recycling and operation and maintenance per month are Rp 111,708,000 and Rp 2,498,000 respectively.
NASA Astrophysics Data System (ADS)
McMurdie, L. A.; Houze, R.; Zagrodnik, J.; Rowe, A.; DeHart, J.; Barnes, H.
2016-12-01
Successful and sustainable coupling of human societies and natural systems requires effective governance, which depends on the existence of proper infrastructure (both hard and soft). In recent decades, much attention has been paid to what has allowed many small-scale self-organized coupled natural-human systems around the world to persist for centuries, thanks to a large part to the work by Elinor Ostrom and colleagues. In this work, we mathematically operationalize a conceptual framework that is developed based on this body of work by way of a stylized model. The model captures the interplay between replicator dynamics within the population, dynamics of natural resources, and threshold characteristics of public infrastructure. The model analysis reveals conditions for long-term sustainability and collapse of the coupled systems as well as other tradeoffs and potential pitfalls in governing these systems.
Uribe-Sánchez, Andrés; Savachkin, Alex
2011-01-01
As recently pointed out by the Institute of Medicine, the existing pandemic mitigation models lack the dynamic decision support capability. We develop a large-scale simulation-driven optimization model for generating dynamic predictive distribution of vaccines and antivirals over a network of regional pandemic outbreaks. The model incorporates measures of morbidity, mortality, and social distancing, translated into the cost of lost productivity and medical expenses. The performance of the strategy is compared to that of the reactive myopic policy, using a sample outbreak in Fla, USA, with an affected population of over four millions. The comparison is implemented at different levels of vaccine and antiviral availability and administration capacity. Sensitivity analysis is performed to assess the impact of variability of some critical factors on policy performance. The model is intended to support public health policy making for effective distribution of limited mitigation resources. PMID:23074658
The Four Cs of disaster partnering: communication, cooperation, coordination and collaboration.
Martin, Eric; Nolte, Isabelle; Vitolo, Emma
2016-10-01
Public, nonprofit and private organisations respond to large-scale disasters domestically and overseas. Critics of these assistance efforts, as well as those involved, often cite poor interorganisational partnering as an obstacle to successful disaster response. Observers frequently call for 'more' and 'better' partnering. We found important qualitative distinctions existed within partnering behaviours. We identified four different types of interorganisational partnering activities often referred to interchangeably: communication, cooperation, coordination and collaboration-the Four Cs. We derived definitions of the Four Cs from the partnering literature. We then tested them in a case study of the response to the 2010 Haiti earthquake. We suggest that the Four Cs are distinct activities, that organisations are typically strong or weak in one or more for various reasons, and that the four terms represent a continuum of increased interorganisational embeddedness in partnering activities. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Population Policy: Abortion and Modern Contraception Are Substitutes.
Miller, Grant; Valente, Christine
2016-08-01
A longstanding debate exists in population policy about the relationship between modern contraception and abortion. Although theory predicts that they should be substitutes, the empirical evidence is difficult to interpret. What is required is a large-scale intervention that alters the supply (or full price) of one or the other and, importantly, that does so in isolation (reproductive health programs often bundle primary health care and family planning-and in some instances, abortion services). In this article, we study Nepal's 2004 legalization of abortion provision and subsequent expansion of abortion services, an unusual and rapidly implemented policy meeting these requirements. Using four waves of rich individual-level data representative of fertile-age Nepalese women, we find robust evidence of substitution between modern contraception and abortion. This finding has important implications for public policy and foreign aid, suggesting that an effective strategy for reducing expensive and potentially unsafe abortions may be to expand the supply of modern contraceptives.
Reflections on social activism in otolaryngology.
Kopelovich, Jonathan C
2014-03-01
What is "social activism" to you? For older otolaryngologists, the term is likely to signify the tumult of the 1960s. For incoming generations, this connotation is outdated. Rather, it more broadly reflects concerted efforts to improve the public good. Some ally with existing institutions to work toward incremental progress. Some start new organizations, using technological tools to build networks, marshal resources, and leapfrog hurdles. Countering these efforts are the ever-changing challenges of practicing otolaryngology today: electronic health records, shifting incentives, and changes in the practice model. Employment by large conglomerates is more common, decreasing our visibility as community leaders. Burnout is a recognized "hazard," and budding otolaryngologists are particularly susceptible. Adding one more thing, like social activism, to a full plate seems counterintuitive. But it shouldn't be. You don't need a "bigger" plate to get involved in social causes. Start simple. Find a partner. Scale up. You'll find it rewarding.
Brysbaert, Marc; Stevens, Michaël; Mandera, Paweł; Keuleers, Emmanuel
2016-01-01
Based on an analysis of the literature and a large scale crowdsourcing experiment, we estimate that an average 20-year-old native speaker of American English knows 42,000 lemmas and 4,200 non-transparent multiword expressions, derived from 11,100 word families. The numbers range from 27,000 lemmas for the lowest 5% to 52,000 for the highest 5%. Between the ages of 20 and 60, the average person learns 6,000 extra lemmas or about one new lemma every 2 days. The knowledge of the words can be as shallow as knowing that the word exists. In addition, people learn tens of thousands of inflected forms and proper nouns (names), which account for the substantially high numbers of ‘words known’ mentioned in other publications. PMID:27524974
Shear-driven dynamo waves at high magnetic Reynolds number.
Tobias, S M; Cattaneo, F
2013-05-23
Astrophysical magnetic fields often display remarkable organization, despite being generated by dynamo action driven by turbulent flows at high conductivity. An example is the eleven-year solar cycle, which shows spatial coherence over the entire solar surface. The difficulty in understanding the emergence of this large-scale organization is that whereas at low conductivity (measured by the magnetic Reynolds number, Rm) dynamo fields are well organized, at high Rm their structure is dominated by rapidly varying small-scale fluctuations. This arises because the smallest scales have the highest rate of strain, and can amplify magnetic field most efficiently. Therefore most of the effort to find flows whose large-scale dynamo properties persist at high Rm has been frustrated. Here we report high-resolution simulations of a dynamo that can generate organized fields at high Rm; indeed, the generation mechanism, which involves the interaction between helical flows and shear, only becomes effective at large Rm. The shear does not enhance generation at large scales, as is commonly thought; instead it reduces generation at small scales. The solution consists of propagating dynamo waves, whose existence was postulated more than 60 years ago and which have since been used to model the solar cycle.
Acquiring synaesthesia: insights from training studies
Rothen, Nicolas; Meier, Beat
2014-01-01
Synaesthesia denotes a condition of remarkable individual differences in experience characterized by specific additional experiences in response to normal sensory input. Synaesthesia seems to (i) run in families which suggests a genetic component, (ii) is associated with marked structural and functional neural differences, and (iii) is usually reported to exist from early childhood. Hence, synaesthesia is generally regarded as a congenital phenomenon. However, most synaesthetic experiences are triggered by cultural artifacts (e.g., letters, musical sounds). Evidence exists to suggest that synaesthetic experiences are triggered by the conceptual representation of their inducer stimuli. Cases were identified for which the specific synaesthetic associations are related to prior experiences and large scale studies show that grapheme-color associations in synaesthesia are not completely random. Hence, a learning component is inherently involved in the development of specific synaesthetic associations. Researchers have hypothesized that associative learning is the critical mechanism. Recently, it has become of scientific and public interest if synaesthetic experiences may be acquired by means of associative training procedures and whether the gains of these trainings are associated with similar cognitive benefits as genuine synaesthetic experiences. In order to shed light on these issues and inform synaesthesia researchers and the general interested public alike, we provide a comprehensive literature review on developmental aspects of synaesthesia and specific training procedures in non-synaesthetes. Under the light of a clear working definition of synaesthesia, we come to the conclusion that synaesthesia can potentially be learned by the appropriate training. PMID:24624072
Axions, neutrinos and strings: The formation of structure in an SO(10) universe
NASA Technical Reports Server (NTRS)
Stecker, F. W.
1984-01-01
In a class of grand unified theories containing SO(10), cosmologically significant axion and neutrino energy densities are obtainable naturally. To obtain large scale structure, both components of dark matter are considered to exist with comparable energy densities. To obtain large scale structure, inflationary and non-inflationary scenarios are considered, as well as scenarios with and without vacuum strings. It is shown that inflation may be compatible with recent observations of the mass density within galaxy clusters and superclusters, especially if strings are present.
Axions, neutrinos and strings - The formation of structure in an SO(10) universe
NASA Technical Reports Server (NTRS)
Stecker, F. W.
1986-01-01
In a class of grand unified theories containing SO(10), cosmologically significant axion and neutrino energy densities are obtainable naturally. To obtain large scale structure, both components of dark matter are considered to exist with comparable energy densities. To obtain large scale structure, inflationary and non-inflationary scenarios are considered, as well as scenarios with and without vacuum strings. It is shown that inflation may be compatible with recent observations of the mass density within galaxy clusters and superclusters, especially if strings are present.
Large Scale Winter Time Disturbances in Meteor Winds over Central and Eastern Europe
NASA Technical Reports Server (NTRS)
Greisiger, K. M.; Portnyagin, Y. I.; Lysenko, I. A.
1984-01-01
Daily zonal wind data of the four pre-MAP-winters 1978/79 to 1981/82 obtained over Central Europe and Eastern Europe by the radar meteor method were studied. Available temperature and satellite radiance data of the middle and upper stratosphere were used for comparison, as well as wind data from Canada. The existence or nonexistence of coupling between the observed large scale zonal wind disturbances in the upper mesopause region (90 to 100 km) and corresponding events in the stratosphere are discussed.
Design of a V/STOL propulsion system for a large-scale fighter model
NASA Technical Reports Server (NTRS)
Willis, W. S.
1981-01-01
Modifications were made to the existing Large-Scale STOL fighter model to simulate a V/STOL configuration. Modifications include the substitutions of two dimensional lift/cruise exhaust nozzles in the nacelles, and the addition of a third J97 engine in the fuselage to suppy a remote exhaust nozzle simulating a Remote Augmented Lift System. A preliminary design of the inlet and exhaust ducting for the third engine was developed and a detailed design was completed of the hot exhaust ducting and remote nozzle.
Where Does the River Run? Lessons from a Semi-Arid River
NASA Astrophysics Data System (ADS)
Meixner, T.; Soto, C. D.; Richter, H.; Uhlman, K.
2009-12-01
Spatial data sets to assess the nature of stream groundwater interactions and the resulting power law/fractal structure of travel time distributions are rare. Spatial data sets can be collected using high technology or by use of a large number of field assistants. The labor intensive way is expensive unless the public can be enlisted as citizen scientists to gather large, robust, spatial data sets robustly and cheaply. Such an effort requires public interest and the ability of a few to organize such an effort at a basin if not regional scale. The San Pedro basin offers such an opportunity for citizen science due to the water resource restrictions of the basins semi-arid climate. Since 1999 The Nature Conservancy, in cooperation with the Upper San Pedro Partnership, the public at large and various university and federal science agency participants, has been mapping where the San Pedro River has water present versus where it is dry. This mapping has used an army of volunteers armed with GPS units, clipboards and their eyes to make the determination if a given 10m reach of the river is wet or dry. These wet/dry mapping data now exist for 11 different annual surveys. These data are unique and enable an investigation of the hydrologic connectedness of flowing waters within this system. Analysis of these data reveals several important findings. The total river area that is wet is strongly correlated with stream flow as observed at three USGS gauges. The correlation is strongest however for 90 day and 1 year average flows rather than more local in time observations such as the daily, 7 day or monthly mean flow at the gauges. This result indicates that where the river is flowing depends on long term hydrologic conditions. The length of river reach that is mapped as wet or dry is indicative of the travel distance and thus time that water travels in the surface (wet) and subsurface (dry) of the river system. The reach length that is mapped as wet follows a power law function (slope of ~ -0.64 approximately) indicating that the fractal travel time distributions observed by others for catchment (Kirchner et al 2001), local to regional scale flow patterns (Cardenas 2008) and for stream solute transport (Haggerty et al. 2005) may have their origin in the fundamental nature of stream groundwater interactions in flowing water systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, J. R.; Peng, E.; Ahmad, Z.
2015-05-15
We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons s{sup −1}, we demonstratemore » that even very large optical surveys can be now be simulated. We demonstrate that we are able to (1) construct kilometer scale phase screens necessary for wide-field telescopes, (2) reproduce atmospheric point-spread function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, (3) accurately reproduce the expected spot diagrams for complex aspheric optical designs, and (4) recover system effective area predicted from analytic photometry integrals. This new code, the Photon Simulator (PhoSim), is publicly available. We have implemented the Large Synoptic Survey Telescope design, and it can be extended to other telescopes. We expect that because of the comprehensive physics implemented in PhoSim, it will be used by the community to plan future observations, interpret detailed existing observations, and quantify systematics related to various astronomical measurements. Future development and validation by comparisons with real data will continue to improve the fidelity and usability of the code.« less
Comparative Approaches to Genetic Discrimination: Chasing Shadows?
Joly, Yann; Feze, Ida Ngueng; Song, Lingqiao; Knoppers, Bartha M
2017-05-01
Genetic discrimination (GD) is one of the most pervasive issues associated with genetic research and its large-scale implementation. An increasing number of countries have adopted public policies to address this issue. Our research presents a worldwide comparative review and typology of these approaches. We conclude with suggestions for public policy development. Copyright © 2017 Elsevier Ltd. All rights reserved.
Living Room vs. Concert Hall: Patterns of Music Consumption in Flanders
ERIC Educational Resources Information Center
Roose, Henk; Stichele, Alexander Vander
2010-01-01
In this article we probe the interplay between public and private music consumption using a large-scale survey of the Flemish population in Belgium. We analyze whether public and private music consumption have different correlates and to what extent there is convergence between the genres that people listen to at home and at concerts. Results show…
Recommendations for open data science.
Gymrek, Melissa; Farjoun, Yossi
2016-01-01
Life science research increasingly relies on large-scale computational analyses. However, the code and data used for these analyses are often lacking in publications. To maximize scientific impact, reproducibility, and reuse, it is crucial that these resources are made publicly available and are fully transparent. We provide recommendations for improving the openness of data-driven studies in life sciences.
ERIC Educational Resources Information Center
Pizmony-Levy, Oren; Bjorklund, Peter, Jr.
2018-01-01
One of the overarching goals of international large-scale assessments (ILSA) is to inform public discourse about the quality of education in different countries. To fulfil this function, the Organisation for Economic Co-operation and Development (OECD), for example, raises awareness of the Program for International Student Assessment (PISA)…
The Common Core State Standards: School Reform at Three Suburban Middle Schools
ERIC Educational Resources Information Center
Morante-Brock, Sandra
2014-01-01
A growing body of research supports the idea that large scale school reform efforts often fail to create sustained change within the public school sector. Proponents of school reform argue that implementing school reform, effectively and with fidelity, can work to ensure the success of reform initiatives in public education. When implementing deep…
Assignment of boundary conditions in embedded ground water flow models
Leake, S.A.
1998-01-01
Many small-scale ground water models are too small to incorporate distant aquifer boundaries. If a larger-scale model exists for the area of interest, flow and head values can be specified for boundaries in the smaller-scale model using values from the larger-scale model. Flow components along rows and columns of a large-scale block-centered finite-difference model can be interpolated to compute horizontal flow across any segment of a perimeter of a small-scale model. Head at cell centers of the larger-scale model can be interpolated to compute head at points on a model perimeter. Simple linear interpolation is proposed for horizontal interpolation of horizontal-flow components. Bilinear interpolation is proposed for horizontal interpolation of head values. The methods of interpolation provided satisfactory boundary conditions in tests using models of hypothetical aquifers.Many small-scale ground water models are too small to incorporate distant aquifer boundaries. If a larger-scale model exists for the area of interest, flow and head values can be specified for boundaries in the smaller-scale model using values from the larger-scale model. Flow components along rows and columns of a large-scale block-centered finite-difference model can be interpolated to compute horizontal flow across any segment of a perimeter of a small-scale model. Head at cell centers of the larger.scale model can be interpolated to compute head at points on a model perimeter. Simple linear interpolation is proposed for horizontal interpolation of horizontal-flow components. Bilinear interpolation is proposed for horizontal interpolation of head values. The methods of interpolation provided satisfactory boundary conditions in tests using models of hypothetical aquifers.
NASA Astrophysics Data System (ADS)
Yu, Garmay; A, Shvetsov; D, Karelov; D, Lebedev; A, Radulescu; M, Petukhov; V, Isaev-Ivanov
2012-02-01
Based on X-ray crystallographic data available at Protein Data Bank, we have built molecular dynamics (MD) models of homologous recombinases RecA from E. coli and D. radiodurans. Functional form of RecA enzyme, which is known to be a long helical filament, was approximated by a trimer, simulated in periodic water box. The MD trajectories were analyzed in terms of large-scale conformational motions that could be detectable by neutron and X-ray scattering techniques. The analysis revealed that large-scale RecA monomer dynamics can be described in terms of relative motions of 7 subdomains. Motion of C-terminal domain was the major contributor to the overall dynamics of protein. Principal component analysis (PCA) of the MD trajectories in the atom coordinate space showed that rotation of C-domain is correlated with the conformational changes in the central domain and N-terminal domain, that forms the monomer-monomer interface. Thus, even though C-terminal domain is relatively far from the interface, its orientation is correlated with large-scale filament conformation. PCA of the trajectories in the main chain dihedral angle coordinate space implicates a co-existence of a several different large-scale conformations of the modeled trimer. In order to clarify the relationship of independent domain orientation with large-scale filament conformation, we have performed analysis of independent domain motion and its implications on the filament geometry.
Water Resources Implications of Cellulosic Biofuel Production at a Regional Scale
NASA Astrophysics Data System (ADS)
Christopher, S. F.; Schoenholtz, S. H.; Nettles, J. E.
2011-12-01
Recent increases in oil prices, a strong national interest in greater energy independence, and a concern for the role of fossil fuels in global climate change, have led to a dramatic expansion in use of alternative renewable energy sources in the U.S. The U.S. government has mandated production of 36 billion gallons of renewable fuels by 2022, of which 16 billion gallons are required to be cellulosic biofuels. Production of cellulosic biomass offers a promising alternative to corn-based systems because large-scale production of corn-based ethanol often requires irrigation and is associated with increased erosion, excess sediment export, and enhanced leaching of nitrogen and phosphorus. Although cultivation of switchgrass using standard agricultural practices is one option being considered for production of cellulosic biomass, intercropping cellulosic biofuel crops within managed forests could provide feedstock without primary land use change or the water quality impacts associated with annual crops. Catchlight Energy LLC is examining the feasibility and sustainability of intercropping switchgrass in loblolly pine plantations in the southeastern U.S. Ongoing research is determining efficient operational techniques and information needed to evaluate effects of these practices on water resources in small watershed-scale (~25 ha) studies. Three sets of four to five sub-watersheds are fully instrumented and currently collecting calibration data in North Carolina, Alabama, and Mississippi. These watershed studies will provide detailed information to understand processes and guide management decisions. However, environmental implications of cellulosic systems need to be examined at a regional scale. We used the Soil Water Assessment Tool (SWAT), a physically-based hydrologic model, to examine water quantity effects of various land use change scenarios ranging from switchgrass intercropping a small percentage of managed pine forest land to conversion of all managed forested land to switchgrass. The regional-scale SWAT model was successfully run and calibrated on the ~ 5 million ha Tombigbee Watershed located in Mississippi and Alabama. Publically available datasets were used as input to the model and for calibration. To improve calibration statistics, five tree age classes (0-4 yr, 4-10 yr, 10-17 yr, 17-24 yr, 24-30 yr) were added to the model to more appropriately represent existing forested systems in the region, which are not included within the standard SWAT set-up. Our results will be essential to public policy makers as they influence and plan for large-scale production of cellulosic biofuels, while sustaining water quality and quantity.
Evaluation Findings from High School Reform Efforts in Baltimore
ERIC Educational Resources Information Center
Smerdon, Becky; Cohen, Jennifer
2009-01-01
The Baltimore City Public School System (BCPSS) is one of the first urban districts in the country to undertake large-scale high school reform, phasing in small learning communities by opening new high schools and transforming large, comprehensive high schools into small high schools. With support from the Bill & Melinda Gates Foundation, a…
ResStock Analysis Tool | Buildings | NREL
Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency
Flemming, Shauna St Clair; Redmond, Nakeva; Williamson, Dana Hz; Thompson, Nancy J; Perryman, Jennie P; Patzer, Rachel E; Arriola, Kimberly Jacob
2018-04-01
Increasing public commitment to organ donation is critical to improving donor kidney availability for end-stage renal disease patients desiring transplant. This study surveyed ( N = 1339) African Americans, measuring perceived pros relative to cons of organ donation, to evaluate an existing Transtheoretical Model decisional balance scale and associations between decisional balance and expressing donation intentions. Findings supported the existing scale structure. More positive decisional balance ratios were associated with 1.76 times the odds of expressing intentions (95% confidence interval = 1.52-2.04). Pros were more strongly linked to donation intentions than cons. Greater understanding of organ donation decision-making is valuable for informing interventions that encourage donation.
Measuring Bystander Attitudes and Behavior to Prevent Sexual Violence
ERIC Educational Resources Information Center
McMahon, Sarah; Allen, Christopher T.; Postmus, Judy L.; McMahon, Sheila M.; Peterson, N. Andrew; Lowe Hoffman, Melanie
2014-01-01
Objective: The purpose of this study is to further investigate the factor structure and strength of the Bystander Attitude Scale-Revised and Bystander Behavior Scale-Revised (BAS-R and BBS-R). Participants: First-year students (N = 4,054) at a large public university in the Northeast completed a survey in 2010 as part of a larger longitudinal…
ERIC Educational Resources Information Center
Gallagher, Mary Jean; Malloy, John; Ryerson, Rachel
2016-01-01
This paper offers an insiders' perspective on the large-scale, system-wide educational change undertaken in Ontario, Canada from 2003 to the present. The authors, Ministry and school system leaders intimately involved in this change process, explore how Ontario has come to be internationally recognized as an equitable, high-achieving, and…
High Familial Correlation in Methylphenidate Response and Side Effect Profile.
Gazer-Snitovsky, Michal; Brand-Gothelf, Ayelet; Dubnov-Raz, Gal; Weizman, Abraham; Gothelf, Doron
2015-04-21
To examine whether a familial tendency exists in clinical response to methylphenidate. Nineteen pairs of siblings or parent-child stimulant-naive individuals with ADHD were prescribed methylphenidate-immediate release, and were comprehensively evaluated at baseline, Week 2, and Week 4, using the ADHD Rating Scale IV, Clinical Global Impression Scale, and the Barkley Side Effects Rating Scale. We found significant intraclass correlations in family member response to methylphenidate-immediate release and side effect profile, including emotional symptoms and loss of appetite and weight. Family history of response to methylphenidate should be taken into account when treating ADHD. © 2015 SAGE Publications.
Angels and Demons: The Science Behind the Scenes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, Norman
Does antimatter really exist? How and why do scientists produce and use it? Does CERN exist and is there an underground complex deep beneath the Swiss/French border? Is truth stranger than fiction? Find out at the coming public lecture. On Tuesday, May 12, SLAC physicist Norman Graf will discuss the real science behind Angels & Demons, Dan Brown's blockbuster novel and the basis of an upcoming Tom Hanks movie. Graf's' talk is one in a series of public lectures across the U.S., Canada and Puerto Rico to share the science of antimatter and the Large Hadron Collider, and the excitementmore » of particle physics research.« less
Gap Analysis and Conservation Network for Freshwater Wetlands in Central Yangtze Ecoregion
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces. PMID:24062632
Gap analysis and conservation network for freshwater wetlands in Central Yangtze Ecoregion.
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces.
Herbst, Kobus; Law, Matthew; Geldsetzer, Pascal; Tanser, Frank; Harling, Guy; Bärnighausen, Till
2015-11-01
Health and demographic surveillance systems (HDSS), in conjunction with HIV treatment cohorts, have made important contributions to our understanding of the impact of HIV treatment and treatment-related interventions in sub-Saharan Africa. The purpose of this review is to describe and discuss innovations in data collection and data linkage that will create new opportunities to establish the impacts of HIV treatment, as well as policies affecting the treatment cascade, on population health and economic and social outcomes. Novel approaches to routine collection of biomarkers, behavioural data, spatial data, social network information, migration events and mobile phone records can significantly strengthen the potential of HDSS to generate exposure and outcome data for causal analysis of HIV treatment impact and policies affecting the HIV treatment cascade. Additionally, by linking HDSS data to health service administration, education and welfare service records, researchers can substantially broaden opportunities to establish how HIV treatment affects health and economic outcomes when delivered through public sector health systems and at scale. As the HIV treatment scaleup in sub-Saharan Africa enters its second decade, it is becoming increasingly important to understand the long-term causal impacts of large-scale HIV treatment and related policies on broader population health outcomes, such as noncommunicable diseases, as well as on economic and social outcomes, such as family welfare and children's educational attainment. By collecting novel data and linking existing data to public sector records, HDSS can create near-unique opportunities to contribute to this research agenda.
White, Mark; Wells, John S G; Butterworth, Tony
2014-09-01
To examine the literature related to a large-scale quality improvement initiative, the 'Productive Ward: Releasing Time to Care', providing a bibliometric profile that tracks the level of interest and scale of roll-out and adoption, discussing the implications for sustainability. Productive Ward: Releasing Time to Care (aka Productive Ward) is probably one of the most ambitious quality improvement efforts engaged by the UK-NHS. Politically and financially supported, its main driver was the NHS Institute for Innovation and Improvement. The NHS institute closed in early 2013 leaving a void of resources, knowledge and expertise. UK roll-out of the initiative is well established and has arguably peaked. International interest in the initiative however continues to develop. A comprehensive literature review was undertaken to identify the literature related to the Productive Ward and its implementation (January 2006-June 2013). A bibliometric analysis examined/reviewed the trends and identified/measured interest, spread and uptake. Overall distribution patterns identify a declining trend of interest, with reduced numbers of grey literature and evaluation publications. However, detailed examination of the data shows no reduction in peer-reviewed outputs. There is some evidence that international uptake of the initiative continues to generate publications and create interest. Sustaining this initiative in the UK will require re-energising, a new focus and financing. The transition period created by the closure of its creator may well contribute to further reduced levels of interest and publication outputs in the UK. However, international implementation, evaluation and associated publications could serve to attract professional/academic interest in this well-established, positively reported, quality improvement initiative. This paper provides nurses and ward teams involved in quality improvement programmes with a detailed, current-state, examination and analysis of the Productive Ward literature, highlighting the bibliometric patterns of this large-scale, international, quality improvement programme. It serves to disseminate updated publication information to those in clinical practice who are involved in Productive Ward or a similar quality improvement initiative. © 2014 John Wiley & Sons Ltd.
Hieu, Nguyen Trong; Brochier, Timothée; Tri, Nguyen-Huu; Auger, Pierre; Brehmer, Patrice
2014-09-01
We consider a fishery model with two sites: (1) a marine protected area (MPA) where fishing is prohibited and (2) an area where the fish population is harvested. We assume that fish can migrate from MPA to fishing area at a very fast time scale and fish spatial organisation can change from small to large clusters of school at a fast time scale. The growth of the fish population and the catch are assumed to occur at a slow time scale. The complete model is a system of five ordinary differential equations with three time scales. We take advantage of the time scales using aggregation of variables methods to derive a reduced model governing the total fish density and fishing effort at the slow time scale. We analyze this aggregated model and show that under some conditions, there exists an equilibrium corresponding to a sustainable fishery. Our results suggest that in small pelagic fisheries the yield is maximum for a fish population distributed among both small and large clusters of school.
Impact resistant boron/aluminum composites for large fan blades
NASA Technical Reports Server (NTRS)
Oller, T. L.; Salemme, C. T.; Bowden, J. H.; Doble, G. S.; Melnyk, P.
1977-01-01
Blade-like specimens were subjected to static ballistic impact testing to determine their relative FOD impact resistance levels. It was determined that a plus or minus 15 deg layup exhibited good impact resistance. The design of a large solid boron/aluminum fan blade was conducted based on the FOD test results. The CF6 fan blade was used as a baseline for these design studies. The solid boron/aluminum fan blade design was used to fabricate two blades. This effort enabled the assessment of the scale up of existing blade manufacturing details for the fabrication of a large B/Al fan blade. Existing CF6 fan blade tooling was modified for use in fabricating these blades.
Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.
2012-01-01
Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085
Over the past three decades, a number of researchers in the fields of environmental justice (EJ) and environmental public health have highlighted the existence of regional and local scale differences in exposure to air pollution, as well as calculated health risk and impacts of a...
Precision enhancement of pavement roughness localization with connected vehicles
NASA Astrophysics Data System (ADS)
Bridgelall, R.; Huang, Y.; Zhang, Z.; Deng, F.
2016-02-01
Transportation agencies rely on the accurate localization and reporting of roadway anomalies that could pose serious hazards to the traveling public. However, the cost and technical limitations of present methods prevent their scaling to all roadways. Connected vehicles with on-board accelerometers and conventional geospatial position receivers offer an attractive alternative because of their potential to monitor all roadways in real-time. The conventional global positioning system is ubiquitous and essentially free to use but it produces impractically large position errors. This study evaluated the improvement in precision achievable by augmenting the conventional geo-fence system with a standard speed bump or an existing anomaly at a pre-determined position to establish a reference inertial marker. The speed sensor subsequently generates position tags for the remaining inertial samples by computing their path distances relative to the reference position. The error model and a case study using smartphones to emulate connected vehicles revealed that the precision in localization improves from tens of metres to sub-centimetre levels, and the accuracy of measuring localized roughness more than doubles. The research results demonstrate that transportation agencies will benefit from using the connected vehicle method to achieve precision and accuracy levels that are comparable to existing laser-based inertial profilers.
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
Helping organizations help others: organization development as a facilitator of social change.
Boyd, Neil M
2011-01-01
This article explores organization development (OD) interventions and their likelihood of increasing social change outcomes in public agencies. The central argument of this work is that public and nonprofit organizations can deliver better social outcomes by systematically engaging in OD interventions. An in-depth survey was conducted in 3 agencies of the Commonwealth of Pennsylvania at the end of the gubernatorial administration of Tom Ridge (1995-2002). During his administration, Governor Ridge led the agencies of Pennsylvania government through a large-scale change effort to improve the efficiency and effectiveness of service delivery to the citizens of the Commonwealth of Pennsylvania. The change effort was a remarkable event for the Commonwealth because no other governor in the history of the state had attempted to conceptualize and deliver a comprehensive large-scale change management initiative. The successes and setbacks served as a fertile context to shed light on the following research question: Do OD interventions increase the likelihood that public organizations will deliver better social outcomes? This question is important in that public organizations may need to engage in organization development activities to improve their internal operations, which in turn may help them provide exemplary social outcomes to those whom they serve. In short, organization development interventions might allow public organizations to help themselves to help others.
NASA Astrophysics Data System (ADS)
Keller, M. M.
2015-12-01
The Large Scale Biosphere Atmosphere Experiment in Amazonia (LBA) is an international continental scale effort led by Brazil to understand how land use change and climate change affects the role of Amazonia in the Earth system. During the first decade of studies (1998-2007), LBA researchers generated new understanding of Amazonia and published over 1000 papers. However, most LBA participants agree that training and education of a large cohort of scientists, especially students from Brazil, was the greatest contribution of LBA. I analyzed bibliographic data from the NASA supported component project known as LBA-ECO. This component covered a large cross-section of the LBA subject areas highlighting land use and land cover change, carbon cycling, nutrient cycling and other aspects of terrestrial and aquatic ecology. I reviewed the complete bibliography of peer-reviewed papers reported by LBA-ECO researchers (http://www.lbaeco.org/cgi-bin/web/investigations/lbaeco_refs.pl). The researchers reported 691 contributions from 1996 through 2013 of which 24 were theses that were removed them from further analysis. Of 667 papers and book chapters, I tallied the first authors separating categories for Brazilians, all students, and Brazilian students. Numerically, LBA-ECO production of papers peaked in 2004. Publication by Brazilians, students, and Brazilian students generally followed the same pattern as publication in general. However, student and Brazilian student contributions as first authors showed clearly increasing proportions of the papers from project initiation through peak publication. Brazilian student participation as first authors averaged more than 20% of all publications from 2003 to 2010 and more than half of all student publications had Brazilians as first authors. Foreign researchers, some initially reluctant to invest in Brazilian students, almost universally adapted the belief that the greatest legacy of LBA would be the contribution to building a cadre of environmental researchers and professionals for the Amazon region. This belief was transformed into a commitment through pressure by NASA management and through the leadership of the LBA-ECO research team leading to LBA's greatest legacy.
MrEnt: an editor for publication-quality phylogenetic tree illustrations.
Zuccon, Alessandro; Zuccon, Dario
2014-09-01
We developed MrEnt, a Windows-based, user-friendly software that allows the production of complex, high-resolution, publication-quality phylogenetic trees in few steps, directly from the analysis output. The program recognizes the standard Nexus tree format and the annotated tree files produced by BEAST and MrBayes. MrEnt combines in a single software a large suite of tree manipulation functions (e.g. handling of multiple trees, tree rotation, character mapping, node collapsing, compression of large clades, handling of time scale and error bars for chronograms) with drawing tools typical of standard graphic editors, including handling of graphic elements and images. The tree illustration can be printed or exported in several standard formats suitable for journal publication, PowerPoint presentation or Web publication. © 2014 John Wiley & Sons Ltd.
Three controversies over item disclosure in medical licensure examinations
Park, Yoon Soo; Yang, Eunbae B.
2015-01-01
In response to views on public's right to know, there is growing attention to item disclosure – release of items, answer keys, and performance data to the public – in medical licensure examinations and their potential impact on the test's ability to measure competence and select qualified candidates. Recent debates on this issue have sparked legislative action internationally, including South Korea, with prior discussions among North American countries dating over three decades. The purpose of this study is to identify and analyze three issues associated with item disclosure in medical licensure examinations – 1) fairness and validity, 2) impact on passing levels, and 3) utility of item disclosure – by synthesizing existing literature in relation to standards in testing. Historically, the controversy over item disclosure has centered on fairness and validity. Proponents of item disclosure stress test takers’ right to know, while opponents argue from a validity perspective. Item disclosure may bias item characteristics, such as difficulty and discrimination, and has consequences on setting passing levels. To date, there has been limited research on the utility of item disclosure for large scale testing. These issues requires ongoing and careful consideration. PMID:26374693
Issues of public policy in the USA raised by amniocentesis.
Etzioni, A.
1976-01-01
Amniocentesis, a procedure which can detect during pregnancy whether or not the fetus will develop into a mongol or one affected by other serious chromosomal defects, if given to all pregnant women aged 40 and over, would save both human suffering and economic loss to the community. The procedure is not at present widely used for various reasons, not all of them medical, and, if the test result is positive abortion is the remedy. The author describes an important clinical trial being conducted in the USA at the present time but suggests that an educational programme should be undertaken to inform the public of the existence of this procedure and its applications even before the results of the American large-scale trial can be known and evaluated. Amniocentesis and its use, Professor Etzioni concludes, is not the only genetic tool which should be reviewed in a manner which would give an overall picture. He compares those who are concerned with these matters to the citizens of Britain when they saw the first steam engine. They did not perceive the social changes--the industrial revolution--that would follow. In our time a 'genetic revolution' may not be long delayed. PMID:822165
Falco: a quick and flexible single-cell RNA-seq processing framework on the cloud.
Yang, Andrian; Troup, Michael; Lin, Peijie; Ho, Joshua W K
2017-03-01
Single-cell RNA-seq (scRNA-seq) is increasingly used in a range of biomedical studies. Nonetheless, current RNA-seq analysis tools are not specifically designed to efficiently process scRNA-seq data due to their limited scalability. Here we introduce Falco, a cloud-based framework to enable paralellization of existing RNA-seq processing pipelines using big data technologies of Apache Hadoop and Apache Spark for performing massively parallel analysis of large scale transcriptomic data. Using two public scRNA-seq datasets and two popular RNA-seq alignment/feature quantification pipelines, we show that the same processing pipeline runs 2.6-145.4 times faster using Falco than running on a highly optimized standalone computer. Falco also allows users to utilize low-cost spot instances of Amazon Web Services, providing a ∼65% reduction in cost of analysis. Falco is available via a GNU General Public License at https://github.com/VCCRI/Falco/. j.ho@victorchang.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Du, Shihong; Zhang, Fangli; Zhang, Xiuyuan
2015-07-01
While most existing studies have focused on extracting geometric information on buildings, only a few have concentrated on semantic information. The lack of semantic information cannot satisfy many demands on resolving environmental and social issues. This study presents an approach to semantically classify buildings into much finer categories than those of existing studies by learning random forest (RF) classifier from a large number of imbalanced samples with high-dimensional features. First, a two-level segmentation mechanism combining GIS and VHR image produces single image objects at a large scale and intra-object components at a small scale. Second, a semi-supervised method chooses a large number of unbiased samples by considering the spatial proximity and intra-cluster similarity of buildings. Third, two important improvements in RF classifier are made: a voting-distribution ranked rule for reducing the influences of imbalanced samples on classification accuracy and a feature importance measurement for evaluating each feature's contribution to the recognition of each category. Fourth, the semantic classification of urban buildings is practically conducted in Beijing city, and the results demonstrate that the proposed approach is effective and accurate. The seven categories used in the study are finer than those in existing work and more helpful to studying many environmental and social problems.
ERIC Educational Resources Information Center
Hemmings, Philip
2006-01-01
This paper looks at ways of ensuring Czech regions and municipalities are fully motivated to make efficiency improvements in public service provision and so help achieve countrywide fiscal sustainability. The very large number of small municipalities in the Czech Republic means that scale economies are difficult to exploit and the policy options…
The Progressive Era: The Limits of Reform. Public Issues Series.
ERIC Educational Resources Information Center
Giese, James R.
This booklet is part of a series designed to help students take and defend a position on public issues. In this unit, the progressive era, a major reform period in U.S. history that stretched from about 1900 to 1915 is discussed. The book suggests that large scale reform is difficult to achieve because reformers often assume that their interests,…
ERIC Educational Resources Information Center
McEntee, Marie; Mortimer, Claire
2013-01-01
This article examines two large-scale public communication campaigns to explore the appropriateness and effectiveness of using one-way communication in contentious environmental issues. The findings show while one-way communication can be successfully employed in contentious issues, it is not appropriate for all contexts and may contribute to…
ProMotE: an efficient algorithm for counting independent motifs in uncertain network topologies.
Ren, Yuanfang; Sarkar, Aisharjya; Kahveci, Tamer
2018-06-26
Identifying motifs in biological networks is essential in uncovering key functions served by these networks. Finding non-overlapping motif instances is however a computationally challenging task. The fact that biological interactions are uncertain events further complicates the problem, as it makes the existence of an embedding of a given motif an uncertain event as well. In this paper, we develop a novel method, ProMotE (Probabilistic Motif Embedding), to count non-overlapping embeddings of a given motif in probabilistic networks. We utilize a polynomial model to capture the uncertainty. We develop three strategies to scale our algorithm to large networks. Our experiments demonstrate that our method scales to large networks in practical time with high accuracy where existing methods fail. Moreover, our experiments on cancer and degenerative disease networks show that our method helps in uncovering key functional characteristics of biological networks.
Batch effects in single-cell RNA-sequencing data are corrected by matching mutual nearest neighbors.
Haghverdi, Laleh; Lun, Aaron T L; Morgan, Michael D; Marioni, John C
2018-06-01
Large-scale single-cell RNA sequencing (scRNA-seq) data sets that are produced in different laboratories and at different times contain batch effects that may compromise the integration and interpretation of the data. Existing scRNA-seq analysis methods incorrectly assume that the composition of cell populations is either known or identical across batches. We present a strategy for batch correction based on the detection of mutual nearest neighbors (MNNs) in the high-dimensional expression space. Our approach does not rely on predefined or equal population compositions across batches; instead, it requires only that a subset of the population be shared between batches. We demonstrate the superiority of our approach compared with existing methods by using both simulated and real scRNA-seq data sets. Using multiple droplet-based scRNA-seq data sets, we demonstrate that our MNN batch-effect-correction method can be scaled to large numbers of cells.
Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan
NASA Astrophysics Data System (ADS)
Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun
2017-04-01
Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.
Tectonic inheritance, reactivation and long term fault weakening processes
NASA Astrophysics Data System (ADS)
Holdsworth, Bob
2017-04-01
This talk gives a geological review of weakening processes in faults and their long-term effect on reactivation and tectonic inheritance during crustal deformation. Examples will be drawn from the Atlantic margins, N America, Japan and the Alps. Tectonic inheritance and reactivation are fundamentally controlled by the processes of stress concentration and shear localisation manifested at all scales in the continental lithosphere. Lithosphere-scale controls include crustal thickness, thermal age and the boundary conditions imposed by the causative plate tectonic processes during extension. At the other end of the scale range, grain-scale controls include local environmental controls (depth, stress, strain rate), rock composition, grainsize, fabric intensity and the presence of fluids or melt. Intermediate-scale geometric controls are largely related to the size, orientation and interconnectivity of pre-existing anisotropies. If reactivation of pre-existing structures occurs, it likely requires a combination of processes across all three scale ranges to be favourable. This can make the unequivocal recognition of inheritance and reactivation difficult. Large (e.g. crustal-scale) pre-existing structures are especially important due to their ability to efficiently concentrate stress and localise strain. For big faults (San Andreas, Great Glen, Median Tectonic Line), detailed studies of the associated exposed fault rocks indicate that reactivation is linked to the development of strongly anisotropic phyllosilicate-rich fault rocks that are weak (e.g. friction coefficients as low as 0.2 or less) under a broad range of deformation conditions. In the case of pre-existing regional dyke swarms (S Atlantic, NW Scotland) - which may themselves track deep mantle fabrics at depth - multiple reactivation of dyke margins is widespread and may preclude reactivation of favourably oriented local basement fabrics. In a majority of cases, pre-existing structures in the crust are significantly oblique (<70°) to far field stress orientations. As a result, even quite modest amounts of reactivation will inevitably lead to transtensional/transpressional strains involving variable components of strike-slip and extension or shortening. The occurrence of bulk non-coaxial, non-plane strain leads to strain partitioning and/or (non-Andersonian) multimodal fracturing where the deformation cannot be described or reconstructed in single 2D cross-sectional or map view. Further complications can arise due to repeated seismogenic rupturing of larger offset faults leading to local stress transfer and reactivation of widely distributed smaller pre-existing structures in the wall rocks (e.g. Adamello Massif, Alps). The Atlantic margins demonstrate that pre-existing structures can influence deformation patterns across a range of scales, but such reactivation should never be assumed to be the norm. In many cases, the scales of faulting and displacement magnitudes associated with these reactivation events are modest compared to the regional-scale deformation of the margin. However, reactivation most certainly does influence the kilometre and smaller-scale complexity of faults, fractures and folds. It will therefore impact significantly on the development of geological architectures and their economic importance, e.g. location and nature of fluid channelways, trap geometries, reservoir performance, etc.
Strategies and Exemplars for Public Outreach Events: Planning, Implementation, Evaluation
NASA Astrophysics Data System (ADS)
Cobb, W. H.; Buxner, S.; Shipp, S. S.; Shebby, S.
2015-12-01
IntroductionEach year the National Aeronautics and Space Administration (NASA) sponsors a variety of public outreach events to share information with educators, students, and the general public. These events are designed to increase interest in and awareness of the mission and goals of NASA. Planning and implementation best practices gleaned from the NASA SMD Education's review of large-scale events, "Best Practices in Outreach Events" will be shared. Outcomes from an event, i C Ceres, celebrating the Dawn mission's arrival at dwarf planet Ceres that utilized these strategies will be shared. Best practices included can be pertinent for all event organizers and evaluators regardless of event size. BackgroundThe literature review focused on identifying evaluations of large-scale public outreach events—and, within these evaluations, identifying best practices. The following criteria for identifying journal articles and reports to potentially include: Public, science-related events open to adults and children. Events with more than 1,000 attendees. Events that occurred during the last 5 years. Evaluations that included information on data collected from visitors and/or volunteers. Evaluations that specified the type of data collected, methodology, and associated results. Planning and Implementation Best PracticesThe literature review revealed key considerations for planning and of large-scale events implementing events. A summary of related best practices is presented below. 1) Advertise the event 2) Use and advertise access to scientists 3) Recruit scientists using these findings 4) Ensure that the event is group and particularly child friendly 5) Target specific event outcomes Best Practices Informing Real-world Planning, Implementation and EvaluationDawn mission's collaborative design of a series of events, i C Ceres, including in-person, interactive events geared to families and live presentations will be shared. Outcomes and lessons learned will be imparted rising from these events and their evaluation. There will be a focus on the family event, in particular the evidence that scientist participation was a particular driver for the event's impact and success.
Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M
2006-06-01
Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels.
Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A. A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M
2006-01-01
Background Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. Methods and Findings This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. Conclusions The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels. PMID:16719557
Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R
2008-01-01
A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.
ERIC Educational Resources Information Center
Binfet, John Tyler; Gadermann, Anne M.; Schonert-Reichl, Kimberly A.
2016-01-01
In this study, we sought to create and validate a brief measure to assess students' perceptions of kindness in school. Participants included 1,753 students in Grades 4 to 8 attending public schools in a large school district in southern British Columbia. The School Kindness Scale (SKS) demonstrated a unidimensional factor structure and adequate…
NASA Astrophysics Data System (ADS)
Pan, Zhenying; Yu, Ye Feng; Valuckas, Vytautas; Yap, Sherry L. K.; Vienne, Guillaume G.; Kuznetsov, Arseniy I.
2018-05-01
Cheap large-scale fabrication of ordered nanostructures is important for multiple applications in photonics and biomedicine including optical filters, solar cells, plasmonic biosensors, and DNA sequencing. Existing methods are either expensive or have strict limitations on the feature size and fabrication complexity. Here, we present a laser-based technique, plasmonic nanoparticle lithography, which is capable of rapid fabrication of large-scale arrays of sub-50 nm holes on various substrates. It is based on near-field enhancement and melting induced under ordered arrays of plasmonic nanoparticles, which are brought into contact or in close proximity to a desired material and acting as optical near-field lenses. The nanoparticles are arranged in ordered patterns on a flexible substrate and can be attached and removed from the patterned sample surface. At optimized laser fluence, the nanohole patterning process does not create any observable changes to the nanoparticles and they have been applied multiple times as reusable near-field masks. This resist-free nanolithography technique provides a simple and cheap solution for large-scale nanofabrication.
Scale-dependent feedbacks between patch size and plant reproduction in desert grassland
Svejcar, Lauren N.; Bestelmeyer, Brandon T.; Duniway, Michael C.; James, Darren K.
2015-01-01
Theoretical models suggest that scale-dependent feedbacks between plant reproductive success and plant patch size govern transitions from highly to sparsely vegetated states in drylands, yet there is scant empirical evidence for these mechanisms. Scale-dependent feedback models suggest that an optimal patch size exists for growth and reproduction of plants and that a threshold patch organization exists below which positive feedbacks between vegetation and resources can break down, leading to critical transitions. We examined the relationship between patch size and plant reproduction using an experiment in a Chihuahuan Desert grassland. We tested the hypothesis that reproductive effort and success of a dominant grass (Bouteloua eriopoda) would vary predictably with patch size. We found that focal plants in medium-sized patches featured higher rates of grass reproductive success than when plants occupied either large patch interiors or small patches. These patterns support the existence of scale-dependent feedbacks in Chihuahuan Desert grasslands and indicate an optimal patch size for reproductive effort and success in B. eriopoda. We discuss the implications of these results for detecting ecological thresholds in desert grasslands.
Learning of Multimodal Representations With Random Walks on the Click Graph.
Wu, Fei; Lu, Xinyan; Song, Jun; Yan, Shuicheng; Zhang, Zhongfei Mark; Rui, Yong; Zhuang, Yueting
2016-02-01
In multimedia information retrieval, most classic approaches tend to represent different modalities of media in the same feature space. With the click data collected from the users' searching behavior, existing approaches take either one-to-one paired data (text-image pairs) or ranking examples (text-query-image and/or image-query-text ranking lists) as training examples, which do not make full use of the click data, particularly the implicit connections among the data objects. In this paper, we treat the click data as a large click graph, in which vertices are images/text queries and edges indicate the clicks between an image and a query. We consider learning a multimodal representation from the perspective of encoding the explicit/implicit relevance relationship between the vertices in the click graph. By minimizing both the truncated random walk loss as well as the distance between the learned representation of vertices and their corresponding deep neural network output, the proposed model which is named multimodal random walk neural network (MRW-NN) can be applied to not only learn robust representation of the existing multimodal data in the click graph, but also deal with the unseen queries and images to support cross-modal retrieval. We evaluate the latent representation learned by MRW-NN on a public large-scale click log data set Clickture and further show that MRW-NN achieves much better cross-modal retrieval performance on the unseen queries/images than the other state-of-the-art methods.
NASA Astrophysics Data System (ADS)
Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration
2014-11-01
We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.
NASA Astrophysics Data System (ADS)
Marlon, J. R.; Howe, P. D.; Leiserowitz, A.
2013-12-01
For climate change communication to be most effective, messages should be targeted to the characteristics of local audiences. In the U.S., 'Six Americas' have been identified among the public based on their response to the climate change issue. The distribution of these different 'publics' varies between states and communities, yet data about public opinion at the sub-national scale remains scarce. In this presentation, we describe a methodology to statistically downscale results from national-level surveys about the Six Americas, climate literacy, and other aspects of public opinion to smaller areas, including states, metropolitan areas, and counties. The method utilizes multilevel regression with poststratification (MRP) to model public opinion at various scales using a large national-level survey dataset. We present state and county-level estimates of two key beliefs about climate change: belief that climate change is happening, and belief in the scientific consensus about climate change. We further present estimates of how the Six Americas vary across the U.S.
NASA Astrophysics Data System (ADS)
Ray, R. K.; Syed, T. H.; Saha, Dipankar; Sarkar, B. C.; Patre, A. K.
2017-12-01
Extracted groundwater, 90% of which is used for irrigated agriculture, is central to the socio-economic development of India. A lack of regulation or implementation of regulations, alongside unrecorded extraction, often leads to over exploitation of large-scale common-pool resources like groundwater. Inevitably, management of groundwater extraction (draft) for irrigation is critical for sustainability of aquifers and the society at large. However, existing assessments of groundwater draft, which are mostly available at large spatial scales, are inadequate for managing groundwater resources that are primarily exploited by stakeholders at much finer scales. This study presents an estimate, projection and analysis of fine-scale groundwater draft in the Seonath-Kharun interfluve of central India. Using field surveys of instantaneous discharge from irrigation wells and boreholes, annual groundwater draft for irrigation in this area is estimated to be 212 × 106 m3, most of which (89%) is withdrawn during non-monsoon season. However, the density of wells/boreholes, and consequent extraction of groundwater, is controlled by the existing hydrogeological conditions. Based on trends in the number of abstraction structures (1982-2011), groundwater draft for the year 2020 is projected to be approximately 307 × 106 m3; hence, groundwater draft for irrigation in the study area is predicted to increase by ˜44% within a span of 8 years. Central to the work presented here is the approach for estimation and prediction of groundwater draft at finer scales, which can be extended to critical groundwater zones of the country.
Collections and user tools for utilization of persistent identifiers in cyberinfrastructures
NASA Astrophysics Data System (ADS)
Weigel, T.
2014-12-01
The main use of persistent identifiers (PIDs) for data objects has so far been for formal publication and citation purposes with a focus on long-term availability and trust. This core use case has now evolved and broadened to include basic data management tasks as identifiers are increasingly seen as a possible anchor element in the deluge of data for purposes of large-scale automation of tasks. The European Data Infrastructure (EUDAT) for instance uses PIDs in their back-end services and distinctly so for entities where the identifier may be more persistent than a resource with limited lifetime. Despite breaking with the traditional metaphor, this offers new opportunities for data management and end-user tools, but also requires a clear demonstrated benefit of value-added services because en masse identifier assignment does not come at zero costs. There are several obstacles to overcome when establishing identifiers at large scale. The administration of large numbers of identifiers can be cumbersome if they are treated in an isolated manner. Here, identifier collections can enable automated mass operations on groups of associated objects. Several use cases rely on base information that is rapidly available from the identifier systems without the need to retrieve objects, yet they will not work efficiently if the information is not consistently typed. Tools that span cyberinfrastructures and address scientific end-users unaware of the varying back-ends must overcome such obstacles. The Working Group on PID Information Types of the Research Data Alliance (RDA) has developed an interface specification and prototype to access and manipulate typed base information. Concrete prototypes for identifier collections exist as well. We will present some first data and provenance tracking tools that make extensive use of these recent developments and address different user needs that span from administrative tasks to individual end-user services with particular focus on data available from the Earth System Grid Federation (ESGF). We will compare the tools along their respective use cases with existing approaches and discuss benefits and limitations.
Automated array assembly, phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
Tasks of scaling up the tandem junction cell (TJC) from 2 cm x 2 cm to 6.2 cm and the assembly of several modules using these large area TJC's are described. The scale-up of the TJC was based on using the existing process and doing the necessary design activities to increase the cell area to an acceptably large area. The design was carried out using available device models. The design was verified and sample large area TJCs were fabricated. Mechanical and process problems occurred causing a schedule slippage that resulted in contract expiration before enough large-area TJCs were fabricated to populate the sample tandem junction modules (TJM). A TJM design was carried out in which the module interconnects served to augment the current collecting buses on the cell. No sample TJMs were assembled due to a shortage of large-area TJCs.
Self Assembly and Pyroelectric Poling for Organics
2015-07-06
ozone or nitrogen oxides) and energetic species from corona discharge . These problems can strongly inhibit the efficient poling and large-scale...poling techniques. Although contact and corona poling protocols are quite well established for decades, there do exist some challenging problems. In...contact poling, severe charge injection from metal electrodes often results in large current that causes dielectric breakdown of films. Corona poling
Architecting for Large Scale Agile Software Development: A Risk-Driven Approach
2013-05-01
addressed aspect of scale in agile software development. Practices such as Scrum of Scrums are meant to address orchestration of multiple development...owner, Scrum master) have differing responsibilities from the roles in the existing phase-based waterfall program structures. Such differences may... Scrum . Communication with both internal and external stakeholders must be open and documentation should not be used as a substitute for communication
Sustainable Model for Public Health Emergency Operations Centers for Global Settings.
Balajee, S Arunmozhi; Pasi, Omer G; Etoundi, Alain Georges M; Rzeszotarski, Peter; Do, Trang T; Hennessee, Ian; Merali, Sharifa; Alroy, Karen A; Phu, Tran Dac; Mounts, Anthony W
2017-10-01
Capacity to receive, verify, analyze, assess, and investigate public health events is essential for epidemic intelligence. Public health Emergency Operations Centers (PHEOCs) can be epidemic intelligence hubs by 1) having the capacity to receive, analyze, and visualize multiple data streams, including surveillance and 2) maintaining a trained workforce that can analyze and interpret data from real-time emerging events. Such PHEOCs could be physically located within a ministry of health epidemiology, surveillance, or equivalent department rather than exist as a stand-alone space and serve as operational hubs during nonoutbreak times but in emergencies can scale up according to the traditional Incident Command System structure.
Aggregate measures of ecosystem services: Can we take the pulse of nature?
Meyerson, L.A.; Baron, Jill S.; Melillo, J.M.; Naiman, R.J.; O'Malley, R.I.; Orians, G.; Palmer, Margaret A.; Pfaff, Alexander S.P.; Running, S.W.; Sala, O.E.
2005-01-01
National scale aggregate indicators of ecosystem services are useful for stimulating and supporting a broad public discussion about trends in the provision of these services. There are important considerations involved in producing an aggregate indicator, including whether the scientific and technological capacity exists, how to address varying perceptions of the societal importance of different services, and how to communicate information about these services to both decision makers and the general public. Although the challenges are formidable, they are not insurmountable. Quantification of ecosystem services and dissemination of information to decision makers and the public is critical for the responsible and sustainable management of natural resources.
Initiation process of earthquakes and its implications for seismic hazard reduction strategy.
Kanamori, H
1996-01-01
For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding. Images Fig. 8 PMID:11607657
NASA Astrophysics Data System (ADS)
Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori
2016-07-01
This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.
Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori
2016-01-01
This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.
A Large Scale Dynamical System Immune Network Modelwith Finite Connectivity
NASA Astrophysics Data System (ADS)
Uezu, T.; Kadono, C.; Hatchett, J.; Coolen, A. C. C.
We study a model of an idiotypic immune network which was introduced by N. K. Jerne. It is known that in immune systems there generally exist several kinds of immune cells which can recognize any particular antigen. Taking this fact into account and assuming that each cell interacts with only a finite number of other cells, we analyze a large scale immune network via both numerical simulations and statistical mechanical methods, and show that the distribution of the concentrations of antibodies becomes non-trivial for a range of values of the strength of the interaction and the connectivity.
Friction-Stir Welding of Large Scale Cryogenic Fuel Tanks for Aerospace Applications
NASA Technical Reports Server (NTRS)
Jones, Clyde S., III; Venable, Richard A.
1998-01-01
The Marshall Space Flight Center has established a facility for the joining of large-scale aluminum-lithium alloy 2195 cryogenic fuel tanks using the friction-stir welding process. Longitudinal welds, approximately five meters in length, were made possible by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and the spindle travel system will be described in this paper. Process controls and real-time data acquisition will also be described, and were critical elements contributing to successful weld operation.
Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.
2010-01-01
Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.
Enabling Large-Scale Biomedical Analysis in the Cloud
Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen
2013-01-01
Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665
Measures of Agreement Between Many Raters for Ordinal Classifications
Nelson, Kerrie P.; Edwards, Don
2015-01-01
Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449
How Big is Too Big for Hubs: Marginal Profitability in Hub-and-Spoke Networks
NASA Technical Reports Server (NTRS)
Ross, Leola B.; Schmidt, Stephen J.
1997-01-01
Increasing the scale of hub operations at major airports has led to concerns about congestion at excessively large hubs. In this paper, we estimate the marginal cost of adding spokes to an existing hub network. We observe entry/non-entry decisions on potential spokes from existing hubs, and estimate both a variable profit function for providing service in markets using that spoke as well as the fixed costs of providing service to the spoke. We let the fixed costs depend upon the scale of operations at the hub, and find the hub size at which spoke service costs are minimized.
Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese
2013-05-01
The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Massive superclusters as a probe of the nature and amplitude of primordial density fluctuations
NASA Technical Reports Server (NTRS)
Kaiser, N.; Davis, M.
1985-01-01
It is pointed out that correlation studies of galaxy positions have been widely used in the search for information about the large-scale matter distribution. The study of rare condensations on large scales provides an approach to extend the existing knowledge of large-scale structure into the weakly clustered regime. Shane (1975) provides a description of several apparent massive condensations within the Shane-Wirtanen catalog, taking into account the Serpens-Virgo cloud and the Corona cloud. In the present study, a description is given of a model for estimating the frequency of condensations which evolve from initially Gaussian fluctuations. This model is applied to the Corona cloud to estimate its 'rareness' and thereby estimate the rms density contrast on this mass scale. An attempt is made to find a conflict between the density fluctuations derived from the Corona cloud and independent constraints. A comparison is conducted of the estimate and the density fluctuations predicted to arise in a universe dominated by cold dark matter.
NASA Astrophysics Data System (ADS)
Postadjian, T.; Le Bris, A.; Sahbi, H.; Mallet, C.
2017-05-01
Semantic classification is a core remote sensing task as it provides the fundamental input for land-cover map generation. The very recent literature has shown the superior performance of deep convolutional neural networks (DCNN) for many classification tasks including the automatic analysis of Very High Spatial Resolution (VHR) geospatial images. Most of the recent initiatives have focused on very high discrimination capacity combined with accurate object boundary retrieval. Therefore, current architectures are perfectly tailored for urban areas over restricted areas but not designed for large-scale purposes. This paper presents an end-to-end automatic processing chain, based on DCNNs, that aims at performing large-scale classification of VHR satellite images (here SPOT 6/7). Since this work assesses, through various experiments, the potential of DCNNs for country-scale VHR land-cover map generation, a simple yet effective architecture is proposed, efficiently discriminating the main classes of interest (namely buildings, roads, water, crops, vegetated areas) by exploiting existing VHR land-cover maps for training.
Interplay between Public Attention and Public Emotion toward Multiple Social Issues on Twitter
Peng, Tai-Quan; Sun, Guodao; Wu, Yingcai
2017-01-01
This study aims to elucidate the intricate interplay between public attention and public emotion toward multiple social issues. A theoretical framework is developed based on three perspectives including endogenous affect hypothesis, affect transfer hypothesis, and affective intelligence theory. Large-scale longitudinal data with 265 million tweets on five social issues are analyzed using a time series analytical approach. Public attention on social issues can influence public emotion on the issue per se. Social issues interact with one another to attract public attention in both cooperative and competitive ways. Instead of a direct transfer from public emotion to public attention, the public emotion toward a social issue moderates the interaction between the issue and other issue(s). PMID:28081117
Public Health's Approach to Systemic Racism: a Systematic Literature Review.
Castle, Billie; Wendel, Monica; Kerr, Jelani; Brooms, Derrick; Rollins, Aaron
2018-05-04
Recently, public health has acknowledged racism as a social determinant of health. Much evidence exists on the impact of individual-level racism and discrimination, with little to no examination of racism from the standpoint of systems and structures. The purpose of this systematic literature review is to analyze the extent to which public health currently addresses systemic racism in the published literature. Utilizing the PRISMA guidelines, this review examines three widely used databases to examine published literature covering the topic as well as implications for future research and practice. A total of 85 articles were included in the review analysis after meeting study criteria. Across numerous articles, the terms racism and systemic racism are largely absent. A critical need exists for an examination of the historical impact of systemic racism on the social determinants of health and health of marginalized populations.
Weapons and Minority Youth Violence.
ERIC Educational Resources Information Center
Northrop, Daphne; Hamrick, Kim
Weapons violence is a major public health problem that especially impacts minority youth. Interventions designed to reduce weapon use by youth are categorized as educational/behavioral change, legal, and technological/environmental. Few educational programs currently exist, but those that do largely concern firearm safety courses, public…
Status of the database on microorganism inactivation in environmental media (DIMEM)
USDA-ARS?s Scientific Manuscript database
Inactivation of pathogenic and indicator microorganisms is the essential component of their environmental fate which needs to be considered in environmental microbiology models. Existing data from a large number of inactivation experiments are dispersed across numerous publications with varying avai...
Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis
NASA Astrophysics Data System (ADS)
Klose, C. D.
2012-12-01
The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (
Human Disease-Drug Network Based on Genomic Expression Profiles
Hu, Guanghui; Agarwal, Pankaj
2009-01-01
Background Drug repositioning offers the possibility of faster development times and reduced risks in drug discovery. With the rapid development of high-throughput technologies and ever-increasing accumulation of whole genome-level datasets, an increasing number of diseases and drugs can be comprehensively characterized by the changes they induce in gene expression, protein, metabolites and phenotypes. Methodology/Principal Findings We performed a systematic, large-scale analysis of genomic expression profiles of human diseases and drugs to create a disease-drug network. A network of 170,027 significant interactions was extracted from the ∼24.5 million comparisons between ∼7,000 publicly available transcriptomic profiles. The network includes 645 disease-disease, 5,008 disease-drug, and 164,374 drug-drug relationships. At least 60% of the disease-disease pairs were in the same disease area as determined by the Medical Subject Headings (MeSH) disease classification tree. The remaining can drive a molecular level nosology by discovering relationships between seemingly unrelated diseases, such as a connection between bipolar disorder and hereditary spastic paraplegia, and a connection between actinic keratosis and cancer. Among the 5,008 disease-drug links, connections with negative scores suggest new indications for existing drugs, such as the use of some antimalaria drugs for Crohn's disease, and a variety of existing drugs for Huntington's disease; while the positive scoring connections can aid in drug side effect identification, such as tamoxifen's undesired carcinogenic property. From the ∼37K drug-drug relationships, we discover relationships that aid in target and pathway deconvolution, such as 1) KCNMA1 as a potential molecular target of lobeline, and 2) both apoptotic DNA fragmentation and G2/M DNA damage checkpoint regulation as potential pathway targets of daunorubicin. Conclusions/Significance We have automatically generated thousands of disease and drug expression profiles using GEO datasets, and constructed a large scale disease-drug network for effective and efficient drug repositioning as well as drug target/pathway identification. PMID:19657382
Shen, Hong-Bin; Chou, Kuo-Chen
2007-04-20
Proteins may simultaneously exist at, or move between, two or more different subcellular locations. Proteins with multiple locations or dynamic feature of this kind are particularly interesting because they may have some very special biological functions intriguing to investigators in both basic research and drug discovery. For instance, among the 6408 human protein entries that have experimentally observed subcellular location annotations in the Swiss-Prot database (version 50.7, released 19-Sept-2006), 973 ( approximately 15%) have multiple location sites. The number of total human protein entries (except those annotated with "fragment" or those with less than 50 amino acids) in the same database is 14,370, meaning a gap of (14,370-6408)=7962 entries for which no knowledge is available about their subcellular locations. Although one can use the computational approach to predict the desired information for the gap, so far all the existing methods for predicting human protein subcellular localization are limited in the case of single location site only. To overcome such a barrier, a new ensemble classifier, named Hum-mPLoc, was developed that can be used to deal with the case of multiple location sites as well. Hum-mPLoc is freely accessible to the public as a web server at http://202.120.37.186/bioinf/hum-multi. Meanwhile, for the convenience of people working in the relevant areas, Hum-mPLoc has been used to identify all human protein entries in the Swiss-Prot database that do not have subcellular location annotations or are annotated as being uncertain. The large-scale results thus obtained have been deposited in a downloadable file prepared with Microsoft Excel and named "Tab_Hum-mPLoc.xls". This file is available at the same website and will be updated twice a year to include new entries of human proteins and reflect the continuous development of Hum-mPLoc.
Across the Great Divide: The Effects of Technology in Secondary Biology Classrooms
NASA Astrophysics Data System (ADS)
Worley, Johnny Howard, II
This study investigates the relationship between technology use and student achievement in public high school across North Carolina. The purpose of this study was to determine whether a digital divide (differences in technology utilization based on student demographics of race/ethnicity, gender, socioeconomic status, and municipality) exists among schools and whether those differences relate to student achievement in high school biology classrooms. The study uses North Carolina end-of-course (EOC) data for biology to analyze student demographic data and assessment results from the 2010-2011 school year from the North Carolina Department of Public Instruction. The data analyses use descriptive and factorial univariate statistics to determine the existence of digital divides and their effects on biology achievement. Analysis of these data described patterns of technology use to determine whether potential variances resulted in a digital divide. Specific technology uses were identified in the data and then their impact on biology achievement scores within various demographic groups was examined. Research findings revealed statistically significant variations of use within different population groups. Despite being statistically significant, the relevance of the association in the variations was minimal at best -- based on the effect scale established by Cohen (1988). Additional factorial univariate analyses were employed to determine potential relationships between technology use and student achievement. The data revealed that technology use did not influence the variation of student achievement scale scores as much as race/ethnicity and socioeconomic status. White students outperformed Hispanic students by an average of three scale score points and Black students by an average of six scale score points. Technology use alone averaged less than a one point difference in mean scale scores, and only when interacting with race, gender, and/or SES did the mean difference increase. However, this increase within the context of the biology scale score range was negligible. This study contributes to the existing body of research on the effects of technology use on student achievement and its influence within various student demographic groups and municipalities. The study also provides additional research information for effective technology utilization, implementation, and instruction in educational environments.
Place-based planning: innovations and applications from four western forests.
Jennifer O. Farnum; Linda E. Kruger
2008-01-01
Place-based planning is an emergent method of public lands planning that aims to redefine the scale at which planning occurs, using place meanings and place values to guide planning processes. Despite the approach's growing popularity, there exist few published accounts of place-based approaches. To provide practitioners and researchers with such examples, the...
An Assessment of Factors Affecting the Organizational Climate of Hawaii Elementary Schools.
ERIC Educational Resources Information Center
Amin, Aminullah; Glenn, Charles E.
The problem undertaken by this study was to determine what, if any, relationships exist between the organizational climates of selected schools, as measured by the Organizational Climate Description Questionnaire (OCDQ), and the scores of teachers on Form E of the Rokeach Dogmatism Scale. The study was conducted in eight Hawaii public elementary…
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Scully, R. A.; Bayer, J.
2016-12-01
Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.
NASA Astrophysics Data System (ADS)
Tohidi, Ali; Gollner, Michael J.; Xiao, Huahua
2018-01-01
Fire whirls present a powerful intensification of combustion, long studied in the fire research community because of the dangers they present during large urban and wildland fires. However, their destructive power has hidden many features of their formation, growth, and propagation. Therefore, most of what is known about fire whirls comes from scale modeling experiments in the laboratory. Both the methods of formation, which are dominated by wind and geometry, and the inner structure of the whirl, including velocity and temperature fields, have been studied at this scale. Quasi-steady fire whirls directly over a fuel source form the bulk of current experimental knowledge, although many other cases exist in nature. The structure of fire whirls has yet to be reliably measured at large scales; however, scaling laws have been relatively successful in modeling the conditions for formation from small to large scales. This review surveys the state of knowledge concerning the fluid dynamics of fire whirls, including the conditions for their formation, their structure, and the mechanisms that control their unique state. We highlight recent discoveries and survey potential avenues for future research, including using the properties of fire whirls for efficient remediation and energy generation.
Selected Papers on Low-Energy Antiprotons and Possible Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noble, Robert
1998-09-19
The only realistic means by which to create a facility at Fermilab to produce large amounts of low energy antiprotons is to use resources which already exist. There is simply too little money and manpower at this point in time to generate new accelerators on a time scale before the turn of the century. Therefore, innovation is required to modify existing equipment to provide the services required by experimenters.
ERIC Educational Resources Information Center
King, Gary; Gakidou, Emmanuela; Ravishankar, Nirmala; Moore, Ryan T.; Lakin, Jason; Vargas, Manett; Tellez-Rojo, Martha Maria; Avila, Juan Eugenio Hernandez; Avila, Mauricio Hernandez; Llamas, Hector Hernandez
2007-01-01
We develop an approach to conducting large-scale randomized public policy experiments intended to be more robust to the political interventions that have ruined some or all parts of many similar previous efforts. Our proposed design is insulated from selection bias in some circumstances even if we lose observations; our inferences can still be…
ERIC Educational Resources Information Center
Rosen, Andrew S.
2018-01-01
Student evaluations of teaching are widely adopted across academic institutions, but there are many underlying trends and biases that can influence their interpretation. Publicly accessible web-based student evaluations of teaching are of particular relevance, due to their widespread use by students in the course selection process and the quantity…
Issues for bringing digital libraries into public use
NASA Technical Reports Server (NTRS)
Flater, David W.; Yesha, Yelena
1993-01-01
In much the same way that the field of artificial intelligence produced a cult which fervently believed that computers would soon think like human beings, the existence of electronic books has resurrected the paperless society as a utopian vision to some, an apocalyptic horror to others. In this essay we have attempted to provide realistic notions of what digital libraries are likely to become if they are a popular success. E-books are capable of subsuming most of the media we use today and have the potential for added functionality by being interactive. The environmental impact of having millions more computers will be offset to some degree, perhaps even exceeded, by the fact that televisions, stereos, VCR's, CD players, newspapers, magazines, and books will become part of the computer system or be made redundant. On the whole, large-scale use of digital libraries is likely to be a winning proposition. Whether or not this comes to pass depends on the directions taken by today's researchers and software developers. By involving the public, the effort being put into digital libraries can be leveraged into something which is big enough to make a real change for the better. If digital libraries remain the exclusive property of government, universities, and large research firms, then large parts of the world will remain without digital libraries for years to come, just as they have remained without digital phone service for far too long. If software companies try to scuttle the project by patenting crucial algorithms and using proprietary data formats, all of us will suffer. Let us reverse the errors of the past and create a truly open digital library system.
Kennedy, Amy E; Khoury, Muin J; Ioannidis, John P A; Brotzman, Michelle; Miller, Amy; Lane, Crystal; Lai, Gabriel Y; Rogers, Scott D; Harvey, Chinonye; Elena, Joanne W; Seminara, Daniela
2016-10-01
We report on the establishment of a web-based Cancer Epidemiology Descriptive Cohort Database (CEDCD). The CEDCD's goals are to enhance awareness of resources, facilitate interdisciplinary research collaborations, and support existing cohorts for the study of cancer-related outcomes. Comprehensive descriptive data were collected from large cohorts established to study cancer as primary outcome using a newly developed questionnaire. These included an inventory of baseline and follow-up data, biospecimens, genomics, policies, and protocols. Additional descriptive data extracted from publicly available sources were also collected. This information was entered in a searchable and publicly accessible database. We summarized the descriptive data across cohorts and reported the characteristics of this resource. As of December 2015, the CEDCD includes data from 46 cohorts representing more than 6.5 million individuals (29% ethnic/racial minorities). Overall, 78% of the cohorts have collected blood at least once, 57% at multiple time points, and 46% collected tissue samples. Genotyping has been performed by 67% of the cohorts, while 46% have performed whole-genome or exome sequencing in subsets of enrolled individuals. Information on medical conditions other than cancer has been collected in more than 50% of the cohorts. More than 600,000 incident cancer cases and more than 40,000 prevalent cases are reported, with 24 cancer sites represented. The CEDCD assembles detailed descriptive information on a large number of cancer cohorts in a searchable database. Information from the CEDCD may assist the interdisciplinary research community by facilitating identification of well-established population resources and large-scale collaborative and integrative research. Cancer Epidemiol Biomarkers Prev; 25(10); 1392-401. ©2016 AACR. ©2016 American Association for Cancer Research.
Australia's economic transition, unemployment, suicide and mental health needs.
Myles, Nicholas; Large, Matthew; Myles, Hannah; Adams, Robert; Liu, Dennis; Galletly, Cherrie
2017-02-01
There have been substantial changes in workforce and employment patterns in Australia over the past 50 years as a result of economic globalisation. This has resulted in substantial reduction in employment in the manufacturing industry often with large-scale job losses in concentrated sectors and communities. Large-scale job loss events receive significant community attention. To what extent these mass unemployment events contribute to increased psychological distress, mental illness and suicide in affected individuals warrants further consideration. Here we undertake a narrative review of published job loss literature. We discuss the impact that large-scale job loss events in the manufacturing sector may have on population mental health, with particular reference to contemporary trends in the Australian economy. We also provide a commentary on the expected outcomes of future job loss events in this context and the implications for Australian public mental health care services. Job loss due to plant closure results in a doubling of psychological distress that peaks 9 months following the unemployment event. The link between job loss and increased rates of mental illness and suicide is less clear. The threat of impending job loss and the social context in which job loss occurs has a significant bearing on psychological outcomes. The implications for Australian public mental health services are discussed.
Simulation research on the process of large scale ship plane segmentation intelligent workshop
NASA Astrophysics Data System (ADS)
Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei
2017-04-01
Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.
A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT
2007-01-30
The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less
Sub-Selective Quantization for Learning Binary Codes in Large-Scale Image Search.
Li, Yeqing; Liu, Wei; Huang, Junzhou
2018-06-01
Recently with the explosive growth of visual content on the Internet, large-scale image search has attracted intensive attention. It has been shown that mapping high-dimensional image descriptors to compact binary codes can lead to considerable efficiency gains in both storage and performing similarity computation of images. However, most existing methods still suffer from expensive training devoted to large-scale binary code learning. To address this issue, we propose a sub-selection based matrix manipulation algorithm, which can significantly reduce the computational cost of code learning. As case studies, we apply the sub-selection algorithm to several popular quantization techniques including cases using linear and nonlinear mappings. Crucially, we can justify the resulting sub-selective quantization by proving its theoretic properties. Extensive experiments are carried out on three image benchmarks with up to one million samples, corroborating the efficacy of the sub-selective quantization method in terms of image retrieval.
Gaussian processes for personalized e-health monitoring with wearable sensors.
Clifton, Lei; Clifton, David A; Pimentel, Marco A F; Watkinson, Peter J; Tarassenko, Lionel
2013-01-01
Advances in wearable sensing and communications infrastructure have allowed the widespread development of prototype medical devices for patient monitoring. However, such devices have not penetrated into clinical practice, primarily due to a lack of research into "intelligent" analysis methods that are sufficiently robust to support large-scale deployment. Existing systems are typically plagued by large false-alarm rates, and an inability to cope with sensor artifact in a principled manner. This paper has two aims: 1) proposal of a novel, patient-personalized system for analysis and inference in the presence of data uncertainty, typically caused by sensor artifact and data incompleteness; 2) demonstration of the method using a large-scale clinical study in which 200 patients have been monitored using the proposed system. This latter provides much-needed evidence that personalized e-health monitoring is feasible within an actual clinical environment, at scale, and that the method is capable of improving patient outcomes via personalized healthcare.
The Wrong Tool for the Job: Diabetes Public Health Programs and Practice Guidelines
López, Andrea; Black, Karen; Schillinger, Dean
2011-01-01
We surveyed state diabetes programs to determine whether they develop and disseminate diabetes guidelines. We found they largely disseminate clinical practice guidelines developed from subspecialty organizations, do not prioritize among the many recommendations contained in diabetes guidelines, and have not adapted guidelines to focus on population rather than individual health. An opportunity exists for state diabetes control programs to better align guidelines with public health goals. PMID:21852653
Word diffusion and climate science.
Bentley, R Alexander; Garnett, Philip; O'Brien, Michael J; Brock, William A
2012-01-01
As public and political debates often demonstrate, a substantial disjoint can exist between the findings of science and the impact it has on the public. Using climate-change science as a case example, we reconsider the role of scientists in the information-dissemination process, our hypothesis being that important keywords used in climate science follow "boom and bust" fashion cycles in public usage. Representing this public usage through extraordinary new data on word frequencies in books published up to the year 2008, we show that a classic two-parameter social-diffusion model closely fits the comings and goings of many keywords over generational or longer time scales. We suggest that the fashions of word usage contributes an empirical, possibly regular, correlate to the impact of climate science on society.
Multiscale structure of time series revealed by the monotony spectrum.
Vamoş, Călin
2017-03-01
Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.
ERIC Educational Resources Information Center
Behrman, Joanna
2017-01-01
Technologies such as electrical appliances entered American households on a large scale only after many decades of promotion to the public. The genre of "household physics" textbooks was one such form of promotion that was directed towards assumed white, female and largely middle-class home economics students. Published from the 1910s to…
Pharmacogenomic agreement between two cancer cell line data sets.
2015-12-03
Large cancer cell line collections broadly capture the genomic diversity of human cancers and provide valuable insight into anti-cancer drug response. Here we show substantial agreement and biological consilience between drug sensitivity measurements and their associated genomic predictors from two publicly available large-scale pharmacogenomics resources: The Cancer Cell Line Encyclopedia and the Genomics of Drug Sensitivity in Cancer databases.
The Snowmastodon Project: cutting-edge science on the blade of a bulldozer
Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.
2015-01-01
Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.
Uscher-Pines, Lori; Bookbinder, Sylvia H; Miro, Suzanne; Burke, Thomas
2007-01-01
Although public health agencies routinely operate hotlines to communicate key messages to the public, they are rarely evaluated to improve hotline management. Since its creation in 2003, the New Jersey Department of Health & Senior Services' Emergency Communications Center has confronted two large-scale incidents that have tested its capabilities in this area. The influenza vaccine shortage of 2004 and the April 2005 TOPOFF 3 full-scale bioterrorism exercise provided both real-life and simulated crisis situations from which to derive general insights into the strengths and weaknesses of hotline administration. This article identifies problems in the areas of staff and message management by analyzing call volume data and the qualitative observations of group feedback sessions and semistructured interviews with hotline staff. It also makes recommendations based on lessons learned to improve future hotline operations in public health emergencies.
Onshore industrial wind turbine locations for the United States up to March 2014
Diffendorfer, James E.; Kramer, Louisa; Ancona, Zachary H.; Garrity, Christopher P.
2015-01-01
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.
Onshore industrial wind turbine locations for the United States up to March 2014.
Diffendorfer, Jay E; Kramer, Louisa A; Ancona, Zach H; Garrity, Christopher P
2015-11-24
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.
Onshore industrial wind turbine locations for the United States up to March 2014
Diffendorfer, Jay E.; Kramer, Louisa A.; Ancona, Zach H.; Garrity, Christopher P.
2015-01-01
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating. PMID:26601687
Supplementary Education: Global Growth, Japan's Experience, Canada's Future
ERIC Educational Resources Information Center
Dierkes, Julian
2008-01-01
Supplementary education is on the rise globally, taking many different forms, from private tutors to small schools and large corporations. These providers exist outside conventional public and private school systems, offering remedial education and tutoring, parallel instruction to conventional schools, and accelerated or more advanced…
Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M
2017-12-06
While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.
Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V
2018-06-25
Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.
Internet Resources for Radio Astronomy
NASA Astrophysics Data System (ADS)
Andernach, H.
A subjective overview of Internet resources for radio-astronomical information is presented. Basic observing techniques and their implications for the interpretation of publicly available radio data are described, followed by a discussion of existing radio surveys, their level of optical identification, and nomenclature of radio sources. Various collections of source catalogues and databases for integrated radio source parameters are reviewed and compared, as well as the web interfaces to interrogate the current and ongoing large-area surveys. Links to radio observatories with archives of raw (uv-) data are presented, as well as services providing images, both of individual objects or extracts (``cutouts'') from large-scale surveys. While the emphasis is on radio continuum data, a brief list of sites providing spectral line data, and atomic or molecular information is included. The major radio telescopes and surveys under construction or planning are outlined. A summary is given of a search for previously unknown optically bright radio sources, as performed by the students as an exercise, using Internet resources only. Over 200 different links are mentioned and were verified, but despite the attempt to make this report up-to-date, it can only provide a snapshot of the situation as of mid-1998.
NASA Astrophysics Data System (ADS)
Hussain, M.; Chen, D.
2014-11-01
Buildings, the basic unit of an urban landscape, host most of its socio-economic activities and play an important role in the creation of urban land-use patterns. The spatial arrangement of different building types creates varied urban land-use clusters which can provide an insight to understand the relationships between social, economic, and living spaces. The classification of such urban clusters can help in policy-making and resource management. In many countries including the UK no national-level cadastral database containing information on individual building types exists in public domain. In this paper, we present a framework for inferring functional types of buildings based on the analysis of their form (e.g. geometrical properties, such as area and perimeter, layout) and spatial relationship from large topographic and address-based GIS database. Machine learning algorithms along with exploratory spatial analysis techniques are used to create the classification rules. The classification is extended to two further levels based on the functions (use) of buildings derived from address-based data. The developed methodology was applied to the Manchester metropolitan area using the Ordnance Survey's MasterMap®, a large-scale topographic and address-based data available for the UK.
Angermeier, Paul L.; Frimpong, Emmanuel A.
2011-01-01
The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. We have compiled a database of > 100 traits for 809 (731 native and 78 nonnative) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database, named Fish Traits, contains information on four major categories of traits: (1) trophic ecology; (2) body size, reproductive ecology, and life history; (3) habitat preferences; and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status was also compiled. The database enhances many opportunities for conducting research on fish species traits and constitutes the first step toward establishing a central repository for a continually expanding set of traits of North American fishes.
Social learning in cooperative dilemmas
Lamba, Shakti
2014-01-01
Helping is a cornerstone of social organization and commonplace in human societies. A major challenge for the evolutionary sciences is to explain how cooperation is maintained in large populations with high levels of migration, conditions under which cooperators can be exploited by selfish individuals. Cultural group selection models posit that such large-scale cooperation evolves via selection acting on populations among which behavioural variation is maintained by the cultural transmission of cooperative norms. These models assume that individuals acquire cooperative strategies via social learning. This assumption remains empirically untested. Here, I test this by investigating whether individuals employ conformist or payoff-biased learning in public goods games conducted in 14 villages of a forager–horticulturist society, the Pahari Korwa of India. Individuals did not show a clear tendency to conform or to be payoff-biased and are highly variable in their use of social learning. This variation is partly explained by both individual and village characteristics. The tendency to conform decreases and to be payoff-biased increases as the value of the modal contribution increases. These findings suggest that the use of social learning in cooperative dilemmas is contingent on individuals' circumstances and environments, and question the existence of stably transmitted cultural norms of cooperation. PMID:24870041
Merlo, Lisa J.; Stone, Amanda M.; Bibbey, Alex
2013-01-01
This study aimed to develop and assess the psychometric properties of an English language measure of problematic mobile phone use. Participants were recruited from a university campus, health science center, and other public locations. The sample included 244 individuals (68.4% female) aged 18–75. Results supported a unidimensional factor structure for the 20-item self-report Problematic Use of Mobile Phones (PUMP) Scale. Internal consistency was excellent (α = 0.94). Strong correlations (r = .76, P < .001) were found between the PUMP Scale and an existing scale of cellular phone dependency that was validated in Asia, as well as items assessing frequency and intensity of mobile phone use. Results provide preliminary support for the use of the PUMP Scale to measure problematic use of mobile phones. PMID:24826371
Neuroscience-related research in Ghana: a systematic evaluation of direction and capacity.
Quansah, Emmanuel; Karikari, Thomas K
2016-02-01
Neurological and neuropsychiatric diseases account for considerable healthcare, economic and social burdens in Ghana. In order to effectively address these burdens, appropriately-trained scientists who conduct high-impact neuroscience research will be needed. Additionally, research directions should be aligned with national research priorities. However, to provide information about current neuroscience research productivity and direction, the existing capacity and focus need to be identified. This would allow opportunities for collaborative research and training to be properly explored and developmental interventions to be better targeted. In this study, we sought to evaluate the existing capacity and direction of neuroscience-related research in Ghana. To do this, we examined publications reporting research investigations authored by scientists affiliated with Ghanaian institutions in specific areas of neuroscience over the last two decades (1995-2015). 127 articles that met our inclusion criteria were systematically evaluated in terms of research foci, annual publication trends and author affiliations. The most actively-researched areas identified include neurocognitive impairments in non-nervous system disorders, depression and suicide, epilepsy and seizures, neurological impact of substance misuse, and neurological disorders. These studies were mostly hospital and community-based surveys. About 60% of these articles were published in the last seven years, suggesting a recent increase in research productivity. However, data on experimental and clinical research outcomes were particularly lacking. We suggest that future investigations should focus on the following specific areas where information was lacking: large-scale disease epidemiology, effectiveness of diagnostic platforms and therapeutic treatments, and the genetic, genomic and molecular bases of diseases.
ERIC Educational Resources Information Center
Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.
Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…
Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring
Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.
2015-04-14
Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.
Scales of Heterogeneities in the Continental Crust and Upper Mantle
NASA Astrophysics Data System (ADS)
Tittgemeyer, M.; Wenzel, F.; Ryberg, T.; Fuchs, K.
1999-09-01
A seismological characterization of crust and upper mantle can refer to large-scale averages of seismic velocities or to fluctuations of elastic parameters. Large is understood here relative to the wavelength used to probe the earth.¶In this paper we try to characterize crust and upper mantle by the fluctuations in media properties rather than by their average velocities. As such it becomes evident that different scales of heterogeneities prevail in different layers of crust and mantle. Although we cannot provide final models and an explanation of why these different scales exist, we believe that scales of inhomogeneities carry significant information regarding the tectonic processes that have affected the lower crust, the lithospheric and the sublithospheric upper mantle.¶We focus on four different types of small-scale inhomogeneities (1) the characteristics of the lower crust, (2) velocity fluctuations in the uppermost mantle, (3) scattering in the lowermost lithosphere and on (4) heterogeneities in the mantle transition zone.
Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey
2014-04-15
In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.
Estimation of regional-scale groundwater flow properties in the Bengal Basin of India and Bangladesh
Michael, H.A.; Voss, C.I.
2009-01-01
Quantitative evaluation of management strategies for long-term supply of safe groundwater for drinking from the Bengal Basin aquifer (India and Bangladesh) requires estimation of the large-scale hydrogeologic properties that control flow. The Basin consists of a stratified, heterogeneous sequence of sediments with aquitards that may separate aquifers locally, but evidence does not support existence of regional confining units. Considered at a large scale, the Basin may be aptly described as a single aquifer with higher horizontal than vertical hydraulic conductivity. Though data are sparse, estimation of regional-scale aquifer properties is possible from three existing data types: hydraulic heads, 14C concentrations, and driller logs. Estimation is carried out with inverse groundwater modeling using measured heads, by model calibration using estimated water ages based on 14C, and by statistical analysis of driller logs. Similar estimates of hydraulic conductivities result from all three data types; a resulting typical value of vertical anisotropy (ratio of horizontal to vertical conductivity) is 104. The vertical anisotropy estimate is supported by simulation of flow through geostatistical fields consistent with driller log data. The high estimated value of vertical anisotropy in hydraulic conductivity indicates that even disconnected aquitards, if numerous, can strongly control the equivalent hydraulic parameters of an aquifer system. ?? US Government 2009.
NASA Astrophysics Data System (ADS)
Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Belova, E.; Ellis, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Que, W.; Ren, Y.; Titus, P.; Yamada, M.; Yoo, J.
2014-12-01
A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE, is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) at Princeton (http://mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to space and solar plasmas. The motivating major physics questions, the construction status, and the planned collaborative research especially with space and solar research communities will be discussed.
NASA Astrophysics Data System (ADS)
Ji, Hantao; Bhattacharjee, A.; Prager, S.; Daughton, W.; Bale, Stuart D.; Carter, T.; Crocker, N.; Drake, J.; Egedal, J.; Sarff, J.; Fox, W.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Yamada, M.; Yoo, J.
2015-04-01
A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE (flare.pppl.gov), is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to heliophysical and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) (mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to magnetospheric, solar wind, and solar coronal plasmas. After a brief summary of recent laboratory results on the topic of magnetic reconnection, the motivating major physics questions, the construction status, and the planned collaborative research especially with heliophysics communities will be discussed.
Thermal non-equilibrium effect of small-scale structures in compressible turbulence
NASA Astrophysics Data System (ADS)
Li, Shi-Yi; Li, Qi-Bing
2018-05-01
The thermal non-equilibrium effect of the small-scale structures in the canonical two-dimensional turbulence is studied. Comparative studies of Unified Gas Kinetic Scheme (UGKS) and GKS-Navier-Stokes (NS) for Taylor-Green flow with initial Ma = 1, Kn = 0.01 and decaying isotropic turbulence with initial Mat = 1, Reλ = 20 show that the discrepancy exists both in small and large scales, even beyond the dissipation range to 10η with accuracy to 8% in the SGS energy transfer of the decaying isotropic turbulence, illustrating the necessity for resolving the kinetic scales even at moderated Reλ = 20.
Micron-scale lens array having diffracting structures
Goldberg, Kenneth A
2013-10-29
A novel micron-scale lens, a microlens, is engineered to concentrate light efficiently onto an area of interest, such as a small, light-sensitive detector element in an integrated electronic device. Existing microlens designs imitate the form of large-scale lenses and are less effective at small sizes. The microlenses described herein have been designed to accommodate diffraction effects, which dominate the behavior of light at small length scales. Thus a new class of light-concentrating optical elements with much higher relative performance has been created. Furthermore, the new designs are much easier to fabricate than previous designs.
Resource Management for Distributed Parallel Systems
NASA Technical Reports Server (NTRS)
Neuman, B. Clifford; Rao, Santosh
1993-01-01
Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.
The XMM Large Scale Structure Survey
NASA Astrophysics Data System (ADS)
Pierre, Marguerite
2005-10-01
We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.
NASA Astrophysics Data System (ADS)
Khandelwal, A.; Karpatne, A.; Kumar, V.
2017-12-01
In this paper, we present novel methods for producing surface water maps at 30 meter spatial resolution at a daily temporal resolution. These new methods will make use of the MODIS spectral data from Terra (available daily since 2000) to produce daily maps at 250 meter and 500 meter resolution, and then refine them using the relative elevation ordering of pixels at 30 meter resolution. The key component of these methods is the use of elevation structure (relative elevation ordering) of a water body. Elevation structure is not explicitly available at desired resolution for most water bodies in the world and hence it will be estimated using our previous work that uses the history of imperfect labels. In this paper, we will present a new technique that uses elevation structure (unlike existing pixel based methods) to enforce temporal consistency in surface water extents (lake area on nearby dates is likely to be very similar). This will greatly improve the quality of the MODIS scale land/water labels since daily MODIS data can have a large amount of missing (or poor quality) data due to clouds and other factors. The quality of these maps will be further improved using elevation based resolution refinement approach that will make use of elevation structure estimated at Landsat scale. With the assumption that elevation structure does not change over time, it provides a very effective way to transfer information between datasets even when they are not observed concurrently. In this work, we will derive elevation structure at Landsat scale from monthly water extent maps spanning 1984-2015, publicly available through a joint effort of Google Earth Engine and the European Commission's Joint Research Centre (JRC). This elevation structure will then be used to refine spatial resolution of Modis scale maps from 2000 onwards. We will present the analysis of these methods on a large and diverse set of water bodies across the world.
Predicting protein functions from redundancies in large-scale protein interaction networks
NASA Technical Reports Server (NTRS)
Samanta, Manoj Pratim; Liang, Shoudan
2003-01-01
Interpreting data from large-scale protein interaction experiments has been a challenging task because of the widespread presence of random false positives. Here, we present a network-based statistical algorithm that overcomes this difficulty and allows us to derive functions of unannotated proteins from large-scale interaction data. Our algorithm uses the insight that if two proteins share significantly larger number of common interaction partners than random, they have close functional associations. Analysis of publicly available data from Saccharomyces cerevisiae reveals >2,800 reliable functional associations, 29% of which involve at least one unannotated protein. By further analyzing these associations, we derive tentative functions for 81 unannotated proteins with high certainty. Our method is not overly sensitive to the false positives present in the data. Even after adding 50% randomly generated interactions to the measured data set, we are able to recover almost all (approximately 89%) of the original associations.
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.
1975-01-01
The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.
Fat-soluble vitamins as disease modulators in multiple sclerosis.
Torkildsen, Ø; Løken-Amsrud, K I; Wergeland, S; Myhr, K-M; Holmøy, T
2013-01-01
Fat-soluble vitamins (A, D, E and K) have properties that could be relevant as modulators of disease activity in multiple sclerosis (MS). We performed a systematic search on PubMed and Medline up to May 2012, using the search strings 'vitamin A', 'retinol', 'retinal', 'carotenoids', 'vitamin D', 'vitamin E', 'alpha-tocopherol', 'vitamin K' in conjunction with 'multiple sclerosis', 'animal model' and 'experimental autoimmune encephalitis (EAE)'. In addition, the reference lists of the publications identified were examined for further citations of relevance. There is comprehensive evidence from epidemiological, observational, and experimental studies that vitamin D may be beneficial in MS. Results from small-scale clinical studies are inconclusive, and large-scale, adequately powered, randomized, controlled trials are still lacking. For vitamin D, Oxford Centre for Evidence-Based Medicine level 2c evidence exists for a positive therapeutic effect. Evidence from animal models indicates that all the examined fat-soluble vitamins could have potential as modulators of disease activity in MS. For vitamin A and E, level 4 and 5 evidence exists for a modulatory effect in MS; for vitamin K, too few studies have been conducted to indicate an effect in humans. Vitamin D is a promising candidate as modulator of disease activity in MS, and controlled studies are currently being conducted. All the fat-soluble vitamins have, however, been demonstrated to be effective in different animal models for the disease, and vitamin A and E have biological properties that could be relevant for MS pathogenesis. Thus, vitamin A and E seem to be promising candidates for future case-control and cohort studies. © 2012 John Wiley & Sons A/S.
Gray, Kathleen
2016-01-01
Health informatics has a major role to play in optimising the management and use of data, information and knowledge in health systems. As health systems undergo digital transformation, it is important to consider informatics approaches not only to curriculum content but also to the design of learning environments and learning activities for health professional learning and development. An example of such an informatics approach is the use of large-scale, integrated public health platforms on the Internet as part of health professional learning and development. This article describes selected examples of such platforms, with a focus on how they may influence the direction of health professional learning and development. Significance for public health The landscape of healthcare systems, public health systems, health research systems and professional education systems is fragmented, with many gaps and silos. More sophistication in the management of health data, information, and knowledge, based on public health informatics expertise, is needed to tackle key issues of prevention, promotion and policy-making. Platform technologies represent an emerging large-scale, highly integrated informatics approach to public health, combining the technologies of Internet, the web, the cloud, social technologies, remote sensing and/or mobile apps into an online infrastructure that can allow more synergies in work within and across these systems. Health professional curricula need updating so that the health workforce has a deep and critical understanding of the way that platform technologies are becoming the foundation of the health sector. PMID:27190977
The NASA Technical Report Server
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.; Paulson, Sharon S.; Binkley, Robert L.; Kellogg, Yvonne D.; Beaumont, Chris J.; Schmunk, Robert B.; Kurtz, Michael J.; Accomazzi, Alberto
1995-01-01
The National Aeronautics and Space Act of 1958 established NASA and charged it to "provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof." The search for innovative methods to distribute NASA's information lead a grass-roots team to create the NASA Technical Report Server (NTRS), which uses the World Wide Web and other popular Internet-based information systems as search engines. The NTRS is an inter-center effort which provides uniform access to various distributed publication servers residing on the Internet. Users have immediate desktop access to technical publications from NASA centers and institutes. The NTRS is comprised of several units, some constructed especially for inclusion in NTRS, and others that are existing NASA publication services that NTRS reuses. This paper presents the NTRS architecture, usage metrics, and the lessons learned while implementing and maintaining the service. The NTRS is largely constructed with freely available software running on existing hardware. NTRS builds upon existing hardware and software, and the resulting additional exposure for the body of literature contained ensures that NASA's institutional knowledge base will continue to receive the widest practicable and appropriate dissemination.
Ice Shape Scaling for Aircraft in SLD Conditions
NASA Technical Reports Server (NTRS)
Anderson, David N.; Tsao, Jen-Ching
2008-01-01
This paper has summarized recent NASA research into scaling of SLD conditions with data from both SLD and Appendix C tests. Scaling results obtained by applying existing scaling methods for size and test-condition scaling will be reviewed. Large feather growth issues, including scaling approaches, will be discussed briefly. The material included applies only to unprotected, unswept geometries. Within the limits of the conditions tested to date, the results show that the similarity parameters needed for Appendix C scaling also can be used for SLD scaling, and no additional parameters are required. These results were based on visual comparisons of reference and scale ice shapes. Nearly all of the experimental results presented have been obtained in sea-level tunnels. The currently recommended methods to scale model size, icing limit and test conditions are described.
What initial condition of inflation would suppress the large-scale CMB spectrum?
Chen, Pisin; Lin, Yu -Hsiang
2016-01-08
There is an apparent power deficit relative to the Λ CDM prediction of the cosmic microwave background spectrum at large scales, which, though not yet statistically significant, persists from WMAP to Planck data. Proposals that invoke some form of initial condition for the inflation have been made to address this apparent power suppression, albeit with conflicting conclusions. By studying the curvature perturbations of a scalar field in the Friedmann-Lemaître-Robertson-Walker universe parameterized by the equation of state parameter w, we find that the large-scale spectrum at the end of inflation reflects the superhorizon spectrum of the initial state. The large-scale spectrummore » is suppressed if the universe begins with the adiabatic vacuum in a superinflation (w < –1) or positive-pressure (w > 0) era. In the latter case, there is however no causal mechanism to establish the initial adiabatic vacuum. On the other hand, as long as the universe begins with the adiabatic vacuum in an era with –1 < w < 0, even if there exists an intermediate positive-pressure era, the large-scale spectrum would be enhanced rather than suppressed. In conclusion, we further calculate the spectrum of a two-stage inflation model with a two-field potential and show that the result agrees with that obtained from the ad hoc single-field analysis.« less
Digital Scholarship and Open Access
ERIC Educational Resources Information Center
Losoff, Barbara; Pence, Harry E.
2010-01-01
Open access publications provide scholars with unrestricted access to the "conversation" that is the basis for the advancement of knowledge. The large number of open access journals, archives, and depositories already in existence demonstrates the technical and economic viability of providing unrestricted access to the literature that is the…
Clickstream data yields high-resolution maps of science.
Bollen, Johan; Van de Sompel, Herbert; Hagberg, Aric; Bettencourt, Luis; Chute, Ryan; Rodriguez, Marko A; Balakireva, Lyudmila
2009-01-01
Intricate maps of science have been created from citation data to visualize the structure of scientific activity. However, most scientific publications are now accessed online. Scholarly web portals record detailed log data at a scale that exceeds the number of all existing citations combined. Such log data is recorded immediately upon publication and keeps track of the sequences of user requests (clickstreams) that are issued by a variety of users across many different domains. Given these advantages of log datasets over citation data, we investigate whether they can produce high-resolution, more current maps of science. Over the course of 2007 and 2008, we collected nearly 1 billion user interactions recorded by the scholarly web portals of some of the most significant publishers, aggregators and institutional consortia. The resulting reference data set covers a significant part of world-wide use of scholarly web portals in 2006, and provides a balanced coverage of the humanities, social sciences, and natural sciences. A journal clickstream model, i.e. a first-order Markov chain, was extracted from the sequences of user interactions in the logs. The clickstream model was validated by comparing it to the Getty Research Institute's Architecture and Art Thesaurus. The resulting model was visualized as a journal network that outlines the relationships between various scientific domains and clarifies the connection of the social sciences and humanities to the natural sciences. Maps of science resulting from large-scale clickstream data provide a detailed, contemporary view of scientific activity and correct the underrepresentation of the social sciences and humanities that is commonly found in citation data.
Secure and interoperable communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David
2016-05-01
The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.
Clickstream Data Yields High-Resolution Maps of Science
Bollen, Johan; Van de Sompel, Herbert; Rodriguez, Marko A.; Balakireva, Lyudmila
2009-01-01
Background Intricate maps of science have been created from citation data to visualize the structure of scientific activity. However, most scientific publications are now accessed online. Scholarly web portals record detailed log data at a scale that exceeds the number of all existing citations combined. Such log data is recorded immediately upon publication and keeps track of the sequences of user requests (clickstreams) that are issued by a variety of users across many different domains. Given these advantages of log datasets over citation data, we investigate whether they can produce high-resolution, more current maps of science. Methodology Over the course of 2007 and 2008, we collected nearly 1 billion user interactions recorded by the scholarly web portals of some of the most significant publishers, aggregators and institutional consortia. The resulting reference data set covers a significant part of world-wide use of scholarly web portals in 2006, and provides a balanced coverage of the humanities, social sciences, and natural sciences. A journal clickstream model, i.e. a first-order Markov chain, was extracted from the sequences of user interactions in the logs. The clickstream model was validated by comparing it to the Getty Research Institute's Architecture and Art Thesaurus. The resulting model was visualized as a journal network that outlines the relationships between various scientific domains and clarifies the connection of the social sciences and humanities to the natural sciences. Conclusions Maps of science resulting from large-scale clickstream data provide a detailed, contemporary view of scientific activity and correct the underrepresentation of the social sciences and humanities that is commonly found in citation data. PMID:19277205
Restoring the Power Projection Capabilities of the U.S. Armed Forces
2017-02-16
Armed Services on February 16, 2017. For more information on this publication , visit www.rand.org/pubs/testimonies/CT464.html Testimonies RAND...law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is...planning prior to Russia’s attacks on Ukraine did not take account of the need to deter large-scale aggression against the North Atlantic Treaty
ERIC Educational Resources Information Center
O'Neil, Bryan L.
The purpose of the practicum was to determine the societal factors existing in the accounting industry and accounting education, with the aim of integrating the changing regulations and environment of the industry into the classroom at Castleton State College (Vermont). A group of certified public accountants were surveyed by Likert scale to learn…
Institutional obstacles to expansion of world food production.
Crosson, P R
1975-05-09
It was argued that over the near-to-medium term-roughly to the mid-1980's-there is enough potential for growth in existing Green Revolution technology and in technical capacity of farmers that institutions affecting these two sources of increased food production probably will not be seriously constraining. The principal bottlenecks likely will be found among those institutions affecting farmers' incentives to innovate. There is impressive evidence that when other conditions for innovation are favorable the supply of marketing services, for both inputs and outputs, is quite elastic. This seems to include the supply of funds from rural saving and informal credit sources, although the evidence is less clear in this respect. The situation concerning price relations and availability of inputs appears mixed. If national income growth targets are achieved, then the growth in total demand for food in the LDC's should be fast enough to support incentive prices for farmers. This advantage could be lost, however, if governments adopt policies to suppress food prices to keep down the cost of living. The price of fertilizers is expected to fall from the high levels of 1974, the amount of the fall depending in good measure on the success of the LDC's in increasing fertilizer production. Historically, their efforts to expand capacity have been relatively inefficient. Moreover, many countries still lack adequate capacity to produce the HYV's and pesticides. Even with good progress in expanding domestic production of inputs, imports will continue to be an important source of supply. Maintenance of present high prices of petroleum products could be a major obstacle to financing these imports on the necessary scale because of the drain it would place on available foreign exchange. I conclude, on balance, that prices and availability of fertilizers, pesticides, and seeds could have important negative effects on farmers' incentives to adopt Green Revolution technology. Rigidities in water management institutions may be even more limiting, for reasons noted in the previous section. The role of existing land tenure institutions is not clear. The tentative conclusion, however, is that over the near-to-medium term the maintenance will not be a major obstacle to further spread of the Green Revolution. Over the longer term, it could become more seriously limiting. The reason is that continued expansion of food production will eventually require the invention and adoption of new technologies and a higher level of technical and managerial skill than most farmers in the LDC's now possess. To do this will require substantial investments in domestic research and extension institutions and in rural education. In countries where a small class of large landowners wield substantial political power, these investments may not occur on the necessary scale because the large farmers have their own means of acquiring the technology and little perceived interest in supporting the upgrading of the skills of small farmers. This review of institutional obstacles to expansion of food production in the LDC's must end on a tentative note. The review does suggest some observations about the process of institutional change, however. There is impressive evidence of strong latent potential in the private sector of the LDC's for mobilizing the resources and effort needed for agricultural progress when the private economic rewards for doing so are high. Under these circumstances, needed changes in the institutions required to mobilize the resources and direct the effort seem relatively easy to achieve. Institutional resistance is stronger in situations where influential interests perceive change as a threat or where there is no direct personal economic reward to change, as in the typical public institution. The latter point is particularly important because the performance of public institutions is critical. Development of new technology, the fundamental condition for continued longterm growth, is basically a public responsibility because the gains from adoption usually cannot be sufficiently captured by private institutions to justify their assuming the cost. Although private firms often have incentives to impart technical knowledge to farmers as a way of widening the market for their products, the broadening and strengthening of the institutional structures concerned with both the general and technical education of farmers is a public responsibility. This is true also of the development of large irrigation systems, both because of the scale of the needed investments and the potential for social conflict in water management. The lack of a well-defined mechanism that would link responses of public institutions to the large social payoffs to increased public investment in irrigation, new technology, and technical abilities of farmers may prove in the long run to be the most important single obstacle to adequate growth of food production in the LDC's.