Sample records for bringing large-scale multiple

  1. The feasibility of using 'bring your own device' (BYOD) technology for electronic data capture in multicentre medical audit and research.

    PubMed

    Faulds, M C; Bauchmuller, K; Miller, D; Rosser, J H; Shuker, K; Wrench, I; Wilson, P; Mills, G H

    2016-01-01

    Large-scale audit and research projects demand robust, efficient systems for accurate data collection, handling and analysis. We utilised a multiplatform 'bring your own device' (BYOD) electronic data collection app to capture observational audit data on theatre efficiency across seven hospital Trusts in South Yorkshire in June-August 2013. None of the participating hospitals had a dedicated information governance policy for bring your own device. Data were collected by 17 investigators for 392 individual theatre lists, capturing 14,148 individual data points, 12, 852 (91%) of which were transmitted to a central database on the day of collection without any loss of data. BYOD technology enabled accurate collection of a large volume of secure data across multiple NHS organisations over a short period of time. Bring your own device technology provides a method for collecting real-time audit, research and quality improvement data within healthcare systems without compromising patient data protection. © 2015 The Association of Anaesthetists of Great Britain and Ireland.

  2. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules.

    PubMed

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I; de Boer, Pascal; Hagen, Kees C W; Hoogenboom, Jacob P; Giepmans, Ben N G

    2017-04-07

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale 'color-EM' as a promising tool to unravel molecular (de)regulation in biomedicine.

  3. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules

    PubMed Central

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I.; de Boer, Pascal; Hagen, Kees (C.) W.; Hoogenboom, Jacob P.; Giepmans, Ben N. G.

    2017-01-01

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale ‘color-EM’ as a promising tool to unravel molecular (de)regulation in biomedicine. PMID:28387351

  4. Beyond the Cell: Using Multiscalar Topics to Bring Interdisciplinarity into Undergraduate Cellular Biology Courses

    PubMed Central

    Weber, Carolyn F.

    2016-01-01

    Western science has grown increasingly reductionistic and, in parallel, the undergraduate life sciences curriculum has become disciplinarily fragmented. While reductionistic approaches have led to landmark discoveries, many of the most exciting scientific advances in the late 20th century have occurred at disciplinary interfaces; work at these interfaces is necessary to manage the world’s looming problems, particularly those that are rooted in cellular-level processes but have ecosystem- and even global-scale ramifications (e.g., nonsustainable agriculture, emerging infectious diseases). Managing such problems requires comprehending whole scenarios and their emergent properties as sums of their multiple facets and complex interrelationships, which usually integrate several disciplines across multiple scales (e.g., time, organization, space). This essay discusses bringing interdisciplinarity into undergraduate cellular biology courses through the use of multiscalar topics. Discussing how cellular-level processes impact large-scale phenomena makes them relevant to everyday life and unites diverse disciplines (e.g., sociology, cell biology, physics) as facets of a single system or problem, emphasizing their connections to core concepts in biology. I provide specific examples of multiscalar topics and discuss preliminary evidence that using such topics may increase students’ understanding of the cell’s position within an ecosystem and how cellular biology interfaces with other disciplines. PMID:27146162

  5. Cloud-Scale Genomic Signals Processing for Robust Large-Scale Cancer Genomic Microarray Data Analysis.

    PubMed

    Harvey, Benjamin Simeon; Ji, Soo-Yeon

    2017-01-01

    As microarray data available to scientists continues to increase in size and complexity, it has become overwhelmingly important to find multiple ways to bring forth oncological inference to the bioinformatics community through the analysis of large-scale cancer genomic (LSCG) DNA and mRNA microarray data that is useful to scientists. Though there have been many attempts to elucidate the issue of bringing forth biological interpretation by means of wavelet preprocessing and classification, there has not been a research effort that focuses on a cloud-scale distributed parallel (CSDP) separable 1-D wavelet decomposition technique for denoising through differential expression thresholding and classification of LSCG microarray data. This research presents a novel methodology that utilizes a CSDP separable 1-D method for wavelet-based transformation in order to initialize a threshold which will retain significantly expressed genes through the denoising process for robust classification of cancer patients. Additionally, the overall study was implemented and encompassed within CSDP environment. The utilization of cloud computing and wavelet-based thresholding for denoising was used for the classification of samples within the Global Cancer Map, Cancer Cell Line Encyclopedia, and The Cancer Genome Atlas. The results proved that separable 1-D parallel distributed wavelet denoising in the cloud and differential expression thresholding increased the computational performance and enabled the generation of higher quality LSCG microarray datasets, which led to more accurate classification results.

  6. The relativistic feedback discharge model of terrestrial gamma ray flashes

    NASA Astrophysics Data System (ADS)

    Dwyer, Joseph R.

    2012-02-01

    As thunderclouds charge, the large-scale fields may approach the relativistic feedback threshold, above which the production of relativistic runaway electron avalanches becomes self-sustaining through the generation of backward propagating runaway positrons and backscattered X-rays. Positive intracloud (IC) lightning may force the large-scale electric fields inside thunderclouds above the relativistic feedback threshold, causing the number of runaway electrons, and the resulting X-ray and gamma ray emission, to grow exponentially, producing very large fluxes of energetic radiation. As the flux of runaway electrons increases, ionization eventually causes the electric field to discharge, bringing the field below the relativistic feedback threshold again and reducing the flux of runaway electrons. These processes are investigated with a new model that includes the production, propagation, diffusion, and avalanche multiplication of runaway electrons; the production and propagation of X-rays and gamma rays; and the production, propagation, and annihilation of runaway positrons. In this model, referred to as the relativistic feedback discharge model, the large-scale electric fields are calculated self-consistently from the charge motion of the drifting low-energy electrons and ions, produced from the ionization of air by the runaway electrons, including two- and three-body attachment and recombination. Simulation results show that when relativistic feedback is considered, bright gamma ray flashes are a natural consequence of upward +IC lightning propagating in large-scale thundercloud fields. Furthermore, these flashes have the same time structures, including both single and multiple pulses, intensities, angular distributions, current moments, and energy spectra as terrestrial gamma ray flashes, and produce large current moments that should be observable in radio waves.

  7. Forming an ad-hoc nearby storage, based on IKAROS and social networking services

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos; Cotronis, Yiannis; Markou, Christos

    2014-06-01

    We present an ad-hoc "nearby" storage, based on IKAROS and social networking services, such as Facebook. By design, IKAROS is capable to increase or decrease the number of nodes of the I/O system instance on the fly, without bringing everything down or losing data. IKAROS is capable to decide the file partition distribution schema, by taking on account requests from the user or an application, as well as a domain or a Virtual Organization policy. In this way, it is possible to form multiple instances of smaller capacity higher bandwidth storage utilities capable to respond in an ad-hoc manner. This approach, focusing on flexibility, can scale both up and down and so can provide more cost effective infrastructures for both large scale and smaller size systems. A set of experiments is performed comparing IKAROS with PVFS2 by using multiple clients requests under HPC IOR benchmark and MPICH2.

  8. Multiple scales and phases in discrete chains with application to folded proteins

    NASA Astrophysics Data System (ADS)

    Sinelnikova, A.; Niemi, A. J.; Nilsson, Johan; Ulybyshev, M.

    2018-05-01

    Chiral heteropolymers such as large globular proteins can simultaneously support multiple length scales. The interplay between the different scales brings about conformational diversity, determines the phase properties of the polymer chain, and governs the structure of the energy landscape. Most importantly, multiple scales produce complex dynamics that enable proteins to sustain live matter. However, at the moment there is incomplete understanding of how to identify and distinguish the various scales that determine the structure and dynamics of a complex protein. Here we address this impending problem. We develop a methodology with the potential to systematically identify different length scales, in the general case of a linear polymer chain. For this we introduce and analyze the properties of an order parameter that can both reveal the presence of different length scales and can also probe the phase structure. We first develop our concepts in the case of chiral homopolymers. We introduce a variant of Kadanoff's block-spin transformation to coarse grain piecewise linear chains, such as the C α backbone of a protein. We derive analytically, and then verify numerically, a number of properties that the order parameter can display, in the case of a chiral polymer chain. In particular, we propose that in the case of a chiral heteropolymer the order parameter can reveal traits of several different phases, contingent on the length scale at which it is scrutinized. We confirm that this is the case with crystallographic protein structures in the Protein Data Bank. Thus our results suggest relations between the scales, the phases, and the complexity of folding pathways.

  9. Dynamic DNA Methylation Controls Glutamate Receptor Trafficking and Synaptic Scaling

    PubMed Central

    Sweatt, J. David

    2016-01-01

    Hebbian plasticity, including LTP and LTD, has long been regarded as important for local circuit refinement in the context of memory formation and stabilization. However, circuit development and stabilization additionally relies on non-Hebbian, homoeostatic, forms of plasticity such as synaptic scaling. Synaptic scaling is induced by chronic increases or decreases in neuronal activity. Synaptic scaling is associated with cell-wide adjustments in postsynaptic receptor density, and can occur in a multiplicative manner resulting in preservation of relative synaptic strengths across the entire neuron's population of synapses. Both active DNA methylation and de-methylation have been validated as crucial regulators of gene transcription during learning, and synaptic scaling is known to be transcriptionally dependent. However, it has been unclear whether homeostatic forms of plasticity such as synaptic scaling are regulated via epigenetic mechanisms. This review describes exciting recent work that has demonstrated a role for active changes in neuronal DNA methylation and demethylation as a controller of synaptic scaling and glutamate receptor trafficking. These findings bring together three major categories of memory-associated mechanisms that were previously largely considered separately: DNA methylation, homeostatic plasticity, and glutamate receptor trafficking. PMID:26849493

  10. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  11. Large Scale Triboelectric Nanogenerator and Self-Powered Pressure Sensor Array Using Low Cost Roll-to-Roll UV Embossing

    PubMed Central

    Dhakar, Lokesh; Gudla, Sudeep; Shan, Xuechuan; Wang, Zhiping; Tay, Francis Eng Hock; Heng, Chun-Huat; Lee, Chengkuo

    2016-01-01

    Triboelectric nanogenerators (TENGs) have emerged as a potential solution for mechanical energy harvesting over conventional mechanisms such as piezoelectric and electromagnetic, due to easy fabrication, high efficiency and wider choice of materials. Traditional fabrication techniques used to realize TENGs involve plasma etching, soft lithography and nanoparticle deposition for higher performance. But lack of truly scalable fabrication processes still remains a critical challenge and bottleneck in the path of bringing TENGs to commercial production. In this paper, we demonstrate fabrication of large scale triboelectric nanogenerator (LS-TENG) using roll-to-roll ultraviolet embossing to pattern polyethylene terephthalate sheets. These LS-TENGs can be used to harvest energy from human motion and vehicle motion from embedded devices in floors and roads, respectively. LS-TENG generated a power density of 62.5 mW m−2. Using roll-to-roll processing technique, we also demonstrate a large scale triboelectric pressure sensor array with pressure detection sensitivity of 1.33 V kPa−1. The large scale pressure sensor array has applications in self-powered motion tracking, posture monitoring and electronic skin applications. This work demonstrates scalable fabrication of TENGs and self-powered pressure sensor arrays, which will lead to extremely low cost and bring them closer to commercial production. PMID:26905285

  12. Roadmap for Scaling and Multifractals in Geosciences: still a long way to go ?

    NASA Astrophysics Data System (ADS)

    Schertzer, Daniel; Lovejoy, Shaun

    2010-05-01

    The interest in scale symmetries (scaling) in Geosciences has never lessened since the first pioneering EGS session on chaos and fractals 22 years ago. The corresponding NP activities have been steadily increasing, covering a wider and wider diversity of geophysical phenomena and range of space-time scales. Whereas interest was initially largely focused on atmospheric turbulence, rain and clouds at small scales, it has quickly broadened to much larger scales and to much wider scale ranges, to include ocean sciences, solid earth and space physics. Indeed, the scale problem being ubiquitous in Geosciences, it is indispensable to share the efforts and the resulting knowledge as much as possible. There have been numerous achievements which have followed from the exploration of larger and larger datasets with finer and finer resolutions, from both modelling and theoretical discussions, particularly on formalisms for intermittency, anisotropy and scale symmetry, multiple scaling (multifractals) vs. simple scaling,. We are now way beyond the early pioneering but tentative attempts using crude estimates of unique scaling exponents to bring some credence to the fact that scale symmetries are key to most nonlinear geoscience problems. Nowadays, we need to better demonstrate that scaling brings effective solutions to geosciences and therefore to society. A large part of the answer corresponds to our capacity to create much more universal and flexible tools to multifractally analyse in straightforward and reliable manners complex and complicated systems such as the climate. Preliminary steps in this direction are already quite encouraging: they show that such approaches explain both the difficulty of classical techniques to find trends in climate scenarios (particularly for extremes) and resolve them with the help of scaling estimators. The question of the reliability and accuracy of these methods is not trivial. After discussing these important, but rather short term issues, we will point out more general questions, which can be put together into the following provocative question: how to convert the classical time evolving deterministic PDE's into dynamical multifractal systems? We will argue that this corresponds to an already active field of research, which include: multifractals as generic solutions of nonlinear PDE (exact results for 1D Burgers equation and a few other caricatures of Navier-Stokes equations, prospects for 3D Burgers equations), cascade structures of numerical weather models, links between multifractal processes and random dynamical systems, and the challenging debate on the most relevant stochastic multifractal formalism, whereas there is already a rather general consent about the deterministic one.

  13. New design for interfacing computers to the Octopus network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sloan, L.J.

    1977-03-14

    The Lawrence Livermore Laboratory has several large-scale computers which are connected to the Octopus network. Several difficulties arise in providing adequate resources along with reliable performance. To alleviate some of these problems a new method of bringing large computers into the Octopus environment is proposed.

  14. Scaling the Pyramid Model across Complex Systems Providing Early Care for Preschoolers: Exploring How Models for Decision Making May Enhance Implementation Science

    ERIC Educational Resources Information Center

    Johnson, LeAnne D.

    2017-01-01

    Bringing effective practices to scale across large systems requires attending to how information and belief systems come together in decisions to adopt, implement, and sustain those practices. Statewide scaling of the Pyramid Model, a framework for positive behavior intervention and support, across different types of early childhood programs…

  15. Cloud-scale genomic signals processing classification analysis for gene expression microarray data.

    PubMed

    Harvey, Benjamin; Soo-Yeon Ji

    2014-01-01

    As microarray data available to scientists continues to increase in size and complexity, it has become overwhelmingly important to find multiple ways to bring inference though analysis of DNA/mRNA sequence data that is useful to scientists. Though there have been many attempts to elucidate the issue of bringing forth biological inference by means of wavelet preprocessing and classification, there has not been a research effort that focuses on a cloud-scale classification analysis of microarray data using Wavelet thresholding in a Cloud environment to identify significantly expressed features. This paper proposes a novel methodology that uses Wavelet based Denoising to initialize a threshold for determination of significantly expressed genes for classification. Additionally, this research was implemented and encompassed within cloud-based distributed processing environment. The utilization of Cloud computing and Wavelet thresholding was used for the classification 14 tumor classes from the Global Cancer Map (GCM). The results proved to be more accurate than using a predefined p-value for differential expression classification. This novel methodology analyzed Wavelet based threshold features of gene expression in a Cloud environment, furthermore classifying the expression of samples by analyzing gene patterns, which inform us of biological processes. Moreover, enabling researchers to face the present and forthcoming challenges that may arise in the analysis of data in functional genomics of large microarray datasets.

  16. [New Concept for Surviving Sepsis: from Phenomenon to Essence].

    PubMed

    Liao, Xue-Lian; Xie, Zhi-Chao; Kang, Yan

    2016-07-01

    Sepsis is a critical clinical syndrome which keep puzzling the medical profession for many years. Recently, the results from several large-scale trials challenged the necessity of early goal directed therapy (EGDT) in surviving sepsis bundle, These trials were not opposed to EGDT but bring new concept that it is essential to utilize therapy with multiple monitoring measures in order to minimize injury while guarantee the safety . Deeper understanding in the pathogenesis of sepsis gives rise to the update of its definition based on vital organ dysfunction. The importance of dynamic monitoring in defining sepsis also need to be emphasized. Developing more effective monitoring measures could provide better treatments, thus improve the prognosis of septic patients. Copyright© by Editorial Board of Journal of Sichuan University (Medical Science Edition).

  17. Managing data from multiple disciplines, scales, and sites to support synthesis and modeling

    USGS Publications Warehouse

    Olson, R. J.; Briggs, J. M.; Porter, J.H.; Mah, Grant R.; Stafford, S.G.

    1999-01-01

    The synthesis and modeling of ecological processes at multiple spatial and temporal scales involves bringing together and sharing data from numerous sources. This article describes a data and information system model that facilitates assembling, managing, and sharing diverse data from multiple disciplines, scales, and sites to support integrated ecological studies. Cross-site scientific-domain working groups coordinate the development of data associated with their particular scientific working group, including decisions about data requirements, data to be compiled, data formats, derived data products, and schedules across the sites. The Web-based data and information system consists of nodes for each working group plus a central node that provides data access, project information, data query, and other functionality. The approach incorporates scientists and computer experts in the working groups and provides incentives for individuals to submit documented data to the data and information system.

  18. Correlative Tomography

    PubMed Central

    Burnett, T. L.; McDonald, S. A.; Gholinia, A.; Geurts, R.; Janus, M.; Slater, T.; Haigh, S. J.; Ornek, C.; Almuaili, F.; Engelberg, D. L.; Thompson, G. E.; Withers, P. J.

    2014-01-01

    Increasingly researchers are looking to bring together perspectives across multiple scales, or to combine insights from different techniques, for the same region of interest. To this end, correlative microscopy has already yielded substantial new insights in two dimensions (2D). Here we develop correlative tomography where the correlative task is somewhat more challenging because the volume of interest is typically hidden beneath the sample surface. We have threaded together x-ray computed tomography, serial section FIB-SEM tomography, electron backscatter diffraction and finally TEM elemental analysis all for the same 3D region. This has allowed observation of the competition between pitting corrosion and intergranular corrosion at multiple scales revealing the structural hierarchy, crystallography and chemistry of veiled corrosion pits in stainless steel. With automated correlative workflows and co-visualization of the multi-scale or multi-modal datasets the technique promises to provide insights across biological, geological and materials science that are impossible using either individual or multiple uncorrelated techniques. PMID:24736640

  19. Frequency-encoded photonic qubits for scalable quantum information processing

    DOE PAGES

    Lukens, Joseph M.; Lougovski, Pavel

    2016-12-21

    Among the objectives for large-scale quantum computation is the quantum interconnect: a device that uses photons to interface qubits that otherwise could not interact. However, the current approaches require photons indistinguishable in frequency—a major challenge for systems experiencing different local environments or of different physical compositions altogether. Here, we develop an entirely new platform that actually exploits such frequency mismatch for processing quantum information. Labeled “spectral linear optical quantum computation” (spectral LOQC), our protocol offers favorable linear scaling of optical resources and enjoys an unprecedented degree of parallelism, as an arbitrary Ν-qubit quantum gate may be performed in parallel onmore » multiple Ν-qubit sets in the same linear optical device. Here, not only does spectral LOQC offer new potential for optical interconnects, but it also brings the ubiquitous technology of high-speed fiber optics to bear on photonic quantum information, making wavelength-configurable and robust optical quantum systems within reach.« less

  20. Frequency-encoded photonic qubits for scalable quantum information processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukens, Joseph M.; Lougovski, Pavel

    Among the objectives for large-scale quantum computation is the quantum interconnect: a device that uses photons to interface qubits that otherwise could not interact. However, the current approaches require photons indistinguishable in frequency—a major challenge for systems experiencing different local environments or of different physical compositions altogether. Here, we develop an entirely new platform that actually exploits such frequency mismatch for processing quantum information. Labeled “spectral linear optical quantum computation” (spectral LOQC), our protocol offers favorable linear scaling of optical resources and enjoys an unprecedented degree of parallelism, as an arbitrary Ν-qubit quantum gate may be performed in parallel onmore » multiple Ν-qubit sets in the same linear optical device. Here, not only does spectral LOQC offer new potential for optical interconnects, but it also brings the ubiquitous technology of high-speed fiber optics to bear on photonic quantum information, making wavelength-configurable and robust optical quantum systems within reach.« less

  1. Anomalous time delays and quantum weak measurements in optical micro-resonators

    PubMed Central

    Asano, M.; Bliokh, K. Y.; Bliokh, Y. P.; Kofman, A. G.; Ikuta, R.; Yamamoto, T.; Kivshar, Y. S.; Yang, L.; Imoto, N.; Özdemir, Ş.K.; Nori, F.

    2016-01-01

    Quantum weak measurements, wavepacket shifts and optical vortices are universal wave phenomena, which originate from fine interference of multiple plane waves. These effects have attracted considerable attention in both classical and quantum wave systems. Here we report on a phenomenon that brings together all the above topics in a simple one-dimensional scalar wave system. We consider inelastic scattering of Gaussian wave packets with parameters close to a zero of the complex scattering coefficient. We demonstrate that the scattered wave packets experience anomalously large time and frequency shifts in such near-zero scattering. These shifts reveal close analogies with the Goos–Hänchen beam shifts and quantum weak measurements of the momentum in a vortex wavefunction. We verify our general theory by an optical experiment using the near-zero transmission (near-critical coupling) of Gaussian pulses propagating through a nano-fibre with a side-coupled toroidal micro-resonator. Measurements demonstrate the amplification of the time delays from the typical inverse-resonator-linewidth scale to the pulse-duration scale. PMID:27841269

  2. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  3. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  4. Spike-train communities: finding groups of similar spike trains.

    PubMed

    Humphries, Mark D

    2011-02-09

    Identifying similar spike-train patterns is a key element in understanding neural coding and computation. For single neurons, similar spike patterns evoked by stimuli are evidence of common coding. Across multiple neurons, similar spike trains indicate potential cell assemblies. As recording technology advances, so does the urgent need for grouping methods to make sense of large-scale datasets of spike trains. Existing methods require specifying the number of groups in advance, limiting their use in exploratory analyses. I derive a new method from network theory that solves this key difficulty: it self-determines the maximum number of groups in any set of spike trains, and groups them to maximize intragroup similarity. This method brings us revealing new insights into the encoding of aversive stimuli by dopaminergic neurons, and the organization of spontaneous neural activity in cortex. I show that the characteristic pause response of a rat's dopaminergic neuron depends on the state of the superior colliculus: when it is inactive, aversive stimuli invoke a single pattern of dopaminergic neuron spiking; when active, multiple patterns occur, yet the spike timing in each is reliable. In spontaneous multineuron activity from the cortex of anesthetized cat, I show the existence of neural ensembles that evolve in membership and characteristic timescale of organization during global slow oscillations. I validate these findings by showing that the method both is remarkably reliable at detecting known groups and can detect large-scale organization of dynamics in a model of the striatum.

  5. Insights and Challenges to Integrating Data from Diverse Ecological Networks

    NASA Astrophysics Data System (ADS)

    Peters, D. P. C.

    2014-12-01

    Many of the most dramatic and surprising effects of global change occur across large spatial extents, from regions to continents, that impact multiple ecosystem types across a range of interacting spatial and temporal scales. The ability of ecologists and inter-disciplinary scientists to understand and predict these dynamics depend, in large part, on existing site-based research infrastructures that developed in response to historic events. Integrating these diverse sources of data is critical to addressing these broad-scale questions. A conceptual approach is presented to synthesize and integrate diverse sources and types of data from different networks of research sites. This approach focuses on developing derived data products through spatial and temporal aggregation that allow datasets collected with different methods to be compared. The approach is illustrated through the integration, analysis, and comparison of hundreds of long-term datasets from 50 ecological sites in the US that represent ecosystem types commonly found globally. New insights were found by comparing multiple sites using common derived data. In addition to "bringing to light" many dark data in a standardized, open access, easy-to-use format, a suite of lessons were learned that can be applied to up and coming research networks in the US and internationally. These lessons will be described along with the challenges, including cyber-infrastructure, cultural, and behavioral constraints associated with the use of big and little data, that may keep ecologists and inter-disciplinary scientists from taking full advantage of the vast amounts of existing and yet-to-be exposed data.

  6. Hurricane Isidore

    Atmospheric Science Data Center

    2013-04-18

    ... 20, 2002. After bringing large-scale flooding to western Cuba, Isidore was upgraded (on September 21) from a tropical storm to a ... Yucatan Peninsula, the hurricane caused major destruction and left hundreds of thousands of people homeless. Although weakened after ...

  7. REDUCING THE WASTE STREAM: BRINGING ENVIRONMENTAL, ECONOMICAL, AND EDUCATIONAL COMPOSTING TO A LIBERAL ARTS COLLEGE

    EPA Science Inventory

    The Northfield, Minnesota area contains three institutions that produce a large amount of compostable food waste. St. Olaf College uses a large-scale on-site composting machine that effectively transforms the food waste to compost, but the system requires an immense start-up c...

  8. Achieving Excellence: Bringing Effective Literacy Pedagogy to Scale in Ontario's Publicly-Funded Education System

    ERIC Educational Resources Information Center

    Gallagher, Mary Jean; Malloy, John; Ryerson, Rachel

    2016-01-01

    This paper offers an insiders' perspective on the large-scale, system-wide educational change undertaken in Ontario, Canada from 2003 to the present. The authors, Ministry and school system leaders intimately involved in this change process, explore how Ontario has come to be internationally recognized as an equitable, high-achieving, and…

  9. An Update on ToxCast™ | Science Inventory | US EPA

    EPA Pesticide Factsheets

    In its first phase, ToxCast™ is profiling over 300 well-characterized chemicals (primarily pesticides) in over 400 HTS endpoints. These endpoints include biochemical assays of protein function, cell-based transcriptional reporter assays, multi-cell interaction assays, transcriptomics on primary cell cultures, and developmental assays in zebrafish embryos. Almost all of the compounds being examined in Phase 1 of ToxCast™ have been tested in traditional toxicology tests, including developmental toxicity, multi-generation studies, and sub-chronic and chronic rodent bioassays Lessons learned to date for ToxCast: Large amounts of quality HTS data can be economically obtained. Large scale data sets will be required to understand potential for biological activity. Value in having multiple assays with overlapping coverage of biological pathways and a variety of methodologies Concentration-response will be important for ultimate interpretation Data transparency will be important for acceptance. Metabolic capabilities and coverage of developmental toxicity pathways will need additional attention. Need to define the gold standard Partnerships are needed to bring critical mass and expertise.

  10. Extending Technologies among Small-Scale Farmers in Meru, Kenya: Ingredients for Success in Farmer Groups

    ERIC Educational Resources Information Center

    Davis, Kristin; Franzel, Steven; Hildebrand, Peter; Irani, Tracy; Place, Nick

    2004-01-01

    Agricultural extension is evolving worldwide, and there is much emphasis today on community-based mechanisms of dissemination in order to bring sustainable change. The goal of this study was to examine the factors that make farmer groups successful in dissemination of information and technologies. A mixed-methods, multiple-stage approach was used…

  11. PHOTOVOLTAICS AND THE ENVIRONMENT 1998. REPORT ON THE WORKSHOP PHOTOVOLTAICS AND THE ENVIRONMENT 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FTHENAKIS,V.; ZWEIBEL,K.; MOSKOWITZ,P.

    1999-02-01

    The objective of the workshop ``Photovoltaics and the Environment'' was to bring together PV manufacturers and industry analysts to define EH and S issues related to the large-scale commercialization of PV technologies.

  12. A Mixed-dimensional Model for Determining the Impact of Permafrost Polygonal Ground Degradation on Arctic Hydrology.

    NASA Astrophysics Data System (ADS)

    Coon, E.; Jan, A.; Painter, S. L.; Moulton, J. D.; Wilson, C. J.

    2017-12-01

    Many permafrost-affected regions in the Arctic manifest a polygonal patterned ground, which contains large carbon stores and is vulnerability to climate change as warming temperatures drive melting ice wedges, polygon degradation, and thawing of the underlying carbon-rich soils. Understanding the fate of this carbon is difficult. The system is controlled by complex, nonlinear physics coupling biogeochemistry, thermal-hydrology, and geomorphology, and there is a strong spatial scale separation between microtopograpy (at the scale of an individual polygon) and the scale of landscape change (at the scale of many thousands of polygons). Physics-based models have come a long way, and are now capable of representing the diverse set of processes, but only on individual polygons or a few polygons. Empirical models have been used to upscale across land types, including ecotypes evolving from low-centered (pristine) polygons to high-centered (degraded) polygon, and do so over large spatial extent, but are limited in their ability to discern causal process mechanisms. Here we present a novel strategy that looks to use physics-based models across scales, bringing together multiple capabilities to capture polygon degradation under a warming climate and its impacts on thermal-hydrology. We use fine-scale simulations on individual polygons to motivate a mixed-dimensional strategy that couples one-dimensional columns representing each individual polygon through two-dimensional surface flow. A subgrid model is used to incorporate the effects of surface microtopography on surface flow; this model is described and calibrated to fine-scale simulations. And critically, a subsidence model that tracks volume loss in bulk ice wedges is used to alter the subsurface structure and subgrid parameters, enabling the inclusion of the feedbacks associated with polygon degradation. This combined strategy results in a model that is able to capture the key features of polygon permafrost degradation, but in a simulation across a large spatial extent of polygonal tundra.

  13. Exploring culture in the world of international nutrition and nutrition sciences.

    PubMed

    Centrone Stefani, Monique; Humphries, Debbie L

    2013-09-01

    This symposium was organized to bring insights from the social sciences into the awareness of nutrition scientists committed to developing and implementing effective nutrition interventions internationally. The symposium explored three different areas in the field where a more precise analysis of culture could enhance the effectiveness of nutrition science: 1) in the implementation of nutrition science research in the field; 2) in the collaboration of multiple stakeholders working to enhance nutrition in a national setting; and 3) in the language and discussions used to frame proposed changes in large scale food and nutrition security policy transnationally. Three social scientists, Monique Centrone Stefani, Lucy Jarosz, and David Pelletier were invited to share insights from their respective disciplines and respondents from within the field of nutrition provided initial reflections to better understand such perspectives. The symposium's interdisciplinary nature was designed to illustrate the challenge of multiple perspectives and methodologies and to advance understanding that could derive from such an exchange for those in the field of international nutrition seeking to decrease global hunger and malnutrition.

  14. Scaling a Survey Course in Extreme Weather

    NASA Astrophysics Data System (ADS)

    Samson, P. J.

    2013-12-01

    "Extreme Weather" is a survey-level course offered at the University of Michigan that is broadcast via the web and serves as a research testbed to explore best practices for large class conduct. The course has led to the creation of LectureTools, a web-based student response and note-taking system that has been shown to increase student engagement dramatically in multiple courses by giving students more opportunities to participate in class. Included in this is the capacity to pose image-based questions (see image where question was "Where would you expect winds from the south") as well as multiple choice, ordered list, free response and numerical questions. Research in this class has also explored differences in learning outcomes from those who participate remotely versus those who physically come to class and found little difference. Moreover the technologies used allow instructors to conduct class from wherever they are while the students can still answer questions and engage in class discussion from wherever they are. This presentation will use LectureTools to demonstrate its features. Attendees are encouraged to bring a mobile device to the session to participate.

  15. Bringing Policy and Practice to the Table: Young Women's Nutritional Experiences in an Ontario Secondary School

    ERIC Educational Resources Information Center

    Gray, Sarah K.

    2015-01-01

    In recent years, media, health organizations and researchers have raised concern over the health of Canadian children and adolescents. Stakeholders have called on the government to confront the problem. Schools are seen as an ideal location for developing and implementing large-scale interventions because of the ease of access to large groups of…

  16. Large-Scale Document Automation: The Systems Integration Issue.

    ERIC Educational Resources Information Center

    Kalthoff, Robert J.

    1985-01-01

    Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…

  17. Dependable Trend Measurement Is Not Just IRT Scaling: Commentary on "Linking Large-Scale Reading Assessments: Measuring International Trends over 40 Years"

    ERIC Educational Resources Information Center

    Mullis, Ina V. S.; Martin, Michael O.

    2016-01-01

    Linking IEA's international reading assessments across 40 years is an interesting endeavor from several perspectives. Being able to examine trends in reading achievement at the 4th grade over such a long period and relate these to policy changes during that time span is an attractive idea. However, this work brings to the fore many thorny issues…

  18. Shifting Expectations: Bringing STEM to Scale through Expanded Learning Systems

    ERIC Educational Resources Information Center

    Donner, Jessica; Wang, Yvonne

    2013-01-01

    Expanded learning opportunities, such as afterschool and summer programs, are particularly well positioned to help address science, technology, engineering, and mathematics (STEM) education crisis. A large percentage of youth participating in afterschool programs are members of groups traditionally underrepresented in STEM fields. Additionally,…

  19. Evaluation of Music And Astronomy Under The Stars: Bringing Science To New Audiences At Music Events

    NASA Astrophysics Data System (ADS)

    Lubowich, D.; Torff, B.

    2014-07-01

    Evaluations were conducted of the 2009-2012 NASA-funded Music and Astronomy Under the Stars (MAUS) program at outdoor concerts (see the separate MAUS poster at this meeting). MAUS promoted lifelong learning by providing opportunities for the public to look through telescopes, participate in hands-on activities, and view posters, banners, and videos at events where large numbers of people are gathered. Surveys were given to 1.6% of the concertgoers at MAUS events with the participants expressing their level of agreement on a four-point scale with the following statements: “The astronomy at this event has been an enjoyable experience;” “It has been easy to comprehend the astronomy at this event;” “This event has helped me learn new things about astronomy;” “This event has made me want to learn more about astronomy;” and “This event has increased my interest in science.” On a scale where 1 = strongly disagree, 2 = disagree, 3 = agree, and 4 = strongly agree, MAUS received high ratings (>3.34/4) on all outcomes. MAUS successfully reached people at different concerts who had little interest in science. MAUS appealed to concert attendees of both genders, all ages, multiple levels of education, and all musical tastes. MAUS positively influenced the public's knowledge of and interest in astronomy. The high ratings from virtually all respondents indicate that the gains were not restricted to science enthusiasts. The data strongly supports the conclusion that MAUS—bringing astronomy to people at musical events—is effective!

  20. Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan

    2015-10-01

    Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

  1. Bringing the Virtual Astronomical Observatory to the Education Community

    NASA Astrophysics Data System (ADS)

    Lawton, B.; Eisenhamer, B.; Mattson, B. J.; Raddick, M. J.

    2012-08-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. The Education and Public Outreach (EPO) program for the VAO will be led by the Space Telescope Science Institute in collaboration with the High Energy Astrophysics Science Archive Research Center (HEASARC) EPO program and Johns Hopkins University. VAO EPO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public and education community. Our EPO efforts will be structured to provide uniform access to VAO information, enabling educational and research opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that the VO has already built many tools for EPO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. However, it is not enough to simply provide tools. Tools must meet the needs of the education community and address national education standards in order to be broadly utilized. To determine which tools the VAO will incorporate into the EPO program, needs assessments will be conducted with educators across the U.S.

  2. Conserving tigers in working landscapes.

    PubMed

    Chanchani, Pranav; Noon, Barry R; Bailey, Larissa L; Warrier, Rekha A

    2016-06-01

    Tiger (Panthera tigris) conservation efforts in Asia are focused on protected areas embedded in human-dominated landscapes. A system of protected areas is an effective conservation strategy for many endangered species if the network is large enough to support stable metapopulations. The long-term conservation of tigers requires that the species be able to meet some of its life-history needs beyond the boundaries of small protected areas and within the working landscape, including multiple-use forests with logging and high human use. However, understanding of factors that promote or limit the occurrence of tigers in working landscapes is incomplete. We assessed the relative influence of protection status, prey occurrence, extent of grasslands, intensity of human use, and patch connectivity on tiger occurrence in the 5400 km(2) Central Terai Landscape of India, adjacent to Nepal. Two observer teams independently surveyed 1009 km of forest trails and water courses distributed across 60 166-km(2) cells. In each cell, the teams recorded detection of tiger signs along evenly spaced trail segments. We used occupancy models that permitted multiscale analysis of spatially correlated data to estimate cell-scale occupancy and segment-scale habitat use by tigers as a function of management and environmental covariates. Prey availability and habitat quality, rather than protected-area designation, influenced tiger occupancy. Tiger occupancy was low in some protected areas in India that were connected to extensive areas of tiger habitat in Nepal, which brings into question the efficacy of current protection and management strategies in both India and Nepal. At a finer spatial scale, tiger habitat use was high in trail segments associated with abundant prey and large grasslands, but it declined as human and livestock use increased. We speculate that riparian grasslands may provide tigers with critical refugia from human activity in the daytime and thereby promote tiger occurrence in some multiple-use forests. Restrictions on human-use in high-quality tiger habitat in multiple-use forests may complement existing protected areas and collectively promote the persistence of tiger populations in working landscapes. © 2015 Society for Conservation Biology.

  3. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  4. Elastic "I Think": Stretching over L1 and L2

    ERIC Educational Resources Information Center

    Zhang, Grace Q.; Sabet, Peyman G. P.

    2016-01-01

    While there has been insightful research on the commonly used expression "I think" (IT), this study introduces a non-conventional and innovative conception of elasticity (Zhang 2011), bringing together several properties of IT. Drawn on large-scale naturally occurring classroom data with a rare combination of linguistically and…

  5. A new way to protect privacy in large-scale genome-wide association studies.

    PubMed

    Kamm, Liina; Bogdanov, Dan; Laur, Sven; Vilo, Jaak

    2013-04-01

    Increased availability of various genotyping techniques has initiated a race for finding genetic markers that can be used in diagnostics and personalized medicine. Although many genetic risk factors are known, key causes of common diseases with complex heritage patterns are still unknown. Identification of such complex traits requires a targeted study over a large collection of data. Ideally, such studies bring together data from many biobanks. However, data aggregation on such a large scale raises many privacy issues. We show how to conduct such studies without violating privacy of individual donors and without leaking the data to third parties. The presented solution has provable security guarantees. Supplementary data are available at Bioinformatics online.

  6. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  7. Fuzzy adaptive strong tracking scaled unscented Kalman filter for initial alignment of large misalignment angles

    NASA Astrophysics Data System (ADS)

    Li, Jing; Song, Ningfang; Yang, Gongliu; Jiang, Rui

    2016-07-01

    In the initial alignment process of strapdown inertial navigation system (SINS), large misalignment angles always bring nonlinear problem, which can usually be processed using the scaled unscented Kalman filter (SUKF). In this paper, the problem of large misalignment angles in SINS alignment is further investigated, and the strong tracking scaled unscented Kalman filter (STSUKF) is proposed with fixed parameters to improve convergence speed, while these parameters are artificially constructed and uncertain in real application. To further improve the alignment stability and reduce the parameters selection, this paper proposes a fuzzy adaptive strategy combined with STSUKF (FUZZY-STSUKF). As a result, initial alignment scheme of large misalignment angles based on FUZZY-STSUKF is designed and verified by simulations and turntable experiment. The results show that the scheme improves the accuracy and convergence speed of SINS initial alignment compared with those based on SUKF and STSUKF.

  8. Robust decentralized hybrid adaptive output feedback fuzzy control for a class of large-scale MIMO nonlinear systems and its application to AHS.

    PubMed

    Huang, Yi-Shao; Liu, Wel-Ping; Wu, Min; Wang, Zheng-Wu

    2014-09-01

    This paper presents a novel observer-based decentralized hybrid adaptive fuzzy control scheme for a class of large-scale continuous-time multiple-input multiple-output (MIMO) uncertain nonlinear systems whose state variables are unmeasurable. The scheme integrates fuzzy logic systems, state observers, and strictly positive real conditions to deal with three issues in the control of a large-scale MIMO uncertain nonlinear system: algorithm design, controller singularity, and transient response. Then, the design of the hybrid adaptive fuzzy controller is extended to address a general large-scale uncertain nonlinear system. It is shown that the resultant closed-loop large-scale system keeps asymptotically stable and the tracking error converges to zero. The better characteristics of our scheme are demonstrated by simulations. Copyright © 2014. Published by Elsevier Ltd.

  9. Experimental two-dimensional quantum walk on a photonic chip

    PubMed Central

    Lin, Xiao-Feng; Feng, Zhen; Chen, Jing-Yuan; Gao, Jun; Sun, Ke; Wang, Chao-Yue; Lai, Peng-Cheng; Xu, Xiao-Yun; Wang, Yao; Qiao, Lu-Feng; Yang, Ai-Lin

    2018-01-01

    Quantum walks, in virtue of the coherent superposition and quantum interference, have exponential superiority over their classical counterpart in applications of quantum searching and quantum simulation. The quantum-enhanced power is highly related to the state space of quantum walks, which can be expanded by enlarging the photon number and/or the dimensions of the evolution network, but the former is considerably challenging due to probabilistic generation of single photons and multiplicative loss. We demonstrate a two-dimensional continuous-time quantum walk by using the external geometry of photonic waveguide arrays, rather than the inner degree of freedoms of photons. Using femtosecond laser direct writing, we construct a large-scale three-dimensional structure that forms a two-dimensional lattice with up to 49 × 49 nodes on a photonic chip. We demonstrate spatial two-dimensional quantum walks using heralded single photons and single photon–level imaging. We analyze the quantum transport properties via observing the ballistic evolution pattern and the variance profile, which agree well with simulation results. We further reveal the transient nature that is the unique feature for quantum walks of beyond one dimension. An architecture that allows a quantum walk to freely evolve in all directions and at a large scale, combining with defect and disorder control, may bring up powerful and versatile quantum walk machines for classically intractable problems. PMID:29756040

  10. Experimental two-dimensional quantum walk on a photonic chip.

    PubMed

    Tang, Hao; Lin, Xiao-Feng; Feng, Zhen; Chen, Jing-Yuan; Gao, Jun; Sun, Ke; Wang, Chao-Yue; Lai, Peng-Cheng; Xu, Xiao-Yun; Wang, Yao; Qiao, Lu-Feng; Yang, Ai-Lin; Jin, Xian-Min

    2018-05-01

    Quantum walks, in virtue of the coherent superposition and quantum interference, have exponential superiority over their classical counterpart in applications of quantum searching and quantum simulation. The quantum-enhanced power is highly related to the state space of quantum walks, which can be expanded by enlarging the photon number and/or the dimensions of the evolution network, but the former is considerably challenging due to probabilistic generation of single photons and multiplicative loss. We demonstrate a two-dimensional continuous-time quantum walk by using the external geometry of photonic waveguide arrays, rather than the inner degree of freedoms of photons. Using femtosecond laser direct writing, we construct a large-scale three-dimensional structure that forms a two-dimensional lattice with up to 49 × 49 nodes on a photonic chip. We demonstrate spatial two-dimensional quantum walks using heralded single photons and single photon-level imaging. We analyze the quantum transport properties via observing the ballistic evolution pattern and the variance profile, which agree well with simulation results. We further reveal the transient nature that is the unique feature for quantum walks of beyond one dimension. An architecture that allows a quantum walk to freely evolve in all directions and at a large scale, combining with defect and disorder control, may bring up powerful and versatile quantum walk machines for classically intractable problems.

  11. CERN and LHC - Their Place in Global Science

    ScienceCinema

    None

    2018-01-09

    The Large Hadron Collider (LHC) is the largest scientific instrument in the world. It brings into collision intense beams of protons and ions to explore the structure of matter and investigate the forces of nature at an unprecedented energy scale, thus serving a community of some 7,000 particle physicists from all over the world.

  12. Implementing Assessment Engineering in the Uniform Certified Public Accountant (CPA) Examination

    ERIC Educational Resources Information Center

    Burke, Matthew; Devore, Richard; Stopek, Josh

    2013-01-01

    This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…

  13. Bringing Abstract Academic Integrity and Ethical Concepts into Real-Life Situations

    ERIC Educational Resources Information Center

    Kwong, Theresa; Wong, Eva; Yue, Kevin

    2017-01-01

    This paper reports the learning analytics on the initial stages of a large-scale, government-funded project which inducts university students in Hong Kong into consideration of academic integrity and ethics through mobile Augmented Reality (AR) learning trails--Trails of Integrity and Ethics (TIEs)--accessed on smart devices. The trails immerse…

  14. The Infrastructure of Accountability: Data Use and the Transformation of American Education

    ERIC Educational Resources Information Center

    Anagnostopoulos, Dorothea, Ed.; Rutledge, Stacey A., Ed.; Jacobsen, Rebecca, Ed.

    2013-01-01

    "The Infrastructure of Accountability" brings together leading and emerging scholars who set forth an ambitious conceptual framework for understanding the full impact of large-scale, performance-based accountability systems on education. Over the past 20 years, schools and school systems have been utterly reshaped by the demands of…

  15. Mission Impossible? Leadership Responsibility without Authority for Initiatives To Reorganise Schools.

    ERIC Educational Resources Information Center

    Wallace, Mike

    This paper explores how characteristics of complex educational change may virtually dictate the leadership strategies adopted by those charged with bringing about change. The change in question here is the large-scale reorganization of local education authorities (LEAs) across England. The article focuses on how across-the-board initiatives to…

  16. Bringing Open Educational Practice to a Research-Intensive University: Prospects and Challenges

    ERIC Educational Resources Information Center

    Masterman, Elizabeth

    2016-01-01

    This article describes a small-scale study that explored the relationship between the pedagogical practices characterised as "open" and the existing model of undergraduate teaching and learning at a large research-intensive university (RIU). The aim was to determine the factors that might enable (conversely impede) the greater uptake of…

  17. Research to Real Life, 2006: Innovations in Deaf-Blindness

    ERIC Educational Resources Information Center

    Leslie, Gail, Ed.

    2006-01-01

    This publication presents several projects that support children who are deaf-blind. These projects are: (1) Learning To Learn; (2) Project SALUTE; (3) Project SPARKLE; (4) Bringing It All Back Home; (5) Project PRIIDE; and (6) Including Students With Deafblindness In Large Scale Assessment Systems. Each project lists components, key practices,…

  18. Unlocking Flexibility: Integrated Optimization and Control of Multienergy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Anese, Emiliano; Mancarella, Pierluigi; Monti, Antonello

    Electricity, natural gas, water, and dis trict heating/cooling systems are predominantly planned and operated independently. However, it is increasingly recognized that integrated optimization and control of such systems at multiple spatiotemporal scales can bring significant socioeconomic, operational efficiency, and environmental benefits. Accordingly, the concept of the multi-energy system is gaining considerable attention, with the overarching objectives of 1) uncovering fundamental gains (and potential drawbacks) that emerge from the integrated operation of multiple systems and 2) developing holistic yet computationally affordable optimization and control methods that maximize operational benefits, while 3) acknowledging intrinsic interdependencies and quality-of-service requirements for each provider.

  19. WHO WOULD EAT IN A WORLD WITHOUT PHOSPHORUS? A GLOBAL DYNAMIC MODEL

    NASA Astrophysics Data System (ADS)

    Dumas, M.

    2009-12-01

    Phosphorus is an indispensable and non-substitutable resource, as agriculture is impossible if soils do not hold adequate amounts of this nutrient. Phosphorus is also considered to be a non-renewable and increasingly scarce resource, as phosphate rock reserves - as one measure of availability amongst others - are estimated to last for 50 to 100 years at current rates of consumption. How would food production decline in different parts of the world in the scenario of a sudden shortage in phosphorus? To answer this question and explore management scenarios, I present a probabilistic model of the structure and dynamics of the global P cycle in the world’s agro-ecosystems. The model proposes an original solution to the challenge of capturing the large-scale aggregate dynamics of multiple micro-scale soil cycling processes. Furthermore, it integrates the essential natural processes with a model of human-managed flows, thereby bringing together several decades of research and measurements from soil science, plant nutrition and long-term agricultural experiments from around the globe. In this paper, I present the model, the first simulation results and the implications for long-term sustainable management of phosphorus and soil fertility.

  20. Livestock First Reached Southern Africa in Two Separate Events.

    PubMed

    Sadr, Karim

    2015-01-01

    After several decades of research on the subject, we now know when the first livestock reached southern Africa but the question of how they got there remains a contentious topic. Debate centres on whether they were brought with a large migration of Khoe-speakers who originated from East Africa; or whether the livestock were traded down-the-line among hunter-gatherer communities; or indeed whether there was a long history of diverse small scale population movements in this part of the world, one or more of which 'infiltrated' livestock into southern Africa. A new analysis of the distribution of stone toolkits from a sizeable sample of sub-equatorial African Later Stone Age sites, coupled with existing knowledge of the distribution of the earliest livestock remains and ceramics vessels, has allowed us to isolate two separate infiltration events that brought the first livestock into southern Africa just over 2000 years ago; one infiltration was along the Atlantic seaboard and another entered the middle reaches of the Limpopo River Basin. These findings agree well with the latest results of genetic research which together indicate that multiple, small-scale infiltrations probably were responsible for bringing the first livestock into southern Africa.

  1. Forum: The Rise of International Large-Scale Assessments and Rationales for Participation

    ERIC Educational Resources Information Center

    Addey, Camilla; Sellar, Sam; Steiner-Khamsi, Gita; Lingard, Bob; Verger, Antoni

    2017-01-01

    This Forum discusses the significant growth of international large-scale assessments (ILSAs) since the mid-1990s. Addey and Sellar's contribution ("A Framework for Analysing the Multiple Rationales for Participating in International Large-Scale Assessments") outlines a framework of rationales for participating in ILSAs and examines the…

  2. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  3. A Step in the Right Direction: Learning Walk Brings Districts Together to Examine Teacher Evaluation and Support Roles

    ERIC Educational Resources Information Center

    Armstrong, Anthony

    2012-01-01

    In a private room at the back of a busy restaurant just outside of Tampa, Florida, drinks and appetizers went unnoticed as 12 diners sat around one large table and engaged in multiple rapid-fire conversations about professional learning. On one side of each conversation were representatives of Memphis (Tennessee) City Schools. The district was in…

  4. Explorative Function in Williams Syndrome Analyzed through a Large-Scale Task with Multiple Rewards

    ERIC Educational Resources Information Center

    Foti, F.; Petrosini, L.; Cutuli, D.; Menghini, D.; Chiarotti, F.; Vicari, S.; Mandolesi, L.

    2011-01-01

    This study aimed to evaluate spatial function in subjects with Williams syndrome (WS) by using a large-scale task with multiple rewards and comparing the spatial abilities of WS subjects with those of mental age-matched control children. In the present spatial task, WS participants had to explore an open space to search nine rewards placed in…

  5. FEATURE 3, LARGE GUN POSITION, SHOWING MULTIPLE COMPARTMENTS, VIEW FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FEATURE 3, LARGE GUN POSITION, SHOWING MULTIPLE COMPARTMENTS, VIEW FACING SOUTH (with scale stick). - Naval Air Station Barbers Point, Anti-Aircraft Battery Complex-Large Gun Position, East of Coral Sea Road, northwest of Hamilton Road, Ewa, Honolulu County, HI

  6. Relationships between Event-Related Potentials and Behavioral and Scholastic Measures of Reading Ability: A Large-Scale, Cross-Sectional Study

    ERIC Educational Resources Information Center

    Khalifian, Negin; Stites, Mallory C.; Laszlo, Sarah

    2016-01-01

    In the cognitive, computational, neuropsychological, and educational literatures, it is established that children approach text in unique ways, and that even adult readers can differ in the strategies they bring to reading. In the developmental event-related potential (ERP) literature, however, children with differing degrees of reading ability…

  7. Wind Farms in Rural Areas: How Far Do Community Benefits from Wind Farms Represent a Local Economic Development Opportunity?

    ERIC Educational Resources Information Center

    Munday, Max; Bristow, Gill; Cowell, Richard

    2011-01-01

    Although the large-scale deployment of renewable technologies can bring significant, localised economic and environmental changes, there has been remarkably little empirical investigation of the rural development implications. This paper seeks to redress this through an analysis of the economic development opportunities surrounding wind energy…

  8. Guide to Datasets for Research and Policymaking in Child Care and Early Education

    ERIC Educational Resources Information Center

    Romero, Mariajose; Douglas-Hall, Ayana

    2009-01-01

    This Guide is an annotated bibliography of existing large scale data sets that provide useful information to policymakers, researchers, state administrators, and others in the field of child care and early education. The Guide follows an ecological approach to research and policy in the field: it brings attention not only to children themselves,…

  9. Acquisition of electroencephalographic data in a large regional hospital - Bringing the brain waves to the computer.

    NASA Technical Reports Server (NTRS)

    Low, M. D.; Baker, M.; Ferguson, R.; Frost, J. D., Jr.

    1972-01-01

    This paper describes a complete electroencephalographic acquisition and transmission system, designed to meet the needs of a large hospital with multiple critical care patient monitoring units. The system provides rapid and prolonged access to a centralized recording and computing area from remote locations within the hospital complex, and from locations in other hospitals and other cities. The system includes quick-on electrode caps, amplifier units and cable transmission for access from within the hospital, and EEG digitization and telephone transmission for access from other hospitals or cities.

  10. SchizConnect: Mediating Neuroimaging Databases on Schizophrenia and Related Disorders for Large-Scale Integration

    PubMed Central

    Wang, Lei; Alpert, Kathryn I.; Calhoun, Vince D.; Cobia, Derin J.; Keator, David B.; King, Margaret D.; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D.; Potkin, Steven G.; Turner, Jessica A.; Ambite, Jose Luis

    2015-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation—translating across data sources—so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information are being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. PMID:26142271

  11. Research on unit commitment with large-scale wind power connected power system

    NASA Astrophysics Data System (ADS)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  12. The first in situ observation of torsional Alfvén waves during the interaction of large-scale magnetic clouds

    NASA Astrophysics Data System (ADS)

    Raghav, Anil N.; Kule, Ankita

    2018-05-01

    The large-scale magnetic cloud such as coronal mass ejections (CMEs) is the fundamental driver of the space weather. The interaction of the multiple-CMEs in interplanetary space affects their dynamic evolution and geo-effectiveness. The complex and merged multiple magnetic clouds appear as the in situ signature of the interacting CMEs. The Alfvén waves are speculated to be one of the major possible energy exchange/dissipation mechanism during the interaction. However, no such observational evidence has been found in the literature. The case studies of CME-CME collision events suggest that the magnetic and thermal energy of the CME is converted into the kinetic energy. Moreover, magnetic reconnection process is justified to be responsible for merging of multiple magnetic clouds. Here, we present unambiguous evidence of sunward torsional Alfvén waves in the interacting region after the super-elastic collision of multiple CMEs. The Walén relation is used to confirm the presence of Alfvén waves in the interacting region of multiple CMEs/magnetic clouds. We conclude that Alfvén waves and magnetic reconnection are the possible energy exchange/dissipation mechanisms during large-scale magnetic clouds collisions. This study has significant implications not only in CME-magnetosphere interactions but also in the interstellar medium where interactions of large-scale magnetic clouds are possible.

  13. BAMBUS: a new inelastic multiplexed neutron spectrometer for PANDA

    NASA Astrophysics Data System (ADS)

    Lim, J. A.; Siemensmeyer, K.; Čermák, P.; Lake, B.; Schneidewind, A.; Inosov, D. S.

    2015-03-01

    We report on plans for a multiplexed neutron analyser option for the PANDA spectrometer. The key design concept is to have many analysers positioned to give a large coverage in the scattering plane, and multiple arcs of these analysers to measure different energy transfers simultaneously. The main goal is to bring intensity gains and improved reciprocal-space and energy mapping capabilities to the existing cold triple-axis spectrometer.

  14. Framing Innovation: The Impact of the Superintendent's Technology Infrastructure Decisions on the Acceptance of Large-Scale Technology Initiatives

    ERIC Educational Resources Information Center

    Arnold, Erik P.

    2014-01-01

    A multiple-case qualitative study of five school districts that had implemented various large-scale technology initiatives was conducted to describe what superintendents do to gain acceptance of those initiatives. The large-scale technology initiatives in the five participating districts included 1:1 District-Provided Device laptop and tablet…

  15. The JASMIN Analysis Platform - bridging the gap between traditional climate data practicies and data-centric analysis paradigms

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan

    2014-05-01

    The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.

  16. High-End Computing for Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2001-01-01

    The objective of the First MIT Conference on Computational Fluid and Solid Mechanics (June 12-14, 2001) is to bring together industry and academia (and government) to nurture the next generation in computational mechanics. The objective of the current talk, 'High-End Computing for Incompressible Flows', is to discuss some of the current issues in large scale computing for mission-oriented tasks.

  17. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  18. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  19. Exploring asynchronous brainstorming in large groups: a field comparison of serial and parallel subgroups.

    PubMed

    de Vreede, Gert-Jan; Briggs, Robert O; Reiter-Palmon, Roni

    2010-04-01

    The aim of this study was to compare the results of two different modes of using multiple groups (instead of one large group) to identify problems and develop solutions. Many of the complex problems facing organizations today require the use of very large groups or collaborations of groups from multiple organizations. There are many logistical problems associated with the use of such large groups, including the ability to bring everyone together at the same time and location. A field study involved two different organizations and compared productivity and satisfaction of group. The approaches included (a) multiple small groups, each completing the entire process from start to end and combining the results at the end (parallel mode); and (b) multiple subgroups, each building on the work provided by previous subgroups (serial mode). Groups using the serial mode produced more elaborations compared with parallel groups, whereas parallel groups produced more unique ideas compared with serial groups. No significant differences were found related to satisfaction with process and outcomes between the two modes. Preferred mode depends on the type of task facing the group. Parallel groups are more suited for tasks for which a variety of new ideas are needed, whereas serial groups are best suited when elaboration and in-depth thinking on the solution are required. Results of this research can guide the development of facilitated sessions of large groups or "teams of teams."

  20. Context-dependence of long-term responses of terrestrial gastropod populations to large-scale disturbance.

    Treesearch

    Christopher P. Bloch; Michael R. Willi

    2006-01-01

    Large-scale natural disturbances, such as hurricanes, can have profound effects on animal populations. Nonetheless, generalizations about the effects of disturbance are elusive, and few studies consider long-term responses of a single population or community to multiple large-scale disturbance events. In the last 20 y, twomajor hurricanes (Hugo and Georges) have struck...

  1. Spherical cows in the sky with fab four

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaloper, Nemanja; Sandora, McCullen, E-mail: kaloper@physics.ucdavis.edu, E-mail: mesandora@ucdavis.edu

    2014-05-01

    We explore spherically symmetric static solutions in a subclass of unitary scalar-tensor theories of gravity, called the 'Fab Four' models. The weak field large distance solutions may be phenomenologically viable, but only if the Gauss-Bonnet term is negligible. Only in this limit will the Vainshtein mechanism work consistently. Further, classical constraints and unitarity bounds constrain the models quite tightly. Nevertheless, in the limits where the range of individual terms at large scales is respectively Kinetic Braiding, Horndeski, and Gauss-Bonnet, the horizon scale effects may occur while the theory satisfies Solar system constraints and, marginally, unitarity bounds. On the other hand,more » to bring the cutoff down to below a millimeter constrains all the couplings scales such that 'Fab Fours' can't be heard outside of the Solar system.« less

  2. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  3. Feasibility of Floating Platform Systems for Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musial, W.; Butterfield, S.; Boone, A.

    This paper provides a general technical description of several types of floating platforms for wind turbines. Platform topologies are classified into multiple- or single-turbine floaters and by mooring method. Platforms using catenary mooring systems are contrasted to vertical mooring systems and the advantages and disadvantages are discussed. Specific anchor types are described in detail. A rough cost comparison is performed for two different platform architectures using a generic 5-MW wind turbine. One platform is a Dutch study of a tri-floater platform using a catenary mooring system, and the other is a mono-column tension-leg platform developed at the National Renewable Energymore » Laboratory. Cost estimates showed that single unit production cost is $7.1 M for the Dutch tri-floater, and $6.5 M for the NREL TLP concept. However, value engineering, multiple unit series production, and platform/turbine system optimization can lower the unit platform costs to $4.26 M and $2.88 M, respectively, with significant potential to reduce cost further with system optimization. These foundation costs are within the range necessary to bring the cost of energy down to the DOE target range of $0.05/kWh for large-scale deployment of offshore floating wind turbines.« less

  4. Co-governing decentralised water systems: an analytical framework.

    PubMed

    Yu, C; Brown, R; Morison, P

    2012-01-01

    Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.

  5. Parallel Clustering Algorithm for Large-Scale Biological Data Sets

    PubMed Central

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Backgrounds Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Methods Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. Result A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies. PMID:24705246

  6. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Treesearch

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  7. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  8. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  9. Hardware-Assisted Large-Scale Neuroevolution for Multiagent Learning

    DTIC Science & Technology

    2014-12-30

    SECURITY CLASSIFICATION OF: This DURIP equipment award was used to purchase, install, and bring on-line two Berkeley Emulation Engines ( BEEs ) and two...mini- BEE machines to establish an FPGA-based high-performance multiagent training platform and its associated software. This acquisition of BEE4-W...Platform; Probabilistic Domain Transformation; Hardware-Assisted; FPGA; BEE ; Hive Brain; Multiagent. REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S

  10. Bringing Effective Instructional Practice to Scale in American Schools: Lessons from the Long Beach Unified School District

    ERIC Educational Resources Information Center

    Zavadsky, Heather

    2016-01-01

    Workforce and societal needs have changed significantly over the past few decades while educational approaches have remained largely the same over the past 50 years. Walk into any random classroom in the United States and you will likely see instruction being delivered to students in straight rows by teachers through lecture style. It is possible…

  11. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  12. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-12-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.

  13. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  14. Recording large-scale neuronal ensembles with silicon probes in the anesthetized rat.

    PubMed

    Schjetnan, Andrea Gomez Palacio; Luczak, Artur

    2011-10-19

    Large scale electrophysiological recordings from neuronal ensembles offer the opportunity to investigate how the brain orchestrates the wide variety of behaviors from the spiking activity of its neurons. One of the most effective methods to monitor spiking activity from a large number of neurons in multiple local neuronal circuits simultaneously is by using silicon electrode arrays. Action potentials produce large transmembrane voltage changes in the vicinity of cell somata. These output signals can be measured by placing a conductor in close proximity of a neuron. If there are many active (spiking) neurons in the vicinity of the tip, the electrode records combined signal from all of them, where contribution of a single neuron is weighted by its 'electrical distance'. Silicon probes are ideal recording electrodes to monitor multiple neurons because of a large number of recording sites (+64) and a small volume. Furthermore, multiple sites can be arranged over a distance of millimeters, thus allowing for the simultaneous recordings of neuronal activity in the various cortical layers or in multiple cortical columns (Fig. 1). Importantly, the geometrically precise distribution of the recording sites also allows for the determination of the spatial relationship of the isolated single neurons. Here, we describe an acute, large-scale neuronal recording from the left and right forelimb somatosensory cortex simultaneously in an anesthetized rat with silicon probes (Fig. 2).

  15. Recording Large-scale Neuronal Ensembles with Silicon Probes in the Anesthetized Rat

    PubMed Central

    Schjetnan, Andrea Gomez Palacio; Luczak, Artur

    2011-01-01

    Large scale electrophysiological recordings from neuronal ensembles offer the opportunity to investigate how the brain orchestrates the wide variety of behaviors from the spiking activity of its neurons. One of the most effective methods to monitor spiking activity from a large number of neurons in multiple local neuronal circuits simultaneously is by using silicon electrode arrays1-3. Action potentials produce large transmembrane voltage changes in the vicinity of cell somata. These output signals can be measured by placing a conductor in close proximity of a neuron. If there are many active (spiking) neurons in the vicinity of the tip, the electrode records combined signal from all of them, where contribution of a single neuron is weighted by its 'electrical distance'. Silicon probes are ideal recording electrodes to monitor multiple neurons because of a large number of recording sites (+64) and a small volume. Furthermore, multiple sites can be arranged over a distance of millimeters, thus allowing for the simultaneous recordings of neuronal activity in the various cortical layers or in multiple cortical columns (Fig. 1). Importantly, the geometrically precise distribution of the recording sites also allows for the determination of the spatial relationship of the isolated single neurons4. Here, we describe an acute, large-scale neuronal recording from the left and right forelimb somatosensory cortex simultaneously in an anesthetized rat with silicon probes (Fig. 2). PMID:22042361

  16. Ascertaining Validity in the Abstract Realm of PMESII Simulation Models: An Analysis of the Peace Support Operations Model (PSOM)

    DTIC Science & Technology

    2009-06-01

    simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE

  17. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    PubMed Central

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  18. Cosmology with CLASS

    NASA Astrophysics Data System (ADS)

    Watts, Duncan; CLASS Collaboration

    2018-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will use large-scale measurements of the polarized cosmic microwave background (CMB) to constrain the physics of inflation, reionization, and massive neutrinos. The experiment is designed to characterize the largest scales, which are inaccessible to most ground-based experiments, and remove Galactic foregrounds from the CMB maps. In this dissertation talk, I present simulations of CLASS data and demonstrate their ability to constrain the simplest single-field models of inflation and to reduce the uncertainty of the optical depth to reionization, τ, to near the cosmic variance limit, significantly improving on current constraints. These constraints will bring a qualitative shift in our understanding of standard ΛCDM cosmology. In particular, CLASS's measurement of τ breaks cosmological parameter degeneracies. Probes of large scale structure (LSS) test the effect of neutrino free-streaming at small scales, which depends on the mass of the neutrinos. CLASS's τ measurement, when combined with next-generation LSS and BAO measurements, will enable a 4σ detection of neutrino mass, compared with 2σ without CLASS data.. I will also briefly discuss the CLASS experiment's measurements of circular polarization of the CMB and the implications of the first-such near-all-sky map.

  19. Nano-multiplication region avalanche photodiodes and arrays

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu (Inventor); Pain, Bedabrata (Inventor); Cunningham, Thomas J. (Inventor)

    2011-01-01

    An avalanche photodiode with a nano-scale reach-through structure comprising n-doped and p-doped regions, formed on a silicon island on an insulator, so that the avalanche photodiode may be electrically isolated from other circuitry on other silicon islands on the same silicon chip as the avalanche photodiode. For some embodiments, multiplied holes generated by an avalanche reduces the electric field in the depletion region of the n-doped and p-doped regions to bring about self-quenching of the avalanche photodiode. Other embodiments are described and claimed.

  20. Flower-like BiOI microsphere/Ni@C nanocapsule hybrid composites and their efficient microwave absorbing activity

    NASA Astrophysics Data System (ADS)

    Liu, Xianguo; Yu, Jieyi; Cui, Caiyun; Sun, Yuping; Li, Xiaolong; Li, Zhenxing

    2018-07-01

    At present, microwave absorbers are prepared by dispersing absorbing nanomaterials in a binder, which can lead to the aggregation of nanomaterials in the binder and further affect the optimization of the absorption performances. Hybrid micro/nano-scale structures are beneficial for buffering agglomeration phenomena and the construction of multiple interfaces. Here, Ni@C nanocapsules are conjugated onto flower-like BiOI microspheres, forming micro/nano-scale hybrid composites. The multiple interfaces between BiOI microspheres and Ni@C nanocapsules can bring enhanced dielectric loss and increased attenuation constant, resulting in the enhancement of absorption capacity (the optimal reflection loss reaches  ‑61.35 dB), increased width of the effective absorption band (the maximum effective bandwidth, f Emax , is 5.86 GHz) and the reduction of absorption thickness (the thickness corresponding to f Emax is 1.7 mm). This study highlights a simple idea for the optimization of electromagnetic absorbing performance, which is of great significance in the development of microwave absorbers.

  1. Strategies for Multi-Modal Analysis

    NASA Astrophysics Data System (ADS)

    Hexemer, Alexander; Wang, Cheng; Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sethian, James; Camera Team

    This section on soft materials will be dedicated to discuss the extraction of the chemical distribution and spatial arrangement of constituent elements and functional groups at multiple length scales and, thus, the examination of collective dynamics, transport, and electronic ordering phenomena. Traditional measures of structure in soft materials have relied heavily on scattering and imaging based techniques due to their capacity to measure nanoscale dimensions and their capacity to monitor structure under conditions of dynamic stress loading. Special attentions are planned to focus on the application of resonant x-ray scattering, contrast-varied neutron scattering, analytical transmission electron microscopy, and their combinations. This session aims to bring experts in both scattering and electron microscope fields to discuss recent advances in selectively characterizing structural architectures of complex soft materials, which have often multi-components with a wide range of length scales and multiple functionalities, and thus hopes to foster novel ideas to decipher a higher level of structural complexity in soft materials in future. CAMERA, Early Career Award.

  2. The atmospheric implications of radiation belt remediation

    NASA Astrophysics Data System (ADS)

    Rodger, C. J.; Clilverd, M. A.; Ulich, Th.; Verronen, P. T.; Turunen, E.; Thomson, N. R.

    2006-08-01

    High altitude nuclear explosions (HANEs) and geomagnetic storms can produce large scale injections of relativistic particles into the inner radiation belts. It is recognised that these large increases in >1 MeV trapped electron fluxes can shorten the operational lifetime of low Earth orbiting satellites, threatening a large, valuable population. Therefore, studies are being undertaken to bring about practical human control of the radiation belts, termed "Radiation Belt Remediation" (RBR). Here we consider the upper atmospheric consequences of an RBR system operating over either 1 or 10 days. The RBR-forced neutral chemistry changes, leading to NOx enhancements and Ox depletions, are significant during the timescale of the precipitation but are generally not long-lasting. The magnitudes, time-scales, and altitudes of these changes are no more significant than those observed during large solar proton events. In contrast, RBR-operation will lead to unusually intense HF blackouts for about the first half of the operation time, producing large scale disruptions to radio communication and navigation systems. While the neutral atmosphere changes are not particularly important, HF disruptions could be an important area for policy makers to consider, particularly for the remediation of natural injections.

  3. A Surface Plasmon Enhanced Infrared Photodetector Based on InAs Quantum Dots

    DTIC Science & Technology

    2010-01-01

    mance of QD infrared detector to a level that is compatible to the widely used, conventional MCT infrared detector . Acknowledgment. S.Y.L. gratefully...amenable to large scale fabrication and, more importantly, does not degrade the noise current characteristics of the photodetector. We believe that this...demonstration would bring the performance of QD-based infrared detectors to a level suitable for emerging surveillance and medical diagnostic

  4. Knowledge discovery from high-frequency stream nitrate concentrations: hydrology and biology contributions.

    PubMed

    Aubert, Alice H; Thrun, Michael C; Breuer, Lutz; Ultsch, Alfred

    2016-08-30

    High-frequency, in-situ monitoring provides large environmental datasets. These datasets will likely bring new insights in landscape functioning and process scale understanding. However, tailoring data analysis methods is necessary. Here, we detach our analysis from the usual temporal analysis performed in hydrology to determine if it is possible to infer general rules regarding hydrochemistry from available large datasets. We combined a 2-year in-stream nitrate concentration time series (time resolution of 15 min) with concurrent hydrological, meteorological and soil moisture data. We removed the low-frequency variations through low-pass filtering, which suppressed seasonality. We then analyzed the high-frequency variability component using Pareto Density Estimation, which to our knowledge has not been applied to hydrology. The resulting distribution of nitrate concentrations revealed three normally distributed modes: low, medium and high. Studying the environmental conditions for each mode revealed the main control of nitrate concentration: the saturation state of the riparian zone. We found low nitrate concentrations under conditions of hydrological connectivity and dominant denitrifying biological processes, and we found high nitrate concentrations under hydrological recession conditions and dominant nitrifying biological processes. These results generalize our understanding of hydro-biogeochemical nitrate flux controls and bring useful information to the development of nitrogen process-based models at the landscape scale.

  5. Negative emissions: Part 1—research landscape and synthesis

    NASA Astrophysics Data System (ADS)

    Minx, Jan C.; Lamb, William F.; Callaghan, Max W.; Fuss, Sabine; Hilaire, Jérôme; Creutzig, Felix; Amann, Thorben; Beringer, Tim; de Oliveira Garcia, Wagner; Hartmann, Jens; Khanna, Tarun; Lenzi, Dominic; Luderer, Gunnar; Nemet, Gregory F.; Rogelj, Joeri; Smith, Pete; Vicente, Jose Luis Vicente; Wilcox, Jennifer; del Mar Zamora Dominguez, Maria

    2018-06-01

    With the Paris Agreement’s ambition of limiting climate change to well below 2 °C, negative emission technologies (NETs) have moved into the limelight of discussions in climate science and policy. Despite several assessments, the current knowledge on NETs is still diffuse and incomplete, but also growing fast. Here, we synthesize a comprehensive body of NETs literature, using scientometric tools and performing an in-depth assessment of the quantitative and qualitative evidence therein. We clarify the role of NETs in climate change mitigation scenarios, their ethical implications, as well as the challenges involved in bringing the various NETs to the market and scaling them up in time. There are six major findings arising from our assessment: first, keeping warming below 1.5 °C requires the large-scale deployment of NETs, but this dependency can still be kept to a minimum for the 2 °C warming limit. Second, accounting for economic and biophysical limits, we identify relevant potentials for all NETs except ocean fertilization. Third, any single NET is unlikely to sustainably achieve the large NETs deployment observed in many 1.5 °C and 2 °C mitigation scenarios. Yet, portfolios of multiple NETs, each deployed at modest scales, could be invaluable for reaching the climate goals. Fourth, a substantial gap exists between the upscaling and rapid diffusion of NETs implied in scenarios and progress in actual innovation and deployment. If NETs are required at the scales currently discussed, the resulting urgency of implementation is currently neither reflected in science nor policy. Fifth, NETs face severe barriers to implementation and are only weakly incentivized so far. Finally, we identify distinct ethical discourses relevant for NETs, but highlight the need to root them firmly in the available evidence in order to render such discussions relevant in practice.

  6. Seemingly unrelated intervention time series models for effectiveness evaluation of large scale environmental remediation.

    PubMed

    Ip, Ryan H L; Li, W K; Leung, Kenneth M Y

    2013-09-15

    Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Data Basin Aquatic Center: expanding access to aquatic conservation data, analysis tools, people and practical answers

    NASA Astrophysics Data System (ADS)

    Osborne-Gowey, J.; Strittholt, J.; Bergquist, J.; Ward, B. C.; Sheehan, T.; Comendant, T.; Bachelet, D. M.

    2009-12-01

    The world’s aquatic resources are experiencing anthropogenic pressures on an unprecedented scale and aquatic organisms are experiencing widespread population changes and ecosystem-scale habitat alterations. Climate change is likely to exacerbate these threats, in some cases reducing the range of native North American fishes by 20-100% (depending on the location of the population and the model assumptions). Scientists around the globe are generating large volumes of data that vary in quality, format, supporting documentation, and accessibility. Moreover, diverse models are being run at various temporal and spatial scales as scientists attempt to understand previous (and project future) human impacts to aquatic species and their habitats. Conservation scientists often struggle to synthesize this wealth of information for developing practical on-the-ground management strategies. As a result, the best available science is often not utilized in the decision-making and adaptive management processes. As aquatic conservation problems around the globe become more serious and the demand to solve them grows more urgent, scientists and land-use managers need a new way to bring strategic, science-based, and action-oriented approaches to aquatic conservation. The Conservation Biology Institute (CBI), with partners such as ESRI, is developing an Aquatic Center as part of a dynamic, web-based resource (Data Basin; http: databasin.org) that centralizes usable aquatic datasets and provides analytical tools to visualize, analyze, and communicate findings for practical applications. To illustrate its utility, we present example datasets of varying spatial scales and synthesize multiple studies to arrive at novel solutions to aquatic threats.

  8. Topics in geophysical fluid dynamics: Atmospheric dynamics, dynamo theory, and climate dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghil, M.; Childress, S.

    1987-01-01

    This text is the first study to apply systematically the successive bifurcations approach to complex time-dependent processes in large scale atmospheric dynamics, geomagnetism, and theoretical climate dynamics. The presentation of recent results on planetary-scale phenomena in the earth's atmosphere, ocean, cryosphere, mantle and core provides an integral account of mathematical theory and methods together with physical phenomena and processes. The authors address a number of problems in rapidly developing areas of geophysics, bringing into closer contact the modern tools of nonlinear mathematics and the novel problems of global change in the environment.

  9. Photometry of icy satellites: How important is multiple scattering in diluting shadows?

    NASA Technical Reports Server (NTRS)

    Buratti, B.; Veverka, J.

    1984-01-01

    Voyager observations have shown that the photometric properties of icy satellites are influenced significantly by large-scale roughness elements on the surfaces. While recent progress was made in treating the photometric effects of macroscopic roughness, it is still the case that even the most complete models do not account for the effects of multiple scattering fully. Multiple scattering dilutes shadows caused by large-scale features, yet for any specific model it is difficult to calculate the amount of dilution as a function of albedo. Accordingly, laboratory measurements were undertaken using the Cornell Goniometer to evaluate the magnitude of the effect.

  10. SchizConnect: Mediating neuroimaging databases on schizophrenia and related disorders for large-scale integration.

    PubMed

    Wang, Lei; Alpert, Kathryn I; Calhoun, Vince D; Cobia, Derin J; Keator, David B; King, Margaret D; Kogan, Alexandr; Landis, Drew; Tallis, Marcelo; Turner, Matthew D; Potkin, Steven G; Turner, Jessica A; Ambite, Jose Luis

    2016-01-01

    SchizConnect (www.schizconnect.org) is built to address the issues of multiple data repositories in schizophrenia neuroimaging studies. It includes a level of mediation--translating across data sources--so that the user can place one query, e.g. for diffusion images from male individuals with schizophrenia, and find out from across participating data sources how many datasets there are, as well as downloading the imaging and related data. The current version handles the Data Usage Agreements across different studies, as well as interpreting database-specific terminologies into a common framework. New data repositories can also be mediated to bring immediate access to existing datasets. Compared with centralized, upload data sharing models, SchizConnect is a unique, virtual database with a focus on schizophrenia and related disorders that can mediate live data as information is being updated at each data source. It is our hope that SchizConnect can facilitate testing new hypotheses through aggregated datasets, promoting discovery related to the mechanisms underlying schizophrenic dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazillian, Morgan; Pedersen, Ascha Lychett; Pless, Jacuelyn

    Shale gas resource potential in China is assessed to be large, and its development could have wide-ranging economic, environmental, and energy security implications. Although commercial scale shale gas development has not yet begun in China, it holds the potential to change the global energy landscape. Chinese decision-makers are wrestling with the challenges associated with bringing the potential to reality: geologic complexity; infrastructure and logistical difficulties; technological, institutional, social and market development issues; and environmental impacts, including greenhouse gas emissions, impacts on water availability and quality, and air pollution. This paper briefly examines the current situation and outlook for shale gasmore » in China, and explores existing and potential avenues for international cooperation. We find that despite some barriers to large-scale development, Chinese shale gas production has the potential to grow rapidly over the medium-term.« less

  12. High-efficiency nanostructured silicon solar cells on a large scale realized through the suppression of recombination channels.

    PubMed

    Zhong, Sihua; Huang, Zengguang; Lin, Xingxing; Zeng, Yang; Ma, Yechi; Shen, Wenzhong

    2015-01-21

    Nanostructured silicon solar cells show great potential for new-generation photovoltaics due to their ability to approach ideal light-trapping. However, the nanofeatured morphology that brings about the optical benefits also introduces new recombination channels, and severe deterioration in the electrical performance even outweighs the gain in optics in most attempts. This Research News article aims to review the recent progress in the suppression of carrier recombination in silicon nanostructures, with the emphasis on the optimization of surface morphology and controllable nanostructure height and emitter doping concentration, as well as application of dielectric passivation coatings, providing design rules to realize high-efficiency nanostructured silicon solar cells on a large scale. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  14. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  15. Ultrafast frequency-agile terahertz devices using methylammonium lead halide perovskites

    PubMed Central

    Chanana, Ashish; Liu, Xiaojie; Vardeny, Zeev Valy

    2018-01-01

    The ability to control the response of metamaterial structures can facilitate the development of new terahertz devices, with applications in spectroscopy and communications. We demonstrate ultrafast frequency-agile terahertz metamaterial devices that enable such a capability, in which multiple perovskites can be patterned in each unit cell with micrometer-scale precision. To accomplish this, we developed a fabrication technique that shields already deposited perovskites from organic solvents, allowing for multiple perovskites to be patterned in close proximity. By doing so, we demonstrate tuning of the terahertz resonant response that is based not only on the optical pump fluence but also on the optical wavelength. Because polycrystalline perovskites have subnanosecond photocarrier recombination lifetimes, switching between resonances can occur on an ultrafast time scale. The use of multiple perovskites allows for new functionalities that are not possible using a single semiconducting material. For example, by patterning one perovskite in the gaps of split-ring resonators and bringing a uniform thin film of a second perovskite in close proximity, we demonstrate tuning of the resonant response using one optical wavelength and suppression of the resonance using a different optical wavelength. This general approach offers new capabilities for creating tunable terahertz devices. PMID:29736416

  16. Ultrafast frequency-agile terahertz devices using methylammonium lead halide perovskites.

    PubMed

    Chanana, Ashish; Liu, Xiaojie; Zhang, Chuang; Vardeny, Zeev Valy; Nahata, Ajay

    2018-05-01

    The ability to control the response of metamaterial structures can facilitate the development of new terahertz devices, with applications in spectroscopy and communications. We demonstrate ultrafast frequency-agile terahertz metamaterial devices that enable such a capability, in which multiple perovskites can be patterned in each unit cell with micrometer-scale precision. To accomplish this, we developed a fabrication technique that shields already deposited perovskites from organic solvents, allowing for multiple perovskites to be patterned in close proximity. By doing so, we demonstrate tuning of the terahertz resonant response that is based not only on the optical pump fluence but also on the optical wavelength. Because polycrystalline perovskites have subnanosecond photocarrier recombination lifetimes, switching between resonances can occur on an ultrafast time scale. The use of multiple perovskites allows for new functionalities that are not possible using a single semiconducting material. For example, by patterning one perovskite in the gaps of split-ring resonators and bringing a uniform thin film of a second perovskite in close proximity, we demonstrate tuning of the resonant response using one optical wavelength and suppression of the resonance using a different optical wavelength. This general approach offers new capabilities for creating tunable terahertz devices.

  17. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  18. INTERDISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY Trajectory Control of Scale-Free Dynamical Networks with Exogenous Disturbances

    NASA Astrophysics Data System (ADS)

    Yang, Hong-Yong; Zhang, Shun; Zong, Guang-Deng

    2011-01-01

    In this paper, the trajectory control of multi-agent dynamical systems with exogenous disturbances is studied. Suppose multiple agents composing of a scale-free network topology, the performance of rejecting disturbances for the low degree node and high degree node is analyzed. Firstly, the consensus of multi-agent systems without disturbances is studied by designing a pinning control strategy on a part of agents, where this pinning control can bring multiple agents' states to an expected consensus track. Then, the influence of the disturbances is considered by developing disturbance observers, and disturbance observers based control (DOBC) are developed for disturbances generated by an exogenous system to estimate the disturbances. Asymptotical consensus of the multi-agent systems with disturbances under the composite controller can be achieved for scale-free network topology. Finally, by analyzing examples of multi-agent systems with scale-free network topology and exogenous disturbances, the verities of the results are proved. Under the DOBC with the designed parameters, the trajectory convergence of multi-agent systems is researched by pinning two class of the nodes. We have found that it has more stronger robustness to exogenous disturbances for the high degree node pinned than that of the low degree node pinned.

  19. Science Competencies That Go Unassessed

    ERIC Educational Resources Information Center

    Gilmer, Penny J.; Sherdan, Danielle M.; Oosterhof, Albert; Rohani, Faranak; Rouby, Aaron

    2011-01-01

    Present large-scale assessments require the use of item formats, such as multiple choice, that can be administered and scored efficiently. This limits competencies that can be measured by these assessments. An alternative approach to large-scale assessments is being investigated that would include the use of complex performance assessments. As…

  20. Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldenson, N.; Mauger, G.; Leung, L. R.

    Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less

  1. Effects of trade openness and market scale on different regions

    NASA Astrophysics Data System (ADS)

    Tian, Renqu; Yang, Zisheng

    2017-04-01

    This paper revisits the relationship between growth, trade openness and market scale. Empirical studies have provided that area develops lopsided problem in China is increasingly serious, while large trade openness and market scale bring about more economic growth. We use a number of data set from province-level’s gross domestic product and socio-economic, as well as statistical methods panel ordinary least squares and instrumental variables estimation techniques to explore the effects of trade openness and regional market scale on the three major economic regions. The results indicate: Firstly, the impact of market scale and trade openness on economic growth is found to be positive. Secondly, the overall regional disparity is owing to the trade openness, market scale and macroeconomic policies. Thirdly, midland and western region should take advantage of regional geographical location and resource to expand exports and narrow the regional difference.

  2. A worldwide analysis of the impact of forest cover change on annual runoff across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Liu, S.

    2017-12-01

    Despite extensive studies on hydrological responses to forest cover change in small watersheds, the hydrological responses to forest change and associated mechanisms across multiple spatial scales have not been fully understood. This review thus examined about 312 watersheds worldwide to provide a generalized framework to evaluate hydrological responses to forest cover change and to identify the contribution of spatial scale, climate, forest type and hydrological regime in determining the intensity of forest change related hydrological responses in small (<1000 km2) and large watersheds (≥1000 km2). Key findings include: 1) the increase in annual runoff associated with forest cover loss is statistically significant at multiple spatial scales whereas the effect of forest cover gain is statistically inconsistent; 2) the sensitivity of annual runoff to forest cover change tends to attenuate as watershed size increases only in large watersheds; 3) annual runoff is more sensitive to forest cover change in water-limited watersheds than in energy-limited watersheds across all spatial scales; and 4) small mixed forest-dominated watersheds or large snow-dominated watersheds are more hydrologically resilient to forest cover change. These findings improve the understanding of hydrological response to forest cover change at different spatial scales and provide a scientific underpinning to future watershed management in the context of climate change and increasing anthropogenic disturbances.

  3. Linear precoding based on polynomial expansion: reducing complexity in massive MIMO.

    PubMed

    Mueller, Axel; Kammoun, Abla; Björnson, Emil; Debbah, Mérouane

    Massive multiple-input multiple-output (MIMO) techniques have the potential to bring tremendous improvements in spectral efficiency to future communication systems. Counterintuitively, the practical issues of having uncertain channel knowledge, high propagation losses, and implementing optimal non-linear precoding are solved more or less automatically by enlarging system dimensions. However, the computational precoding complexity grows with the system dimensions. For example, the close-to-optimal and relatively "antenna-efficient" regularized zero-forcing (RZF) precoding is very complicated to implement in practice, since it requires fast inversions of large matrices in every coherence period. Motivated by the high performance of RZF, we propose to replace the matrix inversion and multiplication by a truncated polynomial expansion (TPE), thereby obtaining the new TPE precoding scheme which is more suitable for real-time hardware implementation and significantly reduces the delay to the first transmitted symbol. The degree of the matrix polynomial can be adapted to the available hardware resources and enables smooth transition between simple maximum ratio transmission and more advanced RZF. By deriving new random matrix results, we obtain a deterministic expression for the asymptotic signal-to-interference-and-noise ratio (SINR) achieved by TPE precoding in massive MIMO systems. Furthermore, we provide a closed-form expression for the polynomial coefficients that maximizes this SINR. To maintain a fixed per-user rate loss as compared to RZF, the polynomial degree does not need to scale with the system, but it should be increased with the quality of the channel knowledge and the signal-to-noise ratio.

  4. Evidence and AIDS activism: HIV scale-up and the contemporary politics of knowledge in global public health

    PubMed Central

    Colvin, Christopher J.

    2014-01-01

    The HIV epidemic is widely recognised as having prompted one of the most remarkable intersections ever of illness, science and activism. The production, circulation, use and evaluation of empirical scientific ‘evidence’ played a central part in activists’ engagement with AIDS science. Previous activist engagement with evidence focused on the social and biomedical responses to HIV in the global North as well as challenges around ensuring antiretroviral treatment (ART) was available in the global South. More recently, however, with the roll-out and scale-up of large public-sector ART programmes and new multi-dimensional prevention efforts, the relationships between evidence and activism have been changing. Scale-up of these large-scale treatment and prevention programmes represents an exciting new opportunity while bringing with it a host of new challenges. This paper examines what new forms of evidence and activism will be required to address the challenges of the scaling-up era of HIV treatment and prevention. It reviews some recent controversies around evidence and HIV scale-up and describes the different forms of evidence and activist strategies that will be necessary for a robust response to these new challenges. PMID:24498918

  5. Modelling DC responses of 3D complex fracture networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  6. Modelling DC responses of 3D complex fracture networks

    DOE PAGES

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    2018-03-01

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  7. Evolution of accelerometer methods for physical activity research.

    PubMed

    Troiano, Richard P; McClain, James J; Brychta, Robert J; Chen, Kong Y

    2014-07-01

    The technology and application of current accelerometer-based devices in physical activity (PA) research allow the capture and storage or transmission of large volumes of raw acceleration signal data. These rich data not only provide opportunities to improve PA characterisation, but also bring logistical and analytic challenges. We discuss how researchers and developers from multiple disciplines are responding to the analytic challenges and how advances in data storage, transmission and big data computing will minimise logistical challenges. These new approaches also bring the need for several paradigm shifts for PA researchers, including a shift from count-based approaches and regression calibrations for PA energy expenditure (PAEE) estimation to activity characterisation and EE estimation based on features extracted from raw acceleration signals. Furthermore, a collaborative approach towards analytic methods is proposed to facilitate PA research, which requires a shift away from multiple independent calibration studies. Finally, we make the case for a distinction between PA represented by accelerometer-based devices and PA assessed by self-report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Evaluation of Large-scale Data to Detect Irregularity in Payment for Medical Services. An Extended Use of Benford's Law.

    PubMed

    Park, Junghyun A; Kim, Minki; Yoon, Seokjoon

    2016-05-17

    Sophisticated anti-fraud systems for the healthcare sector have been built based on several statistical methods. Although existing methods have been developed to detect fraud in the healthcare sector, these algorithms consume considerable time and cost, and lack a theoretical basis to handle large-scale data. Based on mathematical theory, this study proposes a new approach to using Benford's Law in that we closely examined the individual-level data to identify specific fees for in-depth analysis. We extended the mathematical theory to demonstrate the manner in which large-scale data conform to Benford's Law. Then, we empirically tested its applicability using actual large-scale healthcare data from Korea's Health Insurance Review and Assessment (HIRA) National Patient Sample (NPS). For Benford's Law, we considered the mean absolute deviation (MAD) formula to test the large-scale data. We conducted our study on 32 diseases, comprising 25 representative diseases and 7 DRG-regulated diseases. We performed an empirical test on 25 diseases, showing the applicability of Benford's Law to large-scale data in the healthcare industry. For the seven DRG-regulated diseases, we examined the individual-level data to identify specific fees to carry out an in-depth analysis. Among the eight categories of medical costs, we considered the strength of certain irregularities based on the details of each DRG-regulated disease. Using the degree of abnormality, we propose priority action to be taken by government health departments and private insurance institutions to bring unnecessary medical expenses under control. However, when we detect deviations from Benford's Law, relatively high contamination ratios are required at conventional significance levels.

  9. Report of the Fermilab ILC Citizens' Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Fermi National Accelerator Laboratory convened the ILC Citizens' Task Force to provide guidance and advice to the laboratory to ensure that community concerns and ideas are included in all public aspects of planning and design for a proposed future accelerator, the International Linear Collider. In this report, the members of the Task Force describe the process they used to gather and analyze information on all aspects of the proposed accelerator and its potential location at Fermilab in northern Illinois. They present the conclusions and recommendations they reached as a result of the learning process and their subsequent discussions and deliberations.more » While the Task Force was charged to provide guidance on the ILC, it became clear during the process that the high cost of the proposed accelerator made a near-term start for the project at Fermilab unlikely. Nevertheless, based on a year of extensive learning and dialogue, the Task Force developed a series of recommendations for Fermilab to consider as the laboratory develops all successor projects to the Tevatron. The Task Force recognizes that bringing a next-generation particle physics project to Fermilab will require both a large international effort and the support of the local community. While the Task Force developed its recommendations in response to the parameters of a future ILC, the principles they set forth apply directly to any large project that may be conceived at Fermilab, or at other laboratories, in the future. With this report, the Task Force fulfills its task of guiding Fermilab from the perspective of the local community on how to move forward with a large-scale project while building positive relationships with surrounding communities. The report summarizes the benefits, concerns and potential impacts of bringing a large-scale scientific project to northern Illinois.« less

  10. Fire management over large landscapes: a hierarchical approach

    Treesearch

    Kenneth G. Boykin

    2008-01-01

    Management planning for fires becomes increasingly difficult as scale increases. Stratification provides land managers with multiple scales in which to prepare plans. Using statistical techniques, Geographic Information Systems (GIS), and meetings with land managers, we divided a large landscape of over 2 million acres (White Sands Missile Range) into parcels useful in...

  11. Modeling Booklet Effects for Nonequivalent Group Designs in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Hecht, Martin; Weirich, Sebastian; Siegle, Thilo; Frey, Andreas

    2015-01-01

    Multiple matrix designs are commonly used in large-scale assessments to distribute test items to students. These designs comprise several booklets, each containing a subset of the complete item pool. Besides reducing the test burden of individual students, using various booklets allows aligning the difficulty of the presented items to the assumed…

  12. Framing Innovation: The Role of Distributed Leadership in Gaining Acceptance of Large-Scale Technology Initiatives

    ERIC Educational Resources Information Center

    Turner, Henry J.

    2014-01-01

    This dissertation of practice utilized a multiple case-study approach to examine distributed leadership within five school districts that were attempting to gain acceptance of a large-scale 1:1 technology initiative. Using frame theory and distributed leadership theory as theoretical frameworks, this study interviewed each district's…

  13. Plasmonic nanoparticle lithography: Fast resist-free laser technique for large-scale sub-50 nm hole array fabrication

    NASA Astrophysics Data System (ADS)

    Pan, Zhenying; Yu, Ye Feng; Valuckas, Vytautas; Yap, Sherry L. K.; Vienne, Guillaume G.; Kuznetsov, Arseniy I.

    2018-05-01

    Cheap large-scale fabrication of ordered nanostructures is important for multiple applications in photonics and biomedicine including optical filters, solar cells, plasmonic biosensors, and DNA sequencing. Existing methods are either expensive or have strict limitations on the feature size and fabrication complexity. Here, we present a laser-based technique, plasmonic nanoparticle lithography, which is capable of rapid fabrication of large-scale arrays of sub-50 nm holes on various substrates. It is based on near-field enhancement and melting induced under ordered arrays of plasmonic nanoparticles, which are brought into contact or in close proximity to a desired material and acting as optical near-field lenses. The nanoparticles are arranged in ordered patterns on a flexible substrate and can be attached and removed from the patterned sample surface. At optimized laser fluence, the nanohole patterning process does not create any observable changes to the nanoparticles and they have been applied multiple times as reusable near-field masks. This resist-free nanolithography technique provides a simple and cheap solution for large-scale nanofabrication.

  14. Data-based discharge extrapolation: estimating annual discharge for a partially gauged large river basin from its small sub-basins

    NASA Astrophysics Data System (ADS)

    Gong, L.

    2013-12-01

    Large-scale hydrological models and land surface models are by far the only tools for accessing future water resources in climate change impact studies. Those models estimate discharge with large uncertainties, due to the complex interaction between climate and hydrology, the limited quality and availability of data, as well as model uncertainties. A new purely data-based scale-extrapolation method is proposed, to estimate water resources for a large basin solely from selected small sub-basins, which are typically two-orders-of-magnitude smaller than the large basin. Those small sub-basins contain sufficient information, not only on climate and land surface, but also on hydrological characteristics for the large basin In the Baltic Sea drainage basin, best discharge estimation for the gauged area was achieved with sub-basins that cover 2-4% of the gauged area. There exist multiple sets of sub-basins that resemble the climate and hydrology of the basin equally well. Those multiple sets estimate annual discharge for gauged area consistently well with 5% average error. The scale-extrapolation method is completely data-based; therefore it does not force any modelling error into the prediction. The multiple predictions are expected to bracket the inherent variations and uncertainties of the climate and hydrology of the basin. The method can be applied in both un-gauged basins and un-gauged periods with uncertainty estimation.

  15. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  16. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  17. Integrating Data Streams from in-situ Measurements, Social Networks and Satellite Earth Observation to Augment Operational Flood Monitoring and Forecasting: the 2017 Hurricane Season in the Americas as a Large-scale Test Case

    NASA Astrophysics Data System (ADS)

    Matgen, P.; Pelich, R.; Brangbour, E.; Bruneau, P.; Chini, M.; Hostache, R.; Schumann, G.; Tamisier, T.

    2017-12-01

    Hurricanes Harvey, Irma and Maria generated large streams of heterogeneous data, coming notably from three main sources: imagery (satellite and aircraft), in-situ measurement stations and social media. Interpreting these data streams brings critical information to develop, validate and update prediction models. The study addresses existing gaps in the joint extraction of disaster risk information from multiple data sources and their usefulness for reducing the predictive uncertainty of large-scale flood inundation models. Satellite EO data, most notably the free-of-charge data streams generated by the Copernicus program, provided a wealth of high-resolution imagery covering the large areas affected. Our study is focussing on the mapping of flooded areas from a sequence of Sentinel-1 SAR imagery using a classification algorithm recently implemented on the European Space Agency's Grid Processing On Demand environment. The end-to-end-processing chain provided a fast access to all relevant imagery and an effective processing for near-real time analyses. The classification algorithm was applied on pairs of images to rapidly and automatically detect, record and disseminate all observable changes of water bodies. Disaster information was also retrieved from photos as well as texts contributed on social networks and the study shows how this information may complement EO and in-situ data and augment information content. As social media data are noisy and difficult to geo-localize, different techniques are being developed to automatically infer associated semantics and geotags. The presentation provides a cross-comparison between the hazard information obtained from the three data sources. We provide examples of how the generated database of geo-localized disaster information was finally integrated into a large-scale hydrodynamic model of the Colorado River emptying into the Matagorda Bay on the Gulf of Mexico in order to reduce its predictive uncertainty. We describe the success of these efforts as well as the current limitations in fulfilling the needs of the decision-makers. Finally, we also reflect on how these recent developments can leverage the implementation of a more effective response to flood disasters worldwide and can support global initiatives, such as the Global Flood Partnership.

  18. Idealized modeling of convective organization with changing sea surface temperatures using multiple equilibria in weak temperature gradient simulations

    NASA Astrophysics Data System (ADS)

    Sentić, Stipo; Sessions, Sharon L.

    2017-06-01

    The weak temperature gradient (WTG) approximation is a method of parameterizing the influences of the large scale on local convection in limited domain simulations. WTG simulations exhibit multiple equilibria in precipitation; depending on the initial moisture content, simulations can precipitate or remain dry for otherwise identical boundary conditions. We use a hypothesized analogy between multiple equilibria in precipitation in WTG simulations, and dry and moist regions of organized convection to study tropical convective organization. We find that the range of wind speeds that support multiple equilibria depends on sea surface temperature (SST). Compared to the present SST, low SSTs support a narrower range of multiple equilibria at higher wind speeds. In contrast, high SSTs exhibit a narrower range of multiple equilibria at low wind speeds. This suggests that at high SSTs, organized convection might occur with lower surface forcing. To characterize convection at different SSTs, we analyze the change in relationships between precipitation rate, atmospheric stability, moisture content, and the large-scale transport of moist entropy and moisture with increasing SSTs. We find an increase in large-scale export of moisture and moist entropy from dry simulations with increasing SST, which is consistent with a strengthening of the up-gradient transport of moisture from dry regions to moist regions in organized convection. Furthermore, the changes in diagnostic relationships with SST are consistent with more intense convection in precipitating regions of organized convection for higher SSTs.

  19. Nonparametric Bayesian Multiple Imputation for Incomplete Categorical Variables in Large-Scale Assessment Surveys

    ERIC Educational Resources Information Center

    Si, Yajuan; Reiter, Jerome P.

    2013-01-01

    In many surveys, the data comprise a large number of categorical variables that suffer from item nonresponse. Standard methods for multiple imputation, like log-linear models or sequential regression imputation, can fail to capture complex dependencies and can be difficult to implement effectively in high dimensions. We present a fully Bayesian,…

  20. Seeing the forest for the trees: hybridity and social-ecological symbols, rituals and resilience in postdisaster contexts

    Treesearch

    Keith G. Tidball

    2014-01-01

    The role of community-based natural resources management in the form of "greening" after large scale system shocks and surprises is argued to provide multiple benefits via engagement with living elements of social-ecological systems and subsequent enhanced resilience at multiple scales. The importance of so-called social-ecological symbols, especially the...

  1. Nanomanufacturing : nano-structured materials made layer-by-layer.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, James V.; Cheng, Shengfeng; Grest, Gary Stephen

    Large-scale, high-throughput production of nano-structured materials (i.e. nanomanufacturing) is a strategic area in manufacturing, with markets projected to exceed $1T by 2015. Nanomanufacturing is still in its infancy; process/product developments are costly and only touch on potential opportunities enabled by growing nanoscience discoveries. The greatest promise for high-volume manufacturing lies in age-old coating and imprinting operations. For materials with tailored nm-scale structure, imprinting/embossing must be achieved at high speeds (roll-to-roll) and/or over large areas (batch operation) with feature sizes less than 100 nm. Dispersion coatings with nanoparticles can also tailor structure through self- or directed-assembly. Layering films structured with thesemore » processes have tremendous potential for efficient manufacturing of microelectronics, photovoltaics and other topical nano-structured devices. This project is designed to perform the requisite R and D to bring Sandia's technology base in computational mechanics to bear on this scale-up problem. Project focus is enforced by addressing a promising imprinting process currently being commercialized.« less

  2. Measures of Agreement Between Many Raters for Ordinal Classifications

    PubMed Central

    Nelson, Kerrie P.; Edwards, Don

    2015-01-01

    Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449

  3. Functional Connectivity in Multiple Cortical Networks Is Associated with Performance Across Cognitive Domains in Older Adults.

    PubMed

    Shaw, Emily E; Schultz, Aaron P; Sperling, Reisa A; Hedden, Trey

    2015-10-01

    Intrinsic functional connectivity MRI has become a widely used tool for measuring integrity in large-scale cortical networks. This study examined multiple cortical networks using Template-Based Rotation (TBR), a method that applies a priori network and nuisance component templates defined from an independent dataset to test datasets of interest. A priori templates were applied to a test dataset of 276 older adults (ages 65-90) from the Harvard Aging Brain Study to examine the relationship between multiple large-scale cortical networks and cognition. Factor scores derived from neuropsychological tests represented processing speed, executive function, and episodic memory. Resting-state BOLD data were acquired in two 6-min acquisitions on a 3-Tesla scanner and processed with TBR to extract individual-level metrics of network connectivity in multiple cortical networks. All results controlled for data quality metrics, including motion. Connectivity in multiple large-scale cortical networks was positively related to all cognitive domains, with a composite measure of general connectivity positively associated with general cognitive performance. Controlling for the correlations between networks, the frontoparietal control network (FPCN) and executive function demonstrated the only significant association, suggesting specificity in this relationship. Further analyses found that the FPCN mediated the relationships of the other networks with cognition, suggesting that this network may play a central role in understanding individual variation in cognition during aging.

  4. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be viewed as another variation of Pollock's robust design and may be applicable to a wide variety of scenarios where species occur in an area but are not always near the sampled locations. The estimation approach is likely to be especially useful in multispecies conservation programmes by providing efficient estimates using multiple detection devices and by providing device-specific detection probability estimates for use in survey design.

  5. Image Tiling for Profiling Large Objects

    NASA Technical Reports Server (NTRS)

    Venkataraman, Ajit; Schock, Harold; Mercer, Carolyn R.

    1992-01-01

    Three dimensional surface measurements of large objects arc required in a variety of industrial processes. The nature of these measurements is changing as optical instruments arc beginning to replace conventional contact probes scanned over the objects. A common characteristic of the optical surface profilers is the trade off between measurement accuracy and field of view. In order to measure a large object with high accuracy, multiple views arc required. An accurate transformation between the different views is needed to bring about their registration. In this paper, we demonstrate how the transformation parameters can be obtained precisely by choosing control points which lie in the overlapping regions of the images. A good starting point for the transformation parameters is obtained by having a knowledge of the scanner position. The selection of the control points arc independent of the object geometry. By successively recording multiple views and obtaining transformation with respect to a single coordinate system, a complete physical model of an object can be obtained. Since all data arc in the same coordinate system, it can thus be used for building automatic models for free form surfaces.

  6. Intrinsic fluctuations of the proton saturation momentum scale in high multiplicity p+p collisions

    DOE PAGES

    McLerran, Larry; Tribedy, Prithwish

    2015-11-02

    High multiplicity events in p+p collisions are studied using the theory of the Color Glass Condensate. Here, we show that intrinsic fluctuations of the proton saturation momentum scale are needed in addition to the sub-nucleonic color charge fluctuations to explain the very high multiplicity tail of distributions in p+p collisions. It is presumed that the origin of such intrinsic fluctuations is non-perturbative in nature. Classical Yang Mills simulations using the IP-Glasma model are performed to make quantitative estimations. Furthermore, we find that fluctuations as large as O(1) of the average values of the saturation momentum scale can lead to raremore » high multiplicity events seen in p+p data at RHIC and LHC energies. Using the available data on multiplicity distributions we try to constrain the distribution of the proton saturation momentum scale and make predictions for the multiplicity distribution in 13 TeV p+p collisions.« less

  7. Geospatial optimization of siting large-scale solar projects

    USGS Publications Warehouse

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet; Gerritsen, Margot; Diffendorfer, James E.; Haines, Seth S.

    2014-01-01

    guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  8. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Physical activity correlates with neurological impairment and disability in multiple sclerosis.

    PubMed

    Motl, Robert W; Snook, Erin M; Wynn, Daniel R; Vollmer, Timothy

    2008-06-01

    This study examined the correlation of physical activity with neurological impairment and disability in persons with multiple sclerosis (MS). Eighty individuals with MS wore an accelerometer for 7 days and completed the Symptom Inventory (SI), Performance Scales (PS), and Expanded Disability Status Scale. There were large negative correlations between the accelerometer and SI (r = -0.56; rho = -0.58) and Expanded Disability Status Scale (r = -0.60; rho = -0.69) and a moderate negative correlation between the accelerometer and PS (r = -0.39; rho = -0.48) indicating that physical activity was associated with reduced neurological impairment and disability. Such findings provide a preliminary basis for using an accelerometer and the SI and PS as outcome measures in large-scale prospective and experimental examinations of the effect of physical activity behavior on disability and dependence in MS.

  10. GPU Accelerated DG-FDF Large Eddy Simulator

    NASA Astrophysics Data System (ADS)

    Inkarbekov, Medet; Aitzhan, Aidyn; Sammak, Shervin; Givi, Peyman; Kaltayev, Aidarkhan

    2017-11-01

    A GPU accelerated simulator is developed and implemented for large eddy simulation (LES) of turbulent flows. The filtered density function (FDF) is utilized for modeling of the subgrid scale quantities. The filtered transport equations are solved via a discontinuous Galerkin (DG) and the FDF is simulated via particle based Lagrangian Monte-Carlo (MC) method. It is demonstrated that the GPUs simulations are of the order of 100 times faster than the CPU-based calculations. This brings LES of turbulent flows to a new level, facilitating efficient simulation of more complex problems. The work at Al-Faraby Kazakh National University is sponsored by MoES of RK under Grant 3298/GF-4.

  11. Mind the gap - tip leakage vortex in axial turbines

    NASA Astrophysics Data System (ADS)

    Dreyer, M.; Decaix, J.; Münch-Alligné, C.; Farhat, M.

    2014-03-01

    The tendency of designing large Kaplan turbines with a continuous increase of output power is bringing to the front the cavitation erosion issue. Due to the flow in the gap between the runner and the discharge ring, axial turbine blades may develop the so called tip leakage vortex (TLV) cavitation with negative consequences. Such vortices may interact strongly with the wake of guide vanes leading to their multiple collapses and rebounds. If the vortex trajectory remains close to the blade tip, these collapses may lead to severe erosion. One is still unable today to predict its occurrence and development in axial turbines with acceptable accuracy. Numerical flow simulations as well as the actual scale-up rules from small to large scales are unreliable. The present work addresses this problematic in a simplified case study representing TLV cavitation to better understand its sensitivity to the gap width. A Naca0009 hydrofoil is used as a generic blade in the test section of EPFL cavitation tunnel. A sliding mounting support allowing an adjustable gap between the blade tip and wall was manufactured. The vortex trajectory is visualized with a high speed camera and appropriate lighting. The three dimensional velocity field induced by the TLV is investigated using stereo particle image velocimetry. We have taken into account the vortex wandering in the image processing to obtain accurate measurements of the vortex properties. The measurements were performed in three planes located downstream of the hydrofoil for different values of the flow velocity, the incidence angle and the gap width. The results clearly reveal a strong influence of the gap width on both trajectory and intensity of the tip leakage vortex.

  12. The build up of the correlation between halo spin and the large-scale structure

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Kang, Xi

    2018-01-01

    Both simulations and observations have confirmed that the spin of haloes/galaxies is correlated with the large-scale structure (LSS) with a mass dependence such that the spin of low-mass haloes/galaxies tend to be parallel with the LSS, while that of massive haloes/galaxies tend to be perpendicular with the LSS. It is still unclear how this mass dependence is built up over time. We use N-body simulations to trace the evolution of the halo spin-LSS correlation and find that at early times the spin of all halo progenitors is parallel with the LSS. As time goes on, mass collapsing around massive halo is more isotropic, especially the recent mass accretion along the slowest collapsing direction is significant and it brings the halo spin to be perpendicular with the LSS. Adopting the fractional anisotropy (FA) parameter to describe the degree of anisotropy of the large-scale environment, we find that the spin-LSS correlation is a strong function of the environment such that a higher FA (more anisotropic environment) leads to an aligned signal, and a lower anisotropy leads to a misaligned signal. In general, our results show that the spin-LSS correlation is a combined consequence of mass flow and halo growth within the cosmic web. Our predicted environmental dependence between spin and large-scale structure can be further tested using galaxy surveys.

  13. Calving distributions of individual bulls in multiple-sire pastures

    USDA-ARS?s Scientific Manuscript database

    The objective of this project was to quantify patterns in the calving rate of sires in multiple-sire pastures over seven years at a large-scale cow-calf operation. Data consisted of reproductive and genomic records from multiple-sire breeding pastures (n=33) at the United States Meat Animal Research...

  14. High Performance Semantic Factoring of Giga-Scale Semantic Graph Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Adolf, Robert D.; Al-Saffar, Sinan

    2010-10-04

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to their innate semantic structure. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multithreaded architecture of the Cray XMT platform, conventional clusters, and large data stores. In this paper we describe that architecture, and present the results of our deployingmore » that for the analysis of the Billion Triple dataset with respect to its semantic factors.« less

  15. Evaluation of an index of biotic integrity approach used to assess biological condition in western U.S. streams and rivers at varying spatial scales

    USGS Publications Warehouse

    Meador, M.R.; Whittier, T.R.; Goldstein, R.M.; Hughes, R.M.; Peck, D.V.

    2008-01-01

    Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data collection, analyses, and interpretation. The index of biotic integrity (IBI) has been widely used in eastern and central North America, where fish assemblages are complex and largely composed of native species, but IBI development has been hindered in the western United States because of relatively low fish species richness and greater relative abundance of alien fishes. Approaches to developing IBIs rarely provide a consistent means of assessing biological condition across multiple ecoregions. We conducted an evaluation of IBIs recently proposed for three ecoregions of the western United States using an independent data set covering a large geographic scale. We standardized the regional IBIs and developed biological condition criteria, assessed the responsiveness of IBIs to basin-level land uses, and assessed their precision and concordance with basin-scale IBIs. Standardized IBI scores from 318 sites in the western United States comprising mountain, plains, and xeric ecoregions were significantly related to combined urban and agricultural land uses. Standard deviations and coefficients of variation revealed relatively low variation in IBI scores based on multiple sampling reaches at sites. A relatively high degree of corroboration with independent, locally developed IBIs indicates that the regional IBIs are robust across large geographic scales, providing precise and accurate assessments of biological condition for western U.S. streams. ?? Copyright by the American Fisheries Society 2008.

  16. Analytical Assessment of the Relationship between 100MWp Large-scale Grid-connected Photovoltaic Plant Performance and Meteorological Parameters

    NASA Astrophysics Data System (ADS)

    Sheng, Jie; Zhu, Qiaoming; Cao, Shijie; You, Yang

    2017-05-01

    This paper helps in study of the relationship between the photovoltaic power generation of large scale “fishing and PV complementary” grid-tied photovoltaic system and meteorological parameters, with multi-time scale power data from the photovoltaic power station and meteorological data over the same period of a whole year. The result indicates that, the PV power generation has the most significant correlation with global solar irradiation, followed by diurnal temperature range, sunshine hours, daily maximum temperature and daily average temperature. In different months, the maximum monthly average power generation appears in August, which related to the more global solar irradiation and longer sunshine hours in this month. However, the maximum daily average power generation appears in October, this is due to the drop in temperature brings about the improvement of the efficiency of PV panels. Through the contrast of monthly average performance ratio (PR) and monthly average temperature, it is shown that, the larger values of monthly average PR appears in April and October, while it is smaller in summer with higher temperature. The results concluded that temperature has a great influence on the performance ratio of large scale grid-tied PV power system, and it is important to adopt effective measures to decrease the temperature of PV plant properly.

  17. Analyzing the Validity of Relationship Banking through Agent-based Modeling

    NASA Astrophysics Data System (ADS)

    Nishikido, Yukihito; Takahashi, Hiroshi

    This article analyzes the validity of relationship banking through agent-based modeling. In the analysis, we especially focus on the relationship between economic conditions and both lenders' and borrowers' behaviors. As a result of intensive experiments, we made the following interesting findings: (1) Relationship banking contributes to reducing bad loan; (2) relationship banking is more effective in enhancing the market growth compared to transaction banking, when borrowers' sales scale is large; (3) keener competition among lenders may bring inefficiency to the market.

  18. Ending America’s Energy Insecurity: How Electric Vehicles Can Drive the Solution to Energy Independence

    DTIC Science & Technology

    2011-12-01

    road oil, aviation gasoline, kerosene, lubricants, naphtha-type jet fuel, pentanes plus, petrochemical feedstocks, special naphthas, still gas... refinery gas), waxes, miscellaneous products, and crude oil burned as fuel. Figure 2. Uses of Oil (EIA, 2010a, p. 148) There is no significant body of...1. Large-Scale Efforts in the 1990s There have been efforts in the past to bring about the adoption of EVs or other zero- emissions vehicles. There

  19. Bridging the Gap Between Theory and Practice: Structure and Randomization in Large Scale Combinatorial Search

    DTIC Science & Technology

    2012-01-17

    PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION...world problems. Our work brings together techniques from constraint programming, mathematical programming, and satisfiability in a symbiotic way to...power-­‐law  search  tree  model  for  complete  or  exact  methods     (See [9] for a detailed description of this work and

  20. Automatic item generation implemented for measuring artistic judgment aptitude.

    PubMed

    Bezruczko, Nikolaus

    2014-01-01

    Automatic item generation (AIG) is a broad class of methods that are being developed to address psychometric issues arising from internet and computer-based testing. In general, issues emphasize efficiency, validity, and diagnostic usefulness of large scale mental testing. Rapid prominence of AIG methods and their implicit perspective on mental testing is bringing painful scrutiny to many sacred psychometric assumptions. This report reviews basic AIG ideas, then presents conceptual foundations, image model development, and operational application to artistic judgment aptitude testing.

  1. A new resource for developing and strengthening large-scale community health worker programs.

    PubMed

    Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve

    2017-01-12

    Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.

  2. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  3. Bringing School Reform to Scale: Five Award-Winning Urban Districts. Educational Innovations

    ERIC Educational Resources Information Center

    Zavadsky, Heather

    2009-01-01

    "Bringing School Reform to Scale" looks in detail at five school districts that have been honored in recent years by The Broad Foundation, whose annual award is granted "each year to the urban school districts that demonstrate the greatest overall performance and improvement in student achievement while reducing achievement gaps among poor and…

  4. Pollutant Transport and Fate: Relations Between Flow-paths and Downstream Impacts of Human Activities

    NASA Astrophysics Data System (ADS)

    Thorslund, J.; Jarsjo, J.; Destouni, G.

    2017-12-01

    The quality of freshwater resources is increasingly impacted by human activities. Humans also extensively change the structure of landscapes, which may alter natural hydrological processes. To manage and maintain freshwater of good water quality, it is critical to understand how pollutants are released into, transported and transformed within the hydrological system. Some key scientific questions include: What are net downstream impacts of pollutants across different hydroclimatic and human disturbance conditions, and on different scales? What are the functions within and between components of the landscape, such as wetlands, on mitigating pollutant load delivery to downstream recipients? We explore these questions by synthesizing results from several relevant case study examples of intensely human-impacted hydrological systems. These case study sites have been specifically evaluated in terms of net impact of human activities on pollutant input to the aquatic system, as well as flow-path distributions trough wetlands as a potential ecosystem service of pollutant mitigation. Results shows that although individual wetlands have high retention capacity, efficient net retention effects were not always achieved at a larger landscape scale. Evidence suggests that the function of wetlands as mitigation solutions to pollutant loads is largely controlled by large-scale parallel and circular flow-paths, through which multiple wetlands are interconnected in the landscape. To achieve net mitigation effects at large scale, a large fraction of the polluted large-scale flows must be transported through multiple connected wetlands. Although such large-scale flow interactions are critical for assessing water pollution spreading and fate through the landscape, our synthesis shows a frequent lack of knowledge at such scales. We suggest ways forward for addressing the mismatch between the large scales at which key pollutant pressures and water quality changes take place and the relatively scale at which most studies and implementations are currently made. These suggestions can help bridge critical knowledge gaps, as needed for improving water quality predictions and mitigation solutions under human and environmental changes.

  5. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  6. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  7. Increase in Synchronization of Autonomic Rhythms between Individuals When Listening to Music

    PubMed Central

    Bernardi, Nicolò F.; Codrons, Erwan; di Leo, Rita; Vandoni, Matteo; Cavallaro, Filippo; Vita, Giuseppe; Bernardi, Luciano

    2017-01-01

    In light of theories postulating a role for music in forming emotional and social bonds, here we investigated whether endogenous rhythms synchronize between multiple individuals when listening to music. Cardiovascular and respiratory recordings were taken from multiple individuals (musically trained or music-naïve) simultaneously, at rest and during a live concert comprising music excerpts with varying degrees of complexity of the acoustic envelope. Inter-individual synchronization of cardiorespiratory rhythms showed a subtle but reliable increase during passively listening to music compared to baseline. The low-level auditory features of the music were largely responsible for creating or disrupting such synchronism, explaining ~80% of its variance, over and beyond subjective musical preferences and previous musical training. Listening to simple rhythms and melodies, which largely dominate the choice of music during rituals and mass events, brings individuals together in terms of their physiological rhythms, which could explain why music is widely used to favor social bonds. PMID:29089898

  8. Towards AI-powered personalization in MOOC learning

    NASA Astrophysics Data System (ADS)

    Yu, Han; Miao, Chunyan; Leung, Cyril; White, Timothy John

    2017-12-01

    Massive Open Online Courses (MOOCs) represent a form of large-scale learning that is changing the landscape of higher education. In this paper, we offer a perspective on how advances in artificial intelligence (AI) may enhance learning and research on MOOCs. We focus on emerging AI techniques including how knowledge representation tools can enable students to adjust the sequence of learning to fit their own needs; how optimization techniques can efficiently match community teaching assistants to MOOC mediation tasks to offer personal attention to learners; and how virtual learning companions with human traits such as curiosity and emotions can enhance learning experience on a large scale. These new capabilities will also bring opportunities for educational researchers to analyse students' learning skills and uncover points along learning paths where students with different backgrounds may require different help. Ethical considerations related to the application of AI in MOOC education research are also discussed.

  9. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  10. CROSS-SCALE CORRELATIONS AND THE DESIGN AND ANALYSIS OF AVIAN HABITAT SELECTION STUDIES

    EPA Science Inventory

    It has long been suggested that birds select habitat hierarchically, progressing from coarser to finer spatial scales. This hypothesis, in conjunction with the realization that many organisms likely respond to environmental patterns at multiple spatial scales, has led to a large ...

  11. Comparing multi-module connections in membrane chromatography scale-up.

    PubMed

    Yu, Zhou; Karkaria, Tishtar; Espina, Marianela; Hunjun, Manjeet; Surendran, Abera; Luu, Tina; Telychko, Julia; Yang, Yan-Ping

    2015-07-20

    Membrane chromatography is increasingly used for protein purification in the biopharmaceutical industry. Membrane adsorbers are often pre-assembled by manufacturers as ready-to-use modules. In large-scale protein manufacturing settings, the use of multiple membrane modules for a single batch is often required due to the large quantity of feed material. The question as to how multiple modules can be connected to achieve optimum separation and productivity has been previously approached using model proteins and mass transport theories. In this study, we compare the performance of multiple membrane modules in series and in parallel in the production of a protein antigen. Series connection was shown to provide superior separation compared to parallel connection in the context of competitive adsorption. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Eucalyptus plantations for energy production in Hawaii. 1980 annual report, January 1980-December 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitesell, C. D.

    1980-01-01

    In 1980 200 acres of eucalyptus trees were planted for a research and development biomass energy plantation bringing the total area under cultivation to 300 acres. Of this total acreage, 90 acres or 30% was planted in experimental plots. The remaining 70% of the cultivated area was closely monitored to determine the economic cost/benefit ratio of large scale biomass energy production. In the large scale plantings, standard field practices were set up for all phases of production: nursery, clearing, planting, weed control and fertilization. These practices were constantly evaluated for potential improvements in efficiency and reduced cost. Promising experimental treatmentsmore » were implemented on a large scale to test their effectiveness under field production conditions. In the experimental areas all scheduled data collection in 1980 has been completed and most measurements have been keypunched and analyzed. Soil samples and leaf samples have been analyzed for nutrient concentrations. Crop logging procedures have been set up to monitor tree growth through plant tissue analysis. An intensive computer search on biomass, nursery practices, harvesting equipment and herbicide applications has been completed through the services of the US Forest Service.« less

  13. Density-dependent clustering: I. Pulling back the curtains on motions of the BAO peak

    NASA Astrophysics Data System (ADS)

    Neyrinck, Mark C.; Szapudi, István; McCullagh, Nuala; Szalay, Alexander S.; Falck, Bridget; Wang, Jie

    2018-05-01

    The most common statistic used to analyze large-scale structure surveys is the correlation function, or power spectrum. Here, we show how `slicing' the correlation function on local density brings sensitivity to interesting non-Gaussian features in the large-scale structure, such as the expansion or contraction of baryon acoustic oscillations (BAO) according to the local density. The sliced correlation function measures the large-scale flows that smear out the BAO, instead of just correcting them as reconstruction algorithms do. Thus, we expect the sliced correlation function to be useful in constraining the growth factor, and modified gravity theories that involve the local density. Out of the studied cases, we find that the run of the BAO peak location with density is best revealed when slicing on a ˜40 h-1 Mpc filtered density. But slicing on a ˜100 h-1 Mpc filtered density may be most useful in distinguishing between underdense and overdense regions, whose BAO peaks are separated by a substantial ˜5 h-1 Mpc at z = 0. We also introduce `curtain plots' showing how local densities drive particle motions toward or away from each other over the course of an N-body simulation.

  14. Existence of k⁻¹ power-law scaling in the equilibrium regions of wall-bounded turbulence explained by Heisenberg's eddy viscosity.

    PubMed

    Katul, Gabriel G; Porporato, Amilcare; Nikora, Vladimir

    2012-12-01

    The existence of a "-1" power-law scaling at low wavenumbers in the longitudinal velocity spectrum of wall-bounded turbulence was explained by multiple mechanisms; however, experimental support has not been uniform across laboratory studies. This letter shows that Heisenberg's eddy viscosity approach can provide a theoretical framework that bridges these multiple mechanisms and explains the elusiveness of the "-1" power law in some experiments. Novel theoretical outcomes are conjectured about the role of intermittency and very-large scale motions in modifying the k⁻¹ scaling.

  15. Applicability of SCAR markers to food genomics: olive oil traceability.

    PubMed

    Pafundo, Simona; Agrimonti, Caterina; Maestri, Elena; Marmiroli, Nelson

    2007-07-25

    DNA analysis with molecular markers has opened a shortcut toward a genomic comprehension of complex organisms. The availability of micro-DNA extraction methods, coupled with selective amplification of the smallest extracted fragments with molecular markers, could equally bring a breakthrough in food genomics: the identification of original components in food. Amplified fragment length polymorphisms (AFLPs) have been instrumental in plant genomics because they may allow rapid and reliable analysis of multiple and potentially polymorphic sites. Nevertheless, their direct application to the analysis of DNA extracted from food matrixes is complicated by the low quality of DNA extracted: its high degradation and the presence of inhibitors of enzymatic reactions. The conversion of an AFLP fragment to a robust and specific single-locus PCR-based marker, therefore, could extend the use of molecular markers to large-scale analysis of complex agro-food matrixes. In the present study is reported the development of sequence characterized amplified regions (SCARs) starting from AFLP profiles of monovarietal olive oils analyzed on agarose gel; one of these was used to identify differences among 56 olive cultivars. All the developed markers were purposefully amplified in olive oils to apply them to olive oil traceability.

  16. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  17. From a meso- to micro-scale connectome: array tomography and mGRASP

    PubMed Central

    Rah, Jong-Cheol; Feng, Linqing; Druckmann, Shaul; Lee, Hojin; Kim, Jinhyun

    2015-01-01

    Mapping mammalian synaptic connectivity has long been an important goal of neuroscience because knowing how neurons and brain areas are connected underpins an understanding of brain function. Meeting this goal requires advanced techniques with single synapse resolution and large-scale capacity, especially at multiple scales tethering the meso- and micro-scale connectome. Among several advanced LM-based connectome technologies, Array Tomography (AT) and mammalian GFP-Reconstitution Across Synaptic Partners (mGRASP) can provide relatively high-throughput mapping synaptic connectivity at multiple scales. AT- and mGRASP-assisted circuit mapping (ATing and mGRASPing), combined with techniques such as retrograde virus, brain clearing techniques, and activity indicators will help unlock the secrets of complex neural circuits. Here, we discuss these useful new tools to enable mapping of brain circuits at multiple scales, some functional implications of spatial synaptic distribution, and future challenges and directions of these endeavors. PMID:26089781

  18. Applying Multidimensional Item Response Theory Models in Validating Test Dimensionality: An Example of K-12 Large-Scale Science Assessment

    ERIC Educational Resources Information Center

    Li, Ying; Jiao, Hong; Lissitz, Robert W.

    2012-01-01

    This study investigated the application of multidimensional item response theory (IRT) models to validate test structure and dimensionality. Multiple content areas or domains within a single subject often exist in large-scale achievement tests. Such areas or domains may cause multidimensionality or local item dependence, which both violate the…

  19. An Illustrative Guide to the Minerva Framework

    NASA Astrophysics Data System (ADS)

    Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration

    2017-10-01

    Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.

  20. Education and Outreach with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.

    2012-01-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.

  1. Improved Small Baseline processing by means of CAESAR eigen-interferograms decomposition

    NASA Astrophysics Data System (ADS)

    Verde, Simona; Reale, Diego; Pauciullo, Antonio; Fornaro, Gianfranco

    2018-05-01

    The Component extrAction and sElection SAR (CAESAR) is a method for the selection and filtering of scattering mechanisms recently proposed in the multibaseline interferometric SAR framework. Its strength is related to the possibility to select and extract multiple dominant scattering mechanisms, even interfering in the same pixel, since the stage of the interferograms generation, and to carry out a decorrelation noise phase filtering. Up to now, the validation of CAESAR has been addressed in the framework of SAR Tomography for the model-based detection of Persistent Scatterers (PSs). In this paper we investigate the effectiveness related to the use of CAESAR eigen-interferograms in classical multi-baseline DInSAR processing, based on the Small BAseline Subset (SBAS) strategy, typically adopted to extract large scale distributed deformation and atmospheric phase screen. Such components are also exploited for the calibration of the full resolution data for PS or tomographic analysis. By using COSMO-SKyMed (CSK) SAR data, it is demonstrated that dominant scattering component filtering effectively improves the monitoring of distributed spatially decorrelated areas (f.i. bare soil, rocks, etc.) and allows bringing to light man-made structures with dominant backscattering characteristics embedded in highly temporally decorrelated scenario, as isolated asphalt roads and block of buildings in non-urban areas. Moreover it is shown that, thanks to the CAESAR multiple scattering components separation, the layover mitigation in low-topography eigen-interferograms relieves Phase Unwrapping (PhU) errors in urban areas due to abrupt height variations.

  2. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    PubMed

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  3. Strong-lensing analysis of MACS J0717.5+3745 from Hubble Frontier Fields observations: How well can the mass distribution be constrained?

    NASA Astrophysics Data System (ADS)

    Limousin, M.; Richard, J.; Jullo, E.; Jauzac, M.; Ebeling, H.; Bonamigo, M.; Alavi, A.; Clément, B.; Giocoli, C.; Kneib, J.-P.; Verdugo, T.; Natarajan, P.; Siana, B.; Atek, H.; Rexroth, M.

    2016-04-01

    We present a strong-lensing analysis of MACSJ0717.5+3745 (hereafter MACS J0717), based on the full depth of the Hubble Frontier Field (HFF) observations, which brings the number of multiply imaged systems to 61, ten of which have been spectroscopically confirmed. The total number of images comprised in these systems rises to 165, compared to 48 images in 16 systems before the HFF observations. Our analysis uses a parametric mass reconstruction technique, as implemented in the Lenstool software, and the subset of the 132 most secure multiple images to constrain a mass distribution composed of four large-scale mass components (spatially aligned with the four main light concentrations) and a multitude of galaxy-scale perturbers. We find a superposition of cored isothermal mass components to provide a good fit to the observational constraints, resulting in a very shallow mass distribution for the smooth (large-scale) component. Given the implications of such a flat mass profile, we investigate whether a model composed of "peaky" non-cored mass components can also reproduce the observational constraints. We find that such a non-cored mass model reproduces the observational constraints equally well, in the sense that both models give comparable total rms. Although the total (smooth dark matter component plus galaxy-scale perturbers) mass distributions of both models are consistent, as are the integrated two-dimensional mass profiles, we find that the smooth and the galaxy-scale components are very different. We conclude that, even in the HFF era, the generic degeneracy between smooth and galaxy-scale components is not broken, in particular in such a complex galaxy cluster. Consequently, insights into the mass distribution of MACS J0717 remain limited, emphasizing the need for additional probes beyond strong lensing. Our findings also have implications for estimates of the lensing magnification. We show that the amplification difference between the two models is larger than the error associated with either model, and that this additional systematic uncertainty is approximately the difference in magnification obtained by the different groups of modelers using pre-HFF data. This uncertainty decreases the area of the image plane where we can reliably study the high-redshift Universe by 50 to 70%.

  4. Quantification of Treatment Effect Modification on Both an Additive and Multiplicative Scale

    PubMed Central

    Girerd, Nicolas; Rabilloud, Muriel; Pibarot, Philippe; Mathieu, Patrick; Roy, Pascal

    2016-01-01

    Background In both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events. Methods The effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery. Results The multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70. Conclusions The present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification. PMID:27045168

  5. Moving Forward: Strengthening Your State's Capacity to Bring Innovation to Scale. Policy Bulletin

    ERIC Educational Resources Information Center

    Clancy, M. Colleen

    2013-01-01

    Evidence-based innovations developed locally can have a powerful and broad impact on a state's student success agenda, but only when a system is in place for accelerating the diffusion of innovation across institutional lines. State leadership is key to bringing effective practices to scale. That is what is happening in North Carolina, Texas, and…

  6. Orthographic and Phonological Neighborhood Databases across Multiple Languages.

    PubMed

    Marian, Viorica

    2017-01-01

    The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.

  7. [Does the care for the fear of falling bring a profit to community living elderly people who had experienced falls?].

    PubMed

    Landrot, Marion De Rogalski; Perrot, Catherine; Blanc, Patricia; Beauchet, Olivier; Blanchon, Marie Ange; Gonthier, Régis

    2007-09-01

    fall is common in old people and has multiple consequences, physical but also psychological, with a fear of falling which results in reduction in the activities of everyday life, loss of autonomy and entry in dependence. The aim of the study was to evluate the benefit of taking into account the fear of falling in the care of old people who had experienced falls. old people who experienced falls and with a good cognitive status were followed in a day hospital during one year. Evaluation including a specific assessment of the responsibility of the psychological factor, the photolangage, was performed before and after multi-field rehabilitation. We used the rating scales ADL, IADL, SF-36, SAFE, and verbal and analogical scales of the fear of falling. fifteen patients were included (mean age 85 years +/- 5,7). The majority were women living alone, with a good nutritional status, a moderated renal insufficiency, and a comorbidity involving polymedication. Scores on the ADL and IADL scales showed a consolidation of the patients' autonomy, with a slight but significant improvement of the IADL scores (p < 0,05). All scales assessing the fear of falling (visual analogical, verbal scales, SAFE) showed a statistically significant improvement (p<0,001). SF-36 scale, exploring the quality of life perceived by the patients, showed a great deterioration immediately after falling, and a statistically significant improvement on seven of the eight subscales after rehabilitation. The global physical score (GCV) was improved in a nonsignificant way, whereas the global psychic score (MCS) progressed in a statistically significant way (p < 0,001). this pilot study shows that multi-field rehabilitation and adapted assumption of responsibility of fear of falling brings a benefit in term of quality of life and preservation of autonomy in old people living in the community who had experienced falls.

  8. 2012/13 abnormal cold winter in Japan associated with Large-scale Atmospheric Circulation and Local Sea Surface Temperature over the Sea of Japan

    NASA Astrophysics Data System (ADS)

    Ando, Y.; Ogi, M.; Tachibana, Y.

    2013-12-01

    On Japan, wintertime cold wave has social, economic, psychological and political impacts because of the lack of atomic power stations in the era of post Fukushima world. The colder winter is the more electricity is needed. Wintertime weather of Japan and its prediction has come under the world spotlight. The winter of 2012/13 in Japan was abnormally cold, and such a cold winter has persisted for 3 years. Wintertime climate of Japan is governed by some dominant modes of the large-scale atmospheric circulations. Yasunaka and Hanawa (2008) demonstrated that the two dominant modes - Arctic Oscillation (AO) and Western Pacific (WP) pattern - account for about 65% of the interannual variation of the wintertime mean surface air temperature of Japan. A negative AO brings about cold winter in Japan. In addition, a negative WP also brings about cold winter in Japan. Looking back to the winter of 2012/13, both the negative AO and negative WP continued from October through December. If the previous studies were correct, it would have been extremely very cold from October through December. In fact, in December, in accordance with previous studies, it was colder than normal. Contrary to the expectation, in October and November, it was, however, warmer than normal. This discrepancy signifies that an additional hidden circumstance that heats Japan overwhelms these large-scale atmospheric circulations that cool Japan. In this study, we therefore seek an additional cause of wintertime climate of Japan particularly focusing 2012 as well as the AO and WP. We found that anomalously warm oceanic temperature surrounding Japan overwhelmed influences of the AO or WP. Unlike the inland climate, the island climate can be strongly influenced by surrounding ocean temperature, suggesting that large-scale atmospheric patterns alone do not determine the climate of islands. (a) Time series of a 5-day running mean AO index (blue) as defined by Ogi et al., (2004), who called it the SVNAM index. For reference, the conventional AO index is shown by the gray line. (b) a 5-day running mean WP index, (c) area-averaged Surface Air Temperature anomalies in Japan, (d) Air Temperature anomalies, (e) heat flux anomalies, and (f) Sea Surface Temperature anomalies. The boxed area on the Sea of Japan indicates the area in which the (d)-(f) indexes were calculated.

  9. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html.

  10. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html. PMID:23802613

  11. Boussinesq approximation of the Cahn-Hilliard-Navier-Stokes equations.

    PubMed

    Vorobev, Anatoliy

    2010-11-01

    We use the Cahn-Hilliard approach to model the slow dissolution dynamics of binary mixtures. An important peculiarity of the Cahn-Hilliard-Navier-Stokes equations is the necessity to use the full continuity equation even for a binary mixture of two incompressible liquids due to dependence of mixture density on concentration. The quasicompressibility of the governing equations brings a short time-scale (quasiacoustic) process that may not affect the slow dynamics but may significantly complicate the numerical treatment. Using the multiple-scale method we separate the physical processes occurring on different time scales and, ultimately, derive the equations with the filtered-out quasiacoustics. The derived equations represent the Boussinesq approximation of the Cahn-Hilliard-Navier-Stokes equations. This approximation can be further employed as a universal theoretical model for an analysis of slow thermodynamic and hydrodynamic evolution of the multiphase systems with strongly evolving and diffusing interfacial boundaries, i.e., for the processes involving dissolution/nucleation, evaporation/condensation, solidification/melting, polymerization, etc.

  12. Dilemma strength as a framework for advancing evolutionary game theory. Reply to comments on "Universal scaling for the dilemma strength in evolutionary games"

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Kokubo, Satoshi; Jusup, Marko; Tanimoto, Jun

    2015-09-01

    While comprehensive reviews of the literature, by gathering in one place most of the relevant information, undoubtedly steer the development of every scientific field, we found that the comments in response to a review article can be as informative as the review itself, if not more. Namely, reading through the comments on the ideas expressed in Ref. [1], we could identify a number of pressing problems for evolutionary game theory, indicating just how much space there still is for major advances and breakthroughs. In an attempt to bring a sense of order to a multitude of opinions, we roughly classified the comments into three categories, i.e. those concerned with: (i) the universality of scaling in heterogeneous topologies, including empirical dynamic networks [2-8], (ii) the universality of scaling for more general game setups, such as the inclusion of multiple strategies and external features [4,9-11], and (iii) experimental confirmations of the theoretical developments [2,12,13].

  13. Scaling Fiber Lasers to Large Mode Area: An Investigation of Passive Mode-Locking Using a Multi-Mode Fiber

    PubMed Central

    Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W.

    2011-01-01

    The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg–Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers. PMID:21731106

  14. Scaling Fiber Lasers to Large Mode Area: An Investigation of Passive Mode-Locking Using a Multi-Mode Fiber.

    PubMed

    Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W

    2011-01-01

    The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg-Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers.

  15. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  16. An NCME Instructional Module on Booklet Designs in Large-Scale Assessments of Student Achievement: Theory and Practice

    ERIC Educational Resources Information Center

    Frey, Andreas; Hartig, Johannes; Rupp, Andre A.

    2009-01-01

    In most large-scale assessments of student achievement, several broad content domains are tested. Because more items are needed to cover the content domains than can be presented in the limited testing time to each individual student, multiple test forms or booklets are utilized to distribute the items to the students. The construction of an…

  17. Classification Accuracy of Oral Reading Fluency and Maze in Predicting Performance on Large-Scale Reading Assessments

    ERIC Educational Resources Information Center

    Decker, Dawn M.; Hixson, Michael D.; Shaw, Amber; Johnson, Gloria

    2014-01-01

    The purpose of this study was to examine whether using a multiple-measure framework yielded better classification accuracy than oral reading fluency (ORF) or maze alone in predicting pass/fail rates for middle-school students on a large-scale reading assessment. Participants were 178 students in Grades 7 and 8 from a Midwestern school district.…

  18. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  19. A successful trap design for capturing large terrestrial snakes

    Treesearch

    Shirley J. Burgdorf; D. Craig Rudolph; Richard N. Conner; Daniel Saenz; Richard R. Schaefer

    2005-01-01

    Large scale trapping protocols for snakes can be expensive and require large investments of personnel and time. Typical methods, such as pitfall and small funnel traps, are not useful or suitable for capturing large snakes. A method was needed to survey multiple blocks of habitat for the Louisiana Pine Snake (Pituophis ruthveni), throughout its...

  20. First ERTS-1 results in southeastern France: Geology, sedimentology, pollution at sea

    NASA Technical Reports Server (NTRS)

    Fontanel, A.; Guillemot, J.; Guy, M.

    1973-01-01

    Results obtained by four ERTS projects in southeastern France are summarized. With regard to geology, ERTS photos of Western Alps are very useful for tectonic interpretation because large features are clearly visible on these photographs even though they are often hidden by small complicated structures if studied on large scale documents. The 18-day repetition coverage was not obtained, and time-varying sedimentological surveys were impossible. Nevertheless, it was possible to delineate the variations of the shorelines in the Rhone Delta for a period covering the least 8,000 years. Some instances of industries discharging pollutant products at sea were detected, as well as very large anomalies of unknown origin. Some examples of coherent optical processing have been made in order to bring out tectonic features in the Alps mountains.

  1. High performance semantic factoring of giga-scale semantic graph databases.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    al-Saffar, Sinan; Adolf, Bob; Haglin, David

    2010-10-01

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to their innate semantic structure. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multithreaded architecture of the Cray XMT platform, conventional clusters, and large data stores. In this paper we describe that architecture, and present the results of our deployingmore » that for the analysis of the Billion Triple dataset with respect to its semantic factors, including basic properties, connected components, namespace interaction, and typed paths.« less

  2. Population-wide mortality in multiple forest types in western North America: onset, extent, and severity of impacts as indicators of climatic influence

    Treesearch

    J. D. Shaw; J. N. Long; M. T. Thompson; R. J. DeRose

    2010-01-01

    A complex of drought, insects, and disease is causing widespread mortality in multiple forest types across western North America. These forest types range from dry Pinus-Juniperus woodlands to moist, montane Picea-Abies forests. Although large-scale mortality events are known from the past and considered part of natural cycles, recent events have largely been...

  3. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  4. Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate

    PubMed Central

    Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.

    2015-01-01

    Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906

  5. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  6. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  7. Large-scale dynamics associated with clustering of extratropical cyclones affecting Western Europe

    NASA Astrophysics Data System (ADS)

    Pinto, Joaquim G.; Gómara, Iñigo; Masato, Giacomo; Dacre, Helen F.; Woollings, Tim; Caballero, Rodrigo

    2015-04-01

    Some recent winters in Western Europe have been characterized by the occurrence of multiple extratropical cyclones following a similar path. The occurrence of such cyclone clusters leads to large socio-economic impacts due to damaging winds, storm surges, and floods. Recent studies have statistically characterized the clustering of extratropical cyclones over the North Atlantic and Europe and hypothesized potential physical mechanisms responsible for their formation. Here we analyze 4 months characterized by multiple cyclones over Western Europe (February 1990, January 1993, December 1999, and January 2007). The evolution of the eddy driven jet stream, Rossby wave-breaking, and upstream/downstream cyclone development are investigated to infer the role of the large-scale flow and to determine if clustered cyclones are related to each other. Results suggest that optimal conditions for the occurrence of cyclone clusters are provided by a recurrent extension of an intensified eddy driven jet toward Western Europe lasting at least 1 week. Multiple Rossby wave-breaking occurrences on both the poleward and equatorward flanks of the jet contribute to the development of these anomalous large-scale conditions. The analysis of the daily weather charts reveals that upstream cyclone development (secondary cyclogenesis, where new cyclones are generated on the trailing fronts of mature cyclones) is strongly related to cyclone clustering, with multiple cyclones developing on a single jet streak. The present analysis permits a deeper understanding of the physical reasons leading to the occurrence of cyclone families over the North Atlantic, enabling a better estimation of the associated cumulative risk over Europe.

  8. Effects of multiple-scale driving on turbulence statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Hyunju; Cho, Jungyeon, E-mail: hyunju527@gmail.com, E-mail: jcho@cnu.ac.kr

    2014-01-01

    Turbulence is ubiquitous in astrophysical fluids such as the interstellar medium and the intracluster medium. In turbulence studies, it is customary to assume that fluid is driven on a single scale. However, in astrophysical fluids, there can be many different driving mechanisms that act on different scales. If there are multiple energy-injection scales, the process of energy cascade and turbulence dynamo will be different compared with the case of the single energy-injection scale. In this work, we perform three-dimensional incompressible/compressible magnetohydrodynamic turbulence simulations. We drive turbulence in Fourier space in two wavenumber ranges, 2≤k≤√12 (large scale) and 15 ≲ kmore » ≲ 26 (small scale). We inject different amount of energy in each range by changing the amplitude of forcing in the range. We present the time evolution of the kinetic and magnetic energy densities and discuss the turbulence dynamo in the presence of energy injections at two scales. We show how kinetic, magnetic, and density spectra are affected by the two-scale energy injections and we discuss the observational implications. In the case ε {sub L} < ε {sub S}, where ε {sub L} and ε {sub S} are energy-injection rates at the large and small scales, respectively, our results show that even a tiny amount of large-scale energy injection can significantly change the properties of turbulence. On the other hand, when ε {sub L} ≳ ε {sub S}, the small-scale driving does not influence the turbulence statistics much unless ε {sub L} ∼ ε {sub S}.« less

  9. Satellite-Scale Snow Water Equivalent Assimilation into a High-Resolution Land Surface Model

    NASA Technical Reports Server (NTRS)

    De Lannoy, Gabrielle J.M.; Reichle, Rolf H.; Houser, Paul R.; Arsenault, Kristi R.; Verhoest, Niko E.C.; Paulwels, Valentijn R.N.

    2009-01-01

    An ensemble Kalman filter (EnKF) is used in a suite of synthetic experiments to assimilate coarse-scale (25 km) snow water equivalent (SWE) observations (typical of satellite retrievals) into fine-scale (1 km) model simulations. Coarse-scale observations are assimilated directly using an observation operator for mapping between the coarse and fine scales or, alternatively, after disaggregation (re-gridding) to the fine-scale model resolution prior to data assimilation. In either case observations are assimilated either simultaneously or independently for each location. Results indicate that assimilating disaggregated fine-scale observations independently (method 1D-F1) is less efficient than assimilating a collection of neighboring disaggregated observations (method 3D-Fm). Direct assimilation of coarse-scale observations is superior to a priori disaggregation. Independent assimilation of individual coarse-scale observations (method 3D-C1) can bring the overall mean analyzed field close to the truth, but does not necessarily improve estimates of the fine-scale structure. There is a clear benefit to simultaneously assimilating multiple coarse-scale observations (method 3D-Cm) even as the entire domain is observed, indicating that underlying spatial error correlations can be exploited to improve SWE estimates. Method 3D-Cm avoids artificial transitions at the coarse observation pixel boundaries and can reduce the RMSE by 60% when compared to the open loop in this study.

  10. Raccoon spatial requirements and multi-scale habitat selection within an intensively managed central Appalachian forest

    USGS Publications Warehouse

    Owen, Sheldon F.; Berl, Jacob L.; Edwards, John W.; Ford, W. Mark; Wood, Petra Bohall

    2015-01-01

    We studied a raccoon (Procyon lotor) population within a managed central Appalachian hardwood forest in West Virginia to investigate the effects of intensive forest management on raccoon spatial requirements and habitat selection. Raccoon home-range (95% utilization distribution) and core-area (50% utilization distribution) size differed between sexes with males maintaining larger (2×) home ranges and core areas than females. Home-range and core-area size did not differ between seasons for either sex. We used compositional analysis to quantify raccoon selection of six different habitat types at multiple spatial scales. Raccoons selected riparian corridors (riparian management zones [RMZ]) and intact forests (> 70 y old) at the core-area spatial scale. RMZs likely were used by raccoons because they provided abundant denning resources (i.e., large-diameter trees) as well as access to water. Habitat composition associated with raccoon foraging locations indicated selection for intact forests, riparian areas, and regenerating harvest (stands <10 y old). Although raccoons were able to utilize multiple habitat types for foraging resources, a selection of intact forest and RMZs at multiple spatial scales indicates the need of mature forest (with large-diameter trees) for this species in managed forests in the central Appalachians.

  11. Automatic Focus Adjustment of a Microscope

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance

    2005-01-01

    AUTOFOCUS is a computer program for use in a control system that automatically adjusts the position of an instrument arm that carries a microscope equipped with an electronic camera. In the original intended application of AUTOFOCUS, the imaging microscope would be carried by an exploratory robotic vehicle on a remote planet, but AUTOFOCUS could also be adapted to similar applications on Earth. Initially control software other than AUTOFOCUS brings the microscope to a position above a target to be imaged. Then the instrument arm is moved to lower the microscope toward the target: nominally, the target is approached from a starting distance of 3 cm in 10 steps of 3 mm each. After each step, the image in the camera is subjected to a wavelet transform, which is used to evaluate the texture in the image at multiple scales to determine whether and by how much the microscope is approaching focus. A focus measure is derived from the transform and used to guide the arm to bring the microscope to the focal height. When the analysis reveals that the microscope is in focus, image data are recorded and transmitted.

  12. Convergence Science in a Nano World

    PubMed Central

    Cady, Nathaniel

    2013-01-01

    Convergence is a new paradigm that brings together critical advances in the life sciences, physical sciences and engineering. Going beyond traditional “interdisciplinary” studies, “convergence” describes the culmination of truly integrated research and development, yielding revolutionary advances in both scientific research and new technologies. At its core, nanotechnology embodies these elements of convergence science by bringing together multiple disciplines with the goal of creating innovative and groundbreaking technologies. In the biological and biomedical sciences, nanotechnology research has resulted in dramatic improvements in sensors, diagnostics, imaging, and even therapeutics. In particular, there is a current push to examine the interface between the biological world and micro/nano-scale systems. For example, my laboratory is developing novel strategies for spatial patterning of biomolecules, electrical and optical biosensing, nanomaterial delivery systems, cellular patterning techniques, and the study of cellular interactions with nano-structured surfaces. In this seminar, I will give examples of how convergent research is being applied to three major areas of biological research &endash; cancer diagnostics, microbiology, and DNA-based biosensing. These topics will be presented as case studies, showing the benefits (and challenges) of multi-disciplinary, convergent research and development.

  13. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A HIERARCHIAL STOCHASTIC MODEL OF LARGE SCALE ATMOSPHERIC CIRCULATION PATTERNS AND MULTIPLE STATION DAILY PRECIPITATION

    EPA Science Inventory

    A stochastic model of weather states and concurrent daily precipitation at multiple precipitation stations is described. our algorithms are invested for classification of daily weather states; k means, fuzzy clustering, principal components, and principal components coupled with ...

  15. Selecting habitat to survive: the impact of road density on survival in a large carnivore.

    PubMed

    Basille, Mathieu; Van Moorter, Bram; Herfindal, Ivar; Martin, Jodie; Linnell, John D C; Odden, John; Andersen, Reidar; Gaillard, Jean-Michel

    2013-01-01

    Habitat selection studies generally assume that animals select habitat and food resources at multiple scales to maximise their fitness. However, animals sometimes prefer habitats of apparently low quality, especially when considering the costs associated with spatially heterogeneous human disturbance. We used spatial variation in human disturbance, and its consequences on lynx survival, a direct fitness component, to test the Hierarchical Habitat Selection hypothesis from a population of Eurasian lynx Lynx lynx in southern Norway. Data from 46 lynx monitored with telemetry indicated that a high proportion of forest strongly reduced the risk of mortality from legal hunting at the home range scale, while increasing road density strongly increased such risk at the finer scale within the home range. We found hierarchical effects of the impact of human disturbance, with a higher road density at a large scale reinforcing its negative impact at a fine scale. Conversely, we demonstrated that lynx shifted their habitat selection to avoid areas with the highest road densities within their home ranges, thus supporting a compensatory mechanism at fine scale enabling lynx to mitigate the impact of large-scale disturbance. Human impact, positively associated with high road accessibility, was thus a stronger driver of lynx space use at a finer scale, with home range characteristics nevertheless constraining habitat selection. Our study demonstrates the truly hierarchical nature of habitat selection, which aims at maximising fitness by selecting against limiting factors at multiple spatial scales, and indicates that scale-specific heterogeneity of the environment is driving individual spatial behaviour, by means of trade-offs across spatial scales.

  16. On the Interactions Between Planetary and Mesoscale Dynamics in the Oceans

    NASA Astrophysics Data System (ADS)

    Grooms, I.; Julien, K. A.; Fox-Kemper, B.

    2011-12-01

    Multiple-scales asymptotic methods are used to investigate the interaction of planetary and mesoscale dynamics in the oceans. We find three regimes. In the first, the slow, large-scale planetary flow sets up a baroclinically unstable background which leads to vigorous mesoscale eddy generation, but the eddy dynamics do not affect the planetary dynamics. In the second, the planetary flow feels the effects of the eddies, but appears to be unable to generate them. The first two regimes rely on horizontally isotropic large-scale dynamics. In the third regime, large-scale anisotropy, as exists for example in the Antarctic Circumpolar Current and in western boundary currents, allows the large-scale dynamics to both generate and respond to mesoscale eddies. We also discuss how the investigation may be brought to bear on the problem of parameterization of unresolved mesoscale dynamics in ocean general circulation models.

  17. Spatial, temporal, and hybrid decompositions for large-scale vehicle routing with time windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, Russell W

    This paper studies the use of decomposition techniques to quickly find high-quality solutions to large-scale vehicle routing problems with time windows. It considers an adaptive decomposition scheme which iteratively decouples a routing problem based on the current solution. Earlier work considered vehicle-based decompositions that partitions the vehicles across the subproblems. The subproblems can then be optimized independently and merged easily. This paper argues that vehicle-based decompositions, although very effective on various problem classes also have limitations. In particular, they do not accommodate temporal decompositions and may produce spatial decompositions that are not focused enough. This paper then proposes customer-based decompositionsmore » which generalize vehicle-based decouplings and allows for focused spatial and temporal decompositions. Experimental results on class R2 of the extended Solomon benchmarks demonstrates the benefits of the customer-based adaptive decomposition scheme and its spatial, temporal, and hybrid instantiations. In particular, they show that customer-based decompositions bring significant benefits over large neighborhood search in contrast to vehicle-based decompositions.« less

  18. Quantitative Serum Nuclear Magnetic Resonance Metabolomics in Large-Scale Epidemiology: A Primer on -Omic Technologies

    PubMed Central

    Kangas, Antti J; Soininen, Pasi; Lawlor, Debbie A; Davey Smith, George; Ala-Korpela, Mika

    2017-01-01

    Abstract Detailed metabolic profiling in large-scale epidemiologic studies has uncovered novel biomarkers for cardiometabolic diseases and clarified the molecular associations of established risk factors. A quantitative metabolomics platform based on nuclear magnetic resonance spectroscopy has found widespread use, already profiling over 400,000 blood samples. Over 200 metabolic measures are quantified per sample; in addition to many biomarkers routinely used in epidemiology, the method simultaneously provides fine-grained lipoprotein subclass profiling and quantification of circulating fatty acids, amino acids, gluconeogenesis-related metabolites, and many other molecules from multiple metabolic pathways. Here we focus on applications of magnetic resonance metabolomics for quantifying circulating biomarkers in large-scale epidemiology. We highlight the molecular characterization of risk factors, use of Mendelian randomization, and the key issues of study design and analyses of metabolic profiling for epidemiology. We also detail how integration of metabolic profiling data with genetics can enhance drug development. We discuss why quantitative metabolic profiling is becoming widespread in epidemiology and biobanking. Although large-scale applications of metabolic profiling are still novel, it seems likely that comprehensive biomarker data will contribute to etiologic understanding of various diseases and abilities to predict disease risks, with the potential to translate into multiple clinical settings. PMID:29106475

  19. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  20. Bioinspired Wood Nanotechnology for Functional Materials.

    PubMed

    Berglund, Lars A; Burgert, Ingo

    2018-05-01

    It is a challenging task to realize the vision of hierarchically structured nanomaterials for large-scale applications. Herein, the biomaterial wood as a large-scale biotemplate for functionalization at multiple scales is discussed, to provide an increased property range to this renewable and CO 2 -storing bioresource, which is available at low cost and in large quantities. The Progress Report reviews the emerging field of functional wood materials in view of the specific features of the structural template and novel nanotechnological approaches for the development of wood-polymer composites and wood-mineral hybrids for advanced property profiles and new functions. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Bringing to Life Transformative Ideas: A Blueprint for Trustees

    ERIC Educational Resources Information Center

    Whitman, Janet

    2012-01-01

    To bring major initiatives to fruition, trustees, administrators, faculty members, and donors must all be effectively engaged. By broadening a project's impact, the concerns of board members and other constituencies may be addressed and resolved to the satisfaction of all. An institution must provide sufficient time and multiple opportunities for…

  2. Validity and reliability of a pilot scale for assessment of multiple system atrophy symptoms.

    PubMed

    Matsushima, Masaaki; Yabe, Ichiro; Takahashi, Ikuko; Hirotani, Makoto; Kano, Takahiro; Horiuchi, Kazuhiro; Houzen, Hideki; Sasaki, Hidenao

    2017-01-01

    Multiple system atrophy (MSA) is a rare progressive neurodegenerative disorder for which brief yet sensitive scale is required in order for use in clinical trials and general screening. We previously compared several scales for the assessment of MSA symptoms and devised an eight-item pilot scale with large standardized response mean [handwriting, finger taps, transfers, standing with feet together, turning trunk, turning 360°, gait, body sway]. The aim of the present study is to investigate the validity and reliability of a simple pilot scale for assessment of multiple system atrophy symptoms. Thirty-two patients with MSA (15 male/17 female; 20 cerebellar subtype [MSA-C]/12 parkinsonian subtype [MSA-P]) were prospectively registered between January 1, 2014 and February 28, 2015. Patients were evaluated by two independent raters using the Unified MSA Rating Scale (UMSARS), Scale for Assessment and Rating of Ataxia (SARA), and the pilot scale. Correlations between UMSARS, SARA, pilot scale scores, intraclass correlation coefficients (ICCs), and Cronbach's alpha coefficients were calculated. Pilot scale scores significantly correlated with scores for UMSARS Parts I, II, and IV as well as with SARA scores. Intra-rater and inter-rater ICCs and Cronbach's alpha coefficients remained high (> 0.94) for all measures. The results of the present study indicate the validity and reliability of the eight-item pilot scale, particularly for the assessment of symptoms in patients with early state multiple system atrophy.

  3. Propagating stress-pulses and wiggling transition revealed in string dynamics

    NASA Astrophysics Data System (ADS)

    Yao, Zhenwei

    2018-02-01

    Understanding string dynamics yields insights into the intricate dynamic behaviors of various filamentary thin structures in nature and industry covering multiple length scales. In this work, we investigate the planar dynamics of a flexible string where one end is free and the other end is subject to transverse and longitudinal motions. Under transverse harmonic motion, we reveal the propagating pulse structure in the stress profile over the string, and analyze its role in bringing the system into a chaotic state. For a string where one end is under longitudinal uniform acceleration, we identify the wiggling transition, derive the analytical wiggling solution from the string equations, and present the phase diagram.

  4. Controlling Guessing Bias in the Dichotomous Rasch Model Applied to a Large-Scale, Vertically Scaled Testing Program

    ERIC Educational Resources Information Center

    Andrich, David; Marais, Ida; Humphry, Stephen Mark

    2016-01-01

    Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…

  5. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  6. Dynamical tuning for MPC using population games: A water supply network application.

    PubMed

    Barreiro-Gomez, Julian; Ocampo-Martinez, Carlos; Quijano, Nicanor

    2017-07-01

    Model predictive control (MPC) is a suitable strategy for the control of large-scale systems that have multiple design requirements, e.g., multiple physical and operational constraints. Besides, an MPC controller is able to deal with multiple control objectives considering them within the cost function, which implies to determine a proper prioritization for each of the objectives. Furthermore, when the system has time-varying parameters and/or disturbances, the appropriate prioritization might vary along the time as well. This situation leads to the need of a dynamical tuning methodology. This paper addresses the dynamical tuning issue by using evolutionary game theory. The advantages of the proposed method are highlighted and tested over a large-scale water supply network with periodic time-varying disturbances. Finally, results are analyzed with respect to a multi-objective MPC controller that uses static tuning. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Symmetry associated with symmetry break: Revisiting ants and humans escaping from multiple-exit rooms

    NASA Astrophysics Data System (ADS)

    Ji, Q.; Xin, C.; Tang, S. X.; Huang, J. P.

    2018-02-01

    Crowd panic has incurred massive injuries or deaths throughout the world, and thus understanding it is particularly important. It is now a common knowledge that crowd panic induces "symmetry break" in which some exits are jammed while others are underutilized. Amazingly, here we show, by experiment, simulation and theory, that a class of symmetry patterns come to appear for ants and humans escaping from multiple-exit rooms while the symmetry break exists. Our symmetry pattern is described by the fact that the ratio between the ensemble-averaging numbers of ants or humans escaping from different exits is equal to the ratio between the widths of the exits. The mechanism lies in the effect of heterogeneous preferences of agents with limited information for achieving the Nash equilibrium. This work offers new insights into how to improve public safety because large public areas are always equipped with multiple exits, and it also brings an ensemble-averaging method for seeking symmetry associated with symmetry breaking.

  8. Implementation of mechanism of action biology-driven early drug development for children with cancer.

    PubMed

    Pearson, Andrew D J; Herold, Ralf; Rousseau, Raphaël; Copland, Chris; Bradley-Garelik, Brigid; Binner, Debbie; Capdeville, Renaud; Caron, Hubert; Carleer, Jacqueline; Chesler, Louis; Geoerger, Birgit; Kearns, Pamela; Marshall, Lynley V; Pfister, Stefan M; Schleiermacher, Gudrun; Skolnik, Jeffrey; Spadoni, Cesare; Sterba, Jaroslav; van den Berg, Hendrick; Uttenreuther-Fischer, Martina; Witt, Olaf; Norga, Koen; Vassal, Gilles

    2016-07-01

    An urgent need remains for new paediatric oncology drugs to cure children who die from cancer and to reduce drug-related sequelae in survivors. In 2007, the European Paediatric Regulation came into law requiring industry to create paediatric drug (all types of medicinal products) development programmes alongside those for adults. Unfortunately, paediatric drug development is still largely centred on adult conditions and not a mechanism of action (MoA)-based model, even though this would be more logical for childhood tumours as these have much fewer non-synonymous coding mutations than adult malignancies. Recent large-scale sequencing by International Genome Consortium and Paediatric Cancer Genome Project has further shown that the genetic and epigenetic repertoire of driver mutations in specific childhood malignancies differs from more common adult-type malignancies. To bring about much needed change, a Paediatric Platform, ACCELERATE, was proposed in 2013 by the Cancer Drug Development Forum, Innovative Therapies for Children with Cancer, the European Network for Cancer Research in Children and Adolescents and the European Society for Paediatric Oncology. The Platform, comprising multiple stakeholders in paediatric oncology, has three working groups, one with responsibility for promoting and developing high-quality MoA-informed paediatric drug development programmes, including specific measures for adolescents. Key is the establishment of a freely accessible aggregated database of paediatric biological tumour drug targets to be aligned with an aggregated pipeline of drugs. This will enable prioritisation and conduct of early phase clinical paediatric trials to evaluate these drugs against promising therapeutic targets and to generate clinical paediatric efficacy and safety data in an accelerated time frame. Through this work, the Platform seeks to ensure that potentially effective drugs, where the MoA is known and thought to be relevant to paediatric malignancies, are evaluated in early phase clinical trials, and that this approach to generate pre-clinical and clinical data is systematically pursued by academia, sponsors, industry, and regulatory bodies to bring new paediatric oncology drugs to front-line therapy more rapidly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Measuring the impact of multiple sclerosis on psychosocial functioning: the development of a new self-efficacy scale.

    PubMed

    Airlie, J; Baker, G A; Smith, S J; Young, C A

    2001-06-01

    To develop a scale to measure self-efficacy in neurologically impaired patients with multiple sclerosis and to assess the scale's psychometric properties. Cross-sectional questionnaire study in a clinical setting, the retest questionnaire returned by mail after completion at home. Regional multiple sclerosis (MS) outpatient clinic or the Clinical Trials Unit (CTU) at a large neuroscience centre in the UK. One hundred persons with MS attending the Walton Centre for Neurology and Neurosurgery and Clatterbridge Hospital, Wirral, as outpatients. Cognitively impaired patients were excluded at an initial clinic assessment. Patients were asked to provide demographic data and complete the self-efficacy scale along with the following validated scales: Hospital Anxiety and Depression Scale, Rosenberg Self-Esteem Scale, Impact, Stigma and Mastery and Rankin Scales. The Rankin Scale and Barthel Index were also assessed by the physician. A new 11-item self-efficacy scale was constructed consisting of two domains of control and personal agency. The validity of the scale was confirmed using Cronbach's alpha analysis of internal consistency (alpha = 0.81). The test-retest reliability of the scale over two weeks was acceptable with an intraclass correlation coefficient of 0.79. Construct validity was investigated using Pearson's product moment correlation coefficient resulting in significant correlations with depression (r= -0.52) anxiety (r =-0.50) and mastery (r= 0.73). Multiple regression analysis demonstrated that these factors accounted for 70% of the variance of scores on the self-efficacy scale, with scores on mastery, anxiety and perceived disability being independently significant. Assessment of the psychometric properties of this new self-efficacy scale suggest that it possesses good validity and reliability in patients with multiple sclerosis.

  10. Modal Testing of the NPSAT1 Engineering Development Unit

    DTIC Science & Technology

    2012-07-01

    erkläre ich, dass die vorliegende Master Arbeit von mir selbstständig und nur unter Verwendung der angegebenen Quellen und Hilfsmittel angefertigt...logarithmic scale . As 5 Figure 2 shows, natural frequencies are indicated by large values of the first CMIF (peaks), and multiple modes can be detected by...structure’s behavior. Ewins even states, “that no large- scale modal test should be permitted to proceed until some preliminary SDOF analyses have

  11. Large-eddy simulation using the finite element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCallen, R.C.; Gresho, P.M.; Leone, J.M. Jr.

    1993-10-01

    In a large-eddy simulation (LES) of turbulent flows, the large-scale motion is calculated explicitly (i.e., approximated with semi-empirical relations). Typically, finite difference or spectral numerical schemes are used to generate an LES; the use of finite element methods (FEM) has been far less prominent. In this study, we demonstrate that FEM in combination with LES provides a viable tool for the study of turbulent, separating channel flows, specifically the flow over a two-dimensional backward-facing step. The combination of these methodologies brings together the advantages of each: LES provides a high degree of accuracy with a minimum of empiricism for turbulencemore » modeling and FEM provides a robust way to simulate flow in very complex domains of practical interest. Such a combination should prove very valuable to the engineering community.« less

  12. A Review of Control Strategy of the Large-scale of Electric Vehicles Charging and Discharging Behavior

    NASA Astrophysics Data System (ADS)

    Kong, Lingyu; Han, Jiming; Xiong, Wenting; Wang, Hao; Shen, Yaqi; Li, Ying

    2017-05-01

    Large scale access of electric vehicles will bring huge challenges to the safe operation of the power grid, and it’s important to control the charging and discharging of the electric vehicle. First of all, from the electric quality and network loss, this paper points out the influence on the grid caused by electric vehicle charging behaviour. Besides, control strategy of electric vehicle charging and discharging has carried on the induction and the summary from the direct and indirect control. Direct control strategy means control the electric charging behaviour by controlling its electric vehicle charging and discharging power while the indirect control strategy by means of controlling the price of charging and discharging. Finally, for the convenience of the reader, this paper also proposed a complete idea of the research methods about how to study the control strategy, taking the adaptability and possibility of failure of electric vehicle control strategy into consideration. Finally, suggestions on the key areas for future research are put up.

  13. Challenges in large scale quantum mechanical calculations: Challenges in large scale quantum mechanical calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ratcliff, Laura E.; Mohr, Stephan; Huhs, Georg

    2016-11-07

    During the past decades, quantum mechanical methods have undergone an amazing transition from pioneering investigations of experts into a wide range of practical applications, made by a vast community of researchers. First principles calculations of systems containing up to a few hundred atoms have become a standard in many branches of science. The sizes of the systems which can be simulated have increased even further during recent years, and quantum-mechanical calculations of systems up to many thousands of atoms are nowadays possible. This opens up new appealing possibilities, in particular for interdisciplinary work, bridging together communities of different needs andmore » sensibilities. In this review we will present the current status of this topic, and will also give an outlook on the vast multitude of applications, challenges and opportunities stimulated by electronic structure calculations, making this field an important working tool and bringing together researchers of many different domains.« less

  14. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  15. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  16. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  17. A critical view on the eco-friendliness of small hydroelectric installations.

    PubMed

    Premalatha, M; Tabassum-Abbasi; Abbasi, Tasneem; Abbasi, S A

    2014-05-15

    Renewable energy sources are widely perceived as 'clean', 'green', and 'inexhaustible'. In recent years the spectre of global warming and ocean acidification, which has been primarily attributed to fossil fuel burning, has brought renewable energy at the forefront of most climate change mitigation strategies. There is strong advocacy for large-scale substitution of conventional energy sources with the renewables on the premise that such a move would substantially reduce environmental degradation and global warming. These sentiments are being echoed by scientists and policy makers as well as environmental activists all over the world. 'Small hydro', which generally represents hydroelectric power projects of capacities 25 MW or lower, is one of the renewable energy options which is believed to be clean and sustainable even as its bigger version, large hydro, is known to cause several strongly adverse environmental impacts. This paper brings out that the prevailing perception of 'eco-friendliness' of small hydro is mainly due to the fact that it has only been used to a very small extent so far. But once it is deployed at a scale comparable to fossil fuel use, the resulting impacts would be quite substantially adverse. The purpose is not to denegrade small hydro, less so to advocate use of fossil fuels. It, rather, is to bring home the point that a much more realistic and elaborate assessment of the likely direct as well as indirect impacts of extensive utilization of this energy source than has been done hitherto is necessary. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Recent surface cooling in the Yellow and East China Seas and the associated North Pacific climate regime shift

    NASA Astrophysics Data System (ADS)

    Kim, Yong Sun; Jang, Chan Joo; Yeh, Sang-Wook

    2018-03-01

    The Yellow and East China Seas (YECS) are widely believed to have experienced robust, basin-scale warming over the last few decades. However, the warming reached a peak in the late 1990s, followed by a significant cooling trend. In this study, we investigated the characteristics of this low-frequency sea surface temperature (SST) variance and its dynamic relationship with large-scale climate variability through cyclostationary orthogonal function analysis for the 1982-2014 period. Both regressed surface winds on the primary mode of the YECS SST and trends in air-sea heat fluxes demonstrate that the intensification of the northerly winds in winter contribute largely to the recent cooling trend by increasing heat loss to the atmosphere. As a localized oceanic response to these winds, the upwind flow seems to bring warm waters and partially counteracts the basin-scale cooling, thus contributing to a weakening of the cooling trend along the central trough of the Yellow Sea. In the context of the large-scale climate variabilities, a strong relationship between the YECS SST variability and Pacific Decadal Oscillation (PDO) became weak considerably during the recent cooling period after the late 1990s as the PDO signals appeared to be confined within the eastern basin of the North Pacific in association with the regime shift. In addition to this decoupling of the YECS SST from the PDO, the intensifying Siberian High pressure system likely caused the enhanced northerly winds, leading to the recent cooling trend. These findings highlight relative roles of the PDO and the Siberian High in shaping the YECS SST variance through the changes in the large-scale atmospheric circulation and attendant oceanic advection.

  19. Large-scale assessment of benthic communities across multiple marine protected areas using an autonomous underwater vehicle.

    PubMed

    Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D

    2018-01-01

    Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales.

  20. Large-scale assessment of benthic communities across multiple marine protected areas using an autonomous underwater vehicle

    PubMed Central

    Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F.; Byrne, Maria; Malcolm, Hamish A.; Williams, Stefan B.; Steinberg, Peter D.

    2018-01-01

    Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate ‘no-take’ and ‘general-use’ (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5–10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and sampling design that once temporal sampling is incorporated will be useful to detect changes of marine benthic communities across multiple spatial and temporal scales. PMID:29547656

  1. Scale interaction and arrangement in a turbulent boundary layer perturbed by a wall-mounted cylindrical element

    NASA Astrophysics Data System (ADS)

    Tang, Zhanqi; Jiang, Nan

    2018-05-01

    This study reports the modifications of scale interaction and arrangement in a turbulent boundary layer perturbed by a wall-mounted circular cylinder. Hot-wire measurements were executed at multiple streamwise and wall-normal wise locations downstream of the cylindrical element. The streamwise fluctuating signals were decomposed into large-, small-, and dissipative-scale signatures by corresponding cutoff filters. The scale interaction under the cylindrical perturbation was elaborated by comparing the small- and dissipative-scale amplitude/frequency modulation effects downstream of the cylinder element with the results observed in the unperturbed case. It was obtained that the large-scale fluctuations perform a stronger amplitude modulation on both the small and dissipative scales in the near-wall region. At the wall-normal positions of the cylinder height, the small-scale amplitude modulation coefficients are redistributed by the cylinder wake. The similar observation was noted in small-scale frequency modulation; however, the dissipative-scale frequency modulation seems to be independent of the cylindrical perturbation. The phase-relationship observation indicated that the cylindrical perturbation shortens the time shifts between both the small- and dissipative-scale variations (amplitude and frequency) and large-scale fluctuations. Then, the integral time scale dependence of the phase-relationship between the small/dissipative scales and large scales was also discussed. Furthermore, the discrepancy of small- and dissipative-scale time shifts relative to the large-scale motions was examined, which indicates that the small-scale amplitude/frequency leads the dissipative scales.

  2. Integral criteria for large-scale multiple fingerprint solutions

    NASA Astrophysics Data System (ADS)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  3. A Vision for the Future of Environmental Research: Creating Environmental Intelligence Centers

    NASA Astrophysics Data System (ADS)

    Barron, E. J.

    2002-12-01

    The nature of the environmental issues facing our nation demands a capability that allows us to enhance economic vitality, maintain environmental quality, and limit threats to life and property through more fundamental understanding of the Earth. It is "advanced" knowledge of how the system may respond that gives environmental information most of its power and utility. This fact is evident in the demand for new forecasting products, involving air quality, energy demand, water quality and quantity, ultraviolet radiation, and human health indexes. As we demonstrate feasibility and benefit, society is likely to demand a growing number of new operational forecast products on prediction time scales of days to decades into the future. The driving forces that govern our environment are widely recognized, involving primarily weather and climate, patterns of land use and land cover, and resource use with its associated waste products. The importance of these driving forces has been demonstrated by a decade of research on greenhouse gas emissions, ozone depletion and deforestation, and through the birth of Earth System Science. But, there are also major challenges. We find the strongest intersection between human activity, environmental stresses, system interactions and human decision-making in regional analysis coupled to larger spatial scales. In addition, most regions are influenced by multiple-stresses. Multiple, cumulative, and interactive stresses are clearly the most difficult to understand and hence the most difficult to assess and to manage. Currently, we are incapable of addressing these issues in a truly integrated fashion at global scales. The lack of an ability to combine global and regional forcing and to assess the response of the system to multiple stresses at the spatial and temporal scales of interest to humans limits our ability to assess the impacts of specific human perturbations, to assess advantages and risks, and to enhance economic and societal well being in the context of global, national and regional stewardship. These societal needs lead to a vision that uses a regional framework as a stepping-stone to a comprehensive national or global capability. The development of a comprehensive regional framework depends on a new approach to environmental research - the creation of regional Environmental Intelligence Centers. A key objective is to bring a demanding level of discipline to "forecasting" in a broad arena of environmental issues. The regional vision described above is designed to address a broad range of current and future environmental issues by creating a capability based on integrating diverse observing systems, making data readily accessible, developing an increasingly comprehensive predictive capability at the spatial and temporal scales appropriate for examining societal issues, and creating a vigorous intersection with decision-makers. With demonstrated success over a few large-scale regions of the U.S., this strategy will very likely grow into a national capability that far exceeds current capabilities.

  4. SANDO syndrome in a cohort of 107 patients with CPEO and mitochondrial DNA deletions.

    PubMed

    Hanisch, Frank; Kornhuber, Malte; Alston, Charlotte L; Taylor, Robert W; Deschauer, Marcus; Zierz, Stephan

    2015-06-01

    The sensory ataxic neuropathy with dysarthria and ophthalmoparesis (SANDO) syndrome is a subgroup of mitochondrial chronic progressive external ophthalmoplegia (CPEO)-plus disorders associated with multiple mitochondrial DNA (mtDNA) deletions. There is no systematic survey on SANDO in patients with CPEO with either single or multiple large-scale mtDNA deletions. In this retrospective analysis, we characterised the frequency, the genetic and clinical phenotype of 107 index patients with mitochondrial CPEO (n=66 patients with single and n=41 patients with multiple mtDNA deletions) and assessed these for clinical evidence of a SANDO phenotype. Patients with multiple mtDNA deletions were additionally screened for mutations in the nuclear-encoded POLG, SLC25A4, PEO1 and RRM2B genes. The clinical, histological and genetic data of 11 patients with SANDO were further analysed. None of the 66 patients with single, large-scale mtDNA deletions fulfilled the clinical criteria of SANDO syndrome. In contrast, 9 of 41 patients (22%) with multiple mtDNA deletions and two additional family members fulfilled the clinical criteria for SANDO. Within this subgroup, multiple mtDNA deletions were associated with the following nuclear mutations: POLG (n=6), PEO1 (n=2), unidentified (n=2). The combination of sensory ataxic neuropathy with ophthalmoparesis (SANO) was observed in 70% of patients with multiple mtDNA deletions but only in 4% with single deletions. The combination of CPEO and sensory ataxic neuropathy (SANO, incomplete SANDO) was found in 43% of patients with multiple mtDNA deletions but not in patients with single deletions. The SANDO syndrome seems to indicate a cluster of symptoms within the wide range of multisystemic symptoms associated with mitochondrial CPEO. SANO seems to be the most frequent phenotype associated with multiple mtDNA deletions in our cohort but not or is rarely associated with single, large-scale mtDNA deletions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Managing aquatic ecosystems and water resources under multiple stress--an introduction to the MARS project.

    PubMed

    Hering, Daniel; Carvalho, Laurence; Argillier, Christine; Beklioglu, Meryem; Borja, Angel; Cardoso, Ana Cristina; Duel, Harm; Ferreira, Teresa; Globevnik, Lidija; Hanganu, Jenica; Hellsten, Seppo; Jeppesen, Erik; Kodeš, Vit; Solheim, Anne Lyche; Nõges, Tiina; Ormerod, Steve; Panagopoulos, Yiannis; Schmutz, Stefan; Venohr, Markus; Birk, Sebastian

    2015-01-15

    Water resources globally are affected by a complex mixture of stressors resulting from a range of drivers, including urban and agricultural land use, hydropower generation and climate change. Understanding how stressors interfere and impact upon ecological status and ecosystem services is essential for developing effective River Basin Management Plans and shaping future environmental policy. This paper details the nature of these problems for Europe's water resources and the need to find solutions at a range of spatial scales. In terms of the latter, we describe the aims and approaches of the EU-funded project MARS (Managing Aquatic ecosystems and water Resources under multiple Stress) and the conceptual and analytical framework that it is adopting to provide this knowledge, understanding and tools needed to address multiple stressors. MARS is operating at three scales: At the water body scale, the mechanistic understanding of stressor interactions and their impact upon water resources, ecological status and ecosystem services will be examined through multi-factorial experiments and the analysis of long time-series. At the river basin scale, modelling and empirical approaches will be adopted to characterise relationships between multiple stressors and ecological responses, functions, services and water resources. The effects of future land use and mitigation scenarios in 16 European river basins will be assessed. At the European scale, large-scale spatial analysis will be carried out to identify the relationships amongst stress intensity, ecological status and service provision, with a special focus on large transboundary rivers, lakes and fish. The project will support managers and policy makers in the practical implementation of the Water Framework Directive (WFD), of related legislation and of the Blueprint to Safeguard Europe's Water Resources by advising the 3rd River Basin Management Planning cycle, the revision of the WFD and by developing new tools for diagnosing and predicting multiple stressors. Copyright © 2014. Published by Elsevier B.V.

  6. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors

    USGS Publications Warehouse

    Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Vrendenburg, Vance T.; Rosenblum, Erica Bree; Briggs, Cheryl J.

    2016-01-01

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.

  7. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors.

    PubMed

    Knapp, Roland A; Fellers, Gary M; Kleeman, Patrick M; Miller, David A W; Vredenburg, Vance T; Rosenblum, Erica Bree; Briggs, Cheryl J

    2016-10-18

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth's amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species' adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.

  8. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors

    PubMed Central

    Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Rosenblum, Erica Bree; Briggs, Cheryl J.

    2016-01-01

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale. PMID:27698128

  9. On Road Study of Colorado Front Range Greenhouse Gases Distribution and Sources

    NASA Astrophysics Data System (ADS)

    Petron, G.; Hirsch, A.; Trainer, M. K.; Karion, A.; Kofler, J.; Sweeney, C.; Andrews, A.; Kolodzey, W.; Miller, B. R.; Miller, L.; Montzka, S. A.; Kitzis, D. R.; Patrick, L.; Frost, G. J.; Ryerson, T. B.; Robers, J. M.; Tans, P.

    2008-12-01

    The Global Monitoring Division and Chemical Sciences Division of the NOAA Earth System Research Laboratory have teamed up over the summer 2008 to experiment with a new measurement strategy to characterize greenhouse gases distribution and sources in the Colorado Front Range. Combining expertise in greenhouse gases measurements and in local to regional scales air quality study intensive campaigns, we have built the 'Hybrid Lab'. A continuous CO2 and CH4 cavity ring down spectroscopic analyzer (Picarro, Inc.), a CO gas-filter correlation instrument (Thermo Environmental, Inc.) and a continuous UV absorption ozone monitor (2B Technologies, Inc., model 202SC) have been installed securely onboard a 2006 Toyota Prius Hybrid vehicle with an inlet bringing in outside air from a few meters above the ground. To better characterize point and distributed sources, air samples were taken with a Portable Flask Package (PFP) for later multiple species analysis in the lab. A GPS unit hooked up to the ozone analyzer and another one installed on the PFP kept track of our location allowing us to map measured concentrations on the driving route using Google Earth. The Hybrid Lab went out for several drives in the vicinity of the NOAA Boulder Atmospheric Observatory (BAO) tall tower located in Erie, CO and covering areas from Boulder, Denver, Longmont, Fort Collins and Greeley. Enhancements in CO2, CO and destruction of ozone mainly reflect emissions from traffic. Methane enhancements however are clearly correlated with nearby point sources (landfill, feedlot, natural gas compressor ...) or with larger scale air masses advected from the NE Colorado, where oil and gas drilling operations are widespread. The multiple species analysis (hydrocarbons, CFCs, HFCs) of the air samples collected along the way bring insightful information about the methane sources at play. We will present results of the analysis and interpretation of the Hybrid Lab Front Range Study and conclude with perspectives on how we will adapt the measurement strategy to study CO2 anthropogenic emissions in Denver Basin.

  10. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    NASA Technical Reports Server (NTRS)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  11. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  12. 3D plasmonic nanoantennas integrated with MEA biosensors

    NASA Astrophysics Data System (ADS)

    Dipalo, Michele; Messina, Gabriele C.; Amin, Hayder; La Rocca, Rosanna; Shalabaeva, Victoria; Simi, Alessandro; Maccione, Alessandro; Zilio, Pierfrancesco; Berdondini, Luca; de Angelis, Francesco

    2015-02-01

    Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level.Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr05578k

  13. Exploiting the potential of vector control for disease prevention.

    PubMed

    Townson, H; Nathan, M B; Zaim, M; Guillet, P; Manga, L; Bos, R; Kindhauser, M

    2005-12-01

    Although vector control has proven highly effective in preventing disease transmission, it is not being used to its full potential, thereby depriving disadvantaged populations of the benefits of well tried and tested methods. Following the discovery of synthetic residual insecticides in the 1940s, large-scale programmes succeeded in bringing many of the important vector-borne diseases under control. By the late 1960s, most vector-borne diseases--with the exception of malaria in Africa--were no longer considered to be of primary public health importance. The result was that control programmes lapsed, resources dwindled, and specialists in vector control disappeared from public health units. Within two decades, many important vector-borne diseases had re-emerged or spread to new areas. The time has come to restore vector control to its key role in the prevention of disease transmission, albeit with an increased emphasis on multiple measures, whether pesticide-based or involving environmental modification, and with a strengthened managerial and operational capacity. Integrated vector management provides a sound conceptual framework for deployment of cost-effective and sustainable methods of vector control. This approach allows for full consideration of the complex determinants of disease transmission, including local disease ecology, the role of human activity in increasing risks of disease transmission, and the socioeconomic conditions of affected communities.

  14. Exploiting the potential of vector control for disease prevention.

    PubMed Central

    Townson, H.; Nathan, M. B.; Zaim, M.; Guillet, P.; Manga, L.; Bos, R.; Kindhauser, M.

    2005-01-01

    Although vector control has proven highly effective in preventing disease transmission, it is not being used to its full potential, thereby depriving disadvantaged populations of the benefits of well tried and tested methods. Following the discovery of synthetic residual insecticides in the 1940s, large-scale programmes succeeded in bringing many of the important vector-borne diseases under control. By the late 1960s, most vector-borne diseases--with the exception of malaria in Africa--were no longer considered to be of primary public health importance. The result was that control programmes lapsed, resources dwindled, and specialists in vector control disappeared from public health units. Within two decades, many important vector-borne diseases had re-emerged or spread to new areas. The time has come to restore vector control to its key role in the prevention of disease transmission, albeit with an increased emphasis on multiple measures, whether pesticide-based or involving environmental modification, and with a strengthened managerial and operational capacity. Integrated vector management provides a sound conceptual framework for deployment of cost-effective and sustainable methods of vector control. This approach allows for full consideration of the complex determinants of disease transmission, including local disease ecology, the role of human activity in increasing risks of disease transmission, and the socioeconomic conditions of affected communities. PMID:16462987

  15. Distributed Electrical Energy Systems: Needs, Concepts, Approaches and Vision (in Chinese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yingchen; Zhang, Jun; Gao, Wenzhong

    Intelligent distributed electrical energy systems (IDEES) are featured by vast system components, diversifled component types, and difficulties in operation and management, which results in that the traditional centralized power system management approach no longer flts the operation. Thus, it is believed that the blockchain technology is one of the important feasible technical paths for building future large-scale distributed electrical energy systems. An IDEES is inherently with both social and technical characteristics, as a result, a distributed electrical energy system needs to be divided into multiple layers, and at each layer, a blockchain is utilized to model and manage its logicmore » and physical functionalities. The blockchains at difierent layers coordinate with each other and achieve successful operation of the IDEES. Speciflcally, the multi-layer blockchains, named 'blockchain group', consist of distributed data access and service blockchain, intelligent property management blockchain, power system analysis blockchain, intelligent contract operation blockchain, and intelligent electricity trading blockchain. It is expected that the blockchain group can be self-organized into a complex, autonomous and distributed IDEES. In this complex system, frequent and in-depth interactions and computing will derive intelligence, and it is expected that such intelligence can bring stable, reliable and efficient electrical energy production, transmission and consumption.« less

  16. Grid Computing Environment using a Beowulf Cluster

    NASA Astrophysics Data System (ADS)

    Alanis, Fransisco; Mahmood, Akhtar

    2003-10-01

    Custom-made Beowulf clusters using PCs are currently replacing expensive supercomputers to carry out complex scientific computations. At the University of Texas - Pan American, we built a 8 Gflops Beowulf Cluster for doing HEP research using RedHat Linux 7.3 and the LAM-MPI middleware. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes that were compiled in C on the cluster using the LAM-XMPI graphics user environment. We will demonstrate a "simple" prototype grid environment, where we will submit and run parallel jobs remotely across multiple cluster nodes over the internet from the presentation room at Texas Tech. University. The Sphinx Beowulf Cluster will be used for monte-carlo grid test-bed studies for the LHC-ATLAS high energy physics experiment. Grid is a new IT concept for the next generation of the "Super Internet" for high-performance computing. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.

  17. Increasing fruit, vegetable and water consumption in summer day camps--3-year findings of the healthy lunchbox challenge.

    PubMed

    Beets, Michael W; Tilley, Falon; Weaver, Robert G; Turner-McGrievy, Gabrielle M; Moore, Justin B

    2014-10-01

    The objective of this study was to describe the 3-year outcomes (2011-2013) from the healthy lunchbox challenge (HLC) delivered in the US-based summer day camps (SDC) (8-10 hours day(-1), 10-11 weeks summer(-1), SDC) to increase children and staff bringing fruit, vegetables and water (FVW) each day. A single group pre- with multiple post-test design was used in four large-scale SDCs serving more than 550 children day(-1) (6-12 years). The percentage of foods/beverages brought by children/staff, staff promotion of healthy eating and children's consumption of FVW was assessed via direct observation over 98 days across three summers. For children (3308 observations), fruit and vegetables (>11-16%) increased; no changes were observed for FVW for staff (398 observations). Reductions in unhealthy foods/beverages (e.g. soda/pop and chips) were observed for both children and staff (minus -10% to 38%). Staff role modeling unhealthy eating/drinking initially decreased but increased by 2013. The majority of children who brought fruit/vegetables consumed them. The HLC can influence the foods/beverages brought to SDCs. Enhancements are required to further increase FVW brought and consumed. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Equating in Small-Scale Language Testing Programs

    ERIC Educational Resources Information Center

    LaFlair, Geoffrey T.; Isbell, Daniel; May, L. D. Nicolas; Gutierrez Arvizu, Maria Nelly; Jamieson, Joan

    2017-01-01

    Language programs need multiple test forms for secure administrations and effective placement decisions, but can they have confidence that scores on alternate test forms have the same meaning? In large-scale testing programs, various equating methods are available to ensure the comparability of forms. The choice of equating method is informed by…

  19. Future of applied watershed science at regional scales

    Treesearch

    Lee Benda; Daniel Miller; Steve Lanigan; Gordon Reeves

    2009-01-01

    Resource managers must deal increasingly with land use and conservation plans applied at large spatial scales (watersheds, landscapes, states, regions) involving multiple interacting federal agencies and stakeholders. Access to a geographically focused and application-oriented database would allow users in different locations and with different concerns to quickly...

  20. Spotted Towhee population dynamics in a riparian restoration context

    Treesearch

    Stacy L. Small; Frank R., III Thompson; Geoffery R. Geupel; John Faaborg

    2007-01-01

    We investigated factors at multiple scales that might influence nest predation risk for Spotted Towhees (Pipilo maculates) along the Sacramento River, California, within the context of large-scale riparian habitat restoration. We used the logistic-exposure method and Akaike's information criterion (AIC) for model selection to compare predator...

  1. Process, pattern and scale: hydrogeomorphology and plant diversity in forested wetlands across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Alexander, L.; Hupp, C. R.; Forman, R. T.

    2002-12-01

    Many geodisturbances occur across large spatial scales, spanning entire landscapes and creating ecological phenomena in their wake. Ecological study at large scales poses special problems: (1) large-scale studies require large-scale resources, and (2) sampling is not always feasible at the appropriate scale, and researchers rely on data collected at smaller scales to interpret patterns across broad regions. A criticism of landscape ecology is that findings at small spatial scales are "scaled up" and applied indiscriminately across larger spatial scales. In this research, landscape scaling is addressed through process-pattern relationships between hydrogeomorphic processes and patterns of plant diversity in forested wetlands. The research addresses: (1) whether patterns and relationships between hydrogeomorphic, vegetation, and spatial variables can transcend scale; and (2) whether data collected at small spatial scales can be used to describe patterns and relationships across larger spatial scales. Field measurements of hydrologic, geomorphic, spatial, and vegetation data were collected or calculated for 15- 1-ha sites on forested floodplains of six (6) Chesapeake Bay Coastal Plain streams over a total area of about 20,000 km2. Hydroperiod (day/yr), floodplain surface elevation range (m), discharge (m3/s), stream power (kg-m/s2), sediment deposition (mm/yr), relative position downstream and other variables were used in multivariate analyses to explain differences in species richness, tree diversity (Shannon-Wiener Diversity Index H'), and plant community composition at four spatial scales. Data collected at the plot (400-m2) and site- (c. 1-ha) scales are applied to and tested at the river watershed and regional spatial scales. Results indicate that plant species richness and tree diversity (Shannon-Wiener diversity index H') can be described by hydrogeomorphic conditions at all scales, but are best described at the site scale. Data collected at plot and site scales are tested for spatial heterogeneity across the Chesapeake Bay Coastal Plain using a geostatistical variogram, and multiple regression analysis is used to relate plant diversity, spatial, and hydrogeomorphic variables across Coastal Plain regions and hydrologic regimes. Results indicate that relationships between hydrogeomorphic processes and patterns of plant diversity at finer scales can proxy relationships at coarser scales in some, not all, cases. Findings also suggest that data collected at small scales can be used to describe trends across broader scales under limited conditions.

  2. New type of hill-top inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barvinsky, A.O.; Department of Physics, Tomsk State University,Lenin Ave. 36, Tomsk 634050; Department of Physics and Astronomy, Pacific Institue for Theoretical Physics,University of British Columbia, 6224 Agricultural Road, Vancouver, BC V6T 1Z1

    2016-01-20

    We suggest a new type of hill-top inflation originating from the initial conditions in the form of the microcanonical density matrix for the cosmological model with a large number of quantum fields conformally coupled to gravity. Initial conditions for inflation are set up by cosmological instantons describing underbarrier oscillations in the vicinity of the inflaton potential maximum. These periodic oscillations of the inflaton field and cosmological scale factor are obtained within the approximation of two coupled oscillators subject to the slow roll regime in the Euclidean time. This regime is characterized by rapid oscillations of the scale factor on themore » background of a slowly varying inflaton, which guarantees smallness of slow roll parameters ϵ and η of the following inflation stage. A hill-like shape of the inflaton potential is shown to be generated by logarithmic loop corrections to the tree-level asymptotically shift-invariant potential in the non-minimal Higgs inflation model and R{sup 2}-gravity. The solution to the problem of hierarchy between the Planckian scale and the inflation scale is discussed within the concept of conformal higher spin fields, which also suggests the mechanism bringing the model below the gravitational cutoff and, thus, protecting it from large graviton loop corrections.« less

  3. New type of hill-top inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barvinsky, A.O.; Nesterov, D.V.; Kamenshchik, A.Yu., E-mail: barvin@td.lpi.ru, E-mail: Alexander.Kamenshchik@bo.infn.it, E-mail: nesterov@td.lpi.ru

    2016-01-01

    We suggest a new type of hill-top inflation originating from the initial conditions in the form of the microcanonical density matrix for the cosmological model with a large number of quantum fields conformally coupled to gravity. Initial conditions for inflation are set up by cosmological instantons describing underbarrier oscillations in the vicinity of the inflaton potential maximum. These periodic oscillations of the inflaton field and cosmological scale factor are obtained within the approximation of two coupled oscillators subject to the slow roll regime in the Euclidean time. This regime is characterized by rapid oscillations of the scale factor on themore » background of a slowly varying inflaton, which guarantees smallness of slow roll parameters ε and η of the following inflation stage. A hill-like shape of the inflaton potential is shown to be generated by logarithmic loop corrections to the tree-level asymptotically shift-invariant potential in the non-minimal Higgs inflation model and R{sup 2}-gravity. The solution to the problem of hierarchy between the Planckian scale and the inflation scale is discussed within the concept of conformal higher spin fields, which also suggests the mechanism bringing the model below the gravitational cutoff and, thus, protecting it from large graviton loop corrections.« less

  4. Multiple Sclerosis, Personal Stories | NIH MedlinePlus the Magazine

    MedlinePlus

    ... please turn Javascript on. Feature: Multiple Sclerosis Personal Stories: Nicole Lemelle, Iris Young, Michael Anthony, John Cantú ... Better," an Internet video series that brings the story of MS to life through the eyes of ...

  5. Large-scale atomistic simulations demonstrate dominant alloy disorder effects in GaBixAs1 -x/GaAs multiple quantum wells

    NASA Astrophysics Data System (ADS)

    Usman, Muhammad

    2018-04-01

    Bismide semiconductor materials and heterostructures are considered a promising candidate for the design and implementation of photonic, thermoelectric, photovoltaic, and spintronic devices. This work presents a detailed theoretical study of the electronic and optical properties of strongly coupled GaBixAs1 -x /GaAs multiple quantum well (MQW) structures. Based on a systematic set of large-scale atomistic tight-binding calculations, our results reveal that the impact of atomic-scale fluctuations in alloy composition is stronger than the interwell coupling effect, and plays an important role in the electronic and optical properties of the investigated MQW structures. Independent of QW geometry parameters, alloy disorder leads to a strong confinement of charge carriers, a large broadening of the hole energies, and a red-shift in the ground-state transition wavelength. Polarization-resolved optical transition strengths exhibit a striking effect of disorder, where the inhomogeneous broadening could exceed an order of magnitude for MQWs, in comparison to a factor of about 3 for single QWs. The strong influence of alloy disorder effects persists when small variations in the size and composition of MQWs typically expected in a realistic experimental environment are considered. The presented results highlight the limited scope of continuum methods and emphasize on the need for large-scale atomistic approaches to design devices with tailored functionalities based on the novel properties of bismide materials.

  6. Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.

    PubMed

    Tran, Ngoc Tam L; Huang, Chun-Hsi

    2017-05-01

    We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.

  7. Validating Bayesian truth serum in large-scale online human experiments.

    PubMed

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  8. Validating Bayesian truth serum in large-scale online human experiments

    PubMed Central

    Frank, Morgan R.; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method’s mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon’s Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the “honest” distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where “honest” answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers. PMID:28494000

  9. Automatic Scoring of Paper-and-Pencil Figural Responses. Research Report.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; And Others

    Large-scale testing is dominated by the multiple-choice question format. Widespread use of the format is due, in part, to the ease with which multiple-choice items can be scored automatically. This paper examines automatic scoring procedures for an alternative item type: figural response. Figural response items call for the completion or…

  10. Multiple Object Retrieval in Image Databases Using Hierarchical Segmentation Tree

    ERIC Educational Resources Information Center

    Chen, Wei-Bang

    2012-01-01

    The purpose of this research is to develop a new visual information analysis, representation, and retrieval framework for automatic discovery of salient objects of user's interest in large-scale image databases. In particular, this dissertation describes a content-based image retrieval framework which supports multiple-object retrieval. The…

  11. Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course

    ERIC Educational Resources Information Center

    Gallagher, Silvia Elena; Savage, Timothy

    2015-01-01

    Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…

  12. Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course

    ERIC Educational Resources Information Center

    Gallagher, Silvia Elena; Savage, Timothy

    2016-01-01

    Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…

  13. Coupled Multiple-Response versus Free-Response Conceptual Assessment: An Example from Upper-Division Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Pollock, Steven J.

    2014-01-01

    Free-response research-based assessments, like the Colorado Upper-division Electrostatics Diagnostic (CUE), provide rich, fine-grained information about students' reasoning. However, because of the difficulties inherent in scoring these assessments, the majority of the large-scale conceptual assessments in physics are multiple choice. To increase…

  14. Floating Data and the Problem with Illustrating Multiple Regression.

    ERIC Educational Resources Information Center

    Sachau, Daniel A.

    2000-01-01

    Discusses how to introduce basic concepts of multiple regression by creating a large-scale, three-dimensional regression model using the classroom walls and floor. Addresses teaching points that should be covered and reveals student reaction to the model. Finds that the greatest benefit of the model is the low fear, walk-through, nonmathematical…

  15. SUMMIT (Serially Unified Multicenter Multiple Sclerosis Investigation): creating a repository of deeply phenotyped contemporary multiple sclerosis cohorts.

    PubMed

    Bove, Riley; Chitnis, Tanuja; Cree, Bruce Ac; Tintoré, Mar; Naegelin, Yvonne; Uitdehaag, Bernard Mj; Kappos, Ludwig; Khoury, Samia J; Montalban, Xavier; Hauser, Stephen L; Weiner, Howard L

    2017-08-01

    There is a pressing need for robust longitudinal cohort studies in the modern treatment era of multiple sclerosis. Build a multiple sclerosis (MS) cohort repository to capture the variability of disability accumulation, as well as provide the depth of characterization (clinical, radiologic, genetic, biospecimens) required to adequately model and ultimately predict a patient's course. Serially Unified Multicenter Multiple Sclerosis Investigation (SUMMIT) is an international multi-center, prospectively enrolled cohort with over a decade of comprehensive follow-up on more than 1000 patients from two large North American academic MS Centers (Brigham and Women's Hospital (Comprehensive Longitudinal Investigation of Multiple Sclerosis at the Brigham and Women's Hospital (CLIMB; BWH)) and University of California, San Francisco (Expression/genomics, Proteomics, Imaging, and Clinical (EPIC))). It is bringing online more than 2500 patients from additional international MS Centers (Basel (Universitätsspital Basel (UHB)), VU University Medical Center MS Center Amsterdam (MSCA), Multiple Sclerosis Center of Catalonia-Vall d'Hebron Hospital (Barcelona clinically isolated syndrome (CIS) cohort), and American University of Beirut Medical Center (AUBMC-Multiple Sclerosis Interdisciplinary Research (AMIR)). We provide evidence for harmonization of two of the initial cohorts in terms of the characterization of demographics, disease, and treatment-related variables; demonstrate several proof-of-principle analyses examining genetic and radiologic predictors of disease progression; and discuss the steps involved in expanding SUMMIT into a repository accessible to the broader scientific community.

  16. Production of microbial biosurfactants by solid-state cultivation.

    PubMed

    Krieger, Nadia; Camilios Neto, Doumit; Mitchell, David Alexander

    2010-01-01

    In recent years biosurfactants have attracted attention because of their low toxicity, biodegradability and ecological acceptability. However, their use is currently extremely limited due to their high cost in relation to that of chemical surfactants. Solid-state cultivation represents an alternative technology for biosurfactant production that can bring two important advantages: firstly, it allows the use of inexpensive substrates and, secondly, it avoids the problem of foaming that complicates submerged cultivation processes for biosurfactant production. In this chapter we show that, despite its potential, to date relatively little attention has been given to solid-state cultivation for biosurfactant production. We also note that this cultivation technique brings its own challenges, such as the selection of a bioreactor type that will allow adequate heat removal, of substrates with appropriate physico-chemical properties and of methods for monitoring of the cultivation process and recovering the biosurfactants from the fermented solid. With suitable efforts in research, solid-state cultivation can be used for large-scale production of biosurfactants.

  17. Snapshot Hyperspectral Volumetric Microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Jiamin; Xiong, Bo; Lin, Xing; He, Jijun; Suo, Jinli; Dai, Qionghai

    2016-04-01

    The comprehensive analysis of biological specimens brings about the demand for capturing the spatial, temporal and spectral dimensions of visual information together. However, such high-dimensional video acquisition faces major challenges in developing large data throughput and effective multiplexing techniques. Here, we report the snapshot hyperspectral volumetric microscopy that computationally reconstructs hyperspectral profiles for high-resolution volumes of ~1000 μm × 1000 μm × 500 μm at video rate by a novel four-dimensional (4D) deconvolution algorithm. We validated the proposed approach with both numerical simulations for quantitative evaluation and various real experimental results on the prototype system. Different applications such as biological component analysis in bright field and spectral unmixing of multiple fluorescence are demonstrated. The experiments on moving fluorescent beads and GFP labelled drosophila larvae indicate the great potential of our method for observing multiple fluorescent markers in dynamic specimens.

  18. Development of a Knowledgebase (MetRxn) of Metabolites, Reactions and Atom Mappings to Accelerate Discovery and Redesign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maranas, Costas D.

    With advances in DNA sequencing and genome annotation techniques, the breadth of metabolic knowledge across all kingdoms of life is increasing. The construction of genome-scale models (GSMs) facilitates this distillation of knowledge by systematically accounting for reaction stoichiometry and directionality, gene to protein to reaction relationships, reaction localization among cellular organelles, metabolite transport costs and routes, transcriptional regulation, and biomass composition. Genome-scale reconstructions available now span across all kingdoms of life, from microbes to whole-plant models, and have become indispensable for driving informed metabolic designs and interventions. A key barrier to the pace of this development is our inability tomore » utilize metabolite/reaction information from databases such as BRENDA [1], KEGG [2], MetaCyc [3], etc. due to incompatibilities of representation, duplications, and errors. Duplicate entries constitute a major impediment, where the same metabolite is found with multiple names across databases and models, which significantly slows downs the collating of information from multiple data sources. This can also lead to serious modeling errors such as charge/mass imbalances [4,5] which can thwart model predictive abilities such as identifying synthetic lethal gene pairs and quantifying metabolic flows. Hence, we created the MetRxn database [6] that takes the next step in integrating data from multiple sources and formats to automatically create a standardized knowledgebase. We subsequently deployed this resource to bring about new paradigms in genome-scale metabolic model reconstruction, metabolic flux elucidation through MFA, modeling of microbial communities, and pathway prospecting. This research has enabled the PI’s group to continue building upon research milestones and reach new ones (see list of MetRxn-related publications below).« less

  19. Combining heterogenous features for 3D hand-held object recognition

    NASA Astrophysics Data System (ADS)

    Lv, Xiong; Wang, Shuang; Li, Xiangyang; Jiang, Shuqiang

    2014-10-01

    Object recognition has wide applications in the area of human-machine interaction and multimedia retrieval. However, due to the problem of visual polysemous and concept polymorphism, it is still a great challenge to obtain reliable recognition result for the 2D images. Recently, with the emergence and easy availability of RGB-D equipment such as Kinect, this challenge could be relieved because the depth channel could bring more information. A very special and important case of object recognition is hand-held object recognition, as hand is a straight and natural way for both human-human interaction and human-machine interaction. In this paper, we study the problem of 3D object recognition by combining heterogenous features with different modalities and extraction techniques. For hand-craft feature, although it reserves the low-level information such as shape and color, it has shown weakness in representing hiconvolutionalgh-level semantic information compared with the automatic learned feature, especially deep feature. Deep feature has shown its great advantages in large scale dataset recognition but is not always robust to rotation or scale variance compared with hand-craft feature. In this paper, we propose a method to combine hand-craft point cloud features and deep learned features in RGB and depth channle. First, hand-held object segmentation is implemented by using depth cues and human skeleton information. Second, we combine the extracted hetegerogenous 3D features in different stages using linear concatenation and multiple kernel learning (MKL). Then a training model is used to recognize 3D handheld objects. Experimental results validate the effectiveness and gerneralization ability of the proposed method.

  20. Study of Multiple Scale Physics of Magnetic Reconnection on the FLARE (Facility for Laboratory Reconnection Experiments)

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Chen, Y.; Cutler, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Ren, Y.; Yamada, M.; Yoo, J.

    2015-12-01

    The FLARE device (flare.pppl.gov) is a new intermediate-scale plasma experiment under construction at Princeton to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The existing small-scale experiments have been focusing on the single X-line reconnection process either with small effective sizes or at low Lundquist numbers, but both of which are typically very large in natural plasmas. The configuration of the FLARE device is designed to provide experimental access to the new regimes involving multiple X-lines, as guided by a reconnection "phase diagram" [Ji & Daughton, PoP (2011)]. Most of major components of the FLARE device have been designed and are under construction. The device will be assembled and installed in 2016, followed by commissioning and operation in 2017. The planned research on FLARE as a user facility will be discussed on topics including the multiple scale nature of magnetic reconnection from global fluid scales to ion and electron kinetic scales. Results from scoping simulations based on particle and fluid codes and possible comparative research with space measurements will be presented.

  1. Effect of B-ring substitution pattern on binding mode of propionamide selective androgen receptor modulators.

    PubMed

    Bohl, Casey E; Wu, Zengru; Chen, Jiyun; Mohler, Michael L; Yang, Jun; Hwang, Dong Jin; Mustafa, Suni; Miller, Duane D; Bell, Charles E; Dalton, James T

    2008-10-15

    Selective androgen receptor modulators (SARMs) are essentially prostate sparing androgens, which provide therapeutic potential in osteoporosis, male hormone replacement, and muscle wasting. Herein we report crystal structures of the androgen receptor (AR) ligand-binding domain (LBD) complexed to a series of potent synthetic nonsteroidal SARMs with a substituted pendant arene referred to as the B-ring. We found that hydrophilic B-ring para-substituted analogs exhibit an additional region of hydrogen bonding not seen with steroidal compounds and that multiple halogen substitutions affect the B-ring conformation and aromatic interactions with Trp741. This information elucidates interactions important for high AR binding affinity and provides new insight for structure-based drug design.

  2. Assessing Leading ERP-SAP Implementation in Leading Firms in Indonesia

    NASA Astrophysics Data System (ADS)

    Syaiful, B.; Gunawan, W.

    2017-01-01

    Enterprise resource planning (ERP) enables to bring critical capabilities to an organisation, however, the implementation of such capabilities is often surrounded with problems. The implementing ERP-SAP in Indonesian enterprises are still facing tremendous challenges with the failure rate can reach more than 80% of the cases. The article examines the common problems faced by the consultants whenever they deal with their clients, from the practical perspectives. The article takes the multiple case studies of the leading enterprises in Indonesia, such as: KS (largest steel producer), GEM (large mining producer), and HS (large retailer), with the aim to identify the root of problems of SAP implementation. The outcome of the study is expected to provide the consultants with the guideline to understand the ERP implementation process in their clients and effective solutions to cope with it.

  3. A genetically stable rooting protocol for propagating a threatened medicinal plant—Celastrus paniculatus

    PubMed Central

    Phulwaria, Mahendra; Rai, Manoj K.; Patel, Ashok Kumar; Kataria, Vinod; Shekhawat, N. S.

    2012-01-01

    Celastrus paniculatus, belonging to the family Celastraceae, is an important medicinal plant of India. Owing to the ever-increasing demand from the pharmaceutical industry, the species is being overexploited, thereby threatening its stock in the wild. Poor seed viability coupled with low germination restricts its propagation through sexual means. Thus, alternative approaches such as in vitro techniques are highly desirable for large-scale propagation of this medicinally important plant. Nodal segments, obtained from a 12-year-old mature plant, were used as explants for multiple shoot induction. Shoot multiplication was achieved by repeated transfer of mother explants and subculturing of in vitro produced shoot clumps on Murashige and Skoog's (MS) medium supplemented with various concentrations of 6-benzylaminopurine (BAP) alone or in combination with auxin (indole-3-acetic acid (IAA) or α-naphthalene acetic acid (NAA)). The maximum number of shoots (47.75 ± 2.58) was observed on MS medium supplemented with BAP (0.5 mg L−1) and IAA (0.1 mg L−1). In vitro raised shoots were rooted under ex vitro conditions after treating them with indole-3-butyric acid (300 mg L−1) for 3 min. Over 95 % of plantlets acclimatized successfully. The genetic fidelity of the regenerated plants was assessed using random amplified polymorphic DNA. No polymorphism was detected in regenerated plants and the mother plant, revealing the genetic fidelity of the in vitro raised plantlets. The protocol discussed could be effectively employed for large-scale multiplication of C. paniculatus. Its commercial application could be realized for the large-scale multiplication and supply to the State Forest Department.

  4. Fine-scale characteristics of interplanetary sector

    NASA Technical Reports Server (NTRS)

    Behannon, K. W.; Neubauer, F. M.; Barnstoff, H.

    1980-01-01

    The structure of the interplanetary sector boundaries observed by Helios 1 within sector transition regions was studied. Such regions consist of intermediate (nonspiral) average field orientations in some cases, as well as a number of large angle directional discontinuities (DD's) on the fine scale (time scales 1 hour). Such DD's are found to be more similar to tangential than rotational discontinuities, to be oriented on average more nearly perpendicular than parallel to the ecliptic plane to be accompanied usually by a large dip ( 80%) in B and, with a most probable thickness of 3 x 10 to the 4th power km, significantly thicker previously studied. It is hypothesized that the observed structures represent multiple traversals of the global heliospheric current sheet due to local fluctuations in the position of the sheet. There is evidence that such fluctuations are sometimes produced by wavelike motions or surface corrugations of scale length 0.05 - 0.1 AU superimposed on the large scale structure.

  5. Homogenization techniques for population dynamics in strongly heterogeneous landscapes.

    PubMed

    Yurk, Brian P; Cobbold, Christina A

    2018-12-01

    An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.

  6. Nanoparticles from renewable polymers

    PubMed Central

    Wurm, Frederik R.; Weiss, Clemens K.

    2014-01-01

    The use of polymers from natural resources can bring many benefits for novel polymeric nanoparticle systems. Such polymers have a variety of beneficial properties such as biodegradability and biocompatibility, they are readily available on large scale and at low cost. As the amount of fossil fuels decrease, their application becomes more interesting even if characterization is in many cases more challenging due to structural complexity, either by broad distribution of their molecular weights (polysaccharides, polyesters, lignin) or by complex structure (proteins, lignin). This review summarizes different sources and methods for the preparation of biopolymer-based nanoparticle systems for various applications. PMID:25101259

  7. The role of the independent clinical laboratory in new assay development and commercialization.

    PubMed

    Ellis, David G

    2003-01-01

    Most would agree that these are exciting times in the field of laboratory medicine. As the body of scientific knowledge expands and research activities, such as those catalyzed by the sequencing of the human genome, bring us closer to the promise of personalized medicine, the clinical laboratory industry will have increasing opportunities to partner with owners of intellectual property to develop and commercialize new diagnostic tests. The large, independent clinical laboratories are particularly well positioned to commercialize important new tests, with their broad market penetration, infrastructure, and the scale to run esoteric tests cost-effectively.

  8. Sizing of complex structure by the integration of several different optimal design algorithms

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.

    1974-01-01

    Practical design of large-scale structures can be accomplished with the aid of the digital computer by bringing together in one computer program algorithms of nonlinear mathematical programing and optimality criteria with weight-strength and other so-called engineering methods. Applications of this approach to aviation structures are discussed with a detailed description of how the total problem of structural sizing can be broken down into subproblems for best utilization of each algorithm and for efficient organization of the program into iterative loops. Typical results are examined for a number of examples.

  9. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  10. Using systems thinking to support clinical system transformation.

    PubMed

    Best, Allan; Berland, Alex; Herbert, Carol; Bitz, Jennifer; van Dijk, Marlies W; Krause, Christina; Cochrane, Douglas; Noel, Kevin; Marsden, Julian; McKeown, Shari; Millar, John

    2016-05-16

    Purpose - The British Columbia Ministry of Health's Clinical Care Management initiative was used as a case study to better understand large-scale change (LSC) within BC's health system. Using a complex system framework, the purpose of this paper is to examine mechanisms that enable and constrain the implementation of clinical guidelines across various clinical settings. Design/methodology/approach - Researchers applied a general model of complex adaptive systems plus two specific conceptual frameworks (realist evaluation and system dynamics mapping) to define and study enablers and constraints. Focus group sessions and interviews with clinicians, executives, managers and board members were validated through an online survey. Findings - The functional themes for managing large-scale clinical change included: creating a context to prepare clinicians for health system transformation initiatives; promoting shared clinical leadership; strengthening knowledge management, strategic communications and opportunities for networking; and clearing pathways through the complexity of a multilevel, dynamic system. Research limitations/implications - The action research methodology was designed to guide continuing improvement of implementation. A sample of initiatives was selected; it was not intended to compare and contrast facilitators and barriers across all initiatives and regions. Similarly, evaluating the results or process of guideline implementation was outside the scope; the methods were designed to enable conversations at multiple levels - policy, management and practice - about how to improve implementation. The study is best seen as a case study of LSC, offering a possible model for replication by others and a tool to shape further dialogue. Practical implications - Recommended action-oriented strategies included engaging local champions; supporting local adaptation for implementation of clinical guidelines; strengthening local teams to guide implementation; reducing change fatigue; ensuring adequate resources; providing consistent communication especially for front-line care providers; and supporting local teams to demonstrate the clinical value of the guidelines to their colleagues. Originality/value - Bringing a complex systems perspective to clinical guideline implementation resulted in a clear understanding of the challenges involved in LSC.

  11. Returning migrant characteristics and labor market demand in Greece.

    PubMed

    Petras, E M; Kousis, M

    1988-01-01

    Immigrants who repatriate bring with them modern work skills which many observers in labor exporting regions describe as a great contribution to the mother country. Using data from 2 samples of Greek repatriates as well as projections of industrial labor force demands in Greece for the 1980s, this article challenges this concept. The authors find that the uneven regional development and stunted industrial growth which pushed these workers abroad are also responsible for the narrowly limited employment options which they face once they repatriate. For the urban repatriate, the market is limited to unemployment, the urban informal sector and scattered jobs, while for the rural repatriate, small-scale agriculture, multiple job holdings and unemployment are the only viable options.

  12. Advances toward field application of 3D hydraulic tomography

    NASA Astrophysics Data System (ADS)

    Cardiff, M. A.; Barrash, W.; Kitanidis, P. K.

    2011-12-01

    Hydraulic tomography (HT) is a technique that shows great potential for aquifer characterization and one that holds the promise of producing 3D hydraulic property distributions, given suitable equipment. First suggested over 15 years ago, HT assimilates distributed aquifer pressure (head) response data collected during a series of multiple pumping tests to produce estimates of aquifer property variability. Unlike traditional curve-matching analyses, which assume homogeneity or "effective" parameters within the radius of influence of a hydrologic test, HT analysis relies on numerical models with detailed heterogeneity in order to invert for the highly resolved 3D parameter distribution that jointly fits all data. Several numerical and laboratory investigations of characterization using HT have shown that property distributions can be accurately estimated between observation locations when experiments are correctly designed - a property not always shared by other, simpler 1D characterization approaches such as partially-penetrating slug tests. HT may represent one of the best methods available for obtaining detailed 3D aquifer property descriptions, especially in deep or "hard" aquifer materials, where direct-push methods may not be feasible. However, to date HT has not yet been widely adopted at contaminated field sites. We believe that current perceived impediments to HT adoption center around four key issues: 1) A paucity in the scientific literature of proven, cross-validated 3D field applications 2) A lack of guidelines and best practices for performing field 3D HT experiments; 3) Practical difficulty and time commitment associated with the installation of a large number of high-accuracy sampling locations, and the running of a large number of pumping tests; and 4) Computational difficulty associated with solving large-scale inverse problems for parameter identification. In this talk, we present current results in 3D HT research that addresses these four issues, and thus bring HT closer to field practice. Topics to be discussed include: -Improving field efficiency through design and implementation of new modular, easily-installed equipment for 3D HT. -Validating field-scale 3D HT through application and cross-validation at the Boise Hydrogeophysical Research Site. -Developing guidelines for HT implementation based on field experience, numerical modeling, and a comprehensive literature review of the past 15 years of HT research. -Application of novel, fast numerical methods for large-scale HT data analysis. The results presented will focus on the application of 3D HT, but in general we also hope to provide insights on aquifer characterization that stimulate thought on the issue of continually updating aquifer characteristics estimates while recognizing uncertainties and providing guidance for future data collection.

  13. Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.

    PubMed

    Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.

  14. Large scale modulation of high frequency acoustic waves in periodic porous media.

    PubMed

    Boutin, Claude; Rallu, Antoine; Hans, Stephane

    2012-12-01

    This paper deals with the description of the modulation at large scale of high frequency acoustic waves in gas saturated periodic porous media. High frequencies mean local dynamics at the pore scale and therefore absence of scale separation in the usual sense of homogenization. However, although the pressure is spatially varying in the pores (according to periodic eigenmodes), the mode amplitude can present a large scale modulation, thereby introducing another type of scale separation to which the asymptotic multi-scale procedure applies. The approach is first presented on a periodic network of inter-connected Helmholtz resonators. The equations governing the modulations carried by periodic eigenmodes, at frequencies close to their eigenfrequency, are derived. The number of cells on which the carrying periodic mode is defined is therefore a parameter of the modeling. In a second part, the asymptotic approach is developed for periodic porous media saturated by a perfect gas. Using the "multicells" periodic condition, one obtains the family of equations governing the amplitude modulation at large scale of high frequency waves. The significant difference between modulations of simple and multiple mode are evidenced and discussed. The features of the modulation (anisotropy, width of frequency band) are also analyzed.

  15. Business Architecture Development at Public Administration - Insights from Government EA Method Engineering Project in Finland

    NASA Astrophysics Data System (ADS)

    Valtonen, Katariina; Leppänen, Mauri

    Governments worldwide are concerned for efficient production of services to customers. To improve quality of services and to make service production more efficient, information and communication technology (ICT) is largely exploited in public administration (PA). Succeeding in this exploitation calls for large-scale planning which embraces issues from strategic to technological level. In this planning the notion of enterprise architecture (EA) is commonly applied. One of the sub-architectures of EA is business architecture (BA). BA planning is challenging in PA due to a large number of stakeholders, a wide set of customers, and solid and hierarchical structures of organizations. To support EA planning in Finland, a project to engineer a government EA (GEA) method was launched. In this chapter, we analyze the discussions and outputs of the project workshops and reflect emerged issues on current e-government literature. We bring forth insights into and suggestions for government BA and its development.

  16. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  17. Revised Kuppuswamy's Socioeconomic Status Scale: Explained and Updated.

    PubMed

    Sharma, Rahul

    2017-10-15

    Some of the facets of the Kuppuswamy's socioeconomic status scale sometimes create confusion and require explanation on how to classify, and need some minor updates to bring the scale up-to-date. This article provides a revised scale that allows for the real-time update of the scale.

  18. Revised Kuppuswamy's Socioeconomic Status Scale: Explained and Updated.

    PubMed

    Sharma, Rahul

    2017-08-26

    Some of the facets of the Kuppuswamy's socioeconomic status scale sometimes create confusion and require explanation on how to classify, and need some minor updates to bring the scale up-to-date. This article provides a revised scale that allows for the real-time update of the scale.

  19. A new large-scale manufacturing platform for complex biopharmaceuticals.

    PubMed

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  20. Evaluation of an index of biotic integrity approach to assess fish assemblage condition in Western USA streams and rivers at varying spatial scales

    EPA Science Inventory

    Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data ...

  1. Local-scale invasion pathways and small founder numbers in introduced Sacramento pikeminnow (Ptychocheilus grandis)

    Treesearch

    Andrew P. Kinziger; Rodney J. Nakamoto; Bret C. Harvey

    2014-01-01

    Given the general pattern of invasions with severe ecological consequences commonly resulting from multiple introductions of large numbers of individuals on the intercontinental scale, we explored an example of a highly successful, ecologically significant invader introduced over a short distance, possibly via minimal propagule pressure. The Sacramento pikeminnow (

  2. Taking the pulse of a continent: Expanding site-based research infrastructure for regional- to continental-scale ecology

    USDA-ARS?s Scientific Manuscript database

    Many of the most dramatic and surprising effects of global change on ecological systems will occur across large spatial extents, from regions to continents. Multiple ecosystem types will be impacted across a range of interacting spatial and temporal scales. The ability of ecologists to understand an...

  3. MONITORING COASTAL RESOURCES AT MULTIPLE SPATIAL AND TEMPORAL SCALES: LESSONS FROM EMAP 2001 EMAP SYMPOSIUM, APRIL 24-27, PENSACOLA BEACH, FL

    EPA Science Inventory

    In 1990, EMAP's Coastal Monitoring Program conducted its first regional sampling program in the Virginian Province. This first effort focused only at large spatial scales (regional) with some stratification to examine estuarine types. In the ensuing decade, EMAP-Coastal has condu...

  4. Making Visible Teacher Reports of Their Teaching Experiences: The Early Childhood Teacher Experiences Scale

    ERIC Educational Resources Information Center

    Fantuzzo, John; Perlman, Staci; Sproul, Faith; Minney, Ashley; Perry, Marlo A.; Li, Feifei

    2012-01-01

    The study developed multiple independent scales of early childhood teacher experiences (ECTES). ECTES was co-constructed with preschool, kindergarten, and first grade teachers in a large urban school district. Demographic, ECTES, and teaching practices data were collected from 584 teachers. Factor analyses documented three teacher experience…

  5. Relative Costs of Various Types of Assessments.

    ERIC Educational Resources Information Center

    Wheeler, Patricia H.

    Issues of the relative costs of multiple choice tests and alternative types of assessment are explored. Before alternative assessments in large-scale or small-scale programs are used, attention must be given to cost considerations and the resources required to develop and implement the assessment. Major categories of cost to be considered are…

  6. Large area nanoimprint by substrate conformal imprint lithography (SCIL)

    NASA Astrophysics Data System (ADS)

    Verschuuren, Marc A.; Megens, Mischa; Ni, Yongfeng; van Sprang, Hans; Polman, Albert

    2017-06-01

    Releasing the potential of advanced material properties by controlled structuring materials on sub-100-nm length scales for applications such as integrated circuits, nano-photonics, (bio-)sensors, lasers, optical security, etc. requires new technology to fabricate nano-patterns on large areas (from cm2 to 200 mm up to display sizes) in a cost-effective manner. Conventional high-end optical lithography such as stepper/scanners is highly capital intensive and not flexible towards substrate types. Nanoimprint has had the potential for over 20 years to bring a cost-effective, flexible method for large area nano-patterning. Over the last 3-4 years, nanoimprint has made great progress towards volume production. The main accelerator has been the switch from rigid- to wafer-scale soft stamps and tool improvements for step and repeat patterning. In this paper, we discuss substrate conformal imprint lithography (SCIL), which combines nanometer resolution, low patterns distortion, and overlay alignment, traditionally reserved for rigid stamps, with the flexibility and robustness of soft stamps. This was made possible by a combination of a new soft stamp material, an inorganic resist, combined with an innovative imprint method. Finally, a volume production solution will be presented, which can pattern up to 60 wafers per hour.

  7. Numerically modelling the large scale coronal magnetic field

    NASA Astrophysics Data System (ADS)

    Panja, Mayukh; Nandi, Dibyendu

    2016-07-01

    The solar corona spews out vast amounts of magnetized plasma into the heliosphere which has a direct impact on the Earth's magnetosphere. Thus it is important that we develop an understanding of the dynamics of the solar corona. With our present technology it has not been possible to generate 3D magnetic maps of the solar corona; this warrants the use of numerical simulations to study the coronal magnetic field. A very popular method of doing this, is to extrapolate the photospheric magnetic field using NLFF or PFSS codes. However the extrapolations at different time intervals are completely independent of each other and do not capture the temporal evolution of magnetic fields. On the other hand full MHD simulations of the global coronal field, apart from being computationally very expensive would be physically less transparent, owing to the large number of free parameters that are typically used in such codes. This brings us to the Magneto-frictional model which is relatively simpler and computationally more economic. We have developed a Magnetofrictional Model, in 3D spherical polar co-ordinates to study the large scale global coronal field. Here we present studies of changing connectivities between active regions, in response to photospheric motions.

  8. Large-scale data analysis of power grid resilience across multiple US service regions

    NASA Astrophysics Data System (ADS)

    Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert

    2016-05-01

    Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.

  9. Cooperation without culture? The null effect of generalized trust on intentional homicide: a cross-national panel analysis, 1995-2009.

    PubMed

    Robbins, Blaine

    2013-01-01

    Sociologists, political scientists, and economists all suggest that culture plays a pivotal role in the development of large-scale cooperation. In this study, I used generalized trust as a measure of culture to explore if and how culture impacts intentional homicide, my operationalization of cooperation. I compiled multiple cross-national data sets and used pooled time-series linear regression, single-equation instrumental-variables linear regression, and fixed- and random-effects estimation techniques on an unbalanced panel of 118 countries and 232 observations spread over a 15-year time period. Results suggest that culture and large-scale cooperation form a tenuous relationship, while economic factors such as development, inequality, and geopolitics appear to drive large-scale cooperation.

  10. A visualization tool to support decision making in environmental and biological planning

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.

    2014-01-01

    Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.

  11. Method for identifying subsurface fluid migration and drainage pathways in and among oil and gas reservoirs using 3-D and 4-D seismic imaging

    DOEpatents

    Anderson, R.N.; Boulanger, A.; Bagdonas, E.P.; Xu, L.; He, W.

    1996-12-17

    The invention utilizes 3-D and 4-D seismic surveys as a means of deriving information useful in petroleum exploration and reservoir management. The methods use both single seismic surveys (3-D) and multiple seismic surveys separated in time (4-D) of a region of interest to determine large scale migration pathways within sedimentary basins, and fine scale drainage structure and oil-water-gas regions within individual petroleum producing reservoirs. Such structure is identified using pattern recognition tools which define the regions of interest. The 4-D seismic data sets may be used for data completion for large scale structure where time intervals between surveys do not allow for dynamic evolution. The 4-D seismic data sets also may be used to find variations over time of small scale structure within individual reservoirs which may be used to identify petroleum drainage pathways, oil-water-gas regions and, hence, attractive drilling targets. After spatial orientation, and amplitude and frequency matching of the multiple seismic data sets, High Amplitude Event (HAE) regions consistent with the presence of petroleum are identified using seismic attribute analysis. High Amplitude Regions are grown and interconnected to establish plumbing networks on the large scale and reservoir structure on the small scale. Small scale variations over time between seismic surveys within individual reservoirs are identified and used to identify drainage patterns and bypassed petroleum to be recovered. The location of such drainage patterns and bypassed petroleum may be used to site wells. 22 figs.

  12. Method for identifying subsurface fluid migration and drainage pathways in and among oil and gas reservoirs using 3-D and 4-D seismic imaging

    DOEpatents

    Anderson, Roger N.; Boulanger, Albert; Bagdonas, Edward P.; Xu, Liqing; He, Wei

    1996-01-01

    The invention utilizes 3-D and 4-D seismic surveys as a means of deriving information useful in petroleum exploration and reservoir management. The methods use both single seismic surveys (3-D) and multiple seismic surveys separated in time (4-D) of a region of interest to determine large scale migration pathways within sedimentary basins, and fine scale drainage structure and oil-water-gas regions within individual petroleum producing reservoirs. Such structure is identified using pattern recognition tools which define the regions of interest. The 4-D seismic data sets may be used for data completion for large scale structure where time intervals between surveys do not allow for dynamic evolution. The 4-D seismic data sets also may be used to find variations over time of small scale structure within individual reservoirs which may be used to identify petroleum drainage pathways, oil-water-gas regions and, hence, attractive drilling targets. After spatial orientation, and amplitude and frequency matching of the multiple seismic data sets, High Amplitude Event (HAE) regions consistent with the presence of petroleum are identified using seismic attribute analysis. High Amplitude Regions are grown and interconnected to establish plumbing networks on the large scale and reservoir structure on the small scale. Small scale variations over time between seismic surveys within individual reservoirs are identified and used to identify drainage patterns and bypassed petroleum to be recovered. The location of such drainage patterns and bypassed petroleum may be used to site wells.

  13. 2013 Dade W. Moeller lecture: medical countermeasures against radiological terrorism.

    PubMed

    Moulder, John E

    2014-08-01

    Soon after the 9-11 attacks, politicians and scientists began to question our ability to cope with a large-scale radiological terrorism incident. The outline of what was needed was fairly obvious: the ability to prevent such an attack, methods to cope with the medical consequences, the ability to clean up afterward, and the tools to figure out who perpetrated the attack and bring them to justice. The medical response needed three components: the technology to determine rapidly the radiation doses received by a large number of people, methods for alleviating acute hematological radiation injuries, and therapies for mitigation and treatment of chronic radiation injuries. Research done to date has shown that a realistic medical response plan is scientifically possible, but the regulatory and financial barriers to achieving this may currently be insurmountable.

  14. Scale Up in Education. Volume 1: Ideas in Principle

    ERIC Educational Resources Information Center

    Schneider, Barbara Ed.; McDonald, Sarah-Kathryn Ed.

    2006-01-01

    "Scale Up in Education, Volume 1: Ideas in Principle" examines the challenges of "scaling up" from a multidisciplinary perspective. It brings together contributions from disciplines that routinely take promising innovations to scale, including medicine, business, engineering, computing, and education. Together the contributors explore appropriate…

  15. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  16. Dynamic effective connectivity in cortically embedded systems of recurrently coupled synfire chains.

    PubMed

    Trengove, Chris; Diesmann, Markus; van Leeuwen, Cees

    2016-02-01

    As a candidate mechanism of neural representation, large numbers of synfire chains can efficiently be embedded in a balanced recurrent cortical network model. Here we study a model in which multiple synfire chains of variable strength are randomly coupled together to form a recurrent system. The system can be implemented both as a large-scale network of integrate-and-fire neurons and as a reduced model. The latter has binary-state pools as basic units but is otherwise isomorphic to the large-scale model, and provides an efficient tool for studying its behavior. Both the large-scale system and its reduced counterpart are able to sustain ongoing endogenous activity in the form of synfire waves, the proliferation of which is regulated by negative feedback caused by collateral noise. Within this equilibrium, diverse repertoires of ongoing activity are observed, including meta-stability and multiple steady states. These states arise in concert with an effective connectivity structure (ECS). The ECS admits a family of effective connectivity graphs (ECGs), parametrized by the mean global activity level. Of these graphs, the strongly connected components and their associated out-components account to a large extent for the observed steady states of the system. These results imply a notion of dynamic effective connectivity as governing neural computation with synfire chains, and related forms of cortical circuitry with complex topologies.

  17. Conjugate-Gradient Algorithms For Dynamics Of Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1993-01-01

    Algorithms for serial and parallel computation of forward dynamics of multiple-link robotic manipulators by conjugate-gradient method developed. Parallel algorithms have potential for speedup of computations on multiple linked, specialized processors implemented in very-large-scale integrated circuits. Such processors used to stimulate dynamics, possibly faster than in real time, for purposes of planning and control.

  18. Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species

    Treesearch

    Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin

    1999-01-01

    The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...

  19. Protein cage assembly across multiple length scales.

    PubMed

    Aumiller, William M; Uchida, Masaki; Douglas, Trevor

    2018-05-21

    Within the materials science community, proteins with cage-like architectures are being developed as versatile nanoscale platforms for use in protein nanotechnology. Much effort has been focused on the functionalization of protein cages with biological and non-biological moieties to bring about new properties of not only individual protein cages, but collective bulk-scale assemblies of protein cages. In this review, we report on the current understanding of protein cage assembly, both of the cages themselves from individual subunits, and the assembly of the individual protein cages into higher order structures. We start by discussing the key properties of natural protein cages (for example: size, shape and structure) followed by a review of some of the mechanisms of protein cage assembly and the factors that influence it. We then explore the current approaches for functionalizing protein cages, on the interior or exterior surfaces of the capsids. Lastly, we explore the emerging area of higher order assemblies created from individual protein cages and their potential for new and exciting collective properties.

  20. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  1. Towards a New Food System Assessment: AgMIP Coordinated Global and Regional Assessments of Climate Change

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia E.; Thorburn, Peter

    2017-01-01

    Agricultural stakeholders need more credible information on which to base adaptation and mitigation policy decisions. In order to provide this, we must improve the rigor of agricultural modelling. Ensemble approaches can be used to address scale issues and integrated teams can overcome disciplinary silos. The AgMIP Coordinated Global and Regional Assessments of Climate Change and Food Security (CGRA) has the goal to link agricultural systems models using common protocols and scenarios to significantly improve understanding of climate effects on crops, livestock and livelihoods across multiple scales. The AgMIP CGRA assessment brings together experts in climate, crop, livestock, economics, and food security to develop Protocols to guide the process throughout the assessment. Scenarios are designed to consistently combine elements of intertwined storylines of future society including, socioeconomic development, greenhouse gas concentrations, and specific pathways of agricultural sector development. Through these approaches, AgMIP partners around the world are providing an evidence base for their stakeholders as they make decisions and investments.

  2. TernaryNet: faster deep model inference without GPUs for medical 3D segmentation using sparse and binary convolutions.

    PubMed

    Heinrich, Mattias P; Blendowski, Max; Oktay, Ozan

    2018-05-30

    Deep convolutional neural networks (DCNN) are currently ubiquitous in medical imaging. While their versatility and high-quality results for common image analysis tasks including segmentation, localisation and prediction is astonishing, the large representational power comes at the cost of highly demanding computational effort. This limits their practical applications for image-guided interventions and diagnostic (point-of-care) support using mobile devices without graphics processing units (GPU). We propose a new scheme that approximates both trainable weights and neural activations in deep networks by ternary values and tackles the open question of backpropagation when dealing with non-differentiable functions. Our solution enables the removal of the expensive floating-point matrix multiplications throughout any convolutional neural network and replaces them by energy- and time-preserving binary operators and population counts. We evaluate our approach for the segmentation of the pancreas in CT. Here, our ternary approximation within a fully convolutional network leads to more than 90% memory reductions and high accuracy (without any post-processing) with a Dice overlap of 71.0% that comes close to the one obtained when using networks with high-precision weights and activations. We further provide a concept for sub-second inference without GPUs and demonstrate significant improvements in comparison with binary quantisation and without our proposed ternary hyperbolic tangent continuation. We present a key enabling technique for highly efficient DCNN inference without GPUs that will help to bring the advances of deep learning to practical clinical applications. It has also great promise for improving accuracies in large-scale medical data retrieval.

  3. A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping

    PubMed Central

    2011-01-01

    Background The goal of this study was to assess potential differences between administrators/policymakers and those involved in direct practice regarding factors believed to be barriers or facilitating factors to evidence-based practice (EBP) implementation in a large public mental health service system in the United States. Methods Participants included mental health system county officials, agency directors, program managers, clinical staff, administrative staff, and consumers. As part of concept mapping procedures, brainstorming groups were conducted with each target group to identify specific factors believed to be barriers or facilitating factors to EBP implementation in a large public mental health system. Statements were sorted by similarity and rated by each participant in regard to their perceived importance and changeability. Multidimensional scaling, cluster analysis, descriptive statistics and t-tests were used to analyze the data. Results A total of 105 statements were distilled into 14 clusters using concept-mapping procedures. Perceptions of importance of factors affecting EBP implementation varied between the two groups, with those involved in direct practice assigning significantly higher ratings to the importance of Clinical Perceptions and the impact of EBP implementation on clinical practice. Consistent with previous studies, financial concerns (costs, funding) were rated among the most important and least likely to change by both groups. Conclusions EBP implementation is a complex process, and different stakeholders may hold different opinions regarding the relative importance of the impact of EBP implementation. Implementation efforts must include input from stakeholders at multiple levels to bring divergent and convergent perspectives to light. PMID:21899754

  4. A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping.

    PubMed

    Green, Amy E; Aarons, Gregory A

    2011-09-07

    The goal of this study was to assess potential differences between administrators/policymakers and those involved in direct practice regarding factors believed to be barriers or facilitating factors to evidence-based practice (EBP) implementation in a large public mental health service system in the United States. Participants included mental health system county officials, agency directors, program managers, clinical staff, administrative staff, and consumers. As part of concept mapping procedures, brainstorming groups were conducted with each target group to identify specific factors believed to be barriers or facilitating factors to EBP implementation in a large public mental health system. Statements were sorted by similarity and rated by each participant in regard to their perceived importance and changeability. Multidimensional scaling, cluster analysis, descriptive statistics and t-tests were used to analyze the data. A total of 105 statements were distilled into 14 clusters using concept-mapping procedures. Perceptions of importance of factors affecting EBP implementation varied between the two groups, with those involved in direct practice assigning significantly higher ratings to the importance of Clinical Perceptions and the impact of EBP implementation on clinical practice. Consistent with previous studies, financial concerns (costs, funding) were rated among the most important and least likely to change by both groups. EBP implementation is a complex process, and different stakeholders may hold different opinions regarding the relative importance of the impact of EBP implementation. Implementation efforts must include input from stakeholders at multiple levels to bring divergent and convergent perspectives to light.

  5. PLINK: A Tool Set for Whole-Genome Association and Population-Based Linkage Analyses

    PubMed Central

    Purcell, Shaun ; Neale, Benjamin ; Todd-Brown, Kathe ; Thomas, Lori ; Ferreira, Manuel A. R. ; Bender, David ; Maller, Julian ; Sklar, Pamela ; de Bakker, Paul I. W. ; Daly, Mark J. ; Sham, Pak C. 

    2007-01-01

    Whole-genome association studies (WGAS) bring new computational, as well as analytic, challenges to researchers. Many existing genetic-analysis tools are not designed to handle such large data sets in a convenient manner and do not necessarily exploit the new opportunities that whole-genome data bring. To address these issues, we developed PLINK, an open-source C/C++ WGAS tool set. With PLINK, large data sets comprising hundreds of thousands of markers genotyped for thousands of individuals can be rapidly manipulated and analyzed in their entirety. As well as providing tools to make the basic analytic steps computationally efficient, PLINK also supports some novel approaches to whole-genome data that take advantage of whole-genome coverage. We introduce PLINK and describe the five main domains of function: data management, summary statistics, population stratification, association analysis, and identity-by-descent estimation. In particular, we focus on the estimation and use of identity-by-state and identity-by-descent information in the context of population-based whole-genome studies. This information can be used to detect and correct for population stratification and to identify extended chromosomal segments that are shared identical by descent between very distantly related individuals. Analysis of the patterns of segmental sharing has the potential to map disease loci that contain multiple rare variants in a population-based linkage analysis. PMID:17701901

  6. Non-linear scale interactions in a forced turbulent boundary layer

    NASA Astrophysics Data System (ADS)

    Duvvuri, Subrahmanyam; McKeon, Beverley

    2015-11-01

    A strong phase-organizing influence exerted by a single synthetic large-scale spatio-temporal mode on directly-coupled (through triadic interactions) small scales in a turbulent boundary layer forced by a spatially-impulsive dynamic wall-roughness patch was previously demonstrated by the authors (J. Fluid Mech. 2015, vol. 767, R4). The experimental set-up was later enhanced to allow for simultaneous forcing of multiple scales in the flow. Results and analysis are presented from a new set of novel experiments where two distinct large scales are forced in the flow by a dynamic wall-roughness patch. The internal non-linear forcing of two other scales with triadic consistency to the artificially forced large scales, corresponding to sum and difference in wavenumbers, is dominated by the latter. This allows for a forcing-response (input-output) type analysis of the two triadic scales, and naturally lends itself to a resolvent operator based model (e.g. McKeon & Sharma, J. Fluid Mech. 2010, vol. 658, pp. 336-382) of the governing Navier-Stokes equations. The support of AFOSR (grant #FA 9550-12-1-0469, program manager D. Smith) is gratefully acknowledged.

  7. Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications.

    PubMed

    Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can

    2017-05-12

    In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O ( 2 N 2 ) degrees of freedom (DOF) with O ( N ) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array.

  8. Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications

    PubMed Central

    Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can

    2017-01-01

    In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O(2N2) degrees of freedom (DOF) with O(N) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array. PMID:28498329

  9. Moving to stay in place: behavioral mechanisms for coexistence of African large carnivores.

    PubMed

    Vanak, Abi Tamim; Fortin, Daniel; Thaker, Maria; Ogden, Monika; Owen, Cailey; Greatwood, Sophie; Slotow, Rob

    2013-11-01

    Most ecosystems have multiple predator species that not only compete for shared prey, but also pose direct threats to each other. These intraguild interactions are key drivers of carnivore community structure, with ecosystem-wide cascading effects. Yet, behavioral mechanisms for coexistence of multiple carnivore species remain poorly understood. The challenges of studying large, free-ranging carnivores have resulted in mainly coarse-scale examination of behavioral strategies without information about all interacting competitors. We overcame some of these challenges by examining the concurrent fine-scale movement decisions of almost all individuals of four large mammalian carnivore species in a closed terrestrial system. We found that the intensity ofintraguild interactions did not follow a simple hierarchical allometric pattern, because spatial and behavioral tactics of subordinate species changed with threat and resource levels across seasons. Lions (Panthera leo) were generally unrestricted and anchored themselves in areas rich in not only their principal prey, but also, during periods of resource limitation (dry season), rich in the main prey for other carnivores. Because of this, the greatest cost (potential intraguild predation) for subordinate carnivores was spatially coupled with the highest potential benefit of resource acquisition (prey-rich areas), especially in the dry season. Leopard (P. pardus) and cheetah (Acinonyx jubatus) overlapped with the home range of lions but minimized their risk using fine-scaled avoidance behaviors and restricted resource acquisition tactics. The cost of intraguild competition was most apparent for cheetahs, especially during the wet season, as areas with energetically rewarding large prey (wildebeest) were avoided when they overlapped highly with the activity areas of lions. Contrary to expectation, the smallest species (African wild dog, Lycaon pictus) did not avoid only lions, but also used multiple tactics to minimize encountering all other competitors. Intraguild competition thus forced wild dogs into areas with the lowest resource availability year round. Coexistence of multiple carnivore species has typically been explained by dietary niche separation, but our multi-scaled movement results suggest that differences in resource acquisition may instead be a consequence of avoiding intraguild competition. We generate a more realistic representation of hierarchical behavioral interactions that may ultimately drive spatially explicit trophic structures of multi-predator communities.

  10. The Internet As a Large-Scale Complex System

    NASA Astrophysics Data System (ADS)

    Park, Kihong; Willinger, Walter

    2005-06-01

    The Internet may be viewed as a "complex system" with diverse features and many components that can give rise to unexpected emergent phenomena, revealing much about its own engineering. This book brings together chapter contributions from a workshop held at the Santa Fe Institute in March 2001. This volume captures a snapshot of some features of the Internet that may be fruitfully approached using a complex systems perspective, meaning using interdisciplinary tools and methods to tackle the subject area. The Internet penetrates the socioeconomic fabric of everyday life; a broader and deeper grasp of the Internet may be needed to meet the challenges facing the future. The resulting empirical data have already proven to be invaluable for gaining novel insights into the network's spatio-temporal dynamics, and can be expected to become even more important when tryin to explain the Internet's complex and emergent behavior in terms of elementary networking-based mechanisms. The discoveries of fractal or self-similar network traffic traces, power-law behavior in network topology and World Wide Web connectivity are instances of unsuspected, emergent system traits. Another important factor at the heart of fair, efficient, and stable sharing of network resources is user behavior. Network systems, when habited by selfish or greedy users, take on the traits of a noncooperative multi-party game, and their stability and efficiency are integral to understanding the overall system and its dynamics. Lastly, fault-tolerance and robustness of large-scale network systems can exhibit spatial and temporal correlations whose effective analysis and management may benefit from rescaling techniques applied in certain physical and biological systems. The present book will bring together several of the leading workers involved in the analysis of complex systems with the future development of the Internet.

  11. Ensemble Kalman filtering in presence of inequality constraints

    NASA Astrophysics Data System (ADS)

    van Leeuwen, P. J.

    2009-04-01

    Kalman filtering is presence of constraints is an active area of research. Based on the Gaussian assumption for the probability-density functions, it looks hard to bring in extra constraints in the formalism. On the other hand, in geophysical systems we often encounter constraints related to e.g. the underlying physics or chemistry, which are violated by the Gaussian assumption. For instance, concentrations are always non-negative, model layers have non-negative thickness, and sea-ice concentration is between 0 and 1. Several methods to bring inequality constraints into the Kalman-filter formalism have been proposed. One of them is probability density function (pdf) truncation, in which the Gaussian mass from the non-allowed part of the variables is just equally distributed over the pdf where the variables are alolwed, as proposed by Shimada et al. 1998. However, a problem with this method is that the probability that e.g. the sea-ice concentration is zero, is zero! The new method proposed here does not have this drawback. It assumes that the probability-density function is a truncated Gaussian, but the truncated mass is not distributed equally over all allowed values of the variables, but put into a delta distribution at the truncation point. This delta distribution can easily be handled with in Bayes theorem, leading to posterior probability density functions that are also truncated Gaussians with delta distributions at the truncation location. In this way a much better representation of the system is obtained, while still keeping most of the benefits of the Kalman-filter formalism. In the full Kalman filter the formalism is prohibitively expensive in large-scale systems, but efficient implementation is possible in ensemble variants of the kalman filter. Applications to low-dimensional systems and large-scale systems will be discussed.

  12. Parenting Multiples

    MedlinePlus

    ... Every book on parenting will tell you that life forever changes after the birth of a child. So parents of twins (or triplets or more!) can feel as if they've left the hospital and arrived home on a different planet. The arrival of multiple newborns can bring medical, ...

  13. The maintenance of cooperation in multiplex networks with limited and partible resources of agents

    NASA Astrophysics Data System (ADS)

    Li, Zhaofeng; Shen, Bi; Jiang, Yichuan

    2017-02-01

    In this paper, we try to explain the maintenance of cooperation in multiplex networks with limited and partible resources of agents: defection brings larger short-term benefit and cooperative agents may become defective because of the unaffordable costs of cooperative behaviors that are performed in multiple layers simultaneously. Recent studies have identified the positive effects of multiple layers on evolutionary cooperation but generally overlook the maximum costs of agents in these synchronous games. By utilizing network effects and designing evolutionary mechanisms, cooperative behaviors become prevailing in public goods games, and agents can allocate personal resources across multiple layers. First, we generalize degree diversity into multiplex networks to improve the prospect for cooperation. Second, to prevent agents allocating all the resources into one layer, a greedy-first mechanism is proposed, in which agents prefer to add additional investments in the higher-payoff layer. It is found that greedy-first agents can perform cooperative behaviors in multiplex networks when one layer is scale-free network and degree differences between conjoint nodes increase. Our work may help to explain the emergence of cooperation in the absence of individual reputation and punishment mechanisms.

  14. Reconstructing evolutionary trees in parallel for massive sequences.

    PubMed

    Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam

    2017-12-14

    Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .

  15. Trans-National Scale-Up of Services in Global Health

    PubMed Central

    Shahin, Ilan; Sohal, Raman; Ginther, John; Hayden, Leigh; MacDonald, John A.; Mossman, Kathryn; Parikh, Himanshu; McGahan, Anita; Mitchell, Will; Bhattacharyya, Onil

    2014-01-01

    Background Scaling up innovative healthcare programs offers a means to improve access, quality, and health equity across multiple health areas. Despite large numbers of promising projects, little is known about successful efforts to scale up. This study examines trans-national scale, whereby a program operates in two or more countries. Trans-national scale is a distinct measure that reflects opportunities to replicate healthcare programs in multiple countries, thereby providing services to broader populations. Methods Based on the Center for Health Market Innovations (CHMI) database of nearly 1200 health programs, the study contrasts 116 programs that have achieved trans-national scale with 1,068 single-country programs. Data was collected on the programs' health focus, service activity, legal status, and funding sources, as well as the programs' locations (rural v. urban emphasis), and founding year; differences are reported with statistical significance. Findings This analysis examines 116 programs that have achieved trans-national scale (TNS) across multiple disease areas and activity types. Compared to 1,068 single-country programs, we find that trans-nationally scaled programs are more donor-reliant; more likely to focus on targeted health needs such as HIV/AIDS, TB, malaria, or family planning rather than provide more comprehensive general care; and more likely to engage in activities that support healthcare services rather than provide direct clinical care. Conclusion This work, based on a large data set of health programs, reports on trans-national scale with comparison to single-country programs. The work is a step towards understanding when programs are able to replicate their services as they attempt to expand health services for the poor across countries and health areas. A subset of these programs should be the subject of case studies to understand factors that affect the scaling process, particularly seeking to identify mechanisms that lead to improved health outcomes. PMID:25375328

  16. The Social Life of a Data Base

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte; Wales, Roxana; Clancy, Dan (Technical Monitor)

    2002-01-01

    This paper presents the complex social life of a large data base. The topics include: 1) Social Construction of Mechanisms of Memory; 2) Data Bases: The Invisible Memory Mechanism; 3) The Human in the Machine; 4) Data of the Study: A Large-Scale Problem Reporting Data Base; 5) The PRACA Study; 6) Description of PRACA; 7) PRACA and Paper; 8) Multiple Uses of PRACA; 9) The Work of PRACA; 10) Multiple Forms of Invisibility; 11) Such Systems are Everywhere; and 12) Two Morals to the Story. This paper is in viewgraph form.

  17. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  18. A linear concatenation strategy to construct 5'-enriched amplified cDNA libraries using multiple displacement amplification.

    PubMed

    Gadkar, Vijay J; Filion, Martin

    2013-06-01

    In various experimental systems, limiting available amounts of RNA may prevent a researcher from performing large-scale analyses of gene transcripts. One way to circumvent this is to 'pre-amplify' the starting RNA/cDNA, so that sufficient amounts are available for any downstream analysis. In the present study, we report the development of a novel protocol for constructing amplified cDNA libraries using the Phi29 DNA polymerase based multiple displacement amplification (MDA) system. Using as little as 200 ng of total RNA, we developed a linear concatenation strategy to make the single-stranded cDNA template amenable for MDA. The concatenation, made possible by the template switching property of the reverse transcriptase enzyme, resulted in the amplified cDNA library with intact 5' ends. MDA generated micrograms of template, allowing large-scale polymerase chain reaction analyses or other large-scale downstream applications. As the amplified cDNA library contains intact 5' ends, it is also compatible with 5' RACE analyses of specific gene transcripts. Empirical validation of this protocol is demonstrated on a highly characterized (tomato) and an uncharacterized (corn gromwell) experimental system.

  19. Differences in intermittent and continuous fecal shedding patterns between natural and experimental Mycobacterium avium subspecies paratuberculosis infections in cattle

    USDA-ARS?s Scientific Manuscript database

    The objective of this paper is to study shedding patterns of cows infected with Mycobacterium avium subsp. paratuberculosis (MAP). While multiple single farm studies of MAP dynamics were reported, there is not large-scale meta-analysis of both natural and experimental infections. Large difference...

  20. Electrochemical micro/nano-machining: principles and practices.

    PubMed

    Zhan, Dongping; Han, Lianhuan; Zhang, Jie; He, Quanfeng; Tian, Zhao-Wu; Tian, Zhong-Qun

    2017-03-06

    Micro/nano-machining (MNM) is becoming the cutting-edge of high-tech manufacturing because of the increasing industrial demand for supersmooth surfaces and functional three-dimensional micro/nano-structures (3D-MNS) in ultra-large scale integrated circuits, microelectromechanical systems, miniaturized total analysis systems, precision optics, and so on. Taking advantage of no tool wear, no surface stress, environmental friendliness, simple operation, and low cost, electrochemical micro/nano-machining (EC-MNM) has an irreplaceable role in MNM. This comprehensive review presents the state-of-art of EC-MNM techniques for direct writing, surface planarization and polishing, and 3D-MNS fabrications. The key point of EC-MNM is to confine electrochemical reactions at the micro/nano-meter scale. This review will bring together various solutions to "confined reaction" ranging from electrochemical principles through technical characteristics to relevant applications.

  1. A climate-change adaptation framework to reduce continental-scale vulnerability across conservation reserves

    Treesearch

    D.R. Magness; J.M. Morton; F. Huettmann; F.S. Chapin; A.D. McGuire

    2011-01-01

    Rapid climate change, in conjunction with other anthropogenic drivers, has the potential to cause mass species extinction. To minimize this risk, conservation reserves need to be coordinated at multiple spatial scales because the climate envelopes of many species may shift rapidly across large geographic areas. In addition, novel species assemblages and ecological...

  2. Impact of Accumulated Error on Item Response Theory Pre-Equating with Mixed Format Tests

    ERIC Educational Resources Information Center

    Keller, Lisa A.; Keller, Robert; Cook, Robert J.; Colvin, Kimberly F.

    2016-01-01

    The equating of tests is an essential process in high-stakes, large-scale testing conducted over multiple forms or administrations. By adjusting for differences in difficulty and placing scores from different administrations of a test on a common scale, equating allows scores from these different forms and administrations to be directly compared…

  3. A multi-scale assessment of population connectivity in African lions (Panthera leo) in response to landscape change

    Treesearch

    Samuel A. Cushman; Nicholas B. Elliot; David W. Macdonald; Andrew J. Loveridge

    2015-01-01

    Habitat loss and fragmentation are among the major drivers of population declines and extinction, particularly in large carnivores. Connectivity models provide practical tools for assessing fragmentation effects and developing mitigation or conservation responses. To be useful to conservation practitioners, connectivity models need to incorporate multiple scales and...

  4. Managing landscapes at multiple scales for sustainability of ecosystem functions (Preface)

    Treesearch

    R.A. Birdsey; R. Lucas; Y. Pan; G. Sun; E.J. Gustafson; A.H.  Perera

    2010-01-01

    The science of landscape ecology is a rapidly evolving academic field with an emphasis on studying large-scale spatial heterogeneity created by natural influences and human activities. These advances have important implications for managing and conserving natural resources. At a September 2008 IUFRO conference in Chengdu, Sichuan, P.R. China, we highlighted both the...

  5. Discovering New Diseases to Accelerate Precision Medicine.

    PubMed

    Macrae, Calum A

    2017-01-01

    A rate-limiting step in multiple areas of medicine is the limited number of discrete disorders that current technologies are able to identify. Most clinical disease entities are aggregates of large numbers of discrete biological processes that simply happen to share one or two common features. We have begun to translate a wide range of new technologies to the clinic in an effort to improve the resolution and the efficiency of bedside diagnostics with a view to improving drug trials, genetic studies, and the effectiveness of the clinician in a digital environment. The general trajectory for change that new technologies will bring is outlined with some specific examples of areas where such change has already begun to occur.

  6. Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search.

    PubMed

    Liu, Xianglong; Huang, Lei; Deng, Cheng; Lang, Bo; Tao, Dacheng

    2016-10-01

    Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.

  7. Large-Scale Flows and Magnetic Fields Produced by Rotating Convection in a Quasi-Geostrophic Model of Planetary Cores

    NASA Astrophysics Data System (ADS)

    Guervilly, C.; Cardin, P.

    2017-12-01

    Convection is the main heat transport process in the liquid cores of planets. The convective flows are thought to be turbulent and constrained by rotation (corresponding to high Reynolds numbers Re and low Rossby numbers Ro). Under these conditions, and in the absence of magnetic fields, the convective flows can produce coherent Reynolds stresses that drive persistent large-scale zonal flows. The formation of large-scale flows has crucial implications for the thermal evolution of planets and the generation of large-scale magnetic fields. In this work, we explore this problem with numerical simulations using a quasi-geostrophic approximation to model convective and zonal flows at Re 104 and Ro 10-4 for Prandtl numbers relevant for liquid metals (Pr 0.1). The formation of intense multiple zonal jets strongly affects the convective heat transport, leading to the formation of a mean temperature staircase. We also study the generation of magnetic fields by the quasi-geostrophic flows at low magnetic Prandtl numbers.

  8. Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys

    PubMed Central

    Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.

    2014-01-01

    Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634

  9. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  10. High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing

    PubMed Central

    Hu, Chenyuan; Bai, Wei

    2018-01-01

    A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing. PMID:29495263

  11. High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing.

    PubMed

    Hu, Chenyuan; Bai, Wei

    2018-02-24

    A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing.

  12. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  13. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  14. Large-scale P2P network based distributed virtual geographic environment (DVGE)

    NASA Astrophysics Data System (ADS)

    Tan, Xicheng; Yu, Liang; Bian, Fuling

    2007-06-01

    Virtual Geographic Environment has raised full concern as a kind of software information system that helps us understand and analyze the real geographic environment, and it has also expanded to application service system in distributed environment--distributed virtual geographic environment system (DVGE), and gets some achievements. However, limited by the factor of the mass data of VGE, the band width of network, as well as numerous requests and economic, etc. DVGE still faces some challenges and problems which directly cause the current DVGE could not provide the public with high-quality service under current network mode. The Rapid development of peer-to-peer network technology has offered new ideas of solutions to the current challenges and problems of DVGE. Peer-to-peer network technology is able to effectively release and search network resources so as to realize efficient share of information. Accordingly, this paper brings forth a research subject on Large-scale peer-to-peer network extension of DVGE as well as a deep study on network framework, routing mechanism, and DVGE data management on P2P network.

  15. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    NASA Astrophysics Data System (ADS)

    Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.

    2013-12-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.

  16. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    NASA Astrophysics Data System (ADS)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  17. Weak gravitational lensing due to large-scale structure of the universe

    NASA Technical Reports Server (NTRS)

    Jaroszynski, Michal; Park, Changbom; Paczynski, Bohdan; Gott, J. Richard, III

    1990-01-01

    The effect of the large-scale structure of the universe on the propagation of light rays is studied. The development of the large-scale density fluctuations in the omega = 1 universe is calculated within the cold dark matter scenario using a smooth particle approximation. The propagation of about 10 to the 6th random light rays between the redshift z = 5 and the observer was followed. It is found that the effect of shear is negligible, and the amplification of single images is dominated by the matter in the beam. The spread of amplifications is very small. Therefore, the filled-beam approximation is very good for studies of strong lensing by galaxies or clusters of galaxies. In the simulation, the column density was averaged over a comoving area of approximately (1/h Mpc)-squared. No case of a strong gravitational lensing was found, i.e., no 'over-focused' image that would suggest that a few images might be present. Therefore, the large-scale structure of the universe as it is presently known does not produce multiple images with gravitational lensing on a scale larger than clusters of galaxies.

  18. Using occupancy estimation to assess the effectiveness of a regional multiple-species conservation plan: bats in the Pacific Northwest

    Treesearch

    Theodore Weller

    2008-01-01

    Regional conservation plans are increasingly used to plan for and protect biodiversity at large spatial scales however the means of quantitatively evaluating their effectiveness are rarely specified. Multiple-species approaches, particular those which employ site-occupancy estimation, have been proposed as robust and efficient alternatives for assessing the status of...

  19. Item Response Theory with Covariates (IRT-C): Assessing Item Recovery and Differential Item Functioning for the Three-Parameter Logistic Model

    ERIC Educational Resources Information Center

    Tay, Louis; Huang, Qiming; Vermunt, Jeroen K.

    2016-01-01

    In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across multiple variables (covariates) simultaneously. To…

  20. Organizational Communication in Emergencies: Using Multiple Channels and Sources to Combat Noise and Capture Attention

    ERIC Educational Resources Information Center

    Stephens, Keri K.; Barrett, Ashley K.; Mahometa, Michael J.

    2013-01-01

    This study relies on information theory, social presence, and source credibility to uncover what best helps people grasp the urgency of an emergency. We surveyed a random sample of 1,318 organizational members who received multiple notifications about a large-scale emergency. We found that people who received 3 redundant messages coming through at…

  1. Designing a Stage-Sensitive Written Assessment of Elementary Students' Scheme for Multiplicative Reasoning

    ERIC Educational Resources Information Center

    Hodkowski, Nicola M.; Gardner, Amber; Jorgensen, Cody; Hornbein, Peter; Johnson, Heather L.; Tzur, Ron

    2016-01-01

    In this paper we examine the application of Tzur's (2007) fine-grained assessment to the design of an assessment measure of a particular multiplicative scheme so that non-interview, good enough data can be obtained (on a large scale) to infer into elementary students' reasoning. We outline three design principles that surfaced through our recent…

  2. The Development of Multiple-Choice Items Consistent with the AP Chemistry Curriculum Framework to More Accurately Assess Deeper Understanding

    ERIC Educational Resources Information Center

    Domyancich, John M.

    2014-01-01

    Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…

  3. Conceptual Design and Dynamics Testing and Modeling of a Mars Tumbleweed Rover

    NASA Technical Reports Server (NTRS)

    Calhoun Philip C.; Harris, Steven B.; Raiszadeh, Behzad; Zaleski, Kristina D.

    2005-01-01

    The NASA Langley Research Center has been developing a novel concept for a Mars planetary rover called the Mars Tumbleweed. This concept utilizes the wind to propel the rover along the Mars surface, bringing it the potential to cover vast distances not possible with current Mars rover technology. This vehicle, in its deployed configuration, must be large and lightweight to provide the ratio of drag force to rolling resistance necessary to initiate motion from rest on the Mars surface. One Tumbleweed design concept that satisfies these considerations is called the Eggbeater-Dandelion. This paper describes the basic design considerations and a proposed dynamics model of the concept for use in simulation studies. It includes a summary of rolling/bouncing dynamics tests that used videogrammetry to better understand, characterize, and validate the dynamics model assumptions, especially the effective rolling resistance in bouncing/rolling dynamic conditions. The dynamics test used cameras to capture the motion of 32 targets affixed to a test article s outer structure. Proper placement of the cameras and alignment of their respective fields of view provided adequate image resolution of multiple targets along the trajectory as the test article proceeded down the ramp. Image processing of the frames from multiple cameras was used to determine the target positions. Position data from a set of these test runs was compared with results of a three dimensional, flexible dynamics model. Model input parameters were adjusted to match the test data for runs conducted. This process presented herein provided the means to characterize the dynamics and validate the simulation of the Eggbeater-Dandelion concept. The simulation model was used to demonstrate full scale Tumbleweed motion from a stationary condition on a flat-sloped terrain using representative Mars environment parameters.

  4. Polychaete richness and abundance enhanced in anthropogenically modified estuaries despite high concentrations of toxic contaminants.

    PubMed

    Dafforn, Katherine A; Kelaher, Brendan P; Simpson, Stuart L; Coleman, Melinda A; Hutchings, Pat A; Clark, Graeme F; Knott, Nathan A; Doblin, Martina A; Johnston, Emma L

    2013-01-01

    Ecological communities are increasingly exposed to multiple chemical and physical stressors, but distinguishing anthropogenic impacts from other environmental drivers remains challenging. Rarely are multiple stressors investigated in replicated studies over large spatial scales (>1000 kms) or supported with manipulations that are necessary to interpret ecological patterns. We measured the composition of sediment infaunal communities in relation to anthropogenic and natural stressors at multiple sites within seven estuaries. We observed increases in the richness and abundance of polychaete worms in heavily modified estuaries with severe metal contamination, but no changes in the diversity or abundance of other taxa. Estuaries in which toxic contaminants were elevated also showed evidence of organic enrichment. We hypothesised that the observed response of polychaetes was not a 'positive' response to toxic contamination or a reduction in biotic competition, but due to high levels of nutrients in heavily modified estuaries driving productivity in the water column and enriching the sediment over large spatial scales. We deployed defaunated field-collected sediments from the surveyed estuaries in a small scale experiment, but observed no effects of sediment characteristics (toxic or enriching). Furthermore, invertebrate recruitment instead reflected the low diversity and abundance observed during field surveys of this relatively 'pristine' estuary. This suggests that differences observed in the survey are not a direct consequence of sediment characteristics (even severe metal contamination) but are related to parameters that covary with estuary modification such as enhanced productivity from nutrient inputs and the diversity of the local species pool. This has implications for the interpretation of diversity measures in large-scale monitoring studies in which the observed patterns may be strongly influenced by many factors that covary with anthropogenic modification.

  5. Polychaete Richness and Abundance Enhanced in Anthropogenically Modified Estuaries Despite High Concentrations of Toxic Contaminants

    PubMed Central

    Dafforn, Katherine A.; Kelaher, Brendan P.; Simpson, Stuart L.; Coleman, Melinda A.; Hutchings, Pat A.; Clark, Graeme F.; Knott, Nathan A.; Doblin, Martina A.; Johnston, Emma L.

    2013-01-01

    Ecological communities are increasingly exposed to multiple chemical and physical stressors, but distinguishing anthropogenic impacts from other environmental drivers remains challenging. Rarely are multiple stressors investigated in replicated studies over large spatial scales (>1000 kms) or supported with manipulations that are necessary to interpret ecological patterns. We measured the composition of sediment infaunal communities in relation to anthropogenic and natural stressors at multiple sites within seven estuaries. We observed increases in the richness and abundance of polychaete worms in heavily modified estuaries with severe metal contamination, but no changes in the diversity or abundance of other taxa. Estuaries in which toxic contaminants were elevated also showed evidence of organic enrichment. We hypothesised that the observed response of polychaetes was not a ‘positive’ response to toxic contamination or a reduction in biotic competition, but due to high levels of nutrients in heavily modified estuaries driving productivity in the water column and enriching the sediment over large spatial scales. We deployed defaunated field-collected sediments from the surveyed estuaries in a small scale experiment, but observed no effects of sediment characteristics (toxic or enriching). Furthermore, invertebrate recruitment instead reflected the low diversity and abundance observed during field surveys of this relatively ‘pristine’ estuary. This suggests that differences observed in the survey are not a direct consequence of sediment characteristics (even severe metal contamination) but are related to parameters that covary with estuary modification such as enhanced productivity from nutrient inputs and the diversity of the local species pool. This has implications for the interpretation of diversity measures in large-scale monitoring studies in which the observed patterns may be strongly influenced by many factors that covary with anthropogenic modification. PMID:24098816

  6. Cooperation without Culture? The Null Effect of Generalized Trust on Intentional Homicide: A Cross-National Panel Analysis, 1995–2009

    PubMed Central

    Robbins, Blaine

    2013-01-01

    Sociologists, political scientists, and economists all suggest that culture plays a pivotal role in the development of large-scale cooperation. In this study, I used generalized trust as a measure of culture to explore if and how culture impacts intentional homicide, my operationalization of cooperation. I compiled multiple cross-national data sets and used pooled time-series linear regression, single-equation instrumental-variables linear regression, and fixed- and random-effects estimation techniques on an unbalanced panel of 118 countries and 232 observations spread over a 15-year time period. Results suggest that culture and large-scale cooperation form a tenuous relationship, while economic factors such as development, inequality, and geopolitics appear to drive large-scale cooperation. PMID:23527211

  7. Large Scale GW Calculations on the Cori System

    NASA Astrophysics Data System (ADS)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  8. Lagrangian space consistency relation for large scale structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias & Riotto and Peloso & Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present.more » Furthermore, the simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.« less

  9. Lagrangian space consistency relation for large scale structure

    DOE PAGES

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-09-29

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias & Riotto and Peloso & Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present.more » Furthermore, the simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.« less

  10. Testing the equivalence principle on cosmological scales

    NASA Astrophysics Data System (ADS)

    Bonvin, Camille; Fleury, Pierre

    2018-05-01

    The equivalence principle, that is one of the main pillars of general relativity, is very well tested in the Solar system; however, its validity is more uncertain on cosmological scales, or when dark matter is concerned. This article shows that relativistic effects in the large-scale structure can be used to directly test whether dark matter satisfies Euler's equation, i.e. whether its free fall is characterised by geodesic motion, just like baryons and light. After having proposed a general parametrisation for deviations from Euler's equation, we perform Fisher-matrix forecasts for future surveys like DESI and the SKA, and show that such deviations can be constrained with a precision of order 10%. Deviations from Euler's equation cannot be tested directly with standard methods like redshift-space distortions and gravitational lensing, since these observables are not sensitive to the time component of the metric. Our analysis shows therefore that relativistic effects bring new and complementary constraints to alternative theories of gravity.

  11. Curvature, metric and parametrization of origami tessellations: theory and application to the eggbox pattern.

    PubMed

    Nassar, H; Lebée, A; Monasse, L

    2017-01-01

    Origami tessellations are particular textured morphing shell structures. Their unique folding and unfolding mechanisms on a local scale aggregate and bring on large changes in shape, curvature and elongation on a global scale. The existence of these global deformation modes allows for origami tessellations to fit non-trivial surfaces thus inspiring applications across a wide range of domains including structural engineering, architectural design and aerospace engineering. The present paper suggests a homogenization-type two-scale asymptotic method which, combined with standard tools from differential geometry of surfaces, yields a macroscopic continuous characterization of the global deformation modes of origami tessellations and other similar periodic pin-jointed trusses. The outcome of the method is a set of nonlinear differential equations governing the parametrization, metric and curvature of surfaces that the initially discrete structure can fit. The theory is presented through a case study of a fairly generic example: the eggbox pattern. The proposed continuous model predicts correctly the existence of various fittings that are subsequently constructed and illustrated.

  12. Curvature, metric and parametrization of origami tessellations: theory and application to the eggbox pattern

    NASA Astrophysics Data System (ADS)

    Nassar, H.; Lebée, A.; Monasse, L.

    2017-01-01

    Origami tessellations are particular textured morphing shell structures. Their unique folding and unfolding mechanisms on a local scale aggregate and bring on large changes in shape, curvature and elongation on a global scale. The existence of these global deformation modes allows for origami tessellations to fit non-trivial surfaces thus inspiring applications across a wide range of domains including structural engineering, architectural design and aerospace engineering. The present paper suggests a homogenization-type two-scale asymptotic method which, combined with standard tools from differential geometry of surfaces, yields a macroscopic continuous characterization of the global deformation modes of origami tessellations and other similar periodic pin-jointed trusses. The outcome of the method is a set of nonlinear differential equations governing the parametrization, metric and curvature of surfaces that the initially discrete structure can fit. The theory is presented through a case study of a fairly generic example: the eggbox pattern. The proposed continuous model predicts correctly the existence of various fittings that are subsequently constructed and illustrated.

  13. Multiple Intelligences or Multiply Misleading: The Critic's View of the Multiple Intelligences Theory

    ERIC Educational Resources Information Center

    Peariso, Jamon F.

    2008-01-01

    Howard Gardner's Multiple Intelligences (MI) theory has been widely accepted in the field of education for the past two decades. Most educators have been subjugated to the MI theory and to the many issues that its implementation in the classroom brings. This is often done without ever looking at or being presented the critic's view or research on…

  14. MANGO Imager Network Observations of Geomagnetic Storm Impact on Midlatitude 630 nm Airglow Emissions

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.; Bhatt, A.

    2017-12-01

    The Midlatitude Allsky-imaging Network for GeoSpace Observations (MANGO) is a network of imagers filtered at 630 nm spread across the continental United States. MANGO is used to image large-scale airglow and aurora features and observes the generation, propagation, and dissipation of medium and large-scale wave activity in the subauroral, mid and low-latitude thermosphere. This network consists of seven all-sky imagers providing continuous coverage over the United States and extending south into Mexico. This network sees high levels of medium and large scale wave activity due to both neutral and geomagnetic storm forcing. The geomagnetic storm observations largely fall into two categories: Stable Auroral Red (SAR) arcs and Large-scale traveling ionospheric disturbances (LSTIDs). In addition, less-often observed effects include anomalous airglow brightening, bright swirls, and frozen-in traveling structures. We will present an analysis of multiple events observed over four years of MANGO network operation. We will provide both statistics on the cumulative observations and a case study of the "Memorial Day Storm" on May 27, 2017.

  15. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  16. Tropical warming and the dynamics of endangered primates.

    PubMed

    Wiederholt, Ruscena; Post, Eric

    2010-04-23

    Many primate species are severely threatened, but little is known about the effects of global warming and the associated intensification of El Niño events on primate populations. Here, we document the influences of the El Niño southern oscillation (ENSO) and hemispheric climatic variability on the population dynamics of four genera of ateline (neotropical, large-bodied) primates. All ateline genera experienced either an immediate or a lagged negative effect of El Niño events. ENSO events were also found to influence primate resource levels through neotropical arboreal phenology. Furthermore, frugivorous primates showed a high degree of interspecific population synchrony over large scales across Central and South America attributable to the recent trends in large-scale climate. These results highlight the role of large-scale climatic variation and trends in ateline primate population dynamics, and emphasize that global warming could pose additional threats to the persistence of multiple species of endangered primates.

  17. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  18. Inventory and analysis of natural vegetation and related resources from space and high altitude photography

    NASA Technical Reports Server (NTRS)

    Poulton, C. E.

    1972-01-01

    A multiple sampling technique was developed whereby spacecraft photographs supported by aircraft photographs could be used to quantify plant communities. Large scale (1:600 to 1:2,400) color infrared aerial photographs were required to identify shrub and herbaceous species. These photos were used to successfully estimate a herbaceous standing crop biomass. Microdensitometry was used to discriminate among specific plant communities and individual plant species. Large scale infrared photography was also used to estimate mule deer deaths and population density of northern pocket gophers.

  19. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  20. NASA Applications of Molecular Nanotechnology

    NASA Technical Reports Server (NTRS)

    Globus, Al; Bailey, David; Han, Jie; Jaffe, Richard; Levit, Creon; Merkle, Ralph; Srivastava, Deepak

    1998-01-01

    Laboratories throughout the world are rapidly gaining atomically precise control over matter. As this control extends to an ever wider variety of materials, processes and devices, opportunities for applications relevant to NASA's missions will be created. This document surveys a number of future molecular nanotechnology capabilities of aerospace interest. Computer applications, launch vehicle improvements, and active materials appear to be of particular interest. We also list a number of applications for each of NASA's enterprises. If advanced molecular nanotechnology can be developed, almost all of NASA's endeavors will be radically improved. In particular, a sufficiently advanced molecular nanotechnology can arguably bring large scale space colonization within our grasp.

  1. Nanotechnology in vascular tissue engineering: from nanoscaffolding towards rapid vessel biofabrication.

    PubMed

    Mironov, Vladimir; Kasyanov, Vladimir; Markwald, Roger R

    2008-06-01

    The existing methods of biofabrication for vascular tissue engineering are still bioreactor-based, extremely expensive, laborious and time consuming and, furthermore, not automated, which would be essential for an economically successful large-scale commercialization. The advances in nanotechnology can bring additional functionality to vascular scaffolds, optimize internal vascular graft surface and even help to direct the differentiation of stem cells into the vascular cell phenotype. The development of rapid nanotechnology-based methods of vascular tissue biofabrication represents one of most important recent technological breakthroughs in vascular tissue engineering because it dramatically accelerates vascular tissue assembly and, importantly, also eliminates the need for a bioreactor-based scaffold cellularization process.

  2. What does physics have to do with cancer?

    PubMed Central

    Michor, Franziska; Liphardt, Jan; Ferrari, Mauro; Widom, Jonathan

    2013-01-01

    Large-scale cancer genomics, proteomics and RNA-sequencing efforts are currently mapping in fine detail the genetic and biochemical alterations that occur in cancer. However, it is becoming clear that it is difficult to integrate and interpret these data and to translate them into treatments. This difficulty is compounded by the recognition that cancer cells evolve, and that initiation, progression and metastasis are influenced by a wide variety of factors. To help tackle this challenge, the US National Cancer Institute Physical Sciences-Oncology Centers initiative is bringing together physicists, cancer biologists, chemists, mathematicians and engineers. How are we beginning to address cancer from the perspective of the physical sciences? PMID:21850037

  3. RR Lyrae stars and the horizontal branch of NGC 5904 (M5)

    NASA Astrophysics Data System (ADS)

    Arellano Ferro, A.; Luna, A.; Bramich, D. M.; Giridhar, Sunetra; Ahumada, J. A.; Muneer, S.

    2016-05-01

    We report the distance and [Fe/H] value for the globular cluster NGC 5904 (M5) derived from the Fourier decomposition of the light curves of selected RRab and RRc stars. The aim in doing this was to bring these parameters into the homogeneous scales established by our previous work on numerous other globular clusters, allowing a direct comparison of the horizontal branch luminosity in clusters with a wide range of metallicities. Our CCD photometry of the large variable star population of this cluster is used to discuss light curve peculiarities, like Blazhko modulations, on an individual basis. New Blazhko variables are reported.

  4. No Computer Left Behind

    ERIC Educational Resources Information Center

    Cohen, Daniel J.; Rosenzweig, Roy

    2006-01-01

    The combination of the Web and the cell phone forecasts the end of the inexpensive technologies of multiple-choice tests and grading machines. These technological developments are likely to bring the multiple-choice test to the verge of obsolescence, mounting a substantial challenge to the presentation of history and other disciplines.

  5. Mach Number effects on turbulent superstructures in wall bounded flows

    NASA Astrophysics Data System (ADS)

    Kaehler, Christian J.; Bross, Matthew; Scharnowski, Sven

    2017-11-01

    Planer and three-dimensional flow field measurements along a flat plat boundary layer in the Trisonic Wind Tunnel Munich (TWM) are examined with the aim to characterize the scaling, spatial organization, and topology of large scale turbulent superstructures in compressible flow. This facility is ideal for this investigation as the ratio of boundary layer thickness to test section spanwise extent ratio is around 1/25, ensuring minimal sidewall and corner effects on turbulent structures in the center of the test section. A major difficulty in the experimental investigation of large scale features is the mutual size of the superstructures which can extend over many boundary layer thicknesses. Using multiple PIV systems, it was possible to capture the full spatial extent of large-scale structures over a range of Mach numbers from Ma = 0.3 - 3. To calculate the average large-scale structure length and spacing, the acquired vector fields were analyzed by statistical multi-point methods that show large scale structures with a correlation length of around 10 boundary layer thicknesses over the range of Mach numbers investigated. Furthermore, the average spacing between high and low momentum structures is on the order of a boundary layer thicknesses. This work is supported by the Priority Programme SPP 1881 Turbulent Superstructures of the Deutsche Forschungsgemeinschaft.

  6. A neuromorphic implementation of multiple spike-timing synaptic plasticity rules for large-scale neural networks

    PubMed Central

    Wang, Runchun M.; Hamilton, Tara J.; Tapson, Jonathan C.; van Schaik, André

    2015-01-01

    We present a neuromorphic implementation of multiple synaptic plasticity learning rules, which include both Spike Timing Dependent Plasticity (STDP) and Spike Timing Dependent Delay Plasticity (STDDP). We present a fully digital implementation as well as a mixed-signal implementation, both of which use a novel dynamic-assignment time-multiplexing approach and support up to 226 (64M) synaptic plasticity elements. Rather than implementing dedicated synapses for particular types of synaptic plasticity, we implemented a more generic synaptic plasticity adaptor array that is separate from the neurons in the neural network. Each adaptor performs synaptic plasticity according to the arrival times of the pre- and post-synaptic spikes assigned to it, and sends out a weighted or delayed pre-synaptic spike to the post-synaptic neuron in the neural network. This strategy provides great flexibility for building complex large-scale neural networks, as a neural network can be configured for multiple synaptic plasticity rules without changing its structure. We validate the proposed neuromorphic implementations with measurement results and illustrate that the circuits are capable of performing both STDP and STDDP. We argue that it is practical to scale the work presented here up to 236 (64G) synaptic adaptors on a current high-end FPGA platform. PMID:26041985

  7. The 2013 Dade W. Moeller Lecture: Medical Countermeasures Against Radiological Terrorism

    PubMed Central

    Moulder, John E.

    2014-01-01

    Soon after the 9–11 attacks, politicians and scientists began to question our ability to cope with a large-scale radiological terrorism incident. The outline of what was needed was fairly obvious: the ability to prevent such an attack; methods to cope with the medical consequences; the ability to clean up afterwards; and the tools to figure out who perpetrated the attack and bring them to justice. The medical response needed three components: the technology to rapidly determine the radiation doses received by a large number of people, methods for alleviating acute hematological radiation injuries, and therapies for mitigation and treatment of chronic radiation injuries. Research done to date has shown that a realistic medical response plan is scientifically possible, but the regulatory and financial barriers to achieving this may currently be insurmountable. PMID:24978287

  8. The Spanish royal philanthropic expedition to bring smallpox vaccination to the New World and Asia in the 19th century.

    PubMed

    Franco-Paredes, Carlos; Lammoglia, Lorena; Santos-Preciado, José Ignacio

    2005-11-01

    The New World was ravaged by smallpox for several centuries after the Spanish conquest. Jenner's discovery of the smallpox vaccine made possible the prevention and control of smallpox epidemics. In response to a large outbreak of smallpox in the Spanish colonies, King Charles IV appointed Francisco Xavier de Balmis to lead an expedition that would introduce Jenner's vaccine to these colonies. During the journey, the vaccine was kept viable by passing it from arm to arm in orphaned children, who were brought along expressly for that purpose and remained under the care of the orphanage's director. This expedition was the first large scale mass vaccination of its kind. The historic legacy of this pioneering event in international health should be revisited in the current era of persistent inequalities in global health.

  9. Cascade-based attacks on complex networks

    NASA Astrophysics Data System (ADS)

    Motter, Adilson E.; Lai, Ying-Cheng

    2002-12-01

    We live in a modern world supported by large, complex networks. Examples range from financial markets to communication and transportation systems. In many realistic situations the flow of physical quantities in the network, as characterized by the loads on nodes, is important. We show that for such networks where loads can redistribute among the nodes, intentional attacks can lead to a cascade of overload failures, which can in turn cause the entire or a substantial part of the network to collapse. This is relevant for real-world networks that possess a highly heterogeneous distribution of loads, such as the Internet and power grids. We demonstrate that the heterogeneity of these networks makes them particularly vulnerable to attacks in that a large-scale cascade may be triggered by disabling a single key node. This brings obvious concerns on the security of such systems.

  10. An algorithm for continuum modeling of rocks with multiple embedded nonlinearly-compliant joints [Continuum modeling of elasto-plastic media with multiple embedded nonlinearly-compliant joints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurley, R. C.; Vorobiev, O. Y.; Ezzedine, S. M.

    Here, we present a numerical method for modeling the mechanical effects of nonlinearly-compliant joints in elasto-plastic media. The method uses a series of strain-rate and stress update algorithms to determine joint closure, slip, and solid stress within computational cells containing multiple “embedded” joints. This work facilitates efficient modeling of nonlinear wave propagation in large spatial domains containing a large number of joints that affect bulk mechanical properties. We implement the method within the massively parallel Lagrangian code GEODYN-L and provide verification and examples. We highlight the ability of our algorithms to capture joint interactions and multiple weakness planes within individualmore » computational cells, as well as its computational efficiency. We also discuss the motivation for developing the proposed technique: to simulate large-scale wave propagation during the Source Physics Experiments (SPE), a series of underground explosions conducted at the Nevada National Security Site (NNSS).« less

  11. An algorithm for continuum modeling of rocks with multiple embedded nonlinearly-compliant joints [Continuum modeling of elasto-plastic media with multiple embedded nonlinearly-compliant joints

    DOE PAGES

    Hurley, R. C.; Vorobiev, O. Y.; Ezzedine, S. M.

    2017-04-06

    Here, we present a numerical method for modeling the mechanical effects of nonlinearly-compliant joints in elasto-plastic media. The method uses a series of strain-rate and stress update algorithms to determine joint closure, slip, and solid stress within computational cells containing multiple “embedded” joints. This work facilitates efficient modeling of nonlinear wave propagation in large spatial domains containing a large number of joints that affect bulk mechanical properties. We implement the method within the massively parallel Lagrangian code GEODYN-L and provide verification and examples. We highlight the ability of our algorithms to capture joint interactions and multiple weakness planes within individualmore » computational cells, as well as its computational efficiency. We also discuss the motivation for developing the proposed technique: to simulate large-scale wave propagation during the Source Physics Experiments (SPE), a series of underground explosions conducted at the Nevada National Security Site (NNSS).« less

  12. Using Sunlight and Cell Networks to Bring Fleeting Tracking to Small Scale Fisheries

    NASA Astrophysics Data System (ADS)

    Garren, M.; Selbie, H.; Suchomel, D.; McDonald, W.; Solomon, D.

    2016-12-01

    Traditionally, the efforts of small scale fisheries have not been easily incorporated into the global picture of fishing effort and activity. That means that the activities of the vast majority ( 90%) of fishing vessels in the world have remained unquantified and largely opaque. With newly developed technology that harnesses solar power and cost-effective cellular networks to transmit data, it is becoming possible to provide vessel tracking systems on a large scale for vessels of all sizes. Furthermore, capitalizing on the relatively inexpensive cellular networks to transfer the data enables data of much higher granularity to be captured. By recording a vessel's position every few seconds, instead of minutes to hours as is typical of most satellite-based systems, we are able to resolve a diverse array of behaviors happening at sea including when and where fishing occurred and what type of fishing gear was used. This high granularity data is both incredibly useful and also a challenge to manage and mine. New approaches for handling and processing this continuous data stream of vessel positions are being developed to extract the most informative and actionable pieces of information for a variety of audiences including governing agencies, industry supply chains seeking transparency, non-profit organizations supporting conservation efforts, academic researchers and the fishers and boat owners.

  13. Spatial, spectral and temporal patterns of tropical forest cover change as observed with multiple scales of optical satellite data.

    Treesearch

    D.J. Hayes; W.B. Cohen

    2006-01-01

    This article describes the development of a methodology for scaling observations of changes in tropical forest cover to large areas at high temporal frequency from coarse-resolution satellite imagery. The approach for estimating proportional forest cover change as a continuous variable is based on a regression model that relates multispectral, multitemporal Moderate...

  14. Monitoring water use and crop condition in California vineyards at multiple scales using multi-sensor satellite data fusion

    USDA-ARS?s Scientific Manuscript database

    Recent weather patterns have left California’s agricultural areas in severe drought. Given the reduced water availability in much of California it is critical to be able to measure water use and crop condition over large areas, but also in fine detail at scales of individual fields to support water...

  15. Bird Habitat Conservation at Various Scales in the Atlantic Coast Joint Venture

    Treesearch

    Andrew Milliken; Craig Watson; Chuck Hayes

    2005-01-01

    The Atlantic Coast Joint Venture is a partnership focused on the conservation of habitats for migratory birds within the Atlantic Flyway/Atlantic Coast Region from Maine south to Puerto Rico. In order to be effective in planning and implementing conservation in this large and diverse area, the joint venture must work at multiple spatial scales, from the largest ?...

  16. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  17. Untangling the causes a decadal-scale drought: a case study in southeast Australia.

    NASA Astrophysics Data System (ADS)

    Lewis, Sophie; Gallant, Ailie

    2017-04-01

    Prolonged droughts on the order of multiple years to a decade have recently afflicted many parts of highly populated regions around the globe, for example, the southwest United States and southeast Australia. However, the causes of these droughts remain unclear. A significant contribution from natural decadal-scale climate variability is likely, but there is also conflicting evidence of any contribution from anthropogenic climate change. This work aims to untangle the causes of a 13-year drought in southeast Australia spanning 1997-2009. A suite of historical and control simulations from fully coupled GCMs contained in the CMIP5 archive are employed, and the potential contributions of random climate variability, SST forcing and anthropogenic forcing to the drought are examined. It is likely that random, decadal-scale variability played a significant role in producing the prolonged rainfall deficits across southeast Australia. These were reinforced by several years with El Niño-like conditions, which commonly induce drought in the region, and a lack of La Niña conditions, which are more likely to bring rain. Evidence of contribution of anthropogenic forcing to the drought is limited

  18. Facilitating the transition from physiology to hospital wards through an interdisciplinary case study of septic shock.

    PubMed

    Li, Albert S; Berger, Kenneth I; Schwartz, David R; Slater, William R; Goldfarb, David S

    2014-04-12

    In order to develop clinical reasoning, medical students must be able to integrate knowledge across traditional subject boundaries and multiple disciplines. At least two dimensions of integration have been identified: horizontal integration, bringing together different disciplines in considering a topic; and vertical integration, bridging basic science and clinical practice. Much attention has been focused on curriculum overhauls, but our approach is to facilitate horizontal and vertical integration on a smaller scale through an interdisciplinary case study discussion and then to assess its utility. An interdisciplinary case study discussion about a critically ill patient was implemented at the end of an organ system-based, basic sciences module at New York University School of Medicine. Three clinical specialists-a cardiologist, a pulmonologist, and a nephrologist-jointly led a discussion about a complex patient in the intensive care unit with multiple medical problems secondary to septic shock. The discussion emphasized the physiologic underpinnings behind the patient's presentation and the physiologic considerations across the various systems in determining proper treatment. The discussion also highlighted the interdependence between the cardiovascular, respiratory, and renal systems, which were initially presented in separate units. After the session students were given a brief, anonymous three-question free-response questionnaire in which they were asked to evaluate and freely comment on the exercise. Students not only took away physiological principles but also gained an appreciation for various thematic lessons for bringing basic science to the bedside, especially horizontal and vertical integration. The response of the participants was overwhelmingly positive with many indicating that the exercise integrated the material across organ systems, and strengthened their appreciation of the role of physiology in understanding disease presentations and guiding appropriate therapy. Horizontal and vertical integration can be presented effectively through a single-session case study, with complex patient cases involving multiple organ systems providing students opportunities to integrate their knowledge across organ systems while emphasizing the importance of physiology in clinical reasoning. Furthermore, having several clinicians from different specialties discuss the case together can reinforce the matter of integration across multiple organ systems and disciplines in students' minds.

  19. Facilitating the transition from physiology to hospital wards through an interdisciplinary case study of septic shock

    PubMed Central

    2014-01-01

    Background In order to develop clinical reasoning, medical students must be able to integrate knowledge across traditional subject boundaries and multiple disciplines. At least two dimensions of integration have been identified: horizontal integration, bringing together different disciplines in considering a topic; and vertical integration, bridging basic science and clinical practice. Much attention has been focused on curriculum overhauls, but our approach is to facilitate horizontal and vertical integration on a smaller scale through an interdisciplinary case study discussion and then to assess its utility. Methods An interdisciplinary case study discussion about a critically ill patient was implemented at the end of an organ system-based, basic sciences module at New York University School of Medicine. Three clinical specialists—a cardiologist, a pulmonologist, and a nephrologist—jointly led a discussion about a complex patient in the intensive care unit with multiple medical problems secondary to septic shock. The discussion emphasized the physiologic underpinnings behind the patient’s presentation and the physiologic considerations across the various systems in determining proper treatment. The discussion also highlighted the interdependence between the cardiovascular, respiratory, and renal systems, which were initially presented in separate units. After the session students were given a brief, anonymous three-question free-response questionnaire in which they were asked to evaluate and freely comment on the exercise. Results Students not only took away physiological principles but also gained an appreciation for various thematic lessons for bringing basic science to the bedside, especially horizontal and vertical integration. The response of the participants was overwhelmingly positive with many indicating that the exercise integrated the material across organ systems, and strengthened their appreciation of the role of physiology in understanding disease presentations and guiding appropriate therapy. Conclusions Horizontal and vertical integration can be presented effectively through a single-session case study, with complex patient cases involving multiple organ systems providing students opportunities to integrate their knowledge across organ systems while emphasizing the importance of physiology in clinical reasoning. Furthermore, having several clinicians from different specialties discuss the case together can reinforce the matter of integration across multiple organ systems and disciplines in students’ minds. PMID:24725336

  20. ProteinInferencer: Confident protein identification and multiple experiment comparison for large scale proteomics projects.

    PubMed

    Zhang, Yaoyang; Xu, Tao; Shan, Bing; Hart, Jonathan; Aslanian, Aaron; Han, Xuemei; Zong, Nobel; Li, Haomin; Choi, Howard; Wang, Dong; Acharya, Lipi; Du, Lisa; Vogt, Peter K; Ping, Peipei; Yates, John R

    2015-11-03

    Shotgun proteomics generates valuable information from large-scale and target protein characterizations, including protein expression, protein quantification, protein post-translational modifications (PTMs), protein localization, and protein-protein interactions. Typically, peptides derived from proteolytic digestion, rather than intact proteins, are analyzed by mass spectrometers because peptides are more readily separated, ionized and fragmented. The amino acid sequences of peptides can be interpreted by matching the observed tandem mass spectra to theoretical spectra derived from a protein sequence database. Identified peptides serve as surrogates for their proteins and are often used to establish what proteins were present in the original mixture and to quantify protein abundance. Two major issues exist for assigning peptides to their originating protein. The first issue is maintaining a desired false discovery rate (FDR) when comparing or combining multiple large datasets generated by shotgun analysis and the second issue is properly assigning peptides to proteins when homologous proteins are present in the database. Herein we demonstrate a new computational tool, ProteinInferencer, which can be used for protein inference with both small- or large-scale data sets to produce a well-controlled protein FDR. In addition, ProteinInferencer introduces confidence scoring for individual proteins, which makes protein identifications evaluable. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015. Published by Elsevier B.V.

  1. Marine Research Infrastructure collaboration in the COOPLUS project framework - Promoting synergies for marine ecosystems studies

    NASA Astrophysics Data System (ADS)

    Beranzoli, L.; Best, M.; Embriaco, D.; Favali, P.; Juniper, K.; Lo Bue, N.; Lara-Lopez, A.; Materia, P.; Ó Conchubhair, D.; O'Rourke, E.; Proctor, R.; Weller, R. A.

    2017-12-01

    Understanding effects on marine ecosystems of multiple drivers at various scales; from regional such as climate and ocean circulation, to local, such as seafloor gas emissions and harmful underwater noise, requires long time-series of integrated and standardised datasets. Large-scale research infrastructures for ocean observation are able to provide such time-series for a variety of ocean process physical parameters (mass and energy exchanges among surface, water column and benthic boundary layer) that constitute important and necessary measures of environmental conditions and change/development over time. Information deduced from these data is essential for the study, modelling and prediction of marine ecosystems changes and can reveal and potentially confirm deterioration and threats. The COOPLUS European Commission project brings together research infrastructures with the aim of coordinating multilateral cooperation among RIs and identifying common priorities, actions, instruments, resources. COOPLUS will produce a Strategic Research and Innovation Agenda (SRIA) which will be a shared roadmap for mid to long-term collaboration. In particular, marine RIs collaborating in COOPLUS, namely the European Multidisciplinary Seafloor and water column Observatory: EMSO (Europe), the Ocean Observatories Initiative (OOI, USA), Ocean Networks Canada (ONC), and the Integrated Marine Observing System (IMOS, Australia), can represent a source of important data for researchers of marine ecosystems. The RIs can then, in turn, receive suggestions from researchers for implementing new measurements and stimulating cross-cutting collaborations and data integration and standardisation from their user community. This poster provides a description of EMSO, OOI, ONC and IMOS for the benefit of marine ecosystem studies and presents examples of where the analyses of time-series have revealed noteworthy environmental conditions, temporal trends and events.

  2. Historical nectar assessment reveals the fall and rise of Britain in bloom

    PubMed Central

    Baude, Mathilde; Kunin, William E.; Boatman, Nigel D.; Conyers, Simon; Davies, Nancy; Gillespie, Mark A. K.; Morton, R. Daniel; Smart, Simon M.; Memmott, Jane

    2015-01-01

    Summary There is considerable concern over declines in insect pollinator communities and potential impacts on the pollination of crops and wildflowers1–4. Among the multiple pressures facing pollinators2–4, decreasing floral resources due to habitat loss and degradation has been suggested as a key contributing factor2–8. However, a lack of quantitative data has hampered testing for historical changes in floral resources. Here we show that overall floral rewards can be estimated at a national scale by combining vegetation surveys and direct nectar measurements. We find evidence for substantial losses in nectar resources in England and Wales between the 1930s and 1970s; however, total nectar provision in Great Britain as a whole had stabilised by 1978, and increased from 1998 to 2007. These findings concur with trends in pollinator diversity, which declined in the mid-20th century9 but stabilised more recently10. The diversity of nectar sources declined from 1978 to 1990 but stabilised thereafter at low levels, with four plant species accounting for over 50% of national nectar provision in 2007. Calcareous grassland, broadleaved woodland and neutral grassland are the habitats that produce the greatest amount of nectar per unit area from the most diverse sources, whereas arable land is the poorest in both respects. While agri-environment schemes add resources to arable landscapes, their national contribution is low. Due to their large area, improved grasslands could add substantially to national nectar provision if they were managed to increase floral resource provision. This national-scale assessment of floral resource provision brings new insights into the links between plant and pollinator declines, and offers considerable opportunities for conservation. PMID:26842058

  3. Mapping Tropical Forest Mosaics with C- and L-band SAR: First Results from Osa Peninsula, Costa Rica

    NASA Astrophysics Data System (ADS)

    Pinto, N.; Hensley, S.; Aguilar-Amuchastegui, N.; Broadbent, E. N.; Ahmed, R.

    2016-12-01

    In tropical countries, economic incentives and improved infrastructure are creating forest mosaics where small-scale farming and industrial plantations are embedded within and potentially replacing native ecosystems. Practices such as agroforestry, slash-and-burn cultivation, and oil palm monocultures bring widely different impacts on carbon stocks. Characterizing these production systems is not only critical to ascribe deforestation to particular drivers, but also essential to understand the impact of macroeconomic scenarios, national policies, and land tenure schemes on carbon fluxes. The last decade has experienced a dramatic improvement in the extent and consistency of tree cover and gross deforestation products from optical imagery. At the same time, recent work shows that Synthetic Aperture Radar (SAR) can complement optical data and reveal structural types that cannot be easily resolved with reflectance measurements alone. While these results demonstrate the validity of sensor fusion methodologies, they typically rely on local classifications or even manual delineation and as such they cannot support large-scale investigations. Furthermore, there have been few attempts to exploit PolInSAR or multiple wavelengths that can provide critical information to resolve natural and anthropogenic land cover types. We report results from our research at Costa Rica's Osa Peninsula. This site is ideal for algorithm development as it includes a highly diverse tropical forest within Corcovado National Park, as well as agroforestry zones, mangroves, and palm plantations. We first integrate SAR backscatter and coherence data from NASA's L-band UAVSAR, JAXA's ALOS/PALSAR, and ESA's Sentinel to produce a map of structural types. Second, we assess whether coherence measurements and PolInSAR retrievals can be used to resolve forest stand differences at 30m resolution and disitinguish between primary and secondary forest sites.

  4. Helping Children Learn Mathematics through Multiple Intelligences and Standards for School Mathematics.

    ERIC Educational Resources Information Center

    Adams, Thomasenia Lott

    2001-01-01

    Focuses on the National Council of Teachers of Mathematics 2000 process-oriented standards of problem solving, reasoning and proof, communication, connections, and representation as providing a framework for using the multiple intelligences that children bring to mathematics learning. Presents ideas for mathematics lessons and activities to…

  5. Uplink Downlink Rate Balancing and Throughput Scaling in FDD Massive MIMO Systems

    NASA Astrophysics Data System (ADS)

    Bergel, Itsik; Perets, Yona; Shamai, Shlomo

    2016-05-01

    In this work we extend the concept of uplink-downlink rate balancing to frequency division duplex (FDD) massive MIMO systems. We consider a base station with large number antennas serving many single antenna users. We first show that any unused capacity in the uplink can be traded off for higher throughput in the downlink in a system that uses either dirty paper (DP) coding or linear zero-forcing (ZF) precoding. We then also study the scaling of the system throughput with the number of antennas in cases of linear Beamforming (BF) Precoding, ZF Precoding, and DP coding. We show that the downlink throughput is proportional to the logarithm of the number of antennas. While, this logarithmic scaling is lower than the linear scaling of the rate in the uplink, it can still bring significant throughput gains. For example, we demonstrate through analysis and simulation that increasing the number of antennas from 4 to 128 will increase the throughput by more than a factor of 5. We also show that a logarithmic scaling of downlink throughput as a function of the number of receive antennas can be achieved even when the number of transmit antennas only increases logarithmically with the number of receive antennas.

  6. Cosmos as Resonant Harmonies ˜ Singing International Year of Astronomy, 2009 ˜ the cultural significance of our new encounter with the universe

    NASA Astrophysics Data System (ADS)

    Perkins, Kala

    2008-04-01

    UN Int'l Year of Astronomy (IYA), 2009 will celebrate 400 yrs. since Galileo's quests. Bringing the unifying dimensions of cosmos to the global community, sharing the wonder and calling forth the unparalleled ability of astronomy to dwarf our disputations, open our hearts to Einstein's ``cosmological feeling'' and propel us on this collective global adventure, is the nexus of intent. IYA is a global effort to bring the human creative endeavor into harmonic interplay with the universe that is singing us. We are cosmos creating ourselves, taking the reigns of our inherent potency and wondering how law and logos emerge into the entangled formulas and phenomenology of cosmic reason and reality. How is our cosmic encounter affecting our socio-cultural identity and psychology? What harmonies are emerging in researchers in response to our penetration into cosmic etudes of black holes, large-scale flows and stellar dynamics? We are learning to creatively resonate with the universe. Some excellent ideas being brewed for communicating the cosmos to students and the public will be explored.

  7. Fractal markets: Liquidity and investors on different time horizons

    NASA Astrophysics Data System (ADS)

    Li, Da-Ye; Nishimura, Yusaku; Men, Ming

    2014-08-01

    In this paper, we propose a new agent-based model to study the source of liquidity and the “emergent” phenomenon in financial market with fractal structure. The model rests on fractal market hypothesis and agents with different time horizons of investments. What is interesting is that though the agent-based model reveals that the interaction between these heterogeneous agents affects the stability and liquidity of the financial market the real world market lacks detailed data to bring it to light since it is difficult to identify and distinguish the investors with different time horizons in the empirical approach. results show that in a relatively short period of time fractal market provides liquidity from investors with different horizons and the market gains stability when the market structure changes from uniformity to diversification. In the real world the fractal structure with the finite of horizons can only stabilize the market within limits. With the finite maximum horizons, the greater diversity of the investors and the fractal structure will not necessarily bring more stability to the market which might come with greater fluctuation in large time scale.

  8. Does the Position of Response Options in Multiple-Choice Tests Matter?

    ERIC Educational Resources Information Center

    Hohensinn, Christine; Baghaei, Purya

    2017-01-01

    In large scale multiple-choice (MC) tests alternate forms of a test may be developed to prevent cheating by changing the order of items or by changing the position of the response options. The assumption is that since the content of the test forms are the same the order of items or the positions of the response options do not have any effect on…

  9. A Stratified Study of Students' Understanding of Basic Optics Concepts in Different Contexts Using Two-Tier Multiple-Choice Items

    ERIC Educational Resources Information Center

    Chu, Hye-Eun; Treagust, David F.; Chandrasegaran, A. L.

    2009-01-01

    A large scale study involving 1786 year 7-10 Korean students from three school districts in Seoul was undertaken to evaluate their understanding of basic optics concepts using a two-tier multiple-choice diagnostic instrument consisting of four pairs of items, each of which evaluated the same concept in two different contexts. The instrument, which…

  10. Multiple Independent File Parallel I/O with HDF5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(10 6) parallel tasks.

  11. Quantifying the Hierarchical Order in Self-Aligned Carbon Nanotubes from Atomic to Micrometer Scale.

    PubMed

    Meshot, Eric R; Zwissler, Darwin W; Bui, Ngoc; Kuykendall, Tevye R; Wang, Cheng; Hexemer, Alexander; Wu, Kuang Jen J; Fornasiero, Francesco

    2017-06-27

    Fundamental understanding of structure-property relationships in hierarchically organized nanostructures is crucial for the development of new functionality, yet quantifying structure across multiple length scales is challenging. In this work, we used nondestructive X-ray scattering to quantitatively map the multiscale structure of hierarchically self-organized carbon nanotube (CNT) "forests" across 4 orders of magnitude in length scale, from 2.0 Å to 1.5 μm. Fully resolved structural features include the graphitic honeycomb lattice and interlayer walls (atomic), CNT diameter (nano), as well as the greater CNT ensemble (meso) and large corrugations (micro). Correlating orientational order across hierarchical levels revealed a cascading decrease as we probed finer structural feature sizes with enhanced sensitivity to small-scale disorder. Furthermore, we established qualitative relationships for single-, few-, and multiwall CNT forest characteristics, showing that multiscale orientational order is directly correlated with number density spanning 10 9 -10 12 cm -2 , yet order is inversely proportional to CNT diameter, number of walls, and atomic defects. Lastly, we captured and quantified ultralow-q meridional scattering features and built a phenomenological model of the large-scale CNT forest morphology, which predicted and confirmed that these features arise due to microscale corrugations along the vertical forest direction. Providing detailed structural information at multiple length scales is important for design and synthesis of CNT materials as well as other hierarchically organized nanostructures.

  12. Multilevel water governance and problems of scale: setting the stage for a broader debate.

    PubMed

    Moss, Timothy; Newig, Jens

    2010-07-01

    Environmental governance and management are facing a multiplicity of challenges related to spatial scales and multiple levels of governance. Water management is a field particularly sensitive to issues of scale because the hydrological system with its different scalar levels from small catchments to large river basins plays such a prominent role. It thus exemplifies fundamental issues and dilemmas of scale in modern environmental management and governance. In this introductory article to an Environmental Management special feature on "Multilevel Water Governance: Coping with Problems of Scale," we delineate our understanding of problems of scale and the dimensions of scalar politics that are central to water resource management. We provide an overview of the contributions to this special feature, concluding with a discussion of how scalar research can usefully challenge conventional wisdom on water resource management. We hope that this discussion of water governance stimulates a broader debate and inquiry relating to the scalar dimensions of environmental governance and management in general.

  13. Time to "go large" on biofilm research: advantages of an omics approach.

    PubMed

    Azevedo, Nuno F; Lopes, Susana P; Keevil, Charles W; Pereira, Maria O; Vieira, Maria J

    2009-04-01

    In nature, the biofilm mode of life is of great importance in the cell cycle for many microorganisms. Perhaps because of biofilm complexity and variability, the characterization of a given microbial system, in terms of biofilm formation potential, structure and associated physiological activity, in a large-scale, standardized and systematic manner has been hindered by the absence of high-throughput methods. This outlook is now starting to change as new methods involving the utilization of microtiter-plates and automated spectrophotometry and microscopy systems are being developed to perform large-scale testing of microbial biofilms. Here, we evaluate if the time is ripe to start an integrated omics approach, i.e., the generation and interrogation of large datasets, to biofilms--"biofomics". This omics approach would bring much needed insight into how biofilm formation ability is affected by a number of environmental, physiological and mutational factors and how these factors interplay between themselves in a standardized manner. This could then lead to the creation of a database where biofilm signatures are identified and interrogated. Nevertheless, and before embarking on such an enterprise, the selection of a versatile, robust, high-throughput biofilm growing device and of appropriate methods for biofilm analysis will have to be performed. Whether such device and analytical methods are already available, particularly for complex heterotrophic biofilms is, however, very debatable.

  14. 3D plasmonic nanoantennas integrated with MEA biosensors.

    PubMed

    Dipalo, Michele; Messina, Gabriele C; Amin, Hayder; La Rocca, Rosanna; Shalabaeva, Victoria; Simi, Alessandro; Maccione, Alessandro; Zilio, Pierfrancesco; Berdondini, Luca; De Angelis, Francesco

    2015-02-28

    Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level.

  15. Hierarchical drivers of reef-fish metacommunity structure.

    PubMed

    MacNeil, M Aaron; Graham, Nicholas A J; Polunin, Nicholas V C; Kulbicki, Michel; Galzin, René; Harmelin-Vivien, Mireille; Rushton, Steven P

    2009-01-01

    Coral reefs are highly complex ecological systems, where multiple processes interact across scales in space and time to create assemblages of exceptionally high biodiversity. Despite the increasing frequency of hierarchically structured sampling programs used in coral-reef science, little progress has been made in quantifying the relative importance of processes operating across multiple scales. The vast majority of reef studies are conducted, or at least analyzed, at a single spatial scale, ignoring the implicitly hierarchical structure of the overall system in favor of small-scale experiments or large-scale observations. Here we demonstrate how alpha (mean local number of species), beta diversity (degree of species dissimilarity among local sites), and gamma diversity (overall species richness) vary with spatial scale, and using a hierarchical, information-theoretic approach, we evaluate the relative importance of site-, reef-, and atoll-level processes driving the fish metacommunity structure among 10 atolls in French Polynesia. Process-based models, representing well-established hypotheses about drivers of reef-fish community structure, were assembled into a candidate set of 12 hierarchical linear models. Variation in fish abundance, biomass, and species richness were unevenly distributed among transect, reef, and atoll levels, establishing the relative contribution of variation at these spatial scales to the structure of the metacommunity. Reef-fish biomass, species richness, and the abundance of most functional-groups corresponded primarily with transect-level habitat diversity and atoll-lagoon size, whereas detritivore and grazer abundances were largely correlated with potential covariates of larval dispersal. Our findings show that (1) within-transect and among-atoll factors primarily drive the relationship between alpha and gamma diversity in this reef-fish metacommunity; (2) habitat is the primary correlate with reef-fish metacommunity structure at multiple spatial scales; and (3) inter-atoll connectedness was poorly correlated with the nonrandom clustering of reef-fish species. These results demonstrate the importance of modeling hierarchical data and processes in understanding reef-fish metacommunity structure.

  16. Multiple Fault Isolation in Redundant Systems

    NASA Technical Reports Server (NTRS)

    Pattipati, Krishna R.; Patterson-Hine, Ann; Iverson, David

    1997-01-01

    Fault diagnosis in large-scale systems that are products of modern technology present formidable challenges to manufacturers and users. This is due to large number of failure sources in such systems and the need to quickly isolate and rectify failures with minimal down time. In addition, for fault-tolerant systems and systems with infrequent opportunity for maintenance (e.g., Hubble telescope, space station), the assumption of at most a single fault in the system is unrealistic. In this project, we have developed novel block and sequential diagnostic strategies to isolate multiple faults in the shortest possible time without making the unrealistic single fault assumption.

  17. Multiple Fault Isolation in Redundant Systems

    NASA Technical Reports Server (NTRS)

    Pattipati, Krishna R.

    1997-01-01

    Fault diagnosis in large-scale systems that are products of modem technology present formidable challenges to manufacturers and users. This is due to large number of failure sources in such systems and the need to quickly isolate and rectify failures with minimal down time. In addition, for fault-tolerant systems and systems with infrequent opportunity for maintenance (e.g., Hubble telescope, space station), the assumption of at most a single fault in the system is unrealistic. In this project, we have developed novel block and sequential diagnostic strategies to isolate multiple faults in the shortest possible time without making the unrealistic single fault assumption.

  18. Dislocation Multiplication by Single Cross Slip for FCC at Submicron Scales

    NASA Astrophysics Data System (ADS)

    Cui, Yi-Nan; Liu, Zhan-Li; Zhuang, Zhuo

    2013-04-01

    The operation mechanism of single cross slip multiplication (SCSM) is investigated by studying the response of one dislocation loop expanding in face-centered-cubic (FCC) single crystal using three-dimensional discrete dislocation dynamic (3D-DDD) simulation. The results show that SCSM can trigger highly correlated dislocation generation in a short time, which may shed some light on understanding the large strain burst observed experimentally. Furthermore, we find that there is a critical stress and material size for the operation of SCSM, which agrees with that required to trigger large strain burst in the compression tests of FCC micropillars.

  19. Community and occupational health concerns in pork production: a review.

    PubMed

    Donham, K J

    2010-04-01

    Public concerns relative to adverse consequences of large-scale livestock production have been increasingly voiced since the late 1960s. Numerous regional, national, and international conferences have been held on the subject since 1994. This paper provides a review of the literature on the community and occupational health concerns of large-scale livestock production with a focus on pork production. The industry has recognized the concerns of the public, and the national and state pork producer groups are including these issues as an important component of their research and policy priorities. One reason large-scale livestock production has raised concern is that a significant component of the industry has separated from traditional family farming and has developed like other industries in management, structure, and concentration. The magnitude of the problem cited by environmental groups has often been criticized by the pork production industry for lack of science-based evidence to document environmental concerns. In addition to general environmental concerns, occupational health of workers has become more relevant because many operations now are employing more than 10 employees, which brings many operations in the United States under the scrutiny of the US Occupational Safety and Health Administration. In this paper, the scientific literature is reviewed relative to the science basis of occupational and environmental impacts on community and worker health. Further, recommendations are made to help promote sustainability of the livestock industry within the context of maintaining good stewardship of our environmental and human capital.

  20. New PHOBOS results on event-by-event fluctuations

    NASA Astrophysics Data System (ADS)

    Alver, B.; Back, B. B.; Baker, M. D.; Ballintijn, M.; Barton, D. S.; Betts, R. R.; Bickley, A. A.; Bindel, R.; Budzanowski, A.; Busza, W.; Carroll, A.; Chai, Z.; Chetluru, V.; Decowski, M. P.; García, E.; Gburek, T.; George, N.; Gulbrandsen, K.; Gushue, S.; Halliwell, C.; Hamblen, J.; Heintzelman, G. A.; Henderson, C.; Harnarine, I.; Hofman, D. J.; Hollis, R. S.; Hołyński, R.; Holzman, B.; Iordanova, A.; Johnson, E.; Kane, J. L.; Khan, N.; Kucewicz, W.; Kulinich, P.; Kuo, C. M.; Li, W.; Lin, W. T.; Loizides, C.; Manly, S.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Park, I. C.; Reed, C.; Remsberg, L. P.; Reuter, M.; Richardson, E.; Roland, C.; Roland, G.; Rosenberg, L.; Sagerer, J.; Sarin, P.; Sawicki, P.; Sedykh, I.; Skulski, W.; Smith, C. E.; Stankiewicz, M. A.; Steinberg, P.; Stephans, G. S. F.; Sukhanov, A.; Szostak, A.; Tang, J.-L.; Tonjes, M. B.; Trzupek, A.; Vale, C.; van Nieuwenhuizen, G. J.; Vaurynovich, S. S.; Verdier, R.; Veres, G. I.; Walters, P.; Wenger, E.; Willhelm, D.; Wolfs, F. L. H.; Wosiek, B.; Woźniak, K.; Wuosmaa, A. H.; Wyngaardt, S.; Wysłouch, B.

    2006-04-01

    We present new results from the PHOBOS experiment at RHIC on event-by-event fluctuations of particle multiplicities and angular distributions in nucleus-nucleus collisions at RHIC. Our data for Au+Au collisions at √sNN = 200 GeV show that at a level of 10-4 or less, no rare, large-amplitude fluctuations in the total multiplicity distributions or the shape of the pseudorapidity distributions are observed. We however find significant short-range multiplicity correlations in these data, that can be described as particle production in clusters. In Cu+Cu collisions, we observe large final-state azimuthal anisotropies ν2. A common scaling behavior for Cu+Cu and Au+Au for these anisotropies emerges when fluctuations in the initial state geometry are taken into account.

  1. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  2. Coral mass spawning predicted by rapid seasonal rise in ocean temperature

    PubMed Central

    Maynard, Jeffrey A.; Edwards, Alasdair J.; Guest, James R.; Rahbek, Carsten

    2016-01-01

    Coral spawning times have been linked to multiple environmental factors; however, to what extent these factors act as generalized cues across multiple species and large spatial scales is unknown. We used a unique dataset of coral spawning from 34 reefs in the Indian and Pacific Oceans to test if month of spawning and peak spawning month in assemblages of Acropora spp. can be predicted by sea surface temperature (SST), photosynthetically available radiation, wind speed, current speed, rainfall or sunset time. Contrary to the classic view that high mean SST initiates coral spawning, we found rapid increases in SST to be the best predictor in both cases (month of spawning: R2 = 0.73, peak: R2 = 0.62). Our findings suggest that a rapid increase in SST provides the dominant proximate cue for coral mass spawning over large geographical scales. We hypothesize that coral spawning is ultimately timed to ensure optimal fertilization success. PMID:27170709

  3. Climate change mitigation and adaptation in the land use sector: from complementarity to synergy.

    PubMed

    Duguma, Lalisa A; Minang, Peter A; van Noordwijk, Meine

    2014-09-01

    Currently, mitigation and adaptation measures are handled separately, due to differences in priorities for the measures and segregated planning and implementation policies at international and national levels. There is a growing argument that synergistic approaches to adaptation and mitigation could bring substantial benefits at multiple scales in the land use sector. Nonetheless, efforts to implement synergies between adaptation and mitigation measures are rare due to the weak conceptual framing of the approach and constraining policy issues. In this paper, we explore the attributes of synergy and the necessary enabling conditions and discuss, as an example, experience with the Ngitili system in Tanzania that serves both adaptation and mitigation functions. An in-depth look into the current practices suggests that more emphasis is laid on complementarity-i.e., mitigation projects providing adaptation co-benefits and vice versa rather than on synergy. Unlike complementarity, synergy should emphasize functionally sustainable landscape systems in which adaptation and mitigation are optimized as part of multiple functions. We argue that the current practice of seeking co-benefits (complementarity) is a necessary but insufficient step toward addressing synergy. Moving forward from complementarity will require a paradigm shift from current compartmentalization between mitigation and adaptation to systems thinking at landscape scale. However, enabling policy, institutional, and investment conditions need to be developed at global, national, and local levels to achieve synergistic goals.

  4. Drought in the Horn of Africa: attribution of a damaging and repeating extreme event

    NASA Astrophysics Data System (ADS)

    Marthews, Toby; Otto, Friederike; Mitchell, Daniel; Dadson, Simon; Jones, Richard

    2015-04-01

    We have applied detection and attribution techniques to the severe drought that hit the Horn of Africa in 2014. The short rains failed in late 2013 in Kenya, South Sudan, Somalia and southern Ethiopia, leading to a very dry growing season January to March 2014, and subsequently to the current drought in many agricultural areas of the sub-region. We have made use of the weather@home project, which uses publicly-volunteered distributed computing to provide a large ensemble of simulations sufficient to sample regional climate uncertainty. Based on this, we have estimated the occurrence rates of the kinds of the rare and extreme events implicated in this large-scale drought. From land surface model runs based on these ensemble simulations, we have estimated the impacts of climate anomalies during this period and therefore we can reliably identify some factors of the ongoing drought as attributable to human-induced climate change. The UNFCCC's Adaptation Fund is attempting to support projects that bring about an adaptation to "the adverse effects of climate change", but in order to formulate such projects we need a much clearer way to assess how much climate change is human-induced and how much is a consequence of climate anomalies and large-scale teleconnections, which can only be provided by robust attribution techniques.

  5. Transparent and Flexible Large-scale Graphene-based Heater

    NASA Astrophysics Data System (ADS)

    Kang, Junmo; Lee, Changgu; Kim, Young-Jin; Choi, Jae-Boong; Hong, Byung Hee

    2011-03-01

    We report the application of transparent and flexible heater with high optical transmittance and low sheet resistance using graphene films, showing outstanding thermal and electrical properties. The large-scale graphene films were grown on Cu foil by chemical vapor deposition methods, and transferred to transparent substrates by multiple stacking. The wet chemical doping process enhanced the electrical properties, showing a sheet resistance as low as 35 ohm/sq with 88.5 % transmittance. The temperature response usually depends on the dimension and the sheet resistance of the graphene-based heater. We show that a 4x4 cm2 heater can reach 80& circ; C within 40 seconds and large-scale (9x9 cm2) heater shows uniformly heating performance, which was measured using thermocouple and infra-red camera. These heaters would be very useful for defogging systems and smart windows.

  6. Lagrangian space consistency relation for large scale structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horn, Bart; Hui, Lam; Xiao, Xiao, E-mail: bh2478@columbia.edu, E-mail: lh399@columbia.edu, E-mail: xx2146@columbia.edu

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present.more » The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.« less

  7. The proximal-to-distal sequence in upper-limb motions on multiple levels and time scales.

    PubMed

    Serrien, Ben; Baeyens, Jean-Pierre

    2017-10-01

    The proximal-to-distal sequence is a phenomenon that can be observed in a large variety of motions of the upper limbs in both humans and other mammals. The mechanisms behind this sequence are not completely understood and motor control theories able to explain this phenomenon are currently incomplete. The aim of this narrative review is to take a theoretical constraints-led approach to the proximal-to-distal sequence and provide a broad multidisciplinary overview of relevant literature. This sequence exists at multiple levels (brain, spine, muscles, kinetics and kinematics) and on multiple time scales (motion, motor learning and development, growth and possibly even evolution). We hypothesize that the proximodistal spatiotemporal direction on each time scale and level provides part of the organismic constraints that guide the dynamics at the other levels and time scales. The constraint-led approach in this review may serve as a first onset towards integration of evidence and a framework for further experimentation to reveal the dynamics of the proximal-to-distal sequence. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Large-Scale Diversity of Slope Fishes: Pattern Inconsistency between Multiple Diversity Indices

    PubMed Central

    Gaertner, Jean-Claude; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A.; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3′- 45°7′ N; 5°3′W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness. PMID:23843962

  9. Meta-analysis of prediction model performance across multiple studies: Which scale helps ensure between-study normality for the C-statistic and calibration measures?

    PubMed

    Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D

    2017-01-01

    If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.

  10. Postcards from the Field: Using the Web to Bring Near-Real Time Field Work to the Public

    NASA Astrophysics Data System (ADS)

    Genyuk, J.; Johnson, R. M.; Gardiner, L.; Russell, R.; Bergman, J.; Lagrave, M.; Hatheway, B.; Foster, S.; Araujo-Pradere, E. A.

    2007-12-01

    Field work is one of the aspects of a career in the geosciences that can make it so attractive to students and the public. The chance to go to exciting places, and to see amazing things, while making new discoveries is almost too good to be true. The "Postcards from the Field" capability, developed and implemented in the Windows to the Universe website project in 2006, is now providing a new ability to bring this excitement to a large and global audience online. Windows to the Universe is an extremely popular interdisciplinary Earth and space science educational website, with over 20 million visitors per year, including a large following of students and educators. The website is composed of over 7000 interlinked web pages spanning the geosciences, at three levels of sophistication, in English and Spanish. Our Postcards from the Field capability, which was originally developed in support of a major field campaign in Mexico City in 2006 (the Megacity Initiative: Local and Global Research Observations campaign - MILAGRO), has now been generalized to support submissions from researchers engaged in multiple field campaigns. To date, in addition to postcards submitted during the MILAGRO campaign, we have hosted postcards from researchers and educators studying the life cycle of Adelie penguins in the Antarctic, the East Pacific Rise as a component of the RIDGE2000 program, and storm formation in Europe as a component of the Convective and Orographically- induced Precipitation Study (COPS). We are now expanding our postcard lines to include submissions from researchers engaged in the IPY and educators engaged with the ANDRILL (ANtarctic Geologic DRILLing) Research Immersion for Science Educators program. This presentation will present this new capability, its ease of use, and our vision for how it can be used to bring the excitement of field research to the public, students, and educators online.

  11. A new conceptual framework for water and sediment connectivity

    NASA Astrophysics Data System (ADS)

    Keesstra, Saskia; Cerdà, Artemi; Parsons, Tony; Nunes, Joao Pedro; Saco, Patricia

    2016-04-01

    For many years scientists have tried to understand, describe and quantify sediment transport on multiple scales; from the geomorphological work triggered by a single thunderstorm to the geological time scale land scape evolution, and from particles and soil aggregates up to the continental scale. In the last two decades, a new concept called connectivity (Baartman et al., 2013; Bracken et al., 2013, 2015; Parsons et al., 2015) has been used by the scientific community to describe the connection between the different scales at which the sediment redistribution research along the watershed are being studied: pedon, slope tram, slope, watersheds, and basins. This concept is seen as a means to describe and quantify the results of processes influencing the transport of sediment on all these scales. Therefore the concept of connectivity and the way scales are used in the design of a measurement and monitoring scheme are interconnected (Cerdà et al., 2012), which shows that connectivity is not only a tool for process understanding, but also a tool to measure processes on multiple scales. This research aims to describe catchment system dynamics from a connectivity point of view. This conceptual framework can be helpful to look at catchment systems and synthesize which data are necessary to take into account when measuring or modelling water and sediment transfer in catchment systems, Identifying common patterns and generalities will help discover physical reasons for differences in responses and interaction between these processes. We describe a conceptual framework which is meant to bring a better understanding of the system dynamics of a catchment in terms of water and sediment transfer by breaking apart the system dynamics in stocks (the system state at a given moment) and flows (the system fluxes). Breaking apart the internal system dynamics that determine the behaviour of the catchment system is in our opinion a way to bring a better insight into the concepts of hydrological and sediment connectivity as described in previous research by Bracken et al (2013, 2015). By looking at the individual parts of the system, it becomes more manageable and less conceptual, which is important because we have to indicate where the research on connectivity should focus on. With this approach, processes and feedbacks in the catchment system can be pulled apart to study separately, making the system understandable and measureable, which will enable parameterization of models with actual measured data. The approach we took in describing water and sediment transfer is to first assess how they work in a system in dynamic equilibrium. After describing this, an assessment is made of how such dynamic equilibriums can be taken out of balance by an external push. Baartman, J.E.M., Masselink, R.H., Keesstra, S.D., Temme, A.J.A.M., 2013. Linking landscape morphological complexity and sediment connectivity. Earth Surface Processes and Landforms 38: 1457-1471. Bracken, L.J., Wainwright, J., Ali, G.A., Tetzlaff, D., Smith, M.W., Reaney, S.M., and Roy, A.G. 2013. Concepts of hydrological connectivity: research approaches, pathways and future agendas. Earth Science Reviews, 119, 17-34. Bracken, L.J., Turnbull, L, Wainwright, J. and Boogart, P. Submitted. Sediment Connectivity: A Framework for Understanding Sediment Transfer at Multiple Scales. Earth Surface Processes and Landforms. Cerdà, A., Brazier, R., Nearing, M., and de Vente, J. 2012. scales and erosion. Catena, 102, 1-2. doi:10.1016/j.catena.2011.09.006 Parsons A.J., Bracken L., Peoppl , R., Wainwright J., Keesstra, S.D., 2015. Editorial: Introduction to special issue on connectivity in water and sediment dynamics. In press in Earth Surface Processes and Landforms. DOI: 10.1002/esp.3714

  12. Population cycles are highly correlated over long time series and large spatial scales in two unrelated species: Greater sage-grouse and cottontail rabbits

    USGS Publications Warehouse

    Fedy, B.C.; Doherty, K.E.

    2011-01-01

    Animal species across multiple taxa demonstrate multi-annual population cycles, which have long been of interest to ecologists. Correlated population cycles between species that do not share a predator-prey relationship are particularly intriguing and challenging to explain. We investigated annual population trends of greater sage-grouse (Centrocercus urophasianus) and cottontail rabbits (Sylvilagus sp.) across Wyoming to explore the possibility of correlations between unrelated species, over multiple cycles, very large spatial areas, and relatively southern latitudes in terms of cycling species. We analyzed sage-grouse lek counts and annual hunter harvest indices from 1982 to 2007. We show that greater sage-grouse, currently listed as warranted but precluded under the US Endangered Species Act, and cottontails have highly correlated cycles (r = 0. 77). We explore possible mechanistic hypotheses to explain the synchronous population cycles. Our research highlights the importance of control populations in both adaptive management and impact studies. Furthermore, we demonstrate the functional value of these indices (lek counts and hunter harvest) for tracking broad-scale fluctuations in the species. This level of highly correlated long-term cycling has not previously been documented between two non-related species, over a long time-series, very large spatial scale, and within more southern latitudes. ?? 2010 US Government.

  13. Integrating concept ontology and multitask learning to achieve more effective classifier training for multilevel image annotation.

    PubMed

    Fan, Jianping; Gao, Yuli; Luo, Hangzai

    2008-03-01

    In this paper, we have developed a new scheme for achieving multilevel annotations of large-scale images automatically. To achieve more sufficient representation of various visual properties of the images, both the global visual features and the local visual features are extracted for image content representation. To tackle the problem of huge intraconcept visual diversity, multiple types of kernels are integrated to characterize the diverse visual similarity relationships between the images more precisely, and a multiple kernel learning algorithm is developed for SVM image classifier training. To address the problem of huge interconcept visual similarity, a novel multitask learning algorithm is developed to learn the correlated classifiers for the sibling image concepts under the same parent concept and enhance their discrimination and adaptation power significantly. To tackle the problem of huge intraconcept visual diversity for the image concepts at the higher levels of the concept ontology, a novel hierarchical boosting algorithm is developed to learn their ensemble classifiers hierarchically. In order to assist users on selecting more effective hypotheses for image classifier training, we have developed a novel hyperbolic framework for large-scale image visualization and interactive hypotheses assessment. Our experiments on large-scale image collections have also obtained very positive results.

  14. Ecology for the shrinking city (JA) | Science Inventory | US ...

    EPA Pesticide Factsheets

    This article brings together the concepts of shrinking cities—the hundreds of cities worldwide experiencing long-term population loss—and ecology for the city. Ecology for the city is the application of a social–ecological understanding to shaping urban form and function along sustainable trajectories. Ecology for the shrinking city therefore acknowledges that urban transformations to sustainable trajectories may be quite different in shrinking cities as compared with growing cities. Shrinking cities are well poised for transformations, because shrinking is perceived as a crisis and can mobilize the social capacity to change. Ecology is particularly well suited to contribute solutions because of the extent of vacant land in shrinking cities that can be leveraged for ecosystem-services provisioning. A crucial role of an ecology for the shrinking city is identifying innovative pathways that create locally desired amenities that provide ecosystem services and contribute to urban sustainability at multiple scales. This paper brings together the concepts of ecology for the city and shrinking cities – the hundreds of cities worldwide experiencing long-term population loss. Ecology for the city is the application of social-ecological understanding to shaping urban form and function along sustainable trajectories. Ecology for the shrinking city acknowledges that urban transformations to sustainable trajectories may be quite different in shrinking cities as compa

  15. Using a Large-scale Neural Model of Cortical Object Processing to Investigate the Neural Substrate for Managing Multiple Items in Short-term Memory.

    PubMed

    Liu, Qin; Ulloa, Antonio; Horwitz, Barry

    2017-11-01

    Many cognitive and computational models have been proposed to help understand working memory. In this article, we present a simulation study of cortical processing of visual objects during several working memory tasks using an extended version of a previously constructed large-scale neural model [Tagamets, M. A., & Horwitz, B. Integrating electrophysiological and anatomical experimental data to create a large-scale model that simulates a delayed match-to-sample human brain imaging study. Cerebral Cortex, 8, 310-320, 1998]. The original model consisted of arrays of Wilson-Cowan type of neuronal populations representing primary and secondary visual cortices, inferotemporal (IT) cortex, and pFC. We added a module representing entorhinal cortex, which functions as a gating module. We successfully implemented multiple working memory tasks using the same model and produced neuronal patterns in visual cortex, IT cortex, and pFC that match experimental findings. These working memory tasks can include distractor stimuli or can require that multiple items be retained in mind during a delay period (Sternberg's task). Besides electrophysiology data and behavioral data, we also generated fMRI BOLD time series from our simulation. Our results support the involvement of IT cortex in working memory maintenance and suggest the cortical architecture underlying the neural mechanisms mediating particular working memory tasks. Furthermore, we noticed that, during simulations of memorizing a list of objects, the first and last items in the sequence were recalled best, which may implicate the neural mechanism behind this important psychological effect (i.e., the primacy and recency effect).

  16. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    PubMed

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  17. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models

    PubMed Central

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542

  18. Gauge invariance and kaon production in deep inelastic scattering at low scales

    NASA Astrophysics Data System (ADS)

    Guerrero, Juan V.; Accardi, Alberto

    2018-06-01

    This paper focuses on hadron mass effects in calculations of semi-inclusive kaon production in lepton-Deuteron deeply inelastic scattering at HERMES and COMPASS kinematics. In the collinear factorization framework, the corresponding cross section is shown to factorize, at leading order and leading twist, into products of parton distributions and fragmentation functions evaluated in terms of kaon- and nucleon-mass-dependent scaling variables, and to respect gauge invariance. It is found that hadron mass corrections for integrated kaon multiplicities sizeably reduce the apparent large discrepancy between measurements of K++K- multiplicities performed by the two collaborations, and fully reconcile their K+/K- ratios.

  19. Negative Binomial Fits to Multiplicity Distributions from Central Collisions of (16)O+Cu at 14.6A GeV/c and Intermittency

    NASA Technical Reports Server (NTRS)

    Tannenbaum, M. J.

    1994-01-01

    The concept of "Intermittency" was introduced by Bialas and Peschanski to try to explain the "large" fluctuations of multiplicity in restricted intervals of rapidity or pseudorapidity. A formalism was proposed to to study non-statistical (more precisely, non-Poisson) fluctuations as a function of the size of rapidity interval, and it was further suggested that the "spikes" in the rapidity fluctuations were evidence of fractal or intermittent behavior, in analogy to turbulence in fluid dynamics which is characterized by self-similar fluctuations at all scales-the absence of well defined scale of length.

  20. Large Interstellar Polarisation Survey:The Dust Elongation When Combining Optical-Submm Polarisation

    NASA Astrophysics Data System (ADS)

    Siebenmorgen, Ralf; Voschinnikov, N.; Bagnulo, S.; Cox, N.; Cami, J.

    2017-10-01

    The Planck mission has shown that dust properties of the diffuse ISM varies on a large scale and we present variability on a small scales. We present FORS spectro-polarimetry obtained by the Large Interstellar Polarisation Survey along 60 sight-lines. We fit these combined with extinction data by a silicate and carbon dust model with grain sizes ranging from the molecular to the sub-mic. domain. Large silicates of prolate shape account for the observed polarisation. For 37 sight-lines we complement our data set with UVES high-resolution spectra that establish the presence of single or multiple clouds along individual sight-lines. We find correlations between extinction and Serkowski parameters with the dust model and that the presence of multiple clouds depolarises the incoming radiation. However, there is a degeneracy in the dust model between alignment efficiency and the elongation of the grains. This degeneracy can be broken by combining polarization data in the optical-to-submm. This is of wide general interest as it improves the accuracy of deriving dust masses. We show that a flat IR/submm polarisation spectrum with substantial polarisation is predicted from dust models.

  1. Architecting for Large Scale Agile Software Development: A Risk-Driven Approach

    DTIC Science & Technology

    2013-05-01

    addressed aspect of scale in agile software development. Practices such as Scrum of Scrums are meant to address orchestration of multiple development...owner, Scrum master) have differing responsibilities from the roles in the existing phase-based waterfall program structures. Such differences may... Scrum . Communication with both internal and external stakeholders must be open and documentation should not be used as a substitute for communication

  2. Using Galaxy to Perform Large-Scale Interactive Data Analyses

    PubMed Central

    Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton

    2014-01-01

    Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312

  3. Large-region acoustic source mapping using a movable array and sparse covariance fitting.

    PubMed

    Zhao, Shengkui; Tuna, Cagdas; Nguyen, Thi Ngoc Tho; Jones, Douglas L

    2017-01-01

    Large-region acoustic source mapping is important for city-scale noise monitoring. Approaches using a single-position measurement scheme to scan large regions using small arrays cannot provide clean acoustic source maps, while deploying large arrays spanning the entire region of interest is prohibitively expensive. A multiple-position measurement scheme is applied to scan large regions at multiple spatial positions using a movable array of small size. Based on the multiple-position measurement scheme, a sparse-constrained multiple-position vectorized covariance matrix fitting approach is presented. In the proposed approach, the overall sample covariance matrix of the incoherent virtual array is first estimated using the multiple-position array data and then vectorized using the Khatri-Rao (KR) product. A linear model is then constructed for fitting the vectorized covariance matrix and a sparse-constrained reconstruction algorithm is proposed for recovering source powers from the model. The user parameter settings are discussed. The proposed approach is tested on a 30 m × 40 m region and a 60 m × 40 m region using simulated and measured data. Much cleaner acoustic source maps and lower sound pressure level errors are obtained compared to the beamforming approaches and the previous sparse approach [Zhao, Tuna, Nguyen, and Jones, Proc. IEEE Intl. Conf. on Acoustics, Speech and Signal Processing (ICASSP) (2016)].

  4. Multiple-point principle with a scalar singlet extension of the standard model

    DOE PAGES

    Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika; ...

    2017-01-21

    Here, we suggest a scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized. Although there have been lots of attempts to realize the MPP at the Planck scale, the realization with keeping naturalness is quite difficult. This model can easily achieve the MPP at the Planck scale without large Higgs mass corrections. It is worth noting that the electroweak symmetry can be radiatively broken in our model. In the naturalness point of view, the singlet scalar mass should be of O(1 TeV) or less.more » Also, we consider right-handed neutrino extension of the model for neutrino mass generation. The model does not affect the MPP scenario, and might keep the naturalness with the new particle mass scale beyond TeV, thanks to accidental cancellation of Higgs mass corrections.« less

  5. Dewey on Educating Vocation: Bringing Adult Learning to the University

    ERIC Educational Resources Information Center

    Keitges, Mark

    2016-01-01

    This paper addresses Dewey's complex notion of vocation--particularly his idea of multiple vocational activities--and relates it to educating for vocation in colleges and universities. The author argues that higher educators can best respect a student's autonomy as a chooser--with multiple potential vocations--by giving him or her multiple…

  6. Taiwanese Married Women's Lived Experience of Zen Meditation

    ERIC Educational Resources Information Center

    Kang, Hsin-Ru

    2014-01-01

    Due to the impact of Confucianism on Taiwanese society, Taiwanese married women play multiple family roles including being a daughter-in-law, wife, mother, and working woman. Having to play multiple roles usually brings Taiwanese married women burdens and stress. It is reported that Zen meditation improves people's physical and mental wellbeing.…

  7. The Relationship between Teachers' Computer Self-Efficacy and Technology Integration in a School District's Bring Your Own Technology Initiative

    ERIC Educational Resources Information Center

    Ellis, Ashley F.

    2014-01-01

    The purpose of this mixed methods program evaluation study was to investigate the ways in which one public school district and its teachers implemented a Bring Your Own Technology (BYOT) initiative. This study also measured teachers' computer self-efficacy, as measured by Cassidy and Eachus' (2002) Computer User Self-Efficacy Scale, and…

  8. Towards methodical modelling: Differences between the structure and output dynamics of multiple conceptual models

    NASA Astrophysics Data System (ADS)

    Knoben, Wouter; Woods, Ross; Freer, Jim

    2016-04-01

    Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.

  9. Hierarchical Address Event Routing for Reconfigurable Large-Scale Neuromorphic Systems.

    PubMed

    Park, Jongkil; Yu, Theodore; Joshi, Siddharth; Maier, Christoph; Cauwenberghs, Gert

    2017-10-01

    We present a hierarchical address-event routing (HiAER) architecture for scalable communication of neural and synaptic spike events between neuromorphic processors, implemented with five Xilinx Spartan-6 field-programmable gate arrays and four custom analog neuromophic integrated circuits serving 262k neurons and 262M synapses. The architecture extends the single-bus address-event representation protocol to a hierarchy of multiple nested buses, routing events across increasing scales of spatial distance. The HiAER protocol provides individually programmable axonal delay in addition to strength for each synapse, lending itself toward biologically plausible neural network architectures, and scales across a range of hierarchies suitable for multichip and multiboard systems in reconfigurable large-scale neuromorphic systems. We show approximately linear scaling of net global synaptic event throughput with number of routing nodes in the network, at 3.6×10 7 synaptic events per second per 16k-neuron node in the hierarchy.

  10. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less

  11. Large-scale quarantine following biological terrorism in the United States: scientific examination, logistic and legal limits, and possible consequences.

    PubMed

    Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M

    2001-12-05

    Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.

  12. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    NASA Astrophysics Data System (ADS)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  13. Metal stack optimization for low-power and high-density for N7-N5

    NASA Astrophysics Data System (ADS)

    Raghavan, P.; Firouzi, F.; Matti, L.; Debacker, P.; Baert, R.; Sherazi, S. M. Y.; Trivkovic, D.; Gerousis, V.; Dusa, M.; Ryckaert, J.; Tokei, Z.; Verkest, D.; McIntyre, G.; Ronse, K.

    2016-03-01

    One of the key challenges while scaling logic down to N7 and N5 is the requirement of self-aligned multiple patterning for the metal stack. This comes with a large cost of the backend cost and therefore a careful stack optimization is required. Various layers in the stack have different purposes and therefore their choice of pitch and number of layers is critical. Furthermore, when in ultra scaled dimensions of N7 or N5, the number of patterning options are also much larger ranging from multiple LE, EUV to SADP/SAQP. The right choice of these are also needed patterning techniques that use a full grating of wires like SADP/SAQP techniques introduce high level of metal dummies into the design. This implies a large capacitance penalty to the design therefore having large performance and power penalties. This is often mitigated with extra masking strategies. This paper discusses a holistic view of metal stack optimization from standard cell level all the way to routing and the corresponding trade-off that exist for this space.

  14. Large Interstellar Polarisation Survey. II. UV/optical study of cloud-to-cloud variations of dust in the diffuse ISM

    NASA Astrophysics Data System (ADS)

    Siebenmorgen, R.; Voshchinnikov, N. V.; Bagnulo, S.; Cox, N. L. J.; Cami, J.; Peest, C.

    2018-03-01

    It is well known that the dust properties of the diffuse interstellar medium exhibit variations towards different sight-lines on a large scale. We have investigated the variability of the dust characteristics on a small scale, and from cloud-to-cloud. We use low-resolution spectro-polarimetric data obtained in the context of the Large Interstellar Polarisation Survey (LIPS) towards 59 sight-lines in the Southern Hemisphere, and we fit these data using a dust model composed of silicate and carbon particles with sizes from the molecular to the sub-micrometre domain. Large (≥6 nm) silicates of prolate shape account for the observed polarisation. For 32 sight-lines we complement our data set with UVES archive high-resolution spectra, which enable us to establish the presence of single-cloud or multiple-clouds towards individual sight-lines. We find that the majority of these 35 sight-lines intersect two or more clouds, while eight of them are dominated by a single absorbing cloud. We confirm several correlations between extinction and parameters of the Serkowski law with dust parameters, but we also find previously undetected correlations between these parameters that are valid only in single-cloud sight-lines. We find that interstellar polarisation from multiple-clouds is smaller than from single-cloud sight-lines, showing that the presence of a second or more clouds depolarises the incoming radiation. We find large variations of the dust characteristics from cloud-to-cloud. However, when we average a sufficiently large number of clouds in single-cloud or multiple-cloud sight-lines, we always retrieve similar mean dust parameters. The typical dust abundances of the single-cloud cases are [C]/[H] = 92 ppm and [Si]/[H] = 20 ppm.

  15. Large Scale Ionospheric Response During March 17, 2013 Geomagnetic Storm: Reanalysis Based on Multiple Satellites Observations and TIEGCM Simulations

    NASA Astrophysics Data System (ADS)

    Yue, X.; Wang, W.; Schreiner, W. S.; Kuo, Y. H.; Lei, J.; Liu, J.; Burns, A. G.; Zhang, Y.; Zhang, S.

    2015-12-01

    Based on slant total electron content (TEC) observations made by ~10 satellites and ~450 ground IGS GNSS stations, we constructed a 4-D ionospheric electron density reanalysis during the March 17, 2013 geomagnetic storm. Four main large-scale ionospheric disturbances are identified from reanalysis: (1) The positive storm during the initial phase; (2) The SED (storm enhanced density) structure in both northern and southern hemisphere; (3) The large positive storm in main phase; (4) The significant negative storm in middle and low latitude during recovery phase. We then run the NCAR-TIEGCM model with Heelis electric potential empirical model as polar input. The TIEGCM can reproduce 3 of 4 large-scale structures (except SED) very well. We then further analyzed the altitudinal variations of these large-scale disturbances and found several interesting things, such as the altitude variation of SED, the rotation of positive/negative storm phase with local time. Those structures could not be identified clearly by traditional used data sources, which either has no gloval coverage or no vertical resolution. The drivers such as neutral wind/density and electric field from TIEGCM simulations are also analyzed to self-consistantly explain the identified disturbance features.

  16. Genetic structuring of northern myotis (Myotis septentrionalis) at multiple spatial scales

    USGS Publications Warehouse

    Johnson, Joshua B.; Roberts, James H.; King, Timothy L.; Edwards, John W.; Ford, W. Mark; Ray, David A.

    2014-01-01

    Although groups of bats may be genetically distinguishable at large spatial scales, the effects of forest disturbances, particularly permanent land use conversions on fine-scale population structure and gene flow of summer aggregations of philopatric bat species are less clear. We genotyped and analyzed variation at 10 nuclear DNA microsatellite markers in 182 individuals of the forest-dwelling northern myotis (Myotis septentrionalis) at multiple spatial scales, from within first-order watersheds scaling up to larger regional areas in West Virginia and New York. Our results indicate that groups of northern myotis were genetically indistinguishable at any spatial scale we considered, and the collective population maintained high genetic diversity. It is likely that the ability to migrate, exploit small forest patches, and use networks of mating sites located throughout the Appalachian Mountains, Interior Highlands, and elsewhere in the hibernation range have allowed northern myotis to maintain high genetic diversity and gene flow regardless of forest disturbances at local and regional spatial scales. A consequence of maintaining high gene flow might be the potential to minimize genetic founder effects following population declines caused currently by the enzootic White-nose Syndrome.

  17. GC13I-0857: Designing a Frost Forecasting Service for Small Scale Tea Farmers in East Africa

    NASA Technical Reports Server (NTRS)

    Adams, Emily C.; Wanjohi, James Nyaga; Ellenburg, Walter Lee; Limaye, Ashutosh S.; Mugo, Robinson M.; Flores Cordova, Africa Ixmucane; Irwin, Daniel; Case, Jonathan; Malaso, Susan; Sedah, Absae

    2017-01-01

    Kenya is the third largest tea exporter in the world, producing 10% of the world's black tea. Sixty percent of this production occurs largely by small scale tea holders, with an average farm size of 1.04 acres, and an annual net income of $1,075. According to a recent evaluation, a typical frost event in the tea growing region causes about $200 dollars in losses which can be catastrophic for a small holder farm. A 72-hour frost forecast would provide these small-scale tea farmers with enough notice to reduce losses by approximately 80 USD annually. With this knowledge, SERVIR, a joint NASA-USAID initiative that brings Earth observations for improved decision making in developing countries, sought to design a frost monitoring and forecasting service that would provide farmers with enough lead time to react to and protect against a forecasted frost occurrence on their farm. SERVIR Eastern and Southern Africa, through its implementing partner, the Regional Centre for Mapping of Resources for Development (RCMRD), designed a service that included multiple stakeholder engagement events whereby stakeholders from the tea industry value chain were invited to share their experiences so that the exact needs and flow of information could be identified. This unique event allowed enabled the design of a service that fit the specifications of the stakeholders. The monitoring service component uses the MODIS Land Surface Temperature product to identify frost occurrences in near-real time. The prediction component, currently under testing, uses the 2-m air temperature, relative humidity, and 10-m wind speed from a series of high-resolution Weather Research and Forecasting (WRF) numerical weather prediction model runs over eastern Kenya as inputs into a frost prediction algorithm. Accuracy and sensitivity of the algorithm is being assessed with observations collected from the farmers using a smart phone app developed specifically to report frost occurrences, and from data shared through our partner network developed at the stakeholder engagement meeting. This presentation will illustrate the efficacy of our frost forecasting algorithm, and a way forward for incorporating these forecasts in a meaningful way to the key decision makers - the small-scale farmers of East Africa.

  18. Designing a Frost Forecasting Service for Small Scale Tea Farmers in East Africa

    NASA Astrophysics Data System (ADS)

    Adams, E. C.; Nyaga, J. W.; Ellenburg, W. L.; Limaye, A. S.; Mugo, R. M.; Flores Cordova, A. I.; Irwin, D.; Case, J.; Malaso, S.; Sedah, A.

    2017-12-01

    Kenya is the third largest tea exporter in the world, producing 10% of the world's black tea. Sixty percent of this production occurs largely by small scale tea holders, with an average farm size of 1.04 acres, and an annual net income of 1,075. According to a recent evaluation, a typical frost event in the tea growing region causes about 200 dollars in losses which can be catastrophic for a small holder farm. A 72-hour frost forecast would provide these small-scale tea farmers with enough notice to reduce losses by approximately $80 annually. With this knowledge, SERVIR, a joint NASA-USAID initiative that brings Earth observations for improved decision making in developing countries, sought to design a frost monitoring and forecasting service that would provide farmers with enough lead time to react to and protect against a forecasted frost occurrence on their farm. SERVIR Eastern and Southern Africa, through its implementing partner, the Regional Centre for Mapping of Resources for Development (RCMRD), designed a service that included multiple stakeholder engagement events whereby stakeholders from the tea industry value chain were invited to share their experiences so that the exact needs and flow of information could be identified. This unique event allowed enabled the design of a service that fit the specifications of the stakeholders. The monitoring service component uses the MODIS Land Surface Temperature product to identify frost occurrences in near-real time. The prediction component, currently under testing, uses the 2-m air temperature, relative humidity, and 10-m wind speed from a series of high-resolution Weather Research and Forecasting (WRF) numerical weather prediction model runs over eastern Kenya as inputs into a frost prediction algorithm. Accuracy and sensitivity of the algorithm is being assessed with observations collected from the farmers using a smart phone app developed specifically to report frost occurrences, and from data shared through our partner network developed at the stakeholder engagement meeting. This presentation will illustrate the efficacy of our frost forecasting algorithm, and a way forward for incorporating these forecasts in a meaningful way to the key decision makers - the small-scale farmers of East Africa.

  19. Multi-scale temporal and spatial variation in genotypic composition of Cladophora-borne Escherichia coli populations in Lake Michigan.

    PubMed

    Badgley, Brian D; Ferguson, John; Vanden Heuvel, Amy; Kleinheinz, Gregory T; McDermott, Colleen M; Sandrin, Todd R; Kinzelman, Julie; Junion, Emily A; Byappanahalli, Muruleedhara N; Whitman, Richard L; Sadowsky, Michael J

    2011-01-01

    High concentrations of Escherichia coli in mats of Cladophora in the Great Lakes have raised concern over the continued use of this bacterium as an indicator of microbial water quality. Determining the impacts of these environmentally abundant E. coli, however, necessitates a better understanding of their ecology. In this study, the population structure of 4285 Cladophora-borne E. coli isolates, obtained over multiple three day periods from Lake Michigan Cladophora mats in 2007-2009, was examined by using DNA fingerprint analyses. In contrast to previous studies that have been done using isolates from attached Cladophora obtained over large time scales and distances, the extensive sampling done here on free-floating mats over successive days at multiple sites provided a large dataset that allowed for a detailed examination of changes in population structure over a wide range of spatial and temporal scales. While Cladophora-borne E. coli populations were highly diverse and consisted of many unique isolates, multiple clonal groups were also present and accounted for approximately 33% of all isolates examined. Patterns in population structure were also evident. At the broadest scales, E. coli populations showed some temporal clustering when examined by year, but did not show good spatial distinction among sites. E. coli population structure also showed significant patterns at much finer temporal scales. Populations were distinct on an individual mat basis at a given site, and on individual days within a single mat. Results of these studies indicate that Cladophora-borne E. coli populations consist of a mixture of stable, and possibly naturalized, strains that persist during the life of the mat, and more unique, transient strains that can change over rapid time scales. It is clear that further study of microbial processes at fine spatial and temporal scales is needed, and that caution must be taken when interpolating short term microbial dynamics from results obtained from weekly or monthly samples. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Multi-scale temporal and spatial variation in genotypic composition of Cladophora-borne Escherichia coli populations in Lake Michigan

    USGS Publications Warehouse

    Badgley, B.D.; Ferguson, J.; Heuvel, A.V.; Kleinheinz, G.T.; McDermott, C.M.; Sandrin, T.R.; Kinzelman, J.; Junion, E.A.; Byappanahalli, M.N.; Whitman, R.L.; Sadowsky, M.J.

    2011-01-01

    High concentrations of Escherichia coli in mats of Cladophora in the Great Lakes have raised concern over the continued use of this bacterium as an indicator of microbial water quality. Determining the impacts of these environmentally abundant E. coli, however, necessitates a better understanding of their ecology. In this study, the population structure of 4285 Cladophora-borne E. coli isolates, obtained over multiple three day periods from Lake Michigan Cladophora mats in 2007-2009, was examined by using DNA fingerprint analyses. In contrast to previous studies that have been done using isolates from attached Cladophora obtained over large time scales and distances, the extensive sampling done here on free-floating mats over successive days at multiple sites provided a large dataset that allowed for a detailed examination of changes in population structure over a wide range of spatial and temporal scales. While Cladophora-borne E. coli populations were highly diverse and consisted of many unique isolates, multiple clonal groups were also present and accounted for approximately 33% of all isolates examined. Patterns in population structure were also evident. At the broadest scales, E. coli populations showed some temporal clustering when examined by year, but did not show good spatial distinction among sites. E. coli population structure also showed significant patterns at much finer temporal scales. Populations were distinct on an individual mat basis at a given site, and on individual days within a single mat. Results of these studies indicate that Cladophora-borne E. coli populations consist of a mixture of stable, and possibly naturalized, strains that persist during the life of the mat, and more unique, transient strains that can change over rapid time scales. It is clear that further study of microbial processes at fine spatial and temporal scales is needed, and that caution must be taken when interpolating short term microbial dynamics from results obtained from weekly or monthly samples.

  1. Generalized Master Equation with Non-Markovian Multichromophoric Förster Resonance Energy Transfer for Modular Exciton Densities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Seogjoo; Hoyer, Stephan; Fleming, Graham

    2014-10-31

    A generalized master equation (GME) governing quantum evolution of modular exciton density (MED) is derived for large scale light harvesting systems composed of weakly interacting modules of multiple chromophores. The GME-MED offers a practical framework to incorporate real time coherent quantum dynamics calculations of small length scales into dynamics over large length scales, and also provides a non-Markovian generalization and rigorous derivation of the Pauli master equation employing multichromophoric Förster resonance energy transfer rates. A test of the GME-MED for four sites of the Fenna-Matthews-Olson complex demonstrates how coherent dynamics of excitonic populations over coupled chromophores can be accurately describedmore » by transitions between subgroups (modules) of delocalized excitons. Application of the GME-MED to the exciton dynamics between a pair of light harvesting complexes in purple bacteria demonstrates its promise as a computationally efficient tool to investigate large scale exciton dynamics in complex environments.« less

  2. Assessing the influence of rater and subject characteristics on measures of agreement for ordinal ratings.

    PubMed

    Nelson, Kerrie P; Mitani, Aya A; Edwards, Don

    2017-09-10

    Widespread inconsistencies are commonly observed between physicians' ordinal classifications in screening tests results such as mammography. These discrepancies have motivated large-scale agreement studies where many raters contribute ratings. The primary goal of these studies is to identify factors related to physicians and patients' test results, which may lead to stronger consistency between raters' classifications. While ordered categorical scales are frequently used to classify screening test results, very few statistical approaches exist to model agreement between multiple raters. Here we develop a flexible and comprehensive approach to assess the influence of rater and subject characteristics on agreement between multiple raters' ordinal classifications in large-scale agreement studies. Our approach is based upon the class of generalized linear mixed models. Novel summary model-based measures are proposed to assess agreement between all, or a subgroup of raters, such as experienced physicians. Hypothesis tests are described to formally identify factors such as physicians' level of experience that play an important role in improving consistency of ratings between raters. We demonstrate how unique characteristics of individual raters can be assessed via conditional modes generated during the modeling process. Simulation studies are presented to demonstrate the performance of the proposed methods and summary measure of agreement. The methods are applied to a large-scale mammography agreement study to investigate the effects of rater and patient characteristics on the strength of agreement between radiologists. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    PubMed

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (<1 Hz). However, it is difficult to determine the frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  4. Integrated Forecast-Decision Systems For River Basin Planning and Management

    NASA Astrophysics Data System (ADS)

    Georgakakos, A. P.

    2005-12-01

    A central application of climatology, meteorology, and hydrology is the generation of reliable forecasts for water resources management. In principle, effective use of forecasts could improve water resources management by providing extra protection against floods, mitigating the adverse effects of droughts, generating more hydropower, facilitating recreational activities, and minimizing the impacts of extreme events on the environment and the ecosystems. In practice, however, realization of these benefits depends on three requisite elements. First is the skill and reliability of forecasts. Second is the existence of decision support methods/systems with the ability to properly utilize forecast information. And third is the capacity of the institutional infrastructure to incorporate the information provided by the decision support systems into the decision making processes. This presentation discusses several decision support systems (DSS) using ensemble forecasting that have been developed by the Georgia Water Resources Institute for river basin management. These DSS are currently operational in Africa, Europe, and the US and address integrated water resources and energy planning and management in river basins with multiple water uses, multiple relevant temporal and spatial scales, and multiple decision makers. The article discusses the methods used and advocates that the design, development, and implementation of effective forecast-decision support systems must bring together disciplines, people, and institutions necessary to address today's complex water resources challenges.

  5. Multiple-basin energy landscapes for large-amplitude conformational motions of proteins: Structure-based molecular dynamics simulations

    PubMed Central

    Okazaki, Kei-ichi; Koga, Nobuyasu; Takada, Shoji; Onuchic, Jose N.; Wolynes, Peter G.

    2006-01-01

    Biomolecules often undergo large-amplitude motions when they bind or release other molecules. Unlike macroscopic machines, these biomolecular machines can partially disassemble (unfold) and then reassemble (fold) during such transitions. Here we put forward a minimal structure-based model, the “multiple-basin model,” that can directly be used for molecular dynamics simulation of even very large biomolecular systems so long as the endpoints of the conformational change are known. We investigate the model by simulating large-scale motions of four proteins: glutamine-binding protein, S100A6, dihydrofolate reductase, and HIV-1 protease. The mechanisms of conformational transition depend on the protein basin topologies and change with temperature near the folding transition. The conformational transition rate varies linearly with driving force over a fairly large range. This linearity appears to be a consequence of partial unfolding during the conformational transition. PMID:16877541

  6. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  7. Efficient and Extensible Quasi-Explicit Modular Nonlinear Multiscale Battery Model: GH-MSMD

    DOE PAGES

    Kim, Gi-Heon; Smith, Kandler; Lawrence-Simon, Jake; ...

    2017-03-24

    Complex physics and long computation time hinder the adoption of computer aided engineering models in the design of large-format battery cells and systems. A modular, efficient battery simulation model -- the multiscale multidomain (MSMD) model -- was previously introduced to aid the scale-up of Li-ion material and electrode designs to complete cell and pack designs, capturing electrochemical interplay with 3-D electronic current pathways and thermal response. Here, this paper enhances the computational efficiency of the MSMD model using a separation of time-scales principle to decompose model field variables. The decomposition provides a quasi-explicit linkage between the multiple length-scale domains andmore » thus reduces time-consuming nested iteration when solving model equations across multiple domains. In addition to particle-, electrode- and cell-length scales treated in the previous work, the present formulation extends to bus bar- and multi-cell module-length scales. We provide example simulations for several variants of GH electrode-domain models.« less

  8. Multiple mechanisms generate a universal scaling with dissipation for the air-water gas transfer velocity

    NASA Astrophysics Data System (ADS)

    Katul, Gabriel; Liu, Heping

    2017-02-01

    A large corpus of field and laboratory experiments support the finding that the water side transfer velocity kL of sparingly soluble gases near air-water interfaces scales as kL˜(νɛ)1/4, where ν is the kinematic water viscosity and ɛ is the mean turbulent kinetic energy dissipation rate. Originally predicted from surface renewal theory, this scaling appears to hold for marine and coastal systems and across many environmental conditions. It is shown that multiple approaches to representing the effects of turbulence on kL lead to this expression when the Kolmogorov microscale is assumed to be the most efficient transporting eddy near the interface. The approaches considered range from simplified surface renewal schemes with distinct models for renewal durations, scaling and dimensional considerations, and a new structure function approach derived using analogies between scalar and momentum transfer. The work offers a new perspective as to why the aforementioned 1/4 scaling is robust.

  9. The long-term demographic role of community-based family planning in rural Bangladesh.

    PubMed

    Phillips, J F; Hossain, M B; Arends-Kuenning, M

    1996-01-01

    Experimental studies demonstrating the effectiveness of nonclinical distribution of contraceptives are typically conducted in settings where contraceptive use is low and unmet need is extensive. Determining the long-term role of active outreach programs after initial demand is met represents an increasingly important policy issue in Asia, where contraceptive prevalence is high and fixed service points are conveniently available. This article examines the long-term rationale for household family planning in Bangladesh-where growing use of contraceptives, rapid fertility decline, and normative change in reproductive preferences are in progress, bringing into question the rationale for large-scale deployment of paid outreach workers. Longitudinal data are analyzed that record outreach encounters and contraceptive use dynamics in a large rural population. Findings demonstrate that outreach has a continuing impact on program effectiveness, even after a decade of household visitation. The policy implications of this finding are reviewed.

  10. Performance of distributed multiscale simulations

    PubMed Central

    Borgdorff, J.; Ben Belgacem, M.; Bona-Casas, C.; Fazendeiro, L.; Groen, D.; Hoenen, O.; Mizeranschi, A.; Suter, J. L.; Coster, D.; Coveney, P. V.; Dubitzky, W.; Hoekstra, A. G.; Strand, P.; Chopard, B.

    2014-01-01

    Multiscale simulations model phenomena across natural scales using monolithic or component-based code, running on local or distributed resources. In this work, we investigate the performance of distributed multiscale computing of component-based models, guided by six multiscale applications with different characteristics and from several disciplines. Three modes of distributed multiscale computing are identified: supplementing local dependencies with large-scale resources, load distribution over multiple resources, and load balancing of small- and large-scale resources. We find that the first mode has the apparent benefit of increasing simulation speed, and the second mode can increase simulation speed if local resources are limited. Depending on resource reservation and model coupling topology, the third mode may result in a reduction of resource consumption. PMID:24982258

  11. Building International Research Partnerships in the North Atlantic-Arctic Region

    NASA Astrophysics Data System (ADS)

    Benway, Heather M.; Hofmann, Eileen; St. John, Michael

    2014-09-01

    The North Atlantic-Arctic region, which is critical to the health and socioeconomic well being of North America and Europe, is susceptible to climate-driven changes in circulation, biogeochemistry, and marine ecosystems. The need for strong investment in the study of biogeochemical and ecosystem processes and interactions with physical processes over a range of time and space scales in this region was clearly stated in the 2013 Galway Declaration, an intergovernmental statement on Atlantic Ocean cooperation (http://europa.eu/rapid/press-release_IP-13-459_en.htm). Subsequently, a workshop was held to bring together researchers from the United States, Canada, and Europe with expertise across multiple disciplines to discuss an international research initiative focused on key features, processes, and ecosystem services (e.g., Atlantic Meridional Overturning Circulation, spring bloom dynamics, fisheries, etc.) and associated sensitivities to climate changes.

  12. Hydrological Forecasting Practices in Brazil

    NASA Astrophysics Data System (ADS)

    Fan, Fernando; Paiva, Rodrigo; Collischonn, Walter; Ramos, Maria-Helena

    2016-04-01

    This work brings a review on current hydrological and flood forecasting practices in Brazil, including the main forecasts applications, the different kinds of techniques that are currently being employed and the institutions involved on forecasts generation. A brief overview of Brazil is provided, including aspects related to its geography, climate, hydrology and flood hazards. A general discussion about the Brazilian practices on hydrological short and medium range forecasting is presented. Detailed examples of some hydrological forecasting systems that are operational or in a research/pre-operational phase using the large scale hydrological model MGB-IPH are also presented. Finally, some suggestions are given about how the forecasting practices in Brazil can be understood nowadays, and what are the perspectives for the future.

  13. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    PubMed

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  14. How can the impact of PACS on inpatient length of hospital stay be established?

    NASA Astrophysics Data System (ADS)

    Bryan, Stirling; Muris, Nicole; Keen, Justin; Weatherburn, Gwyneth C.; Buxton, Martin J.

    1994-05-01

    Many have argued that the introduction of a large-scale PACS system into a hospital will bring about reductions in the length of inpatient hospital stays. There is currently no convicting empirical evidence to support such claims. As part of the independent evaluation exercise being undertaken alongside the Hammersmith Hospital PACS implementation, an assessment is being made of the impact of PACS on length of stay for selected patient groups. This paper reports the general research methods being employed to undertake this assessment and provides some baseline results from the analysis of total hip replacement patients and total knee replacement patients treated prior to the introduction of PACS.

  15. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    PubMed

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  16. Road Damage Extraction from Post-Earthquake Uav Images Assisted by Vector Data

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Dou, A.

    2018-04-01

    Extraction of road damage information after earthquake has been regarded as urgent mission. To collect information about stricken areas, Unmanned Aerial Vehicle can be used to obtain images rapidly. This paper put forward a novel method to detect road damage and bring forward a coefficient to assess road accessibility. With the assistance of vector road data, image data of the Jiuzhaigou Ms7.0 Earthquake is tested. In the first, the image is clipped according to vector buffer. Then a large-scale segmentation is applied to remove irrelevant objects. Thirdly, statistics of road features are analysed, and damage information is extracted. Combining with the on-filed investigation, the extraction result is effective.

  17. A peer-to-peer music sharing system based on query-by-humming

    NASA Astrophysics Data System (ADS)

    Wang, Jianrong; Chang, Xinglong; Zhao, Zheng; Zhang, Yebin; Shi, Qingwei

    2007-09-01

    Today, the main traffic in peer-to-peer (P2P) network is still multimedia files including large numbers of music files. The study of Music Information Retrieval (MIR) brings out many encouraging achievements in music search area. Nevertheless, the research of music search based on MIR in P2P network is still insufficient. Query by Humming (QBH) is one MIR technology studied for years. In this paper, we present a server based P2P music sharing system which is based on QBH and integrated with a Hierarchical Index Structure (HIS) to enhance the relation between surface data and potential information. HIS automatically evolving depends on the music related items carried by each peer such as midi files, lyrics and so forth. Instead of adding large amount of redundancy, the system generates a bit of index for multiple search input which improves the traditional keyword-based text search mode largely. When network bandwidth, speed, etc. are no longer a bottleneck of internet serve, the accessibility and accuracy of information provided by internet are being more concerned by end users.

  18. Memory: Ironing Out a Wrinkle in Time.

    PubMed

    Miller, Adam M P; Frankland, Paul W; Josselyn, Sheena A

    2018-05-21

    Individual hippocampal neurons encode time over seconds, whereas large-scale changes in population activity of hippocampal neurons encode time over minutes and days. New research shows how the hippocampus represents these multiple timescales simultaneously. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Qualis-SIS: automated standard curve generation and quality assessment for multiplexed targeted quantitative proteomic experiments with labeled standards.

    PubMed

    Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H

    2015-02-06

    Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.

  20. Bringing quality improvement into the intensive care unit.

    PubMed

    McMillan, Tracy R; Hyzy, Robert C

    2007-02-01

    During the last several years, many governmental and nongovernmental organizations have championed the application of the principles of quality improvement to the practice of medicine, particularly in the area of critical care. To review the breadth of approaches to quality improvement in the intensive care unit, including measures such as mortality and length of stay, and the use of protocols, bundles, and the role of large, multiple-hospital collaboratives. Several agencies have participated in the application of the quality movement to medicine, culminating in the development of standards such as the intensive care unit core measures of the Joint Commission on Accreditation of Healthcare Organizations. Although "zero defects" may not be possible in all measurable variables of quality in the intensive care unit, several measures, such as catheter-related bloodstream infections, can be significantly reduced through the implementation of improved processes of care, such as care bundles. Large, multiple-center, quality improvement collaboratives, such as the Michigan Keystone Intensive Care Unit Project, may be particularly effective in improving the quality of care by creating a "bandwagon effect" within a geographic region. The quality revolution is having a significant effect in the critical care unit and is likely to be facilitated by the transition to the electronic medical record.

  1. False Discovery Control in Large-Scale Spatial Multiple Testing

    PubMed Central

    Sun, Wenguang; Reich, Brian J.; Cai, T. Tony; Guindani, Michele; Schwartzman, Armin

    2014-01-01

    Summary This article develops a unified theoretical and computational framework for false discovery control in multiple testing of spatial signals. We consider both point-wise and cluster-wise spatial analyses, and derive oracle procedures which optimally control the false discovery rate, false discovery exceedance and false cluster rate, respectively. A data-driven finite approximation strategy is developed to mimic the oracle procedures on a continuous spatial domain. Our multiple testing procedures are asymptotically valid and can be effectively implemented using Bayesian computational algorithms for analysis of large spatial data sets. Numerical results show that the proposed procedures lead to more accurate error control and better power performance than conventional methods. We demonstrate our methods for analyzing the time trends in tropospheric ozone in eastern US. PMID:25642138

  2. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  3. The Triggering of Large-Scale Waves by CME Initiation

    NASA Astrophysics Data System (ADS)

    Forbes, Terry

    Studies of the large-scale waves generated at the onset of a coronal mass ejection (CME) can provide important information about the processes in the corona that trigger and drive CMEs. The size of the region where the waves originate can indicate the location of the magnetic forces that drive the CME outward, and the rate at which compressive waves steepen into shocks can provide a measure of how the driving forces develop in time. However, in practice it is difficult to separate the effects of wave formation from wave propagation. The problem is particularly acute for the corona because of the multiplicity of wave modes (e.g. slow versus fast MHD waves) and the highly nonuniform structure of the solar atmosphere. At the present time large-scale numerical simulations provide the best hope for deconvolving wave propagation and formation effects from one another.

  4. From Fibrils to Toughness: Multi-Scale Mechanics of Fibrillating Interfaces in Stretchable Electronics

    PubMed Central

    van der Sluis, Olaf; Vossen, Bart; Geers, Marc

    2018-01-01

    Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908

  5. Counting on β-Diversity to Safeguard the Resilience of Estuaries

    PubMed Central

    de Juan, Silvia; Thrush, Simon F.; Hewitt, Judi E.

    2013-01-01

    Coastal ecosystems are often stressed by non-point source and cumulative effects that can lead to local-scale community homogenisation and a concomitant loss of large-scale ecological connectivity. Here we investigate the use of β-diversity as a measure of both community heterogeneity and ecological connectivity. To understand the consequences of different environmental scenarios on heterogeneity and connectivity, it is necessary to understand the scale at which different environmental factors affect β-diversity. We sampled macrofauna from intertidal sites in nine estuaries from New Zealand’s North Island that represented different degrees of stress derived from land-use. We used multiple regression models to identify relationships between β-diversity and local sediment variables, factors related to the estuarine and catchment hydrodynamics and morphology and land-based stressors. At local scales, we found higher β-diversity at sites with a relatively high total richness. At larger scales, β-diversity was positively related to γ-diversity, suggesting that a large regional species pool was linked with large-scale heterogeneity in these systems. Local environmental heterogeneity influenced β-diversity at both local and regional scales, although variables at the estuarine and catchment scales were both needed to explain large scale connectivity. The estuaries expected a priori to be the most stressed exhibited higher variance in community dissimilarity between sites and connectivity to the estuary species pool. This suggests that connectivity and heterogeneity metrics could be used to generate early warning signals of cumulative stress. PMID:23755252

  6. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring should be strengthened.

  7. Numerical Simulation of The Mediterranean Sea Using Diecast: Interaction Between Basin, Sub-basin and Local Scale Features and Natural Variability.

    NASA Astrophysics Data System (ADS)

    Fernández, V.; Dietrich, D. E.; Haney, R. L.; Tintoré, J.

    In situ and satellite data obtained during the last ten years have shown that the circula- tion in the Mediterranean Sea is extremely complex in space, with significant features ranging from mesoscale to sub-basin and basin scale, and highly variable in time, with mesoscale to seasonal and interannual signals. Also, the steep bottom topography and the variable atmospheric conditions from one sub-basin to another, make the circula- tion to be composed of numerous energetic and narrow coastal currents, density fronts and mesoscale structures that interact at sub-basin scale with the large scale circula- tion. To simulate numerically and better understand these features, besides high grid resolution, a low numerical dispersion and low physical dissipation ocean model is required. We present the results from a 1/8z horizontal resolution numerical simula- tion of the Mediterranean Sea using DieCAST ocean model, which meets the above requirements since it is stable with low general dissipation and uses accurate fourth- order-accurate approximations with low numerical dispersion. The simulations are carried out with climatological surface forcing using monthly mean winds and relax- ation towards climatological values of temperature and salinity. The model reproduces the main features of the large basin scale circulation, as well as the seasonal variabil- ity of sub-basin scale currents that are well documented by observations in straits and channels. In addition, DieCAST brings out natural fronts and eddies that usually do not appear in numerical simulations of the Mediterranean and that lead to a natural interannual variability. The role of this intrinsic variability in the general circulation will be discussed.

  8. An Investigation of the Generalizability and Dependability of Direct Behavior Rating Single Item Scales (DBR-SIS) to Measure Academic Engagement and Disruptive Behavior of Middle School Students

    ERIC Educational Resources Information Center

    Chafouleas, Sandra M.; Briesch, Amy M.; Riley-Tillman, T. Chris; Christ, Theodore J.; Black, Anne C.; Kilgus, Stephen P.

    2010-01-01

    A total of 4 raters, including 2 teachers and 2 research assistants, used Direct Behavior Rating Single Item Scales (DBR-SIS) to measure the academic engagement and disruptive behavior of 7 middle school students across multiple occasions. Generalizability study results for the full model revealed modest to large magnitudes of variance associated…

  9. Characterizing Impacts of Land Grabbing on Terrestrial Vegetation and Ecohydrologic change in Mozambique through Multiple-sensor Remote Sensing and Models

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Lakshmi, V.; Al-Barakat, R.; Maksimowicz, M.

    2017-12-01

    Land grabbing, the acquisition of large areas of land by external entities, results from interactions of complex global economic, social, and political processes. These transactions are controversial because they can result in large-scale disruptions to historical land uses, including increased intensity of agricultural practices and significant conversions in land cover. These large-scale disruptions have the potential to impact surface water and energy balance because vegetation controls the partitioning of incoming energy into latent and sensible heat fluxes and precipitation into runoff and infiltration. Because large-scale land acquisitions can impact local ecosystem services, it is important to document changes in terrestrial vegetation associated with these acquisitions to support the assessment of associated impacts on regional surface water and energy balance, spatiotemporal scales of those changes, and interactions and feedbacks with other processes, particularly in the atmosphere. We use remote sensing data from multiple satellite platforms to diagnose and characterize changes in terrestrial vegetation and ecohydrology in Mozambique during periods that bracket periods associated with significant. The Advanced very High Resolution Radiometer (AVHRR) sensor provides long-term continuous data that can document historical seasonal cycles of vegetation greenness. These data are augmented with analyses from Landsat multispectral data, which provides significantly higher spatial resolution. Here we quantify spatiotemporal changes in vegetation are associated with periods of significant land acquisitions in Mozambique. This analysis complements a suite of land-atmosphere modeling experiments designed to deduce potential changes in land surface water and energy budgets associated with these acquisitions. This work advance understanding of how telecouplings between global economic and political forcings and regional hydrology and climate.

  10. FLARE (Facility for Laboratory Reconnection Experiments): A Major Next-Step for Laboratory Studies of Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Belova, E.; Ellis, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Que, W.; Ren, Y.; Titus, P.; Yamada, M.; Yoo, J.

    2014-12-01

    A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE, is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) at Princeton (http://mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to space and solar plasmas. The motivating major physics questions, the construction status, and the planned collaborative research especially with space and solar research communities will be discussed.

  11. FLARE (Facility for Laboratory Reconnection Experiments): A Major Next-Step for Laboratory Studies of Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Ji, Hantao; Bhattacharjee, A.; Prager, S.; Daughton, W.; Bale, Stuart D.; Carter, T.; Crocker, N.; Drake, J.; Egedal, J.; Sarff, J.; Fox, W.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Yamada, M.; Yoo, J.

    2015-04-01

    A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE (flare.pppl.gov), is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to heliophysical and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) (mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to magnetospheric, solar wind, and solar coronal plasmas. After a brief summary of recent laboratory results on the topic of magnetic reconnection, the motivating major physics questions, the construction status, and the planned collaborative research especially with heliophysics communities will be discussed.

  12. Community assembly of the ferns of Florida.

    PubMed

    Sessa, Emily B; Chambers, Sally M; Li, Daijiang; Trotta, Lauren; Endara, Lorena; Burleigh, J Gordon; Baiser, Benjamin

    2018-03-01

    Many ecological and evolutionary processes shape the assembly of organisms into local communities from a regional pool of species. We analyzed phylogenetic and functional diversity to understand community assembly of the ferns of Florida at two spatial scales. We built a phylogeny for 125 of the 141 species of ferns in Florida using five chloroplast markers. We calculated mean pairwise dissimilarity (MPD) and mean nearest taxon distance (MNTD) from phylogenetic distances and functional trait data for both spatial scales and compared the results to null models to assess significance. Our results for over vs. underdispersion in functional and phylogenetic diversity differed depending on spatial scale and metric considered. At the county scale, MPD revealed evidence for phylogenetic overdispersion, while MNTD revealed phylogenetic and functional underdispersion, and at the conservation area scale, MPD revealed phylogenetic and functional underdispersion while MNTD revealed evidence only of functional underdispersion. Our results are consistent with environmental filtering playing a larger role at the smaller, conservation area scale. The smaller spatial units are likely composed of fewer local habitat types that are selecting for closely related species, with the larger-scale units more likely to be composed of multiple habitat types that bring together a larger pool of species from across the phylogeny. Several aspects of fern biology, including their unique physiology and water relations and the importance of the independent gametophyte stage of the life cycle, make ferns highly sensitive to local, microhabitat conditions. © 2018 The Authors. American Journal of Botany is published by Wiley Periodicals, Inc. on behalf of the Botanical Society of America.

  13. Record Balkan floods of 2014 linked to planetary wave resonance.

    PubMed

    Stadtherr, Lisa; Coumou, Dim; Petoukhov, Vladimir; Petri, Stefan; Rahmstorf, Stefan

    2016-04-01

    In May 2014, the Balkans were hit by a Vb-type cyclone that brought disastrous flooding and severe damage to Bosnia and Herzegovina, Serbia, and Croatia. Vb cyclones migrate from the Mediterranean, where they absorb warm and moist air, to the north, often causing flooding in central/eastern Europe. Extreme rainfall events are increasing on a global scale, and both thermodynamic and dynamical mechanisms play a role. Where thermodynamic aspects are generally well understood, there is large uncertainty associated with current and future changes in dynamics. We study the climatic and meteorological factors that influenced the catastrophic flooding in the Balkans, where we focus on large-scale circulation. We show that the Vb cyclone was unusually stationary, bringing extreme rainfall for several consecutive days, and that this situation was likely linked to a quasi-stationary circumglobal Rossby wave train. We provide evidence that this quasi-stationary wave was amplified by wave resonance. Statistical analysis of daily spring rainfall over the Balkan region reveals significant upward trends over 1950-2014, especially in the high quantiles relevant for flooding events. These changes cannot be explained by simple thermodynamic arguments, and we thus argue that dynamical processes likely played a role in increasing flood risks over the Balkans.

  14. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  15. Volunteers Oriented Interface Design for the Remote Navigation of Rescue Robots at Large-Scale Disaster Sites

    NASA Astrophysics Data System (ADS)

    Yang, Zhixiao; Ito, Kazuyuki; Saijo, Kazuhiko; Hirotsune, Kazuyuki; Gofuku, Akio; Matsuno, Fumitoshi

    This paper aims at constructing an efficient interface being similar to those widely used in human daily life, to fulfill the need of many volunteer rescuers operating rescue robots at large-scale disaster sites. The developed system includes a force feedback steering wheel interface and an artificial neural network (ANN) based mouse-screen interface. The former consists of a force feedback steering control and a six monitors’ wall. It provides a manual operation like driving cars to navigate a rescue robot. The latter consists of a mouse and a camera’s view displayed in a monitor. It provides a semi-autonomous operation by mouse clicking to navigate a rescue robot. Results of experiments show that a novice volunteer can skillfully navigate a tank rescue robot through both interfaces after 20 to 30 minutes of learning their operation respectively. The steering wheel interface has high navigating speed in open areas, without restriction of terrains and surface conditions of a disaster site. The mouse-screen interface is good at exact navigation in complex structures, while bringing little tension to operators. The two interfaces are designed to switch into each other at any time to provide a combined efficient navigation method.

  16. Occupational health and safety of workers in agriculture and horticulture.

    PubMed

    Lundqvist, P

    2000-01-01

    Working in agriculture and horticulture gives considerable job satisfaction. The tasks are often interesting; you can see the result of your own work, watch your crop grow and mature; you have an affinity with nature and can follow the changes in the seasons. However, today it is a dangerous work environment fraught with occupational injuries and diseases due to hazardous situations and to physiological, physical, biological, chemical, psychological, and sociological factors. The ongoing rapid development may, on the other hand, bring about many changes during the next decades with more farmers and growers switching to organic production. Moreover, increased awareness of animal welfare also may lead to improved working conditions. Large-scale operations with fewer family-operated agricultural businesses might mean fewer injuries among children and older farmers. A consequence of large-scale operations may also be better regulation of working conditions. The greater use of automation technology eliminates many harmful working postures and movements when milking cows and carrying out other tasks. Information technology offers people the opportunity to gain more knowledge about their work. Labeling food produced in a worker-friendly work environment may give the consumers a chance to be involved in the process.

  17. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.

  18. Fast Generation of Ensembles of Cosmological N-Body Simulations via Mode-Resampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, M D; Cole, S; Frenk, C S

    2011-02-14

    We present an algorithm for quickly generating multiple realizations of N-body simulations to be used, for example, for cosmological parameter estimation from surveys of large-scale structure. Our algorithm uses a new method to resample the large-scale (Gaussian-distributed) Fourier modes in a periodic N-body simulation box in a manner that properly accounts for the nonlinear mode-coupling between large and small scales. We find that our method for adding new large-scale mode realizations recovers the nonlinear power spectrum to sub-percent accuracy on scales larger than about half the Nyquist frequency of the simulation box. Using 20 N-body simulations, we obtain a powermore » spectrum covariance matrix estimate that matches the estimator from Takahashi et al. (from 5000 simulations) with < 20% errors in all matrix elements. Comparing the rates of convergence, we determine that our algorithm requires {approx}8 times fewer simulations to achieve a given error tolerance in estimates of the power spectrum covariance matrix. The degree of success of our algorithm indicates that we understand the main physical processes that give rise to the correlations in the matter power spectrum. Namely, the large-scale Fourier modes modulate both the degree of structure growth through the variation in the effective local matter density and also the spatial frequency of small-scale perturbations through large-scale displacements. We expect our algorithm to be useful for noise modeling when constraining cosmological parameters from weak lensing (cosmic shear) and galaxy surveys, rescaling summary statistics of N-body simulations for new cosmological parameter values, and any applications where the influence of Fourier modes larger than the simulation size must be accounted for.« less

  19. Regression Models for the Analysis of Longitudinal Gaussian Data from Multiple Sources

    PubMed Central

    O’Brien, Liam M.; Fitzmaurice, Garrett M.

    2006-01-01

    We present a regression model for the joint analysis of longitudinal multiple source Gaussian data. Longitudinal multiple source data arise when repeated measurements are taken from two or more sources, and each source provides a measure of the same underlying variable and on the same scale. This type of data generally produces a relatively large number of observations per subject; thus estimation of an unstructured covariance matrix often may not be possible. We consider two methods by which parsimonious models for the covariance can be obtained for longitudinal multiple source data. The methods are illustrated with an example of multiple informant data arising from a longitudinal interventional trial in psychiatry. PMID:15726666

  20. The BioLexicon: a large-scale terminological resource for biomedical text mining

    PubMed Central

    2011-01-01

    Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical Markup Framework, an ISO standard. Conclusions The BioLexicon contains over 2.2 M lexical entries and over 1.8 M terminological variants, as well as over 3.3 M semantic relations, including over 2 M synonymy relations. Its exploitation can benefit both application developers and users. We demonstrate some such benefits by describing integration of the resource into a number of different tools, and evaluating improvements in performance that this can bring. PMID:21992002

Top