Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian
2016-01-01
The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.
NASA Technical Reports Server (NTRS)
Doolin, B. F.
1975-01-01
Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.
Intra-reach headwater fish assemblage structure
McKenna, James E.
2017-01-01
Large-scale conservation efforts can take advantage of modern large databases and regional modeling and assessment methods. However, these broad-scale efforts often assume uniform average habitat conditions and/or species assemblages within stream reaches.
StePS: Stereographically Projected Cosmological Simulations
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-05-01
StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.
Bakker, Elisabeth S.; Gill, Jacquelyn L.; Johnson, Christopher N.; Vera, Frans W. M.; Sandom, Christopher J.; Asner, Gregory P.; Svenning, Jens-Christian
2016-01-01
Until recently in Earth history, very large herbivores (mammoths, ground sloths, diprotodons, and many others) occurred in most of the World’s terrestrial ecosystems, but the majority have gone extinct as part of the late-Quaternary extinctions. How has this large-scale removal of large herbivores affected landscape structure and ecosystem functioning? In this review, we combine paleo-data with information from modern exclosure experiments to assess the impact of large herbivores (and their disappearance) on woody species, landscape structure, and ecosystem functions. In modern landscapes characterized by intense herbivory, woody plants can persist by defending themselves or by association with defended species, can persist by growing in places that are physically inaccessible to herbivores, or can persist where high predator activity limits foraging by herbivores. At the landscape scale, different herbivore densities and assemblages may result in dynamic gradients in woody cover. The late-Quaternary extinctions were natural experiments in large-herbivore removal; the paleoecological record shows evidence of widespread changes in community composition and ecosystem structure and function, consistent with modern exclosure experiments. We propose a conceptual framework that describes the impact of large herbivores on woody plant abundance mediated by herbivore diversity and density, predicting that herbivore suppression of woody plants is strongest where herbivore diversity is high. We conclude that the decline of large herbivores induces major alterations in landscape structure and ecosystem functions. PMID:26504223
Bakker, Elisabeth S; Gill, Jacquelyn L; Johnson, Christopher N; Vera, Frans W M; Sandom, Christopher J; Asner, Gregory P; Svenning, Jens-Christian
2016-01-26
Until recently in Earth history, very large herbivores (mammoths, ground sloths, diprotodons, and many others) occurred in most of the World's terrestrial ecosystems, but the majority have gone extinct as part of the late-Quaternary extinctions. How has this large-scale removal of large herbivores affected landscape structure and ecosystem functioning? In this review, we combine paleo-data with information from modern exclosure experiments to assess the impact of large herbivores (and their disappearance) on woody species, landscape structure, and ecosystem functions. In modern landscapes characterized by intense herbivory, woody plants can persist by defending themselves or by association with defended species, can persist by growing in places that are physically inaccessible to herbivores, or can persist where high predator activity limits foraging by herbivores. At the landscape scale, different herbivore densities and assemblages may result in dynamic gradients in woody cover. The late-Quaternary extinctions were natural experiments in large-herbivore removal; the paleoecological record shows evidence of widespread changes in community composition and ecosystem structure and function, consistent with modern exclosure experiments. We propose a conceptual framework that describes the impact of large herbivores on woody plant abundance mediated by herbivore diversity and density, predicting that herbivore suppression of woody plants is strongest where herbivore diversity is high. We conclude that the decline of large herbivores induces major alterations in landscape structure and ecosystem functions.
The one scale that rules them all
NASA Astrophysics Data System (ADS)
Ouellette, Jennifer
2017-05-01
There are very real constraints on how large a complex organism can grow. This is the essence of all modern-day scaling laws, and the subject of Geoffrey West's provocative new book Scale: the Universal Laws of Life and Death in Organisms, Cities and Companies
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Use of a Modern Polymerization Pilot-Plant for Undergraduate Control Projects.
ERIC Educational Resources Information Center
Mendoza-Bustos, S. A.; And Others
1991-01-01
Described is a project where students gain experience in handling large volumes of hazardous materials, process start up and shut down, equipment failures, operational variations, scaling up, equipment cleaning, and run-time scheduling while working in a modern pilot plant. Included are the system design, experimental procedures, and results. (KR)
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
Research directions in large scale systems and decentralized control
NASA Technical Reports Server (NTRS)
Tenney, R. R.
1980-01-01
Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.
Contractual Duration and Investment Incentives: Evidence from Large Scale Production Units in China
NASA Astrophysics Data System (ADS)
Li, Fang; Feng, Shuyi; D'Haese, Marijke; Lu, Hualiang; Qu, Futian
2017-04-01
Large Scale Production Units have become important forces in the supply of agricultural commodities and agricultural modernization in China. Contractual duration in farmland transfer to Large Scale Production Units can be considered to reflect land tenure security. Theoretically, long-term tenancy contracts can encourage Large Scale Production Units to increase long-term investments by ensuring land rights stability or favoring access to credit. Using a unique Large Scale Production Units- and plot-level field survey dataset from Jiangsu and Jiangxi Province, this study aims to examine the effect of contractual duration on Large Scale Production Units' soil conservation behaviours. IV method is applied to take into account the endogeneity of contractual duration and unobserved household heterogeneity. Results indicate that farmland transfer contract duration significantly and positively affects land-improving investments. Policies aimed at improving transaction platforms and intermediary organizations in farmland transfer to facilitate Large Scale Production Units to access farmland with long-term tenancy contracts may therefore play an important role in improving soil quality and land productivity.
Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.
Paninski, L; Cunningham, J P
2018-06-01
Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.
The spatial and temporal domains of modern ecology.
Estes, Lyndon; Elsen, Paul R; Treuer, Timothy; Ahmed, Labeeb; Caylor, Kelly; Chang, Jason; Choi, Jonathan J; Ellis, Erle C
2018-05-01
To understand ecological phenomena, it is necessary to observe their behaviour across multiple spatial and temporal scales. Since this need was first highlighted in the 1980s, technology has opened previously inaccessible scales to observation. To help to determine whether there have been corresponding changes in the scales observed by modern ecologists, we analysed the resolution, extent, interval and duration of observations (excluding experiments) in 348 studies that have been published between 2004 and 2014. We found that observational scales were generally narrow, because ecologists still primarily use conventional field techniques. In the spatial domain, most observations had resolutions ≤1 m 2 and extents ≤10,000 ha. In the temporal domain, most observations were either unreplicated or infrequently repeated (>1 month interval) and ≤1 year in duration. Compared with studies conducted before 2004, observational durations and resolutions appear largely unchanged, but intervals have become finer and extents larger. We also found a large gulf between the scales at which phenomena are actually observed and the scales those observations ostensibly represent, raising concerns about observational comprehensiveness. Furthermore, most studies did not clearly report scale, suggesting that it remains a minor concern. Ecologists can better understand the scales represented by observations by incorporating autocorrelation measures, while journals can promote attentiveness to scale by implementing scale-reporting standards.
DOT National Transportation Integrated Search
2013-01-01
The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...
Fracking in Tight Shales: What Is It, What Does It Accomplish, and What Are Its Consequences?
NASA Astrophysics Data System (ADS)
Norris, J. Quinn; Turcotte, Donald L.; Moores, Eldridge M.; Brodsky, Emily E.; Rundle, John B.
2016-06-01
Fracking is a popular term referring to hydraulic fracturing when it is used to extract hydrocarbons. We distinguish between low-volume traditional fracking and the high-volume modern fracking used to recover large volumes of hydrocarbons from shales. Shales are fine-grained rocks with low granular permeabilities. During the formation of oil and gas, large fluid pressures are generated. These pressures result in natural fracking, and the resulting fracture permeability allows oil and gas to escape, reducing the fluid pressures. These fractures may subsequently be sealed by mineral deposition, resulting in tight shale formations. The objective of modern fracking is to reopen these fractures and/or create new fractures on a wide range of scales. Modern fracking has had a major impact on the availability of oil and gas globally; however, there are serious environmental objections to modern fracking, which should be weighed carefully against its benefits.
NASA Astrophysics Data System (ADS)
Kröger, Knut; Creutzburg, Reiner
2013-05-01
The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.
NASA Astrophysics Data System (ADS)
Bryant, Gerald
2015-04-01
Large-scale soft-sediment deformation features in the Navajo Sandstone have been a topic of interest for nearly 40 years, ever since they were first explored as a criterion for discriminating between marine and continental processes in the depositional environment. For much of this time, evidence for large-scale sediment displacements was commonly attributed to processes of mass wasting. That is, gravity-driven movements of surficial sand. These slope failures were attributed to the inherent susceptibility of dune sand responding to environmental triggers such as earthquakes, floods, impacts, and the differential loading associated with dune topography. During the last decade, a new wave of research is focusing on the event significance of deformation features in more detail, revealing a broad diversity of large-scale deformation morphologies. This research has led to a better appreciation of subsurface dynamics in the early Jurassic deformation events recorded in the Navajo Sandstone, including the important role of intrastratal sediment flow. This report documents two illustrative examples of large-scale sediment displacements represented in extensive outcrops of the Navajo Sandstone along the Utah/Arizona border. Architectural relationships in these outcrops provide definitive constraints that enable the recognition of a large-scale sediment outflow, at one location, and an equally large-scale subsurface flow at the other. At both sites, evidence for associated processes of liquefaction appear at depths of at least 40 m below the original depositional surface, which is nearly an order of magnitude greater than has commonly been reported from modern settings. The surficial, mass flow feature displays attributes that are consistent with much smaller-scale sediment eruptions (sand volcanoes) that are often documented from modern earthquake zones, including the development of hydraulic pressure from localized, subsurface liquefaction and the subsequent escape of fluidized sand toward the unconfined conditions of the surface. The origin of the forces that produced the lateral, subsurface movement of a large body of sand at the other site is not readily apparent. The various constraints on modeling the generation of the lateral force required to produce the observed displacement are considered here, along with photodocumentation of key outcrop relationships.
Perspectives on integrated modeling of transport processes in semiconductor crystal growth
NASA Technical Reports Server (NTRS)
Brown, Robert A.
1992-01-01
The wide range of length and time scales involved in industrial scale solidification processes is demonstrated here by considering the Czochralski process for the growth of large diameter silicon crystals that become the substrate material for modern microelectronic devices. The scales range in time from microseconds to thousands of seconds and in space from microns to meters. The physics and chemistry needed to model processes on these different length scales are reviewed.
Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models
NASA Technical Reports Server (NTRS)
Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.
2018-01-01
The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-25
The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less
Early Gender Test Score Gaps across OECD Countries
ERIC Educational Resources Information Center
Bedard, Kelly; Cho, Insook
2010-01-01
The results reported in this paper contribute to the debate about gender skill gaps in at least three ways. First, we document the large differences in early gender gaps across developed countries using a large scale, modern, representative data source. Second, we show that countries with pro-female sorting, countries that place girls in classes…
Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)
2000-01-01
HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).
Obreht, Igor; Hambach, Ulrich; Veres, Daniel; Zeeden, Christian; Bösken, Janina; Stevens, Thomas; Marković, Slobodan B; Klasen, Nicole; Brill, Dominik; Burow, Christoph; Lehmkuhl, Frank
2017-07-19
Understanding the past dynamics of large-scale atmospheric systems is crucial for our knowledge of the palaeoclimate conditions in Europe. Southeastern Europe currently lies at the border between Atlantic, Mediterranean, and continental climate zones. Past changes in the relative influence of associated atmospheric systems must have been recorded in the region's palaeoarchives. By comparing high-resolution grain-size, environmental magnetic and geochemical data from two loess-palaeosol sequences in the Lower Danube Basin with other Eurasian palaeorecords, we reconstructed past climatic patterns over Southeastern Europe and the related interaction of the prevailing large-scale circulation modes over Europe, especially during late Marine Isotope Stage 3 (40,000-27,000 years ago). We demonstrate that during this time interval, the intensification of the Siberian High had a crucial influence on European climate causing the more continental conditions over major parts of Europe, and a southwards shift of the Westerlies. Such a climatic and environmental change, combined with the Campanian Ignimbrite/Y-5 volcanic eruption, may have driven the Anatomically Modern Human dispersal towards Central and Western Europe, pointing to a corridor over the Eastern European Plain as an important pathway in their dispersal.
Souza, Juliana M DE; Galaverna, Renan; Souza, Aline A N DE; Brocksom, Timothy J; Pastre, Julio C; Souza, Rodrigo O M A DE; Oliveira, Kleber T DE
2018-01-01
We present a comprehensive review of the advent and impact of continuous flow chemistry with regard to the synthesis of natural products and drugs, important pharmaceutical products and definitely responsible for a revolution in modern healthcare. We detail the beginnings of modern drugs and the large scale batch mode of production, both chemical and microbiological. The introduction of modern continuous flow chemistry is then presented, both as a technological tool for enabling organic chemistry, and as a fundamental research endeavor. This part details the syntheses of bioactive natural products and commercial drugs.
Low-cost production of solar-cell panels
NASA Technical Reports Server (NTRS)
Bickler, D. B.; Gallagher, B. D.; Sanchez, L. E.
1980-01-01
Large-scale production model combines most modern manufacturing techniques to produce silicon-solar-cell panels of low costs by 1982. Model proposes facility capable of operating around the clock with annual production capacity of 20 W of solar cell panels.
Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.
Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong
2017-10-11
The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
A parallel orbital-updating based plane-wave basis method for electronic structure calculations
NASA Astrophysics Data System (ADS)
Pan, Yan; Dai, Xiaoying; de Gironcoli, Stefano; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui
2017-11-01
Motivated by the recently proposed parallel orbital-updating approach in real space method [1], we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.
Large-scale shell-model calculations for 32-39P isotopes
NASA Astrophysics Data System (ADS)
Srivastava, P. C.; Hirsch, J. G.; Ermamatov, M. J.; Kota, V. K. B.
2012-10-01
In this work, the structure of 32-39P isotopes is described in the framework of stateof-the-art large-scale shell-model calculations, employing the code ANTOINE with three modern effective interactions: SDPF-U, SDPF-NR and the extended pairing plus quadrupole-quadrupoletype forces with inclusion of monopole interaction (EPQQM). Protons are restricted to fill the sd shell, while neutrons are active in the sd - pf valence space. Results for positive and negative level energies and electromagnetic observables are compared with the available experimental data.
Kidane, A.; Hepelwa, A.; Tingum, E.; Hu, T.W.
2016-01-01
In this study an attempt is made to compare the efficiency in tobacco leaf production with three other cereals – maize, ground nut and rice – commonly grown by Tanzanian small scale farmers. The paper reviews the prevalence of tobacco use in Africa with that of the developed world; while there was a decline in the latter there appears to be an increase in the former. The economic benefit and costs of tobacco production and consumption in Tanzania are also compared. Using a nationally representative large scale data we were able to observe that modern agricultural inputs allotted to tobacco was much higher than those allotted to maize, ground nut and rice. Using A Frontier Production approach, the study shows that the efficiency of tobacco, maize, groundnuts and rice were 75.3%, 68.5%, 64.5% and 46.5% respectively. Despite the infusion of massive agricultural input allotted to it, tobacco is still 75.3% efficient-tobacco farmers should have produced the same amount by utilizing only 75.3% of realized inputs. The relatively high efficiency in tobacco can only be explained by the large scale allocation of modern agricultural inputs such as fertilizer, better seeds, credit facility and easy access to market. The situation is likely to be reversed if more allocation of inputs were directed to basic food crops such as maize, rice and ground nuts. Tanzania’s policy of food security and poverty alleviation can only be achieved by allocating more modern inputs to basic necessities such as maize and rice. PMID:28124032
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Viau, Andre
2013-04-01
Paleoclimate reconstructions rely on the accuracy of modern climate datasets for calibration of fossil records under the assumption of climate normality through time, which means that the modern climate operates in a similar manner as over the past 2,000 years. In this study, we show how using different modern climate datasets have an impact on a pollen-based reconstruction of mean temperature of the warmest month (MTWA) during the past 2,000 years for North America. The modern climate datasets used to explore this research question include the: Whitmore et al., (2005) modern climate dataset; North American Regional Reanalysis (NARR); National Center For Environmental Prediction (NCEP); European Center for Medium Range Weather Forecasting (ECMWF) ERA-40 reanalysis; WorldClim, Global Historical Climate Network (GHCN) and New et al., which is derived from the CRU dataset. Results show that some caution is advised in using the reanalysis data on large-scale reconstructions. Station data appears to dampen out the variability of the reconstruction produced using station based datasets. The reanalysis or model-based datasets are not recommended for paleoclimate large-scale North American reconstructions as they appear to lack some of the dynamics observed in station datasets (CRU) which resulted in warm-biased reconstructions as compared to the station-based reconstructions. The Whitmore et al. (2005) modern climate dataset appears to be a compromise between CRU-based datasets and model-based datasets except for the ERA-40. In addition, an ultra-high resolution gridded climate dataset such as WorldClim may only be useful if the pollen calibration sites in North America have at least the same spatial precision. We reconstruct the MTWA to within +/-0.01°C by using an average of all curves derived from the different modern climate datasets, demonstrating the robustness of the procedure used. It may be that the use of an average of different modern datasets may reduce the impact of uncertainty of paleoclimate reconstructions, however, this is yet to be determined with certainty. Future evaluation using for example the newly developed Berkeley earth surface temperature datasets should be tested against the paleoclimate record.
Tropospheric transport differences between models using the same large-scale meteorological fields
NASA Astrophysics Data System (ADS)
Orbe, Clara; Waugh, Darryn W.; Yang, Huang; Lamarque, Jean-Francois; Tilmes, Simone; Kinnison, Douglas E.
2017-01-01
The transport of chemicals is a major uncertainty in the modeling of tropospheric composition. A common approach is to transport gases using the winds from meteorological analyses, either using them directly in a chemical transport model or by constraining the flow in a general circulation model. Here we compare the transport of idealized tracers in several different models that use the same meteorological fields taken from Modern-Era Retrospective analysis for Research and Applications (MERRA). We show that, even though the models use the same meteorological fields, there are substantial differences in their global-scale tropospheric transport related to large differences in parameterized convection between the simulations. Furthermore, we find that the transport differences between simulations constrained with the same-large scale flow are larger than differences between free-running simulations, which have differing large-scale flow but much more similar convective mass fluxes. Our results indicate that more attention needs to be paid to convective parameterizations in order to understand large-scale tropospheric transport in models, particularly in simulations constrained with analyzed winds.
Code modernization and modularization of APEX and SWAT watershed simulation models
USDA-ARS?s Scientific Manuscript database
SWAT (Soil and Water Assessment Tool) and APEX (Agricultural Policy / Environmental eXtender) are respectively large and small watershed simulation models derived from EPIC Environmental Policy Integrated Climate), a field-scale agroecology simulation model. All three models are coded in FORTRAN an...
Introducing Large-Scale Innovation in Schools
ERIC Educational Resources Information Center
Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.
2016-01-01
Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school…
History of Education in Modern and Contemporary Europe: New Sources and Lines of Research
ERIC Educational Resources Information Center
Sani, Roberto
2013-01-01
by Albisetti focuses primarily on the nineteenth century, and on some large-scale trends and issues, such as those relating to education and secondary instruction for women. Discussing this issue implies--especially in the diverse and heterogeneous context of…
Imaging detectors and electronics—a view of the future
NASA Astrophysics Data System (ADS)
Spieler, Helmuth
2004-09-01
Imaging sensors and readout electronics have made tremendous strides in the past two decades. The application of modern semiconductor fabrication techniques and the introduction of customized monolithic integrated circuits have made large-scale imaging systems routine in high-energy physics. This technology is now finding its way into other areas, such as space missions, synchrotron light sources, and medical imaging. I review current developments and discuss the promise and limits of new technologies. Several detector systems are described as examples of future trends. The discussion emphasizes semiconductor detector systems, but I also include recent developments for large-scale superconducting detector arrays.
REVIEWS OF TOPICAL PROBLEMS: Large-scale star formation in galaxies
NASA Astrophysics Data System (ADS)
Efremov, Yurii N.; Chernin, Artur D.
2003-01-01
A brief review is given of the history of modern ideas on the ongoing star formation process in the gaseous disks of galaxies. Recent studies demonstrate the key role of the interplay between the gas self-gravitation and its turbulent motions. The large scale supersonic gas flows create structures of enhanced density which then give rise to the gravitational condensation of gas into stars and star clusters. Formation of star clusters, associations and complexes is considered, as well as the possibility of isolated star formation. Special emphasis is placed on star formation under the action of ram pressure.
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
Quantitative properties of clustering within modern microscopic nuclear models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Volya, A.; Tchuvil’sky, Yu. M., E-mail: tchuvl@nucl-th.sinp.msu.ru
2016-09-15
A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially themore » possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.« less
Scalable Automated Model Search
2014-05-20
ma- chines. Categories and Subject Descriptors Big Data [Distributed Computing]: Large scale optimization 1. INTRODUCTION Modern scientific and...from Continuum Analytics[1], and Apache Spark 0.8.1. Additionally, we made use of Hadoop 1.0.4 configured on local disks as our data store for the large...Borkar et al. Hyracks: A flexible and extensible foundation for data -intensive computing. In ICDE, 2011. [16] J. Canny and H. Zhao. Big data
Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.
An Introduction to the Computerized Adaptive Testing
ERIC Educational Resources Information Center
Tian, Jian-quan; Miao, Dan-min; Zhu, Xia; Gong, Jing-jing
2007-01-01
Computerized adaptive testing (CAT) has unsurpassable advantages over traditional testing. It has become the mainstream in large scale examinations in modern society. This paper gives a brief introduction to CAT including differences between traditional testing and CAT, the principles of CAT, psychometric theory and computer algorithms of CAT, the…
Retooling Education: Testing and the Liberal Arts
ERIC Educational Resources Information Center
Jackson, Robert L.
2007-01-01
The motivation and methodology for measuring intelligence have changed repeatedly in the modern history of large-scale student testing. Test makers have always sought to identify raw aptitude for cultivation, but they have never figured out how to promote excellence while preserving equality. They've settled for egalitarianism, which gives rise to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Henry, Stephen Michael
In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor-more » ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.« less
Compactified cosmological simulations of the infinite universe
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-06-01
We present a novel N-body simulation method that compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to follow the evolution of the large-scale structure. Our approach eliminates the need for periodic boundary conditions, a mere numerical convenience which is not supported by observation and which modifies the law of force on large scales in an unrealistic fashion. We demonstrate that our method outclasses standard simulations executed on workstation-scale hardware in dynamic range, it is balanced in following a comparable number of high and low k modes and, its fundamental geometry and topology match observations. Our approach is also capable of simulating an expanding, infinite universe in static coordinates with Newtonian dynamics. The price of these achievements is that most of the simulated volume has smoothly varying mass and spatial resolution, an approximation that carries different systematics than periodic simulations. Our initial implementation of the method is called StePS which stands for Stereographically projected cosmological simulations. It uses stereographic projection for space compactification and naive O(N^2) force calculation which is nevertheless faster to arrive at a correlation function of the same quality than any standard (tree or P3M) algorithm with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence our code can function as a high-speed prediction tool for modern large-scale surveys. To learn about the limits of the respective methods, we compare StePS with GADGET-2 running matching initial conditions.
Population Policy: Abortion and Modern Contraception Are Substitutes.
Miller, Grant; Valente, Christine
2016-08-01
A longstanding debate exists in population policy about the relationship between modern contraception and abortion. Although theory predicts that they should be substitutes, the empirical evidence is difficult to interpret. What is required is a large-scale intervention that alters the supply (or full price) of one or the other and, importantly, that does so in isolation (reproductive health programs often bundle primary health care and family planning-and in some instances, abortion services). In this article, we study Nepal's 2004 legalization of abortion provision and subsequent expansion of abortion services, an unusual and rapidly implemented policy meeting these requirements. Using four waves of rich individual-level data representative of fertile-age Nepalese women, we find robust evidence of substitution between modern contraception and abortion. This finding has important implications for public policy and foreign aid, suggesting that an effective strategy for reducing expensive and potentially unsafe abortions may be to expand the supply of modern contraceptives.
Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs
NASA Astrophysics Data System (ADS)
Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.
2015-03-01
This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410
Sedimentary processes of the Bagnold Dunes: Implications for the eolian rock record of Mars
NASA Astrophysics Data System (ADS)
Ewing, R. C.; Lapotre, M. G. A.; Lewis, K. W.; Day, M.; Stein, N.; Rubin, D. M.; Sullivan, R.; Banham, S.; Lamb, M. P.; Bridges, N. T.; Gupta, S.; Fischer, W. W.
2017-12-01
The Mars Science Laboratory rover Curiosity visited two active wind-blown sand dunes within Gale crater, Mars, which provided the first ground-based opportunity to compare Martian and terrestrial eolian dune sedimentary processes and study a modern analog for the Martian eolian rock record. Orbital and rover images of these dunes reveal terrestrial-like and uniquely Martian processes. The presence of grainfall, grainflow, and impact ripples resembled terrestrial dunes. Impact ripples were present on all dune slopes and had a size and shape similar to their terrestrial counterpart. Grainfall and grainflow occurred on dune and large-ripple lee slopes. Lee slopes were 29° where grainflows were present and 33° where grainfall was present. These slopes are interpreted as the dynamic and static angles of repose, respectively. Grain size measured on an undisturbed impact ripple ranges between 50 μm and 350 μm with an intermediate axis mean size of 113 μm (median: 103 μm). Dissimilar to dune eolian processes on Earth, large, meter-scale ripples were present on all dune slopes. Large ripples had nearly symmetric to strongly asymmetric topographic profiles and heights ranging between 12 cm and 28 cm. The composite observations of the modern sedimentary processes highlight that the Martian eolian rock record is likely different from its terrestrial counterpart because of the large ripples, which are expected to engender a unique scale of cross stratification. More broadly, however, in the Bagnold Dune Field as on Earth, dune-field pattern dynamics and basin-scale boundary conditions will dictate the style and distribution of sedimentary processes.
Transforming Power Systems Through Global Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-06-01
Ambitious and integrated policy and regulatory frameworks are crucial to achieve power system transformation. The 21st Century Power Partnership -- a multilateral initiative of the Clean Energy Ministerial -- serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with energy efficiency and grid modernization.
``Large''- vs Small-scale friction control in turbulent channel flow
NASA Astrophysics Data System (ADS)
Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp
2017-11-01
We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.
Constraining Modern and Historic Mercury Emissions From Gold Mining
NASA Astrophysics Data System (ADS)
Strode, S. A.; Jaeglé, L.; Selin, N. E.; Sunderland, E.
2007-12-01
Mercury emissions from both historic gold and silver mining and modern small-scale gold mining are highly uncertain. Historic mercury emissions can affect the modern atmosphere through reemission from land and ocean, and quantifying mercury emissions from historic gold and silver mining can help constrain modern mining sources. While estimates of mercury emissions during historic gold rushes exceed modern anthropogenic mercury emissions in North America, sediment records in many regions do not show a strong gold rush signal. We use the GEOS-Chem chemical transport model to determine the spatial footprint of mercury emissions from mining and compare model runs from gold rush periods to sediment and ice core records of historic mercury deposition. Based on records of gold and silver production, we include mercury emissions from North and South American mining of 1900 Mg/year in 1880, compared to modern global anthropogenic emissions of 3400 Mg/year. Including this large mining source in GEOS-Chem leads to an overestimate of the modeled 1880 to preindustrial enhancement ratio compared to the sediment core record. We conduct sensitivity studies to constrain the level of mercury emissions from modern and historic mining that is consistent with the deposition records for different regions.
Graham, Jay P; Leibler, Jessica H; Price, Lance B; Otte, Joachim M; Pfeiffer, Dirk U; Tiensin, T; Silbergeld, Ellen K
2008-01-01
Understanding interactions between animals and humans is critical in preventing outbreaks of zoonotic disease. This is particularly important for avian influenza. Food animal production has been transformed since the 1918 influenza pandemic. Poultry and swine production have changed from small-scale methods to industrial-scale operations. There is substantial evidence of pathogen movement between and among these industrial facilities, release to the external environment, and exposure to farm workers, which challenges the assumption that modern poultry production is more biosecure and biocontained as compared with backyard or small holder operations in preventing introduction and release of pathogens. An analysis of data from the Thai government investigation in 2004 indicates that the odds of H5N1 outbreaks and infections were significantly higher in large-scale commercial poultry operations as compared with backyard flocks. These data suggest that successful strategies to prevent or mitigate the emergence of pandemic avian influenza must consider risk factors specific to modern industrialized food animal production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, H.F.; Andrews, J.T.; Short, S.K.
The characteristic anomaly patterns of modern surface temperature and precipitation are compared to tree-ring indices (0-300 yr) and fossil pollen (0-6000 yr) variations in northern North America. The data base consists of 245 climate stations, 55 tree-ring chronologies, 153 modern pollen collections, and 39 fossil pollen sites. A few areas exhibit relatively high climatic sensitivity, displaying generally consistent patterns during alternate warm and cold periods, regardless of time scales. The surface changes are related to the redistribution (i.e., changes in the mean position and strength) of the planetary-scale waves and to north-south shifts in the mean boundary of the Arcticmore » Front. The zone where the largest changes occur is typically located along the mean present-day boundary between Arctic and Pacific airstreams. Establishing plausible relationships between vegetation responses and concomitant changes in atmospheric circulation patterns increases our confidence that the paleoclimatic signals are indeed related to large-scale circulation changes.« less
Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan
NASA Astrophysics Data System (ADS)
Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun
2017-04-01
Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.
Bukowski, Beth E; Baker, William L
2013-04-01
Sagebrush landscapes provide habitat for Sage-Grouse and other sagebrush obligates, yet historical fire regimes and the structure of historical sagebrush landscapes are poorly known, hampering ecological restoration and management. To remedy this, General Land Office Survey (GLO) survey notes were used to reconstruct over two million hectares of historical vegetation for four sagebrush-dominated (Artemisia spp.) study areas in the western United States. Reconstructed vegetation was analyzed for fire indicators used to identify historical fires and reconstruct historical fire regimes. Historical fire-size distributions were inverse-J shaped, and one fire > 100 000 ha was identified. Historical fire rotations were estimated at 171-342 years for Wyoming big sagebrush (A. tridentata ssp. wyomingensis) and 137-217 years for mountain big sagebrush (A. tridentata ssp. vaseyana). Historical fire and patch sizes were significantly larger in Wyoming big sagebrush than mountain big sagebrush, and historical fire rotations were significantly longer in Wyoming big sagebrush than mountain big sagebrush. Historical fire rotations in Wyoming were longer than those in other study areas. Fine-scale mosaics of burned and unburned area and larger unburned inclusions within fire perimeters were less common than in modern fires. Historical sagebrush landscapes were dominated by large, contiguous areas of sagebrush, though large grass-dominated areas and finer-scale mosaics of grass and sagebrush were also present in smaller amounts. Variation in sagebrush density was a common source of patchiness, and areas classified as "dense" made up 24.5% of total sagebrush area, compared to 16.3% for "scattered" sagebrush. Results suggest significant differences in historical and modern fire regimes. Modern fire rotations in Wyoming big sagebrush are shorter than historical fire rotations. Results also suggest that historical sagebrush landscapes would have fluctuated, because of infrequent episodes of large fires and long periods of recovery and maturity. Due to fragmentation of sagebrush landscapes, the large, contiguous expanses of sagebrush that dominated historically are most at risk and in need of conservation, including both dense and scattered sagebrush. Fire suppression in Wyoming big sagebrush may also be advisable, as modern fire rotations are shorter than their historical counterparts.
Extracellular matrix motion and early morphogenesis
Loganathan, Rajprasad; Rongish, Brenda J.; Smith, Christopher M.; Filla, Michael B.; Czirok, Andras; Bénazéraf, Bertrand
2016-01-01
For over a century, embryologists who studied cellular motion in early amniotes generally assumed that morphogenetic movement reflected migration relative to a static extracellular matrix (ECM) scaffold. However, as we discuss in this Review, recent investigations reveal that the ECM is also moving during morphogenesis. Time-lapse studies show how convective tissue displacement patterns, as visualized by ECM markers, contribute to morphogenesis and organogenesis. Computational image analysis distinguishes between cell-autonomous (active) displacements and convection caused by large-scale (composite) tissue movements. Modern quantification of large-scale ‘total’ cellular motion and the accompanying ECM motion in the embryo demonstrates that a dynamic ECM is required for generation of the emergent motion patterns that drive amniote morphogenesis. PMID:27302396
ERIC Educational Resources Information Center
Vincent, Jack E.
Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents data on the application of distance theory to patterns of cooperation among nations. Distance theory implies that international relations systems (nations, organizations, individuals, etc.) can be…
ERIC Educational Resources Information Center
Schlenker, Richard M.; And Others
Information is presented about the problems involved in using sea water in the steam propulsion systems of large, modern ships. Discussions supply background chemical information concerning the problems of corrosion, scale buildup, and sludge production. Suggestions are given for ways to maintain a good water treatment program to effectively deal…
ERIC Educational Resources Information Center
McGann, Sean T.; Frost, Raymond D.; Matta, Vic; Huang, Wayne
2007-01-01
Information Systems (IS) departments are facing challenging times as enrollments decline and the field evolves, thus necessitating large-scale curriculum changes. Our experience shows that many IS departments are in such a predicament as they have not evolved content quickly enough to keep it relevant, they do a poor job coordinating curriculum…
Deeper Look at Student Learning of Quantum Mechanics: The Case of Tunneling
ERIC Educational Resources Information Center
McKagan, S. B.; Perkins, K. K.; Wieman, C. E.
2008-01-01
We report on a large-scale study of student learning of quantum tunneling in four traditional and four transformed modern physics courses. In the transformed courses, which were designed to address student difficulties found in previous research, students still struggle with many of the same issues found in other courses. However, the reasons for…
Housing first on a large scale: Fidelity strengths and challenges in the VA's HUD-VASH program.
Kertesz, Stefan G; Austin, Erika L; Holmes, Sally K; DeRussy, Aerin J; Van Deusen Lukas, Carol; Pollio, David E
2017-05-01
Housing First (HF) combines permanent supportive housing and supportive services for homeless individuals and removes traditional treatment-related preconditions for housing entry. There has been little research describing strengths and shortfalls of HF implementation outside of research demonstration projects. The U.S. Department of Veterans Affairs (VA) has transitioned to an HF approach in a supportive housing program serving over 85,000 persons. This offers a naturalistic window to study fidelity when HF is adopted on a large scale. We operationalized HF into 20 criteria grouped into 5 domains. We assessed 8 VA medical centers twice (1 year apart), scoring each criterion using a scale ranging from 1 ( low fidelity ) to 4 ( high fidelity ). There were 2 HF domains (no preconditions and rapidly offering permanent housing) for which high fidelity was readily attained. There was uneven progress in prioritizing the most vulnerable clients for housing support. Two HF domains (sufficient supportive services and a modern recovery philosophy) had considerably lower fidelity. Interviews suggested that operational issues such as shortfalls in staffing and training likely hindered performance in these 2 domains. In this ambitious national HF program, the largest to date, we found substantial fidelity in focusing on permanent housing and removal of preconditions to housing entry. Areas of concern included the adequacy of supportive services and adequacy in deployment of a modern recovery philosophy. Under real-world conditions, large-scale implementation of HF is likely to require significant additional investment in client service supports to assure that results are concordant with those found in research studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Single-chip microprocessor that communicates directly using light
NASA Astrophysics Data System (ADS)
Sun, Chen; Wade, Mark T.; Lee, Yunsup; Orcutt, Jason S.; Alloatti, Luca; Georgas, Michael S.; Waterman, Andrew S.; Shainline, Jeffrey M.; Avizienis, Rimas R.; Lin, Sen; Moss, Benjamin R.; Kumar, Rajesh; Pavanello, Fabio; Atabaki, Amir H.; Cook, Henry M.; Ou, Albert J.; Leu, Jonathan C.; Chen, Yu-Hsin; Asanović, Krste; Ram, Rajeev J.; Popović, Miloš A.; Stojanović, Vladimir M.
2015-12-01
Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems—from mobile phones to large-scale data centres. These limitations can be overcome by using optical communications based on chip-scale electronic-photonic systems enabled by silicon-based nanophotonic devices8. However, combining electronics and photonics on the same chip has proved challenging, owing to microchip manufacturing conflicts between electronics and photonics. Consequently, current electronic-photonic chips are limited to niche manufacturing processes and include only a few optical devices alongside simple circuits. Here we report an electronic-photonic system on a single chip integrating over 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions. This system is a realization of a microprocessor that uses on-chip photonic devices to directly communicate with other chips using light. To integrate electronics and photonics at the scale of a microprocessor chip, we adopt a ‘zero-change’ approach to the integration of photonics. Instead of developing a custom process to enable the fabrication of photonics, which would complicate or eliminate the possibility of integration with state-of-the-art transistors at large scale and at high yield, we design optical devices using a standard microelectronics foundry process that is used for modern microprocessors. This demonstration could represent the beginning of an era of chip-scale electronic-photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data centres and supercomputers.
Single-chip microprocessor that communicates directly using light.
Sun, Chen; Wade, Mark T; Lee, Yunsup; Orcutt, Jason S; Alloatti, Luca; Georgas, Michael S; Waterman, Andrew S; Shainline, Jeffrey M; Avizienis, Rimas R; Lin, Sen; Moss, Benjamin R; Kumar, Rajesh; Pavanello, Fabio; Atabaki, Amir H; Cook, Henry M; Ou, Albert J; Leu, Jonathan C; Chen, Yu-Hsin; Asanović, Krste; Ram, Rajeev J; Popović, Miloš A; Stojanović, Vladimir M
2015-12-24
Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems--from mobile phones to large-scale data centres. These limitations can be overcome by using optical communications based on chip-scale electronic-photonic systems enabled by silicon-based nanophotonic devices. However, combining electronics and photonics on the same chip has proved challenging, owing to microchip manufacturing conflicts between electronics and photonics. Consequently, current electronic-photonic chips are limited to niche manufacturing processes and include only a few optical devices alongside simple circuits. Here we report an electronic-photonic system on a single chip integrating over 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions. This system is a realization of a microprocessor that uses on-chip photonic devices to directly communicate with other chips using light. To integrate electronics and photonics at the scale of a microprocessor chip, we adopt a 'zero-change' approach to the integration of photonics. Instead of developing a custom process to enable the fabrication of photonics, which would complicate or eliminate the possibility of integration with state-of-the-art transistors at large scale and at high yield, we design optical devices using a standard microelectronics foundry process that is used for modern microprocessors. This demonstration could represent the beginning of an era of chip-scale electronic-photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data centres and supercomputers.
Siver, Peter A; Jo, Bok Yeon; Kim, Jong Im; Shin, Woongghi; Lott, Anne Marie; Wolfe, Alexander P
2015-06-01
Heterokont algae of the class Synurophyceae, characterized by distinctive siliceous scales that cover the surface of the cell, are ecologically important in inland waters, yet their evolutionary history remains enigmatic. We explore phylogenetic relationships within this group of algae relative to geologic time, with a focus on evolution of siliceous components. We combined an expansive five-gene and time-calibrated molecular phylogeny of synurophyte algae with an extensive array of fossil specimens from the middle Eocene to infer evolutionary trends within the group. The group originated in the Jurassic approximately 157 million years ago (Ma), with the keystone genera Mallomonas and Synura diverging during the Early Cretaceous at 130 Ma. Mallomonas further splits into two major subclades, signaling the evolution of the V-rib believed to aid in the spacing and organization of scales on the cell covering. Synura also diverges into two primary subclades, separating taxa with forward-projecting spines on the scale from those with a keel positioned on the scale proper. Approximately one third of the fossil species are extinct, whereas the remaining taxa are linked to modern congeners. The taxonomy of synurophytes, which relies extensively on the morphology of the siliceous components, is largely congruent with molecular analyses. Scales of extinct synurophytes were significantly larger than those of modern taxa and may have played a role in their demise. In contrast, many fossil species linked to modern lineages were smaller in the middle Eocene, possibly reflecting growth in the greenhouse climatic state that characterized this geologic interval. © 2015 Botanical Society of America, Inc.
Mechanisation of large-scale agricultural fields in developing countries - a review.
Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila
2016-09-01
Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Sedimentary processes of the Bagnold Dunes: Implications for the eolian rock record of Mars.
Ewing, R C; Lapotre, M G A; Lewis, K W; Day, M; Stein, N; Rubin, D M; Sullivan, R; Banham, S; Lamb, M P; Bridges, N T; Gupta, S; Fischer, W W
2017-12-01
The Mars Science Laboratory rover Curiosity visited two active wind-blown sand dunes within Gale crater, Mars, which provided the first ground-based opportunity to compare Martian and terrestrial eolian dune sedimentary processes and study a modern analog for the Martian eolian rock record. Orbital and rover images of these dunes reveal terrestrial-like and uniquely Martian processes. The presence of grainfall, grainflow, and impact ripples resembled terrestrial dunes. Impact ripples were present on all dune slopes and had a size and shape similar to their terrestrial counterpart. Grainfall and grainflow occurred on dune and large-ripple lee slopes. Lee slopes were ~29° where grainflows were present and ~33° where grainfall was present. These slopes are interpreted as the dynamic and static angles of repose, respectively. Grain size measured on an undisturbed impact ripple ranges between 50 μm and 350 μm with an intermediate axis mean size of 113 μm (median: 103 μm). Dissimilar to dune eolian processes on Earth, large, meter-scale ripples were present on all dune slopes. Large ripples had nearly symmetric to strongly asymmetric topographic profiles and heights ranging between 12 cm and 28 cm. The composite observations of the modern sedimentary processes highlight that the Martian eolian rock record is likely different from its terrestrial counterpart because of the large ripples, which are expected to engender a unique scale of cross stratification. More broadly, however, in the Bagnold Dune Field as on Earth, dune-field pattern dynamics and basin-scale boundary conditions will dictate the style and distribution of sedimentary processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, E.J.; McNeilly, G.S.
The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, E.; Burton, E.; Duran, A.
Understanding the real-world power demand of modern automobiles is of critical importance to engineers using modeling and simulation to inform the intelligent design of increasingly efficient powertrains. Increased use of global positioning system (GPS) devices has made large scale data collection of vehicle speed (and associated power demand) a reality. While the availability of real-world GPS data has improved the industry's understanding of in-use vehicle power demand, relatively little attention has been paid to the incremental power requirements imposed by road grade. This analysis quantifies the incremental efficiency impacts of real-world road grade by appending high fidelity elevation profiles tomore » GPS speed traces and performing a large simulation study. Employing a large real-world dataset from the National Renewable Energy Laboratory's Transportation Secure Data Center, vehicle powertrain simulations are performed with and without road grade under five vehicle models. Aggregate results of this study suggest that road grade could be responsible for 1% to 3% of fuel use in light-duty automobiles.« less
Researching, Evaluating, and Choosing a Backup Service in the Cloud
ERIC Educational Resources Information Center
Hastings, Robin
2012-01-01
Backups are a modern fact of life. Every organization that has any kind of computing technology (and that is all of them these days) needs to back up its data in case of technological or user errors. Traditionally, large-scale backups have been done via an internal or external tape drive that takes magnetic tapes (minicassettes, essentially) and…
Microbial desulfurization of coal
NASA Technical Reports Server (NTRS)
Dastoor, M. N.; Kalvinskas, J. J.
1978-01-01
Experiments indicate that several sulfur-oxidizing bacteria strains have been very efficient in desulfurizing coal. Process occurs at room temperature and does not require large capital investments of high energy inputs. Process may expand use of abundant reserves of high-sulfur bituminous coal, which is currently restricted due to environmental pollution. On practical scale, process may be integrated with modern coal-slurry transportation lines.
NASA Technical Reports Server (NTRS)
Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.
2012-01-01
This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.
How institutions shaped the last major evolutionary transition to large-scale human societies
Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent
2016-01-01
What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937
Panoptes: web-based exploration of large scale genome variation data.
Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic
2017-10-15
The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2017-04-01
With the advent of fully automated image analysis and modern machine learning methods, there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. This paper presents a method and implementation for facilitating such datasets that addresses the critical issue of size scaling for algorithm validation and evaluation; current evaluation methods that are usually used in academic studies do not scale to large datasets. This method includes protocols for the documentation of many regions in very large image datasets; the documentation may be incrementally updated by new image data and by improved algorithm outcomes. This method has been used for 5 years in the context of chest health biomarkers from low-dose chest CT images that are now being used with increasing frequency in lung cancer screening practice. The lung scans are segmented into over 100 different anatomical regions, and the method has been applied to a dataset of over 20,000 chest CT images. Using this framework, the computer algorithms have been developed to achieve over 90% acceptable image segmentation on the complete dataset.
Extreme-Scale De Novo Genome Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georganas, Evangelos; Hofmeyr, Steven; Egan, Rob
De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and themore » large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.« less
Extracellular matrix motion and early morphogenesis.
Loganathan, Rajprasad; Rongish, Brenda J; Smith, Christopher M; Filla, Michael B; Czirok, Andras; Bénazéraf, Bertrand; Little, Charles D
2016-06-15
For over a century, embryologists who studied cellular motion in early amniotes generally assumed that morphogenetic movement reflected migration relative to a static extracellular matrix (ECM) scaffold. However, as we discuss in this Review, recent investigations reveal that the ECM is also moving during morphogenesis. Time-lapse studies show how convective tissue displacement patterns, as visualized by ECM markers, contribute to morphogenesis and organogenesis. Computational image analysis distinguishes between cell-autonomous (active) displacements and convection caused by large-scale (composite) tissue movements. Modern quantification of large-scale 'total' cellular motion and the accompanying ECM motion in the embryo demonstrates that a dynamic ECM is required for generation of the emergent motion patterns that drive amniote morphogenesis. © 2016. Published by The Company of Biologists Ltd.
Cosmological neutrino simulations at extreme scale
Emberson, J. D.; Yu, Hao-Ran; Inman, Derek; ...
2017-08-01
Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method ofmore » data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.« less
Foundational perspectives on causality in large-scale brain networks
NASA Astrophysics Data System (ADS)
Mannino, Michael; Bressler, Steven L.
2015-12-01
A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain.
Foundational perspectives on causality in large-scale brain networks.
Mannino, Michael; Bressler, Steven L
2015-12-01
A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain. Copyright © 2015 Elsevier B.V. All rights reserved.
The diurnal interaction between convection and peninsular-scale forcing over South Florida
NASA Technical Reports Server (NTRS)
Cooper, H. J.; Simpson, J.; Garstang, M.
1982-01-01
One of the outstanding problems in modern meterology is that of describing in detail the manner in which larger scales of motion interact with, influence and are influenced by successively smaller scales of motion. The present investigation is concerned with a study of the diurnal evolution of convection, the interaction between the peninsular-scale convergence and convection, and the role of the feedback produced by the cloud-scale downdrafts in the maintenance of the convection. Attention is given to the analysis, the diurnal cycle of the network area-averaged divergence, convective-scale divergence, convective mass transports, and the peninsular scale divergence. The links established in the investigation between the large scale (peninsular), the mesoscale (network), and the convective scale (cloud) are found to be of fundamental importance to the understanding of the initiation, maintenance, and decay of deep precipitating convection and to its theoretical parameterization.
Introduction to Methods of Approximation in Physics and Astronomy
NASA Astrophysics Data System (ADS)
van Putten, Maurice H. P. M.
2017-04-01
Modern astronomy reveals an evolving Universe rife with transient sources, mostly discovered - few predicted - in multi-wavelength observations. Our window of observations now includes electromagnetic radiation, gravitational waves and neutrinos. For the practicing astronomer, these are highly interdisciplinary developments that pose a novel challenge to be well-versed in astroparticle physics and data analysis. In realizing the full discovery potential of these multimessenger approaches, the latter increasingly involves high-performance supercomputing. These lecture notes developed out of lectures on mathematical-physics in astronomy to advanced undergraduate and beginning graduate students. They are organised to be largely self-contained, starting from basic concepts and techniques in the formulation of problems and methods of approximation commonly used in computation and numerical analysis. This includes root finding, integration, signal detection algorithms involving the Fourier transform and examples of numerical integration of ordinary differential equations and some illustrative aspects of modern computational implementation. In the applications, considerable emphasis is put on fluid dynamical problems associated with accretion flows, as these are responsible for a wealth of high energy emission phenomena in astronomy. The topics chosen are largely aimed at phenomenological approaches, to capture main features of interest by effective methods of approximation at a desired level of accuracy and resolution. Formulated in terms of a system of algebraic, ordinary or partial differential equations, this may be pursued by perturbation theory through expansions in a small parameter or by direct numerical computation. Successful application of these methods requires a robust understanding of asymptotic behavior, errors and convergence. In some cases, the number of degrees of freedom may be reduced, e.g., for the purpose of (numerical) continuation or to identify secular behavior. For instance, secular evolution of orbital parameters may derive from averaging over essentially periodic behavior on relatively short, orbital periods. When the original number of degrees of freedom is large, averaging over dynamical time scales may lead to a formulation in terms of a system in approximately thermodynamic equilibrium subject to evolution on a secular time scale by a regular or singular perturbation. In modern astrophysics and cosmology, gravitation is being probed across an increasingly broad range of scales and more accurately so than ever before. These observations probe weak gravitational interactions below what is encountered in our solar system by many orders of magnitude. These observations hereby probe (curved) spacetime at low energy scales that may reveal novel properties hitherto unanticipated in the classical vacuum of Newtonian mechanics and Minkowski spacetime. Dark energy and dark matter encountered on the scales of galaxies and beyond, therefore, may be, in part, revealing our ignorance of the vacuum at the lowest energy scales encountered in cosmology. In this context, our application of Newtonian mechanics to globular clusters, galaxies and cosmology is an approximation assuming a classical vacuum, ignoring the potential for hidden low energy scales emerging on cosmological scales. Given our ignorance of the latter, this poses a challenge in the potential for unknown systematic deviations. If of quantum mechanical origin, such deviations are often referred to as anomalies. While they are small in traditional, macroscopic Newtonian experiments in the laboratory, they same is not a given in the limit of arbitrarily weak gravitational interactions. We hope this selection of introductory material is useful and kindles the reader's interest to become a creative member of modern astrophysics and cosmology.
Automation of Hessian-Based Tubularity Measure Response Function in 3D Biomedical Images.
Dzyubak, Oleksandr P; Ritman, Erik L
2011-01-01
The blood vessels and nerve trees consist of tubular objects interconnected into a complex tree- or web-like structure that has a range of structural scale 5 μm diameter capillaries to 3 cm aorta. This large-scale range presents two major problems; one is just making the measurements, and the other is the exponential increase of component numbers with decreasing scale. With the remarkable increase in the volume imaged by, and resolution of, modern day 3D imagers, it is almost impossible to make manual tracking of the complex multiscale parameters from those large image data sets. In addition, the manual tracking is quite subjective and unreliable. We propose a solution for automation of an adaptive nonsupervised system for tracking tubular objects based on multiscale framework and use of Hessian-based object shape detector incorporating National Library of Medicine Insight Segmentation and Registration Toolkit (ITK) image processing libraries.
Medical image classification based on multi-scale non-negative sparse coding.
Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar
2017-11-01
With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Ken Ferrier,; J. Taylor Perron,; Sujoy Mukhopadhyay,; Matt Rosener,; Stock, Jonathan; Slosberg, Michelle; Kimberly L. Huppert,
2013-01-01
Erosion of volcanic ocean islands creates dramatic landscapes, modulates Earth’s carbon cycle, and delivers sediment to coasts and reefs. Because many volcanic islands have large climate gradients and minimal variations in lithology and tectonic history, they are excellent natural laboratories for studying climatic effects on the evolution of topography. Despite concerns that modern sediment fluxes to island coasts may exceed long-term fluxes, little is known about how erosion rates and processes vary across island interiors, how erosion rates are influenced by the strong climate gradients on many islands, and how modern island erosion rates compare to long-term rates. Here, we present new measurements of erosion rates over 5 yr to 5 m.y. timescales on the Hawaiian island of Kaua‘i, across which mean annual precipitation ranges from 0.5 to 9.5 m/yr. Eroded rock volumes from basins across Kaua‘i indicate that million-year-scale erosion rates are correlated with modern mean annual precipitation and range from 8 to 335 t km–2 yr–1. In Kaua‘i’s Hanalei River basin, 3He concentrations in detrital olivines imply millennial-scale erosion rates of >126 to >390 t km–2 yr–1 from olivine-bearing hillslopes, while fluvial suspended sediment fluxes measured from 2004 to 2009 plus estimates of chemical and bed-load fluxes imply basin-averaged erosion rates of 545 ± 128 t km–2 yr–1. Mapping of landslide scars in satellite imagery of the Hanalei basin from 2004 and 2010 implies landslide-driven erosion rates of 30–47 t km–2 yr–1. These measurements imply that modern erosion rates in the Hanalei basin are no more than 2.3 ± 0.6 times faster than millennial-scale erosion rates, and, to the extent that modern precipitation patterns resemble long-term patterns, they are consistent with a link between precipitation rates and long-term erosion rates.
An S N Algorithm for Modern Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
2016-08-29
LANL discrete ordinates transport packages are required to perform large, computationally intensive time-dependent calculations on massively parallel architectures, where even a single such calculation may need many months to complete. While KBA methods scale out well to very large numbers of compute nodes, we are limited by practical constraints on the number of such nodes we can actually apply to any given calculation. Instead, we describe a modified KBA algorithm that allows realization of the reductions in solution time offered by both the current, and future, architectural changes within a compute node.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to 'topdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of second stage factor analysis of 'underdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis of combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis of 'underdog'…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of discriminant function analysis to combined…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents the computer printout of an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents data on the application of second stage factor analysis of combined…
ERIC Educational Resources Information Center
Truijen, K. J. P.; Sleegers, P. J. C.; Meelissen, M. R. M.; Nieuwenhuis, A. F. M.
2013-01-01
Purpose: At a time when secondary vocational education is implementing competence-based education (CBE) on a large scale, to adapt to the needs of students and of the labour market in a modern society, many vocational schools have recognised that interdisciplinary teacher teams are an important condition for this implementation. In order to…
A comparative study of modern and fossil cone scales and seeds of conifers: A geochemical approach
Artur, Stankiewicz B.; Mastalerz, Maria; Kruge, M.A.; Van Bergen, P. F.; Sadowska, A.
1997-01-01
Modern cone scales and seeds of Pinus strobus and Sequoia sempervirens, and their fossil (Upper Miocene, c. 6 Mar) counterparts Pinus leitzii and Sequoia langsdorfi have been studied using pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS), electron-microprobe and scanning electron microscopy. Microscopic observations revealed only minor microbial activity and high-quality structural preservation of the fossil material. The pyrolysates of both modern genera showed the presence of ligno-cellulose characteristic of conifers. However, the abundance of (alkylated)phenols and 1,2-benzenediols in modern S. sempervirens suggests the presence of non-hydrolysable tannins or abundant polyphenolic moieties not previously reported in modern conifers. The marked differences between the pyrolysis products of both modern genera are suggested to be of chemosystematic significance. The fossil samples also contained ligno-cellulose which exhibited only partial degradation, primarily of the carbohydrate constituents. Comparison between the fossil cone scale and seed pyrolysates indicated that the ligno-cellulose complex present in the seeds is chemically more resistant than that in the cone scales. Principal component analysis (PCA) of the pyrolysis data allowed for the determination of the discriminant functions used to assess the extent of degradation and the chemosystematic differences between both genera and between cone scales and seeds. Elemental composition (C, O, S), obtained using electron-microprobe, corroborated the pyrolysis results. Overall, the combination of chemical, microscopic and statistical methods allowed for a detailed characterization and chemosystematic interpretations of modern and fossil conifer cone scales and seeds.
How do you modernize a health service? A realist evaluation of whole-scale transformation in london.
Greenhalgh, Trisha; Humphrey, Charlotte; Hughes, Jane; Macfarlane, Fraser; Butler, Ceri; Pawson, Ray
2009-06-01
Large-scale, whole-systems interventions in health care require imaginative approaches to evaluation that go beyond assessing progress against predefined goals and milestones. This project evaluated a major change effort in inner London, funded by a charitable donation of approximately $21 million, which spanned four large health care organizations, covered three services (stroke, kidney, and sexual health), and sought to "modernize" these services with a view to making health care more efficient, effective, and patient centered. This organizational case study draws on the principles of realist evaluation, a largely qualitative approach that is centrally concerned with testing and refining program theories by exploring the complex and dynamic interaction among context, mechanism, and outcome. This approach used multiple data sources and methods in a pragmatic and reflexive manner to build a picture of the case and follow its fortunes over the three-year study period. The methods included ethnographic observation, semistructured interviews, and scrutiny of documents and other contemporaneous materials. As well as providing ongoing formative feedback to the change teams in specific areas of activity, we undertook a more abstract, interpretive analysis, which explored the context-mechanism-outcome relationship using the guiding question "what works, for whom, under what circumstances?" In this example of large-scale service transformation, numerous projects and subprojects emerged, fed into one another, and evolved over time. Six broad mechanisms appeared to be driving the efforts of change agents: integrating services across providers, finding and using evidence, involving service users in the modernization effort, supporting self-care, developing the workforce, and extending the range of services. Within each of these mechanisms, different teams chose widely differing approaches and met with differing success. The realist analysis of the fortunes of different subprojects identified aspects of context and mechanism that accounted for observed outcomes (both intended and unintended). This study was one of the first applications of realist evaluation to a large-scale change effort in health care. Even when an ambitious change program shifts from its original goals and meets unforeseen challenges (indeed, precisely because the program morphs and adapts over time), realist evaluation can draw useful lessons about how particular preconditions make particular outcomes more likely, even though it cannot produce predictive guidance or a simple recipe for success. Noting recent calls by others for the greater use of realist evaluation in health care, this article considers some of the challenges and limitations of this method in the light of this experience and suggests that its use will require some fundamental changes in the worldview of some health services researchers.
Lembke-Jene, Lester; Tiedemann, Ralf; Nürnberg, Dirk; Gong, Xun; Lohmann, Gerrit
2018-05-22
The Pacific hosts the largest oxygen minimum zones (OMZs) in the world ocean, which are thought to intensify and expand under future climate change, with significant consequences for marine ecosystems, biogeochemical cycles, and fisheries. At present, no deep ventilation occurs in the North Pacific due to a persistent halocline, but relatively better-oxygenated subsurface North Pacific Intermediate Water (NPIW) mitigates OMZ development in lower latitudes. Over the past decades, instrumental data show decreasing oxygenation in NPIW; however, long-term variations in middepth ventilation are potentially large, obscuring anthropogenic influences against millennial-scale natural background shifts. Here, we use paleoceanographic proxy evidence from the Okhotsk Sea, the foremost North Pacific ventilation region, to show that its modern oxygenated pattern is a relatively recent feature, with little to no ventilation before six thousand years ago, constituting an apparent Early-Middle Holocene (EMH) threshold or "tipping point." Complementary paleomodeling results likewise indicate a warmer, saltier EMH NPIW, different from its modern conditions. During the EMH, the Okhotsk Sea switched from a modern oxygenation source to a sink, through a combination of sea ice loss, higher water temperatures, and remineralization rates, inhibiting ventilation. We estimate a strongly decreased EMH NPIW oxygenation of ∼30 to 50%, and increased middepth Pacific nutrient concentrations and carbon storage. Our results ( i ) imply that under past or future warmer-than-present conditions, oceanic biogeochemical feedback mechanisms may change or even switch direction, and ( ii ) provide constraints on the high-latitude North Pacific's influence on mesopelagic ventilation dynamics, with consequences for large oceanic regions. Copyright © 2018 the Author(s). Published by PNAS.
Herculano-Houzel, Suzana; Kaas, Jon H.
2011-01-01
Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they apply to simians, great apes and modern humans alike), the Colobinae and Pongidae lineages favored marked increases in body size rather than brain size from the common ancestor with the Homo lineage, while the Homo lineage seems to have favored a large brain instead of a large body, possibly due to the metabolic limitations to having both. PMID:21228547
Herculano-Houzel, Suzana; Kaas, Jon H
2011-01-01
Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they apply to simians, great apes and modern humans alike), the Colobinae and Pongidae lineages favored marked increases in body size rather than brain size from the common ancestor with the Homo lineage, while the Homo lineage seems to have favored a large brain instead of a large body, possibly due to the metabolic limitations to having both. Copyright © 2011 S. Karger AG, Basel.
MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems
NASA Astrophysics Data System (ADS)
Gellis, B. S.
2017-12-01
Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.
Sedimentary processes of the Bagnold Dunes: Implications for the eolian rock record of Mars
Lapotre, M. G. A.; Lewis, K. W.; Day, M.; Stein, N.; Rubin, D. M.; Sullivan, R.; Banham, S.; Lamb, M. P.; Bridges, N. T.; Gupta, S.; Fischer, W. W.
2017-01-01
Abstract The Mars Science Laboratory rover Curiosity visited two active wind‐blown sand dunes within Gale crater, Mars, which provided the first ground‐based opportunity to compare Martian and terrestrial eolian dune sedimentary processes and study a modern analog for the Martian eolian rock record. Orbital and rover images of these dunes reveal terrestrial‐like and uniquely Martian processes. The presence of grainfall, grainflow, and impact ripples resembled terrestrial dunes. Impact ripples were present on all dune slopes and had a size and shape similar to their terrestrial counterpart. Grainfall and grainflow occurred on dune and large‐ripple lee slopes. Lee slopes were ~29° where grainflows were present and ~33° where grainfall was present. These slopes are interpreted as the dynamic and static angles of repose, respectively. Grain size measured on an undisturbed impact ripple ranges between 50 μm and 350 μm with an intermediate axis mean size of 113 μm (median: 103 μm). Dissimilar to dune eolian processes on Earth, large, meter‐scale ripples were present on all dune slopes. Large ripples had nearly symmetric to strongly asymmetric topographic profiles and heights ranging between 12 cm and 28 cm. The composite observations of the modern sedimentary processes highlight that the Martian eolian rock record is likely different from its terrestrial counterpart because of the large ripples, which are expected to engender a unique scale of cross stratification. More broadly, however, in the Bagnold Dune Field as on Earth, dune‐field pattern dynamics and basin‐scale boundary conditions will dictate the style and distribution of sedimentary processes. PMID:29497590
Predicting spatio-temporal failure in large scale observational and micro scale experimental systems
NASA Astrophysics Data System (ADS)
de las Heras, Alejandro; Hu, Yong
2006-10-01
Forecasting has become an essential part of modern thought, but the practical limitations still are manifold. We addressed future rates of change by comparing models that take into account time, and models that focus more on space. Cox regression confirmed that linear change can be safely assumed in the short-term. Spatially explicit Poisson regression, provided a ceiling value for the number of deforestation spots. With several observed and estimated rates, it was decided to forecast using the more robust assumptions. A Markov-chain cellular automaton thus projected 5-year deforestation in the Amazonian Arc of Deforestation, showing that even a stable rate of change would largely deplete the forest area. More generally, resolution and implementation of the existing models could explain many of the modelling difficulties still affecting forecasting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Nai-Yuan; Zavala, Victor M.
We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection viamore » symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.« less
Astronomical Optical Interferometry. I. Methods and Instrumentation
NASA Astrophysics Data System (ADS)
Jankov, S.
2010-12-01
Previous decade has seen an achievement of large interferometric projects including 8-10m telescopes and 100m class baselines. Modern computer and control technology has enabled the interferometric combination of light from separate telescopes also in the visible and infrared regimes. Imaging with milli-arcsecond (mas) resolution and astrometry with micro-arcsecond (muas) precision have thus become reality. Here, I review the methods and instrumentation corresponding to the current state in the field of astronomical optical interferometry. First, this review summarizes the development from the pioneering works of Fizeau and Michelson. Next, the fundamental observables are described, followed by the discussion of the basic design principles of modern interferometers. The basic interferometric techniques such as speckle and aperture masking interferometry, aperture synthesis and nulling interferometry are disscused as well. Using the experience of past and existing facilities to illustrate important points, I consider particularly the new generation of large interferometers that has been recently commissioned (most notably, the CHARA, Keck, VLT and LBT Interferometers). Finally, I discuss the longer-term future of optical interferometry, including the possibilities of new large-scale ground-based projects and prospects for space interferometry.
Topics in geophysical fluid dynamics: Atmospheric dynamics, dynamo theory, and climate dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghil, M.; Childress, S.
1987-01-01
This text is the first study to apply systematically the successive bifurcations approach to complex time-dependent processes in large scale atmospheric dynamics, geomagnetism, and theoretical climate dynamics. The presentation of recent results on planetary-scale phenomena in the earth's atmosphere, ocean, cryosphere, mantle and core provides an integral account of mathematical theory and methods together with physical phenomena and processes. The authors address a number of problems in rapidly developing areas of geophysics, bringing into closer contact the modern tools of nonlinear mathematics and the novel problems of global change in the environment.
Composite turbine blade design options for Claude (open) cycle OTEC power systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Penney, T R
1985-11-01
Small-scale turbine rotors made from composites offer several technical advantages for a Claude (open) cycle ocean thermal energy conversion (OTEC) power system. Westinghouse Electric Corporation has designed a composite turbine rotor/disk using state-of-the-art analysis methods for large-scale (100-MW/sub e/) open cycle OTEC applications. Near-term demonstrations using conventional low-pressure turbine blade shapes with composite material would achieve feasibility and modern credibility of the open cycle OTEC power system. Application of composite blades for low-pressure turbo-machinery potentially improves the reliability of conventional metal blades affected by stress corrosion.
Browne, J
2009-01-01
Charles Darwin's experimental investigations show him to have been a superb practical researcher. These skills are often underestimated today when assessing Darwin's achievement in the Origin of Species and his other books. Supported by a private income, he turned his house and gardens into a Victorian equivalent of a modern research station. Darwin participated actively in the exchange of scientific information via letters and much of his research was also carried out through correspondence. Although this research was relatively small scale in practice, it was large scale in intellectual scope. Darwin felt he had a strong desire to understand or explain whatever he observed.
NASA Astrophysics Data System (ADS)
Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.
2017-12-01
StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of discriminant function analysis of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this computer printout presents data on the application of social field theory to patterns of conflict among nations. Social field theory implies that international relations is a field which consists of all the…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Computer printout of the analysis is included. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents a computer printout of data regarding 'topdog' behavior among nations with regard to economic development and…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969 for…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their power in analyzing international relations, this monograph presents the computer printout of data on the application of discriminant function analysis…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph reports on the testing of relative status field theory on WEIS conflict data for 1966-1969…
Research on computer-aided design of modern marine power systems
NASA Astrophysics Data System (ADS)
Ding, Dongdong; Zeng, Fanming; Chen, Guojun
2004-03-01
To make the MPS (Marine Power System) design process more economical and easier, a new CAD scheme is brought forward which takes much advantage of VR (Virtual Reality) and AI (Artificial Intelligence) technologies. This CAD system can shorten the period of design and reduce the requirements on designers' experience in large scale. And some key issues like the selection of hardware and software of such a system are discussed.
Reeves, Anthony P.; Xie, Yiting; Liu, Shuang
2017-01-01
Abstract. With the advent of fully automated image analysis and modern machine learning methods, there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. This paper presents a method and implementation for facilitating such datasets that addresses the critical issue of size scaling for algorithm validation and evaluation; current evaluation methods that are usually used in academic studies do not scale to large datasets. This method includes protocols for the documentation of many regions in very large image datasets; the documentation may be incrementally updated by new image data and by improved algorithm outcomes. This method has been used for 5 years in the context of chest health biomarkers from low-dose chest CT images that are now being used with increasing frequency in lung cancer screening practice. The lung scans are segmented into over 100 different anatomical regions, and the method has been applied to a dataset of over 20,000 chest CT images. Using this framework, the computer algorithms have been developed to achieve over 90% acceptable image segmentation on the complete dataset. PMID:28612037
Multi-scale approaches for high-speed imaging and analysis of large neural populations
Ahrens, Misha B.; Yuste, Rafael; Peterka, Darcy S.; Paninski, Liam
2017-01-01
Progress in modern neuroscience critically depends on our ability to observe the activity of large neuronal populations with cellular spatial and high temporal resolution. However, two bottlenecks constrain efforts towards fast imaging of large populations. First, the resulting large video data is challenging to analyze. Second, there is an explicit tradeoff between imaging speed, signal-to-noise, and field of view: with current recording technology we cannot image very large neuronal populations with simultaneously high spatial and temporal resolution. Here we describe multi-scale approaches for alleviating both of these bottlenecks. First, we show that spatial and temporal decimation techniques based on simple local averaging provide order-of-magnitude speedups in spatiotemporally demixing calcium video data into estimates of single-cell neural activity. Second, once the shapes of individual neurons have been identified at fine scale (e.g., after an initial phase of conventional imaging with standard temporal and spatial resolution), we find that the spatial/temporal resolution tradeoff shifts dramatically: after demixing we can accurately recover denoised fluorescence traces and deconvolved neural activity of each individual neuron from coarse scale data that has been spatially decimated by an order of magnitude. This offers a cheap method for compressing this large video data, and also implies that it is possible to either speed up imaging significantly, or to “zoom out” by a corresponding factor to image order-of-magnitude larger neuronal populations with minimal loss in accuracy or temporal resolution. PMID:28771570
Modern pollen deposition in Long Island Sound
Beuning, Kristina R.M.; Fransen, Lindsey; Nakityo, Berna; Mecray, Ellen L.; Buchholtz ten Brink, Marilyn R.
2000-01-01
Palynological analyses of 20 surface sediment samples collected from Long Island Sound show a pollen assemblage dominated by Carya, Betula, Pinus, Quercus, Tsuga, and Ambrosia, as is consistent with the regional vegetation. No trends in relative abundance of these pollen types occur either from west to east or associated with modern riverine inputs throughout the basin. Despite the large-scale, long-term removal of fine-grained sediment from winnowed portions of the eastern Sound, the composition of the pollen and spore component of the sedimentary matrix conforms to a basin-wide homogeneous signal. These results strongly support the use of select regional palynological boundaries as chronostratigraphic tools to provide a framework for interpretation of the late glacial and Holocene history of the Long Island Sound basin sediments.
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
Multiple Fault Isolation in Redundant Systems
NASA Technical Reports Server (NTRS)
Pattipati, Krishna R.; Patterson-Hine, Ann; Iverson, David
1997-01-01
Fault diagnosis in large-scale systems that are products of modern technology present formidable challenges to manufacturers and users. This is due to large number of failure sources in such systems and the need to quickly isolate and rectify failures with minimal down time. In addition, for fault-tolerant systems and systems with infrequent opportunity for maintenance (e.g., Hubble telescope, space station), the assumption of at most a single fault in the system is unrealistic. In this project, we have developed novel block and sequential diagnostic strategies to isolate multiple faults in the shortest possible time without making the unrealistic single fault assumption.
Kernel methods for large-scale genomic data analysis
Xing, Eric P.; Schaid, Daniel J.
2015-01-01
Machine learning, particularly kernel methods, has been demonstrated as a promising new tool to tackle the challenges imposed by today’s explosive data growth in genomics. They provide a practical and principled approach to learning how a large number of genetic variants are associated with complex phenotypes, to help reveal the complexity in the relationship between the genetic markers and the outcome of interest. In this review, we highlight the potential key role it will have in modern genomic data processing, especially with regard to integration with classical methods for gene prioritizing, prediction and data fusion. PMID:25053743
The CPAT 2.0.2 Domain Model - How CPAT 2.0.2 "Thinks" From an Analyst Perspective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Melander, Darryl J.
To help effectively plan the management and modernization of their large and diverse fleets of vehicles, the Program Executive Office Ground Combat Systems (PEO GCS) and the Program Executive Office Combat Support and Combat Service Support (PEO CS &CSS) commissioned the development of a large - scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This reportmore » contains a description of the organizational fleet structure and a thorough explanation of the business rules that the CPAT formulation follows involving performance, scheduling, production, and budgets. This report, which is an update to the original CPAT domain model published in 2015 (SAND2015 - 4009), covers important new CPAT features. This page intentionally left blank« less
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph presents findings from an analysis of data on international cooperation over a three-year period. A computer printout of the analysis is included. The document is part of a large scale research project to test various theories with regard to their power in analyzing international relations. In this monograph, an inventory is…
Men and Arms in the Middle East: The Human Factor in Military Modernization
1979-06-01
countries under study supports their abilities to wield military power effectively, their large- scale reliance on importation of military technologies...statistics, and on quality from area experts. In many cases , we were unable to arrive at numerical estimates of the sources of supply. . Likely future...government agencies); on -the-job training (as in the case of counterpart pro- grams); and thle direct importation of both military and civilian labor
Open source tools for large-scale neuroscience.
Freeman, Jeremy
2015-06-01
New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
2009-09-01
his schedule is. I learned most from our informal discussions and collaboration with other industry professionals. Amela was instrumental in allowing...me to effectively analyze, structure and critique my work. I take many professional lessons learned from Amela with me as I leave NPS. Thanks to...observers began learning about maneuver warfare in a large-scale battle. The demonstration was recognized as a huge success after General von Muffling
Impacts of Modernizing Urban Stormwater Systems on Nutrient and Carbon Dynamics
NASA Astrophysics Data System (ADS)
Filippelli, G. M.; Jacinthe, P. A.; Druschel, G.
2015-12-01
Over 200 cities throughout the U.S. are undergoing the painful and expensive transition from Combined Sewer Outflows (CSOs) to modern stormwater systems. The infrastructure of CSOs is frequently a century old, with a design adapted to stormwater conditions of smaller, more pervious cities. Normal rainfall events of less 1 cm per hour can now exceed the CSO capacities in many urban sub-watersheds, leading to streamwater conditions that exceed human health standards for pathogens. Although much focus has been placed on the plumbing aspects of urban stormwater modernization, less has been focused on local, and indeed regional, implications of nutrient and carbon dynamic changes. Indianapolis, Indiana, with a metropolitan population of over 1 million, is a case study of CSO modernization. Most CSO systems in the city were built almost 100 years ago, and the city has experienced classic patterns of growth of impervious surface area, population growth, and enhanced use of chemical fertilizers. The result of these changes has been frequent failure of the CSO system, and release of sewage water into suburban and urban streams, rivers and reservoirs. Driven largely by modern environmental regulations, the city is now "footing the bill" for a century of poor planning and growth, with the real costs seen by ratepayers in the form of steeply growing wastewater fees. The mitigation approach to this problem is largely one of subsurface engineering on a mega scale, with less attention (i.e., money) placed on complementary land-use and nutrient management efforts on the surface. Several examples illustrate the relatively straightforward nature of changing plumbing, in contrast to the complex result of these changes on nutrient pathways, and the implications that this has on oxygenation, nutrient cycling, and carbon release/sequestration dynamics in riparian and urban reservoir systems.
Modernizing Evolutionary Anthropology : Introduction to the Special Issue.
Mattison, Siobhán M; Sear, Rebecca
2016-12-01
Evolutionary anthropology has traditionally focused on the study of small-scale, largely self-sufficient societies. The increasing rarity of these societies underscores the importance of such research yet also suggests the need to understand the processes by which such societies are being lost-what we call "modernization"-and the effects of these processes on human behavior and biology. In this article, we discuss recent efforts by evolutionary anthropologists to incorporate modernization into their research and the challenges and rewards that follow. Advantages include that these studies allow for explicit testing of hypotheses that explore how behavior and biology change in conjunction with changes in social, economic, and ecological factors. In addition, modernization often provides a source of "natural experiments" since it may proceed in a piecemeal fashion through a population. Challenges arise, however, in association with reduced variability in fitness proxies such as fertility, and with the increasing use of relatively novel methodologies in evolutionary anthropology, such as the analysis of secondary data. Confronting these challenges will require careful consideration but will lead to an improved understanding of humanity. We conclude that the study of modernization offers the prospect of developing a richer evolutionary anthropology, by encompassing ultimate and proximate explanations for behavior expressed across the full range of human societies.
Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications.
Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can
2017-05-12
In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O ( 2 N 2 ) degrees of freedom (DOF) with O ( N ) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array.
Low-Cost Nested-MIMO Array for Large-Scale Wireless Sensor Applications
Zhang, Duo; Wu, Wen; Fang, Dagang; Wang, Wenqin; Cui, Can
2017-01-01
In modern communication and radar applications, large-scale sensor arrays have increasingly been used to improve the performance of a system. However, the hardware cost and circuit power consumption scale linearly with the number of sensors, which makes the whole system expensive and power-hungry. This paper presents a low-cost nested multiple-input multiple-output (MIMO) array, which is capable of providing O(2N2) degrees of freedom (DOF) with O(N) physical sensors. The sensor locations of the proposed array have closed-form expressions. Thus, the aperture size and number of DOF can be predicted as a function of the total number of sensors. Additionally, with the help of time-sequence-phase-weighting (TSPW) technology, only one receiver channel is required for sampling the signals received by all of the sensors, which is conducive to reducing the hardware cost and power consumption. Numerical simulation results demonstrate the effectiveness and superiority of the proposed array. PMID:28498329
Climate-driven C4 plant distributions in China: divergence in C4 taxa
Wang, Renzhong; Ma, Linna
2016-01-01
There have been debates on the driving factors of C4 plant expansion, such as PCO2 decline in the late Micocene and warmer climate and precipitation at large-scale modern ecosystems. These disputes are mainly due to the lack of direct evidence and extensive data analysis. Here we use mass flora data to explore the driving factors of C4 distribution and divergent patterns for different C4 taxa at continental scale in China. The results display that it is mean annual climate variables driving C4 distribution at present-day vegetation. Mean annual temperature is the critical restriction of total C4 plants and the precipitation gradients seem to have much less impact. Grass and sedge C4 plants are largely restricted to mean annual temperature and precipitation respectively, while Chenopod C4 plants are strongly restricted by aridity in China. Separate regression analysis can succeed to detect divergences of climate distribution patterns of C4 taxa at global scale. PMID:27302686
Extra-metabolic energy use and the rise in human hyper-density
NASA Astrophysics Data System (ADS)
Burger, Joseph R.; Weinberger, Vanessa P.; Marquet, Pablo A.
2017-03-01
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth’s ’energetic equivalence rule’ supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
Extra-metabolic energy use and the rise in human hyper-density.
Burger, Joseph R; Weinberger, Vanessa P; Marquet, Pablo A
2017-03-02
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth's 'energetic equivalence rule' supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
Modern Quaternary plant lineages promote diversity through facilitation of ancient Tertiary lineages
Valiente-Banuet, Alfonso; Rumebe, Adolfo Vital; Verdú, Miguel; Callaway, Ragan M.
2006-01-01
One of the most important floristic sorting periods to affect modern plant communities occurred during the shift from the wet Tertiary period to the unusually dry Quaternary, when most global deserts developed. During this time, a wave of new plant species emerged, presumably in response to the new climate. Interestingly, most Tertiary species that have been tracked through the fossil record did not disappear but remained relatively abundant despite the development of a much more unfavorable climate for species adapted to moist conditions. Here we find, by integrating paleobotanical, ecological, and phylogenetic analyses, that a large number of ancient Tertiary species in Mediterranean-climate ecosystems appear to have been preserved by the facilitative or “nurse” effects of modern Quaternary species. Our results indicate that these interdependent relationships among plants have played a central role in the preservation of the global biodiversity and provided a mechanism for stabilizing selection and the conservation of ecological traits over evolutionary time scales. PMID:17068126
Johnson, Michael; Jagoe, Kirstie; Charron, Dana; Young, Bonnie N.; Rahman, A. S. M. Mashiur; Omolloh, Daniel; Ipe, Julie
2017-01-01
Nearly three billion people worldwide burn solid fuels and kerosene in open fires and inefficient stoves to cook, light, and heat their homes. Cleaner-burning stoves reduce emissions and can have positive health, climate, and women’s empowerment benefits. This article reports on the protocol and baseline data from the evaluation of four behavior change communication (BCC) campaigns carried out in lower to middle income countries aimed at promoting the sale and use of cleaner-burning stoves. Interventions implemented in Bangladesh, Kenya, and Nigeria are using a range of BCC methods including mass media, digital media, outdoor advertising, and inter-personal communication. The mixed methods evaluation comprises three large-scale surveys: one pre-BCC and two follow-ups, along with smaller scale assessments of stove uptake and patterns of use. Baseline results revealed varying levels of awareness of previous promotions and positive attitudes and beliefs about modern (i.e., relatively clean-burning) cookstoves. Differences in cookstove preferences and behaviors by gender, socio-demographics, media use, and country/region were observed that may affect outcomes. Across all three countries, cost (lack of funds) a key perceived barrier to buying a cleaner-burning stove. Future multivariate analyses will examine potential dose-response effects of BCC on cookstove uptake and patterns of use. BCC campaigns have the potential to promote modern cookstoves at scale. More research on campaign effectiveness is needed, and on how to optimize messages and channels. This evaluation builds on a limited evidence base in the field. PMID:29271949
Evans, William Douglas; Johnson, Michael; Jagoe, Kirstie; Charron, Dana; Young, Bonnie N; Rahman, A S M Mashiur; Omolloh, Daniel; Ipe, Julie
2017-12-22
Nearly three billion people worldwide burn solid fuels and kerosene in open fires and inefficient stoves to cook, light, and heat their homes. Cleaner-burning stoves reduce emissions and can have positive health, climate, and women's empowerment benefits. This article reports on the protocol and baseline data from the evaluation of four behavior change communication (BCC) campaigns carried out in lower to middle income countries aimed at promoting the sale and use of cleaner-burning stoves. Interventions implemented in Bangladesh, Kenya, and Nigeria are using a range of BCC methods including mass media, digital media, outdoor advertising, and inter-personal communication. The mixed methods evaluation comprises three large-scale surveys: one pre-BCC and two follow-ups, along with smaller scale assessments of stove uptake and patterns of use. Baseline results revealed varying levels of awareness of previous promotions and positive attitudes and beliefs about modern (i.e., relatively clean-burning) cookstoves. Differences in cookstove preferences and behaviors by gender, socio-demographics, media use, and country/region were observed that may affect outcomes. Across all three countries, cost (lack of funds) a key perceived barrier to buying a cleaner-burning stove. Future multivariate analyses will examine potential dose-response effects of BCC on cookstove uptake and patterns of use. BCC campaigns have the potential to promote modern cookstoves at scale. More research on campaign effectiveness is needed, and on how to optimize messages and channels. This evaluation builds on a limited evidence base in the field.
Stephenson, Rob; Bartel, Doris; Rubardt, Marcie
2012-01-01
Using samples of reproductive aged men and women from rural Ethiopia and Kenya, this study examines the associations between two scales measuring balances of power and equitable attitudes within relationships and modern contraceptive use. The scales are developed from the Sexual and Reproductive Power Scale (SRPS) and Gender Equitable Male (GEM) scale, which were originally developed to measure relationship power (SRPS) among women and gender equitable attitudes (GEM) among men. With the exception of Ethiopian women, a higher score on the balance of power scale was associated with significantly higher odds of reporting modern contraceptive use. For men and women in both countries, a higher score on the equitable attitudes scale was associated with significantly higher odds of reporting modern contraceptive use. However, only the highest categories of the scales are associated with contraceptive use, suggesting a threshold effect in the relationships between power, equity and contraceptive use. The results presented here demonstrate how elements of the GEM and SRPS scales can be used to create scales measuring balances of power and equitable attitudes within relationships that are associated with self-reporting of modern contraceptive use in two resource-poor settings. However, further work with larger sample sizes is needed to confirm these findings, and to examine the extent to which these scales can be applied to other social and cultural contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horiuchi, Shunsaku, E-mail: horiuchi@vt.edu
2016-06-21
The cold dark matter paradigm has been extremely successful in explaining the large-scale structure of the Universe. However, it continues to face issues when confronted by observations on sub-Galactic scales. A major caveat, now being addressed, has been the incomplete treatment of baryon physics. We first summarize the small-scale issues surrounding cold dark matter and discuss the solutions explored by modern state-of-the-art numerical simulations including treatment of baryonic physics. We identify the too big to fail in field galaxies as among the best targets to study modifications to dark matter, and discuss the particular connection with sterile neutrino warm darkmore » matter. We also discuss how the recently detected anomalous 3.55 keV X-ray lines, when interpreted as sterile neutrino dark matter decay, provide a very good description of small-scale observations of the Local Group.« less
Confirmation of general relativity on large scales from weak lensing and galaxy velocities.
Reyes, Reinabelle; Mandelbaum, Rachel; Seljak, Uros; Baldauf, Tobias; Gunn, James E; Lombriser, Lucas; Smith, Robert E
2010-03-11
Although general relativity underlies modern cosmology, its applicability on cosmological length scales has yet to be stringently tested. Such a test has recently been proposed, using a quantity, E(G), that combines measures of large-scale gravitational lensing, galaxy clustering and structure growth rate. The combination is insensitive to 'galaxy bias' (the difference between the clustering of visible galaxies and invisible dark matter) and is thus robust to the uncertainty in this parameter. Modified theories of gravity generally predict values of E(G) different from the general relativistic prediction because, in these theories, the 'gravitational slip' (the difference between the two potentials that describe perturbations in the gravitational metric) is non-zero, which leads to changes in the growth of structure and the strength of the gravitational lensing effect. Here we report that E(G) = 0.39 +/- 0.06 on length scales of tens of megaparsecs, in agreement with the general relativistic prediction of E(G) approximately 0.4. The measured value excludes a model within the tensor-vector-scalar gravity theory, which modifies both Newtonian and Einstein gravity. However, the relatively large uncertainty still permits models within f(R) theory, which is an extension of general relativity. A fivefold decrease in uncertainty is needed to rule out these models.
Confirmation of general relativity on large scales from weak lensing and galaxy velocities
NASA Astrophysics Data System (ADS)
Reyes, Reinabelle; Mandelbaum, Rachel; Seljak, Uros; Baldauf, Tobias; Gunn, James E.; Lombriser, Lucas; Smith, Robert E.
2010-03-01
Although general relativity underlies modern cosmology, its applicability on cosmological length scales has yet to be stringently tested. Such a test has recently been proposed, using a quantity, EG, that combines measures of large-scale gravitational lensing, galaxy clustering and structure growth rate. The combination is insensitive to `galaxy bias' (the difference between the clustering of visible galaxies and invisible dark matter) and is thus robust to the uncertainty in this parameter. Modified theories of gravity generally predict values of EG different from the general relativistic prediction because, in these theories, the `gravitational slip' (the difference between the two potentials that describe perturbations in the gravitational metric) is non-zero, which leads to changes in the growth of structure and the strength of the gravitational lensing effect. Here we report that EG = 0.39+/-0.06 on length scales of tens of megaparsecs, in agreement with the general relativistic prediction of EG~0.4. The measured value excludes a model within the tensor-vector-scalar gravity theory, which modifies both Newtonian and Einstein gravity. However, the relatively large uncertainty still permits models within f() theory, which is an extension of general relativity. A fivefold decrease in uncertainty is needed to rule out these models.
Food security through large scale investments in agriculture
NASA Astrophysics Data System (ADS)
Rulli, M.; D'Odorico, P.
2013-12-01
Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We show how at gap closure up to about 290-470 million people could be fed by crops grown on this land, in face of the 200-300 million people that can be supported with the current crop yields. These numbers raise some concern because many of the target countries exhibit high malnourishment levels. If used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.
Image segmentation evaluation for very-large datasets
NASA Astrophysics Data System (ADS)
Reeves, Anthony P.; Liu, Shuang; Xie, Yiting
2016-03-01
With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.
Evseeva, T I; Maĭstrenko, T A; Geras'kin, S A; Belykh, E S; Umarov, M A; Sergeeva, I Iu; Sergeev, V Iu
2008-01-01
Results on estimation of modern radioecological situation at nuclear explosion "Chagan" based on large-scale cartographic studies (1:25000) of a test area (4 km2) are presented. Maximum gamma-irradiation doses were observed at bulk of ground surrounded a crater and at radioactive fall-outs extended to the North-East and to the SouthWest from the crater. Based on data on artificial radionuclide specific activity most part of soil samples were attributed to radioactive wastes according to IAEA (1996) and OSPORB (1999). Natural decrease of soil radioactivity up to safety level due to 60Co, 137Cs, 90Sr, 152Eu, 154Eu radioactive decay and 241Am accumulation-decay will not take place within the next 60 years at the studied area.
GDC 2: Compression of large collections of genomes
Deorowicz, Sebastian; Danek, Agnieszka; Niemiec, Marcin
2015-01-01
The fall of prices of the high-throughput genome sequencing changes the landscape of modern genomics. A number of large scale projects aimed at sequencing many human genomes are in progress. Genome sequencing also becomes an important aid in the personalized medicine. One of the significant side effects of this change is a necessity of storage and transfer of huge amounts of genomic data. In this paper we deal with the problem of compression of large collections of complete genomic sequences. We propose an algorithm that is able to compress the collection of 1092 human diploid genomes about 9,500 times. This result is about 4 times better than what is offered by the other existing compressors. Moreover, our algorithm is very fast as it processes the data with speed 200 MB/s on a modern workstation. In a consequence the proposed algorithm allows storing the complete genomic collections at low cost, e.g., the examined collection of 1092 human genomes needs only about 700 MB when compressed, what can be compared to about 6.7 TB of uncompressed FASTA files. The source code is available at http://sun.aei.polsl.pl/REFRESH/index.php?page=projects&project=gdc&subpage=about. PMID:26108279
GDC 2: Compression of large collections of genomes.
Deorowicz, Sebastian; Danek, Agnieszka; Niemiec, Marcin
2015-06-25
The fall of prices of the high-throughput genome sequencing changes the landscape of modern genomics. A number of large scale projects aimed at sequencing many human genomes are in progress. Genome sequencing also becomes an important aid in the personalized medicine. One of the significant side effects of this change is a necessity of storage and transfer of huge amounts of genomic data. In this paper we deal with the problem of compression of large collections of complete genomic sequences. We propose an algorithm that is able to compress the collection of 1092 human diploid genomes about 9,500 times. This result is about 4 times better than what is offered by the other existing compressors. Moreover, our algorithm is very fast as it processes the data with speed 200 MB/s on a modern workstation. In a consequence the proposed algorithm allows storing the complete genomic collections at low cost, e.g., the examined collection of 1092 human genomes needs only about 700 MB when compressed, what can be compared to about 6.7 TB of uncompressed FASTA files. The source code is available at http://sun.aei.polsl.pl/REFRESH/index.php?page=projects&project=gdc&subpage=about.
Effects of rotation on coolant passage heat transfer. Volume 1: Coolant passages with smooth walls
NASA Technical Reports Server (NTRS)
Hajek, T. J.; Wagner, J. H.; Johnson, B. V.; Higgins, A. W.; Steuber, G. D.
1991-01-01
An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modern turbine blades. The immediate objective was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. Experiments were conducted in a smooth wall large scale heat transfer model.
Nonlinear Maps for Design of Discrete Time Models of Neuronal Network Dynamics
2016-02-29
Performance/Technic~ 02-01-2016- 02-29-2016 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER Nonlinear Maps for Design of Discrete -Time Models of Neuronal...neuronal model in the form of difference equations that generates neuronal states in discrete moments of time. In this approach, time step can be made...propose to use modern DSP ideas to develop new efficient approaches to the design of such discrete -time models for studies of large-scale neuronal
China Report, Red Flag, Number 11, 1 June 1986.
1986-08-06
and that he hoped it would attract the attention of the whole society. I feel that abiding by the principle of "gearing toward modernization, gear...ing toward the world and gearing toward the future," and training children into a generation of new communist persons is an especially important...association as tantamount to the collectivization model of the past, thinking that it is a move toward "large-scale operation, a high degree of
Climatic controls on the pace of glacier erosion
NASA Astrophysics Data System (ADS)
Koppes, Michele; Hallet, Bernard; Rignot, Eric; Mouginot, Jeremie; Wellner, Julia; Love, Katherine
2016-04-01
Mountain ranges worldwide have undergone large-scale modification due the erosive action of ice, yet the mechanisms that control the timing of this modification and the rate by which ice erodes remain poorly understood. Available data report a wide range of erosion rates from individual ice masses over varying timescales, suggesting that modern erosion rates exceed orogenic rates by 2-3 orders of magnitude. These modern rates are presumed to be due to dynamic acceleration of the ice masses during deglaciation and retreat. Recent numerical models have focused on replicating the processes that produce the geomorphic signatures of glacial landscapes. Central to these models is a simple quantitative index that relates erosion rate to ice dynamics and to climate. To provide such an index, we examined explicitly the factors controlling modern glacier erosion rates across climatic regimes. Holding tectonic history, bedrock lithology and glacier hypsometries relatively constant across a latitudinal transect from Patagonia to the Antarctic Peninsula, we find that modern, basin-averaged erosion rates vary by three orders of magnitude, from 1->10 mm yr-1 for temperate tidewater glaciers to 0.01-<0.1 mm yr-1 for polar outlet glaciers, largely as a function of temperature and basal thermal regime. Erosion rates also increase non-linearly with both the sliding speed and the ice flux through the ELA, in accord with theory. The general relationship between ice dynamics and erosion suggests that the erosion rate scales non-linearly with basal sliding speed, with an exponent n ≈ 2-2.62. Notably, erosion rates decrease by over two orders of magnitude between temperate and polar glaciers with similar ice discharge rates. The difference in erosion rates between temperate and colder glaciers of similar shape and size is primarily related to the abundance of meltwater accessing the bed. Since all glaciers worldwide have experienced colder than current climatic conditions, the 100-fold decrease in long-term relative to modern erosion rates may in part reflect the temporal averaging of warm and cold-based conditions over the lifecycle of these glaciers. Higher temperatures and precipitation rates at the end of glaciations favor the production of water from rainfall, surface melting and internal melting, which promotes sliding, erosion and sediment production and evacuation from under the ice. Hence, climatic variation, more than the extent of ice cover or ice volume, controls the pace at which glaciers shape mountains.
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2010-09-01
This article is essentially devoted to a brief historical introduction to Euler's formula for polyhedra, topology, theory of graphs and networks with many examples from the real-world. Celebrated Königsberg seven-bridge problem and some of the basic properties of graphs and networks for some understanding of the macroscopic behaviour of real physical systems are included. We also mention some important and modern applications of graph theory or network problems from transportation to telecommunications. Graphs or networks are effectively used as powerful tools in industrial, electrical and civil engineering, communication networks in the planning of business and industry. Graph theory and combinatorics can be used to understand the changes that occur in many large and complex scientific, technical and medical systems. With the advent of fast large computers and the ubiquitous Internet consisting of a very large network of computers, large-scale complex optimization problems can be modelled in terms of graphs or networks and then solved by algorithms available in graph theory. Many large and more complex combinatorial problems dealing with the possible arrangements of situations of various kinds, and computing the number and properties of such arrangements can be formulated in terms of networks. The Knight's tour problem, Hamilton's tour problem, problem of magic squares, the Euler Graeco-Latin squares problem and their modern developments in the twentieth century are also included.
The cosmological principle is not in the sky
NASA Astrophysics Data System (ADS)
Park, Chan-Gyung; Hyun, Hwasu; Noh, Hyerim; Hwang, Jai-chan
2017-08-01
The homogeneity of matter distribution at large scales, known as the cosmological principle, is a central assumption in the standard cosmological model. The case is testable though, thus no longer needs to be a principle. Here we perform a test for spatial homogeneity using the Sloan Digital Sky Survey Luminous Red Galaxies (LRG) sample by counting galaxies within a specified volume with the radius scale varying up to 300 h-1 Mpc. We directly confront the large-scale structure data with the definition of spatial homogeneity by comparing the averages and dispersions of galaxy number counts with allowed ranges of the random distribution with homogeneity. The LRG sample shows significantly larger dispersions of number counts than the random catalogues up to 300 h-1 Mpc scale, and even the average is located far outside the range allowed in the random distribution; the deviations are statistically impossible to be realized in the random distribution. This implies that the cosmological principle does not hold even at such large scales. The same analysis of mock galaxies derived from the N-body simulation, however, suggests that the LRG sample is consistent with the current paradigm of cosmology, thus the simulation is also not homogeneous in that scale. We conclude that the cosmological principle is neither in the observed sky nor demanded to be there by the standard cosmological world model. This reveals the nature of the cosmological principle adopted in the modern cosmology paradigm, and opens a new field of research in theoretical cosmology.
Tuberculosis and the role of war in the modern era.
Drobniewski, F A; Verlander, N Q
2000-12-01
Tuberculosis (TB) remains a major global health problem; historically, major wars have increased TB notifications. This study evaluated whether modern conflicts worldwide affected TB notifications between 1975 and 1995. Dates of conflicts were obtained and matched with national TB notification data reported to the World Health Organization. Overall notification rates were calculated pre and post conflict. Poisson regression analysis was applied to all conflicts with sufficient data for detailed trend analysis. Thirty-six conflicts were identified, for which 3-year population and notification data were obtained. Overall crude TB notification rates were 81.9 and 105.1/100,000 pre and post start of conflict in these countries. Sufficient data existed in 16 countries to apply Poisson regression analysis to model 5-year pre and post start of conflict trends. This analysis indicated that the risk of presenting with TB in any country 2.5 years after the outbreak of conflict relative to 2.5 years before the outbreak was 1.016 (95%CI 0.9435-1.095). The modelling suggested that in the modern era war may not significantly damage efforts to control TB in the long term. This might be due to the limited scale of most of these conflicts compared to the large-scale civilian disruption associated with 'world wars'. The management of TB should be considered in planning post-conflict refugee and reconstruction programmes.
NASA Astrophysics Data System (ADS)
Griffiths, John D.
2015-12-01
The modern understanding of the brain as a large, complex network of interacting elements is a natural consequence of the Neuron Doctrine [1,2] that has been bolstered in recent years by the tools and concepts of connectomics. In this abstracted, network-centric view, the essence of neural and cognitive function derives from the flows between network elements of activity and information - or, more generally, causal influence. The appropriate characterization of causality in neural systems, therefore, is a question at the very heart of systems neuroscience.
Psychometric Properties and Normative Data for a Swedish Version of the Modern Health Worries Scale.
Palmquist, Eva; Petrie, Keith J; Nordin, Steven
2017-02-01
The modern health worries (MHW) scale was developed to assess individuals' worries about aspects of modernity and technology affecting personal health. The aim of this study was to psychometrically evaluate a Swedish version of the MHW scale and to provide Swedish normative data. Data were collected as part of the Västerbotten Environmental Health Study, which has a random sample of 3406 Swedish adults (18-79 years). The Swedish version of the MHW scale showed excellent internal consistency and satisfactory convergent validity. A four-factor structure consistent with the original version was confirmed. The model showed invariance across age and sex. A slightly positively skewed and platykurtic distribution was found. Normative data for the general population and for combinations of specific age groups (young, middle aged, and elderly) and sex are presented. The psychometric properties of the Swedish version of the MHW scale suggest that use of this instrument is appropriate for assessing worries about modernity in Swedish-speaking and similar populations. The scale now has the advantage of good normative data being available. MHW may hold importance for understanding and predicting the development of functional disorders, such as idiopathic environmental intolerance and other medically unexplained conditions.
Can a science-based definition of acupuncture improve clinical outcomes?
Priebe, Ted; Stumpf, Steven H; Zalunardo, Rod
2017-05-01
Research on acupuncture has been muddled by attempts to bridge the ancient with the modern. Barriers to effectiveness research are reflected in recurring conflicts that include disagreement on use of the most basic terms, lack of standard intervention controls, and the absence of functional measures for assessing treatment effect. Acupuncture research has stalled at the "placebo barrier" wherein acupuncture is "no better than placebo." The most widely recognized comparative effectiveness research in acupuncture does not compare acupuncture treatment protocols within groups, thereby, mutating large scale effectiveness studies into large scale efficacy trials. Too often research in acupuncture attempts to tie outcomes to traditional belief systems thereby limiting usefulness of the research. The acupuncture research paradigm needs to focus more closely on a scientific definition of treatments and outcomes that compare protocols in terms of prevalent clinical issues such as relative effectiveness for treating pain.
Batteries for electric road vehicles.
Goodenough, John B; Braga, M Helena
2018-01-15
The dependence of modern society on the energy stored in a fossil fuel is not sustainable. An immediate challenge is to eliminate the polluting gases emitted from the roads of the world by replacing road vehicles powered by the internal combustion engine with those powered by rechargeable batteries. These batteries must be safe and competitive in cost, performance, driving range between charges, and convenience. The competitive performance of an electric car has been demonstrated, but the cost of fabrication, management to ensure safety, and a short cycle life have prevented large-scale penetration of the all-electric road vehicle into the market. Low-cost, safe all-solid-state cells from which dendrite-free alkali-metal anodes can be plated are now available; they have an operating temperature range from -20 °C to 80 °C and they permit the design of novel high-capacity, high-voltage cathodes providing fast charge/discharge rates. Scale-up to large multicell batteries is feasible.
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Watson, Willie R. (Technical Monitor)
2005-01-01
The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.
Fault-tolerant Control of a Cyber-physical System
NASA Astrophysics Data System (ADS)
Roxana, Rusu-Both; Eva-Henrietta, Dulf
2017-10-01
Cyber-physical systems represent a new emerging field in automatic control. The fault system is a key component, because modern, large scale processes must meet high standards of performance, reliability and safety. Fault propagation in large scale chemical processes can lead to loss of production, energy, raw materials and even environmental hazard. The present paper develops a multi-agent fault-tolerant control architecture using robust fractional order controllers for a (13C) cryogenic separation column cascade. The JADE (Java Agent DEvelopment Framework) platform was used to implement the multi-agent fault tolerant control system while the operational model of the process was implemented in Matlab/SIMULINK environment. MACSimJX (Multiagent Control Using Simulink with Jade Extension) toolbox was used to link the control system and the process model. In order to verify the performance and to prove the feasibility of the proposed control architecture several fault simulation scenarios were performed.
Probing the brain with molecular fMRI.
Ghosh, Souparno; Harvey, Peter; Simon, Jacob C; Jasanoff, Alan
2018-06-01
One of the greatest challenges of modern neuroscience is to incorporate our growing knowledge of molecular and cellular-scale physiology into integrated, organismic-scale models of brain function in behavior and cognition. Molecular-level functional magnetic resonance imaging (molecular fMRI) is a new technology that can help bridge these scales by mapping defined microscopic phenomena over large, optically inaccessible regions of the living brain. In this review, we explain how MRI-detectable imaging probes can be used to sensitize noninvasive imaging to mechanistically significant components of neural processing. We discuss how a combination of innovative probe design, advanced imaging methods, and strategies for brain delivery can make molecular fMRI an increasingly successful approach for spatiotemporally resolved studies of diverse neural phenomena, perhaps eventually in people. Copyright © 2018 Elsevier Ltd. All rights reserved.
Rethinking the history of modern agriculture: British pig production, c.1910-65.
Woods, Abigail
2012-01-01
This article uses a study of pig production in Britain, c.1910-65, to rethink the history of modern agriculture and its implications for human-animal relationships. Drawing on literature written by and for pig producers and experts, it challenges existing portrayals of a unidirectional, post-Second World War shift from traditional small-scale mixed farming to large, specialized, intensive systems. Rather, 'factory-style' pig production was already established in Britain by the 1930s, and its fortunes waxed and waned over time in relation to different kinds of outdoor production, which was still prominent in the mid-1960s. In revealing that the progressive proponents of both indoor and outdoor methods regarded them as modern and efficient, but defined and pursued these values in quite different ways, the article argues for a more historically situated understanding of agricultural modernity. Analysis reveals that regardless of their preferred production system, leading experts and producers were keen to develop what they considered to be natural methods that reflected the pig's natural needs and desires. They perceived pigs as active, sentient individuals, and believed that working in harmony with their natures was essential, even if this was, ultimately, for commercial ends. Such views contradict received accounts of modern farming as a utilitarian enterprise, concerned only with dominating and manipulating nature. They are used to argue that a romantic, moral view of the pig did not simply pre-date or emerge in opposition to modern agriculture, but, rather, was integral to it.
NASA Astrophysics Data System (ADS)
Sincavage, R.; Betka, P. M.; Seeber, L.; Steckler, M. S.; Zoramthara, C.
2017-12-01
The closure of an ocean basin involves the interplay of tectonics and sedimentology, whereby thick successions of fluvio-deltaic and shallow marine sediment accumulate in the closing gap between the subduction zone and passive margin. The transition from subduction to collision involves processes that are inherently time-transgressive and co-evolve to influence the nature of the developing tectonic wedge. The Indo-Burman Ranges (IBR) of eastern India present a unique opportunity to examine this scenario on a variety of spatial (10-2-105 m2) and temporal (1 a-10 Ma) scales. Recent field mapping campaigns in the IBR have illuminated analogous depositional environments expressed in the Neogene outcrops of the IBR and the Holocene sediment archive of the Ganges-Brahmaputra-Meghna delta (GBMD). Six distinct lithofacies are present in shallow-marine to fluvial strata of the IBR, containing sedimentary structures that reflect depositional environments correlative with the modern delta. Cyclical alternations of fine sands and silts in packages on the order of 15-20 cm thick define part of the shallow-marine section (M2 facies) that we interpret to represent the foreset beds of the ancient subaqueous delta. The overall scale and sedimentary structures of M2 compare favorably with modern foreset deposits in the Bay of Bengal. Tan-orange medium-grained, well sorted fluvial sandstone that contain large scale (1-10 m) tabular and trough cross bedding represent large-river channel deposits (F2 facies) that overlie the shallow marine strata. F2 deposits bear a striking resemblance in scale and character to bar deposits along the modern Jamuna River. Preliminary grain size analyses on the F2 facies yield grain size distributions that are remarkably consistent with Brahmaputra-sourced mid-Holocene sediments from Sylhet basin within the GBMD. Current research on the GBMD has revealed quantifiable trends in bed thicknesses, downstream fining, and grain size within fluvial deposits. These data will be coupled with ongoing structural and geo- and thermochronology field studies of the IBR that aim to continue to reveal the structural and stratigraphic evolution of this geologically active and densely populated region.
Modernization of US Nuclear Forces: Costs in Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tapia-Jimenez, D.
This short research paper addresses two topics that have emerged in the debate about whether, when, and how to modernize U.S. nuclear forces.1 The first topic relates to the size and scale of the planned nuclear force, with some critics of the modernization plan arguing that the United States is simply replicating the Cold War force for a very different era. The second topic relates to the cost of the modernization effort, with some critics arguing that the cost is unaffordable.2 This paper begins with a review of the changes in the size and scale of U.S. nuclear forces sincemore » the Cold War. It then examines the expected costs of modernization in a comparative perspective.« less
NASA Astrophysics Data System (ADS)
Tang, Shuaiqi; Zhang, Minghua
2015-08-01
Atmospheric vertical velocities and advective tendencies are essential large-scale forcing data to drive single-column models (SCMs), cloud-resolving models (CRMs), and large-eddy simulations (LESs). However, they cannot be directly measured from field measurements or easily calculated with great accuracy. In the Atmospheric Radiation Measurement Program (ARM), a constrained variational algorithm (1-D constrained variational analysis (1DCVA)) has been used to derive large-scale forcing data over a sounding network domain with the aid of flux measurements at the surface and top of the atmosphere (TOA). The 1DCVA algorithm is now extended into three dimensions (3DCVA) along with other improvements to calculate gridded large-scale forcing data, diabatic heating sources (Q1), and moisture sinks (Q2). Results are presented for a midlatitude cyclone case study on 3 March 2000 at the ARM Southern Great Plains site. These results are used to evaluate the diabatic heating fields in the available products such as Rapid Update Cycle, ERA-Interim, National Centers for Environmental Prediction Climate Forecast System Reanalysis, Modern-Era Retrospective Analysis for Research and Applications, Japanese 55-year Reanalysis, and North American Regional Reanalysis. We show that although the analysis/reanalysis generally captures the atmospheric state of the cyclone, their biases in the derivative terms (Q1 and Q2) at regional scale of a few hundred kilometers are large and all analyses/reanalyses tend to underestimate the subgrid-scale upward transport of moist static energy in the lower troposphere. The 3DCVA-gridded large-scale forcing data are physically consistent with the spatial distribution of surface and TOA measurements of radiation, precipitation, latent and sensible heat fluxes, and clouds that are better suited to force SCMs, CRMs, and LESs. Possible applications of the 3DCVA are discussed.
Multilevel water governance and problems of scale: setting the stage for a broader debate.
Moss, Timothy; Newig, Jens
2010-07-01
Environmental governance and management are facing a multiplicity of challenges related to spatial scales and multiple levels of governance. Water management is a field particularly sensitive to issues of scale because the hydrological system with its different scalar levels from small catchments to large river basins plays such a prominent role. It thus exemplifies fundamental issues and dilemmas of scale in modern environmental management and governance. In this introductory article to an Environmental Management special feature on "Multilevel Water Governance: Coping with Problems of Scale," we delineate our understanding of problems of scale and the dimensions of scalar politics that are central to water resource management. We provide an overview of the contributions to this special feature, concluding with a discussion of how scalar research can usefully challenge conventional wisdom on water resource management. We hope that this discussion of water governance stimulates a broader debate and inquiry relating to the scalar dimensions of environmental governance and management in general.
Modern Climate Analogues of Late-Quaternary Paleoclimates for the Western United States.
NASA Astrophysics Data System (ADS)
Mock, Cary Jeffrey
This study examined spatial variations of modern and late-Quaternary climates for the western United States. Synoptic climatological analyses of the modern record identified the predominate climatic controls that normally produce the principal modes of spatial climatic variability. They also provided a modern standard to assess past climates. Maps of the month-to-month changes in 500 mb heights, sea-level pressure, temperature, and precipitation illustrated how different climatic controls govern the annual cycle of climatic response. The patterns of precipitation ratios, precipitation bar graphs, and the seasonal precipitation maximum provided additional insight into how different climatic controls influence spatial climatic variations. Synoptic-scale patterns from general circulation model (GCM) simulations or from analyses of climatic indices were used as the basis for finding modern climate analogues for 18 ka and 9 ka. Composite anomaly maps of atmospheric circulation, precipitation, and temperature were compared with effective moisture maps compiled from proxy data to infer how the patterns, which were evident from the proxy data, were generated. The analyses of the modern synoptic climatology indicate that smaller-scale climatic controls must be considered along with larger-scale ones in order to explain patterns of spatial climate heterogeneity. Climatic extremes indicate that changes in the spatial patterns of precipitation seasonality are the exception rather than the rule, reflecting the strong influence of smaller-scale controls. Modern climate analogues for both 18 ka and 9 ka clearly depict the dry Northwest/wet Southwest contrast that is suggested by GCM simulations and paleoclimatic evidence. 18 ka analogues also show the importance of smaller-scale climatic controls in explaining spatial climatic variation in the Northwest and northern Great Plains. 9 ka analogues provide climatological explanations for patterns of spatial heterogeneity over several mountainous areas as suggested by paleoclimatic evidence. Modern analogues of past climates supplement modeling approaches by providing information below the resolution of model simulations. Analogues can be used to examine the controls of spatial paleoclimatic variation if sufficient instrumental data and paleoclimatic evidence are available, and if one carefully exercises uniformitarianism when extrapolating modern relationships to the past.
Extreme reaction times determine fluctuation scaling in human color vision
NASA Astrophysics Data System (ADS)
Medina, José M.; Díaz, José A.
2016-11-01
In modern mental chronometry, human reaction time defines the time elapsed from stimulus presentation until a response occurs and represents a reference paradigm for investigating stochastic latency mechanisms in color vision. Here we examine the statistical properties of extreme reaction times and whether they support fluctuation scaling in the skewness-kurtosis plane. Reaction times were measured for visual stimuli across the cardinal directions of the color space. For all subjects, the results show that very large reaction times deviate from the right tail of reaction time distributions suggesting the existence of dragon-kings events. The results also indicate that extreme reaction times are correlated and shape fluctuation scaling over a wide range of stimulus conditions. The scaling exponent was higher for achromatic than isoluminant stimuli, suggesting distinct generative mechanisms. Our findings open a new perspective for studying failure modes in sensory-motor communications and in complex networks.
Acceleration and propagation of ultrahigh energy cosmic rays
NASA Astrophysics Data System (ADS)
Lemoine, Martin
2013-02-01
The origin of the highest energy cosmic rays represents one of the most conspicuous enigmas of modern astrophysics, in spite of gigantic experimental efforts in the past fifty years, and of active theoretical research. The past decade has known exciting experimental results, most particularly the detection of a cut-off at the expected position for the long sought Greisen-Zatsepin-Kuzmin suppression as well as evidence for large scale anisotropies. This paper summarizes and discusses recent achievements in this field.
NASA Astrophysics Data System (ADS)
Galison, Peter
2010-02-01
Secrecy in matters of national defense goes back far past antiquity. But our modern form of national secrecy owes a huge amount to a the large scale, systematic, and technical system of scientific secrecy that began in the Radar and Manhattan Projects of World War II and came to its current form in the Cold War. Here I would like to capture some of this trajectory and to present some of the paradoxes and deep conundrums that our secrecy system offers us in the Post-Cold War world. )
ΛGR Centennial: Cosmic Web in Dark Energy Background
NASA Astrophysics Data System (ADS)
Chernin, A. D.
The basic building blocks of the Cosmic Web are groups and clusters of galaxies, super-clusters (pancakes) and filaments embedded in the universal dark energy background. The background produces antigravity, and the antigravity effect is strong in groups, clusters and superclusters. Antigravity is very weak in filaments where matter (dark matter and baryons) produces gravity dominating in the filament internal dynamics. Gravity-antigravity interplay on the large scales is a grandiose phenomenon predicted by ΛGR theory and seen in modern observations of the Cosmic Web.
He, Guizhen; Zhang, Lei; Lu, Yonglong
2009-09-01
Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
Military Health System Transformation Implications on Health Information Technology Modernization.
Khan, Saad
2018-03-01
With the recent passage of the National Defense Authorization Act for Fiscal Year 2017, Congress has triggered groundbreaking Military Health System organizational restructuring with the Defense Health Agency assuming responsibility for managing all hospitals and clinics owned by the Army, Navy, and Air Force. This is a major shift toward a modern value-based managed care system, which will require much greater military-civilian health care delivery integration to be in place by October 2018. Just before the National Defense Authorization Act for Fiscal Year 2017 passage, the Department of Defense had already begun a seismic shift and awarded a contract for the new Military Health System-wide electronic health record system. In this perspective, we discuss the implications of the intersection of two large-scope and large-scale initiatives, health system transformation, and information technology modernization, being rolled out in the largest and most complex federal agency and potential risk mitigating steps. The Military Health System will require an expanded unified clinical leadership to spearhead short-term transformation; furthermore, developing, organizing, and growing a cadre of informatics expertise to expand the use and diffusion of novel solutions such as health information exchanges, data analytics, and others to transcend organizational barriers are still needed to achieve the long-term aim of health system reform as envisioned by the National Defense Authorization Act for Fiscal Year 2017.
Implementing WebGL and HTML5 in Macromolecular Visualization and Modern Computer-Aided Drug Design.
Yuan, Shuguang; Chan, H C Stephen; Hu, Zhenquan
2017-06-01
Web browsers have long been recognized as potential platforms for remote macromolecule visualization. However, the difficulty in transferring large-scale data to clients and the lack of native support for hardware-accelerated applications in the local browser undermine the feasibility of such utilities. With the introduction of WebGL and HTML5 technologies in recent years, it is now possible to exploit the power of a graphics-processing unit (GPU) from a browser without any third-party plugin. Many new tools have been developed for biological molecule visualization and modern drug discovery. In contrast to traditional offline tools, real-time computing, interactive data analysis, and cross-platform analyses feature WebGL- and HTML5-based tools, facilitating biological research in a more efficient and user-friendly way. Copyright © 2017 Elsevier Ltd. All rights reserved.
What Can Modern River Profiles Tell Us about Orogenic Processes and Orogen Evolution?
NASA Astrophysics Data System (ADS)
Whipple, K. X.
2008-12-01
Numerous lines of evidence from theory, numerical simulations, and physical experiments suggest that orogen evolution is strongly coupled to atmospheric processes through the interrelationships among climate, topography, and erosion rate. In terms of orogenic processes and orogen evolution, these relationships are most important at the regional scale (mean topographic gradient, mean relief above surrounding plains) largely because crustal deformation is most sensitive to erosional unloading averaged over sufficiently long wavelengths. For this reason, and because above moderate erosion rates (> 0.2 mm/yr) hillslope form becomes decoupled from erosion rate, attention has focused on the river network, and even on particularly large rivers. We now have data that demonstrates a monotonic relationship between erosion rate and the channel steepness index (slope normalized for differences in drainage area) in a variety of field settings. Consequently, study of modern river profiles can yield useful information on recent and on-going patterns of rock uplift. It is not yet possible, however, to quantitatively isolate expected climatic and lithologic influences on this relationship. A combination of field studies and theoretical analyses are beginning to reveal the timescale of landscape response, and thus the topographic memory of past conditions. At orogen scale, river profile response to a change in rock uplift rate is on the order of 1-10 Myr. Because of these long response times, the modern profiles of large rivers and their major tributaries can potentially preserve an interpretable record of rock uplift rates since the Miocene and are insensitive to short-term climatic fluctuations. Only significant increases in rock uplift rate, however, are likely to leave a clear topographic signature. Strategies have been developed to differentiate between temporal and spatial (tectonic, climatic, or lithologic) influences on channel profile form, especially where spatially distributed data on recent incision rates is available. A more difficult question is one of cause and effect. Only in some circumstances is it possible to determine whether rivers are steep in response to localized rock uplift or whether localized rock uplift occurs in response to rapidly incising steep rivers.
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.
Abebe, Gumataw K; Chalak, Ali; Abiad, Mohamad G
2017-07-01
Food safety is a key public health issue worldwide. This study aims to characterise existing governance mechanisms - governance structures (GSs) and food safety management systems (FSMSs) - and analyse the alignment thereof in detecting food safety hazards, based on empirical evidence from Lebanon. Firm-to-firm and public baseline are the dominant FSMSs applied in a large-scale, while chain-wide FSMSs are observed only in a small-scale. Most transactions involving farmers are relational and market-based in contrast to (large-scale) processors, which opt for hierarchical GSs. Large-scale processors use a combination of FSMSs and GSs to minimise food safety hazards albeit potential increase in coordination costs; this is an important feature of modern food supply chains. The econometric analysis reveals contract period, on-farm inspection and experience having significant effects in minimising food safety hazards. However, the potential to implement farm-level FSMS is influenced by formality of the contract, herd size, trading partner choice, and experience. Public baseline FSMSs appear effective in controlling food safety hazards; however, this may not be viable due to the scarcity of public resources. We suggest public policies to focus on long-lasting governance mechanisms by introducing incentive schemes and farm-level FSMSs by providing loans and education to farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
C3: A Command-line Catalogue Cross-matching tool for modern astrophysical survey data
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-06-01
In the current data-driven science era, it is needed that data analysis techniques has to quickly evolve to face with data whose dimensions has increased up to the Petabyte scale. In particular, being modern astrophysics based on multi-wavelength data organized into large catalogues, it is crucial that the astronomical catalog cross-matching methods, strongly dependant from the catalogues size, must ensure efficiency, reliability and scalability. Furthermore, multi-band data are archived and reduced in different ways, so that the resulting catalogues may differ each other in formats, resolution, data structure, etc, thus requiring the highest generality of cross-matching features. We present C 3 (Command-line Catalogue Cross-match), a multi-platform application designed to efficiently cross-match massive catalogues from modern surveys. Conceived as a stand-alone command-line process or a module within generic data reduction/analysis pipeline, it provides the maximum flexibility, in terms of portability, configuration, coordinates and cross-matching types, ensuring high performance capabilities by using a multi-core parallel processing paradigm and a sky partitioning algorithm.
Automation of large scale transient protein expression in mammalian cells
Zhao, Yuguang; Bishop, Benjamin; Clay, Jordan E.; Lu, Weixian; Jones, Margaret; Daenke, Susan; Siebold, Christian; Stuart, David I.; Yvonne Jones, E.; Radu Aricescu, A.
2011-01-01
Traditional mammalian expression systems rely on the time-consuming generation of stable cell lines; this is difficult to accommodate within a modern structural biology pipeline. Transient transfections are a fast, cost-effective solution, but require skilled cell culture scientists, making man-power a limiting factor in a setting where numerous samples are processed in parallel. Here we report a strategy employing a customised CompacT SelecT cell culture robot allowing the large-scale expression of multiple protein constructs in a transient format. Successful protocols have been designed for automated transient transfection of human embryonic kidney (HEK) 293T and 293S GnTI− cells in various flask formats. Protein yields obtained by this method were similar to those produced manually, with the added benefit of reproducibility, regardless of user. Automation of cell maintenance and transient transfection allows the expression of high quality recombinant protein in a completely sterile environment with limited support from a cell culture scientist. The reduction in human input has the added benefit of enabling continuous cell maintenance and protein production, features of particular importance to structural biology laboratories, which typically use large quantities of pure recombinant proteins, and often require rapid characterisation of a series of modified constructs. This automated method for large scale transient transfection is now offered as a Europe-wide service via the P-cube initiative. PMID:21571074
Low-Cost and Large-Area Electronics, Roll-to-Roll Processing and Beyond
NASA Astrophysics Data System (ADS)
Wiesenhütter, Katarzyna; Skorupa, Wolfgang
In the following chapter, the authors conduct a literature survey of current advances in state-of-the-art low-cost, flexible electronics. A new emerging trend in the design of modern semiconductor devices dedicated to scaling-up, rather than reducing, their dimensions is presented. To realize volume manufacturing, alternative semiconductor materials with superior performance, fabricated by innovative processing methods, are essential. This review provides readers with a general overview of the material and technology evolution in the area of macroelectronics. Herein, the term macroelectronics (MEs) refers to electronic systems that can cover a large area of flexible media. In stark contrast to well-established micro- and nano-scale semiconductor devices, where property improvement is associated with downscaling the dimensions of the functional elements, in macroelectronic systems their overall size defines the ultimate performance (Sun and Rogers in Adv. Mater. 19:1897-1916,
The Impact of Continental Configuration on Global Response to Large Igneous Province Eruptions
NASA Astrophysics Data System (ADS)
Stellmann, J.; West, A. J.; Ridgwell, A.; Becker, T. W.
2017-12-01
The impact of Large Igneous Province eruptions as recorded in the geologic record varies widely; some eruptions cause global warming, large scale ocean acidification and anoxia and mass extinctions while others cause some or none of these phenomena. There are several potential factors which may determine the global response to a Large Igneous Province eruption; here we consider continental configuration. The arrangement of continents controls the extent of shallow seas, ocean circulation and planetary albedo; all factors which impact global climate and its response to sudden changes in greenhouse gas concentrations. To assess the potential impact of continental configuration, a suite of simulated eruptions was carried out using the cGENIE Earth system model in two end-member continental configurations: the end-Permian supercontinent and the modern. Eruptions simulated are comparable to an individual pulse of a Large Igneous Province eruption with total CO2 emissions of 1,000 or 10,000 GtC erupted over 1,000 or 10,000 years, spanning eruptions rates of .1-10 GtC/yr. Global response is characterized by measuring the magnitude and duration of changes to atmospheric concentration of CO2, saturation state of calcite and ocean oxygen levels. Preliminary model results show that end-Permian continental configuration and conditions (radiative balance, ocean chemistry) lead to smaller magnitude and shorter duration changes in atmospheric pCO2 and ocean saturation state of calcite following the simulated eruption than the modern configuration.
Species composition and morphologic variation of Porites in the Gulf of California
NASA Astrophysics Data System (ADS)
López-Pérez, R. A.
2013-09-01
Morphometric analysis of corallite calices confirmed that from the late Miocene to the Recent, four species of Porites have inhabited the Gulf of California: the extinct Porites carrizensis, the locally extirpated Porites lobata and the extant Porites sverdrupi and Porites panamensis. Furthermore, large-scale spatial and temporal phenotypic plasticity was observed in the dominant species P. panamensis. Canonical discriminant analysis and ANOVA demonstrated that the calice structures of P. panamensis experienced size reduction between the late Pleistocene and Recent. Similarly, PERMANOVA, regression and correlation analyses demonstrated that across the 800 km north to south in the gulf, P. panamensis populations displayed a similar reduction in calice structures. Based on correlation analysis with environmental data, these large spatial changes are likely related to changes in nutrient concentration and sea surface temperature. As such, the large-scale spatial and temporal phenotypic variation recorded in populations of P. panamensis in the Gulf of California is likely related to optimization of corallite performance (energy acquisition) within various environmental scenarios. These findings may have relevance to modern conservation efforts within this ecological dominant genus.
Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures
NASA Astrophysics Data System (ADS)
Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A.; Park, Jiwoong
2017-10-01
High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides--which represent one- and three-atom-thick two-dimensional building blocks, respectively--have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.
Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures.
Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A; Park, Jiwoong
2017-10-12
High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides-which represent one- and three-atom-thick two-dimensional building blocks, respectively-have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.
New evidence for the barrier reef model, Permian Capitan Reef complex, New Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirkland, B.L.; Moore, C.H. Jr.
1990-05-01
Recent paleontologic and petrologic observations suggest that the Capitan Formation was deposited as an organic or ecologic reef that acted as an emergent barrier to incoming wave energy. In outcrops in the Guadalupe Mountains and within Carlsbad Caverns, massive reef boundstone contains a highly diverse assemblage of frame-building and binding organisms. In modern reefs, diversity among frame builders decreases dramatically with depth. Marine cement is abundant in reef boundstone, but limited in back-reef grainstone and packstone. This cementation pattern is similar to that observed in modern emergent barrier reef systems. Based on comparison with modern analogs, these dasycladrominated back-reef sedimentsmore » and their associated biota are indicative of shallow, hypersaline conditions. Few of these dasyclads exhibit broken or abraded segments and some thallus sections are still articulated suggesting that low-energy, hypersaline conditions occurred immediately shelfward of the reef. In addition, large-scale topographic features, such as possible spur and groove structures between Walnut Canyon and Rattlesnake Canyon, and facies geometries, such as the reef to shelf transition, resemble those found in modern shallow-water reefs. The organisms that formed the Capitan Reef appear to have lived in, and responded to, physical and chemical conditions similar to those that control the geometry of modern shallow-water reefs. Like their modern counterparts, they seem to have strongly influenced adjacent environments. In light of this evidence, consideration should be given to either modifying or abandoning the marginal mound model in favor of the originally proposed barrier reef model.« less
McKell, Allison O.; Rippinger, Christine M.; McAllen, John K.; Akopov, Asmik; Kirkness, Ewen F.; Payne, Daniel C.; Edwards, Kathryn M.; Chappell, James D.; Patton, John T.
2012-01-01
Group A rotaviruses (RVs) are 11-segmented, double-stranded RNA viruses and are primary causes of gastroenteritis in young children. Despite their medical relevance, the genetic diversity of modern human RVs is poorly understood, and the impact of vaccine use on circulating strains remains unknown. In this study, we report the complete genome sequence analysis of 58 RVs isolated from children with severe diarrhea and/or vomiting at Vanderbilt University Medical Center (VUMC) in Nashville, TN, during the years spanning community vaccine implementation (2005 to 2009). The RVs analyzed include 36 G1P[8], 18 G3P[8], and 4 G12P[8] Wa-like genogroup 1 strains with VP6-VP1-VP2-VP3-NSP1-NSP2-NSP3-NSP4-NSP5/6 genotype constellations of I1-R1-C1-M1-A1-N1-T1-E1-H1. By constructing phylogenetic trees, we identified 2 to 5 subgenotype alleles for each gene. The results show evidence of intragenogroup gene reassortment among the cocirculating strains. However, several isolates from different seasons maintained identical allele constellations, consistent with the notion that certain RV clades persisted in the community. By comparing the genes of VUMC RVs to those of other archival and contemporary RV strains for which sequences are available, we defined phylogenetic lineages and verified that the diversity of the strains analyzed in this study reflects that seen in other regions of the world. Importantly, the VP4 and VP7 proteins encoded by VUMC RVs and other contemporary strains show amino acid changes in or near neutralization domains, which might reflect antigenic drift of the virus. Thus, this large-scale, comparative genomic study of modern human RVs provides significant insight into how this pathogen evolves during its spread in the community. PMID:22696651
Pardikes, Nicholas A; Shapiro, Arthur M; Dyer, Lee A; Forister, Matthew L
2015-11-01
Understanding the spatial and temporal scales at which environmental variation affects populations of plants and animals is an important goal for modern population biology, especially in the context of shifting climatic conditions. The El Niño Southern Oscillation (ENSO) generates climatic extremes of interannual variation, and has been shown to have significant effects on the diversity and abundance of a variety of terrestrial taxa. However, studies that have investigated the influence of such large-scale climate phenomena have often been limited in spatial and taxonomic scope. We used 23 years (1988-2010) of a long-term butterfly monitoring data set to explore associations between variation in population abundance of 28 butterfly species and variation in ENSO-derived sea surface temperature anomalies (SSTA) across 10 sites that encompass an elevational range of 2750 m in the Sierra Nevada mountain range of California. Our analysis detected a positive, regional effect of increased SSTA on butterfly abundance (wetter and warmer years predict more butterfly observations), yet the influence of SSTA on butterfly abundances varied along the elevational gradient, and also differed greatly among the 28 species. Migratory species had the strongest relationships with ENSO-derived SSTA, suggesting that large-scale climate indices are particularly valuable for understanding biotic-abiotic relationships of the most mobile species. In general, however, the ecological effects of large-scale climatic factors are context dependent between sites and species. Our results illustrate the power of long-term data sets for revealing pervasive yet subtle climatic effects, but also caution against expectations derived from exemplar species or single locations in the study of biotic-abiotic interactions.
[The vegetation adventivisation through perspective of modern ecological ideas].
Mirkin, B M; Naumova, L G
2002-01-01
Results of study of vegetation adventivisation (increase in proportion of invasive species) correspond to the theory of present ecology that denies general universal laws. Diverse features of invasive species play different role under various ecological conditions and at various time and space scale. The invasibility of communities under various conditions is determined by combination of different biotic and abiotic factors though it is obvious that most of invasive species are characterized with the high seed production, well developed vegetative propagation, windblown pollination, high plasticity and effective use of resources, low consumption by herbivores. The definition of an "ideal invasive species" or an "ideal invasible community" is impossible. The regularities of vegetation adventivisation can be observed clearly only at very large scale.
Ancient genomes revisit the ancestry of domestic and Przewalski's horses.
Gaunitz, Charleen; Fages, Antoine; Hanghøj, Kristian; Albrechtsen, Anders; Khan, Naveed; Schubert, Mikkel; Seguin-Orlando, Andaine; Owens, Ivy J; Felkel, Sabine; Bignon-Lau, Olivier; de Barros Damgaard, Peter; Mittnik, Alissa; Mohaseb, Azadeh F; Davoudi, Hossein; Alquraishi, Saleh; Alfarhan, Ahmed H; Al-Rasheid, Khaled A S; Crubézy, Eric; Benecke, Norbert; Olsen, Sandra; Brown, Dorcas; Anthony, David; Massy, Ken; Pitulko, Vladimir; Kasparov, Aleksei; Brem, Gottfried; Hofreiter, Michael; Mukhtarova, Gulmira; Baimukhanov, Nurbol; Lõugas, Lembi; Onar, Vedat; Stockhammer, Philipp W; Krause, Johannes; Boldgiv, Bazartseren; Undrakhbold, Sainbileg; Erdenebaatar, Diimaajav; Lepetz, Sébastien; Mashkour, Marjan; Ludwig, Arne; Wallner, Barbara; Merz, Victor; Merz, Ilja; Zaibert, Viktor; Willerslev, Eske; Librado, Pablo; Outram, Alan K; Orlando, Ludovic
2018-04-06
The Eneolithic Botai culture of the Central Asian steppes provides the earliest archaeological evidence for horse husbandry, ~5500 years ago, but the exact nature of early horse domestication remains controversial. We generated 42 ancient-horse genomes, including 20 from Botai. Compared to 46 published ancient- and modern-horse genomes, our data indicate that Przewalski's horses are the feral descendants of horses herded at Botai and not truly wild horses. All domestic horses dated from ~4000 years ago to present only show ~2.7% of Botai-related ancestry. This indicates that a massive genomic turnover underpins the expansion of the horse stock that gave rise to modern domesticates, which coincides with large-scale human population expansions during the Early Bronze Age. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
NASA Astrophysics Data System (ADS)
Rowe, H. D.; Dunbar, R. B.
2004-09-01
A basin-scale hydrologic-energy balance model that integrates modern climatological, hydrological, and hypsographic observations was developed for the modern Lake Titicaca watershed (northern Altiplano, South America) and operated under variable conditions to understand controls on post-glacial changes in lake level. The model simulates changes in five environmental variables (air temperature, cloud fraction, precipitation, relative humidity, and land surface albedo). Relatively small changes in three meteorological variables (mean annual precipitation, temperature, and/or cloud fraction) explain the large mid-Holocene lake-level decrease (˜85 m) inferred from seismic reflection profiling and supported by sediment-based paleoproxies from lake sediments. Climatic controls that shape the present-day Altiplano and the sediment-based record of Holocene lake-level change are combined to interpret model-derived lake-level simulations in terms of changes in the mean state of ENSO and its impact on moisture transport to the Altiplano.
The role of soil weathering and hydrology in regulating chemical fluxes from catchments (Invited)
NASA Astrophysics Data System (ADS)
Maher, K.; Chamberlain, C. P.
2010-12-01
Catchment-scale chemical fluxes have been linked to a number of different parameters that describe the conditions at the Earth’s surface, including runoff, temperature, rock type, vegetation, and the rate of tectonic uplift. However, many of the relationships relating chemical denudation to surface processes and conditions, while based on established theoretical principles, are largely empirical and derived solely from modern observations. Thus, an enhanced mechanistic basis for linking global solute fluxes to both surface processes and climate may improve our confidence in extrapolating modern solute fluxes to past and future conditions. One approach is to link observations from detailed soil-based studies with catchment-scale properties. For example, a number of recent studies of chemical weathering at the soil-profile scale have reinforced the importance of hydrologic processes in controlling chemical weathering rates. An analysis of data from granitic soils shows that weathering rates decrease with increasing fluid residence times and decreasing flow rates—over moderate fluid residence times, from 5 days to 10 years, transport-controlled weathering explains the orders of magnitude variation in weathering rates to a better extent than soil age. However, the importance of transport-controlled weathering is difficult to discern at the catchment scale because of the range of flow rates and fluid residence times captured by a single discharge or solute flux measurement. To assess the importance of transport-controlled weathering on catchment scale chemical fluxes, we present a model that links the chemical flux to the extent of reaction between the soil waters and the solids, or the fluid residence time. Different approaches for describing the distribution of fluid residence times within a catchment are then compared with the observed Si fluxes for a limited number of catchments. This model predicts high solute fluxes in regions with high run-off, relief, and long flow paths suggesting that the particular hydrologic setting of a landscape will be the underlying control on the chemical fluxes. As such, we reinterpret the large chemical fluxes that are observed in active mountain belts, like the Himalaya, to be primarily controlled by the long reactive flow paths created by the steep terrain coupled with high amounts of precipitation.
Brown, Caleb M; Henderson, Donald M; Vinther, Jakob; Fletcher, Ian; Sistiaga, Ainara; Herrera, Jorsua; Summons, Roger E
2017-08-21
Predator-prey dynamics are an important evolutionary driver of escalating predation mode and efficiency, and commensurate responses of prey [1-3]. Among these strategies, camouflage is important for visual concealment, with countershading the most universally observed [4-6]. Extant terrestrial herbivores free of significant predation pressure, due to large size or isolation, do not exhibit countershading. Modern predator-prey dynamics may not be directly applicable to those of the Mesozoic due to the dominance of very large, visually oriented theropod dinosaurs [7]. Despite thyreophoran dinosaurs' possessing extensive dermal armor, some of the most extreme examples of anti-predator structures [8, 9], little direct evidence of predation on these and other dinosaur megaherbivores has been documented. Here we describe a new, exquisitely three-dimensionally preserved nodosaurid ankylosaur, Borealopelta markmitchelli gen. et sp. nov., from the Early Cretaceous of Alberta, which preserves integumentary structures as organic layers, including continuous fields of epidermal scales and intact horn sheaths capping the body armor. We identify melanin in the organic residues through mass spectroscopic analyses and observe lighter pigmentation of the large parascapular spines, consistent with display, and a pattern of countershading across the body. With an estimated body mass exceeding 1,300 kg, B. markmitchelli was much larger than modern terrestrial mammals that either are countershaded or experience significant predation pressure as adults. Presence of countershading suggests predation pressure strong enough to select for concealment in this megaherbivore despite possession of massive dorsal and lateral armor, illustrating a significant dichotomy between Mesozoic predator-prey dynamics and those of modern terrestrial systems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
1983-11-15
Concurrent Algorithms", A. Cremers , Dortmund University, West Germany, and T. Hibbard, JPL, Pasadena, CA 64 "An Overview of Signal Representations in...n O f\\ n O P- A -> Problem-oriented specification of concurrent algorithms Armin B. Cremers and Thomas N. Hibbard Preliminary version September...1982 s* Armin B. Cremers Computer Science Department University of Dortmund P.O. Box 50 05 00 D-4600 Dortmund 50 Fed. Rep. Germany
A 65 nm CMOS LNA for Bolometer Application
NASA Astrophysics Data System (ADS)
Huang, Tom Nan; Boon, Chirn Chye; Zhu, Forest Xi; Yi, Xiang; He, Xiaofeng; Feng, Guangyin; Lim, Wei Meng; Liu, Bei
2016-04-01
Modern bolometers generally consist of large-scale arrays of detectors. Implemented in conventional technologies, such bolometer arrays suffer from integrability and productivity issues. Recently, the development of CMOS technologies has presented an opportunity for the massive production of high-performance and highly integrated bolometers. This paper presents a 65-nm CMOS LNA designed for a millimeter-wave bolometer's pre-amplification stage. By properly applying some positive feedback, the noise figure of the proposed LNA is minimized at under 6 dB and the bandwidth is extended to 30 GHz.
Volkov, A V; Kolkutin, V V; Klevno, V A; Shkol'nikov, B V; Kornienko, I V
2008-01-01
Managerial experience is described that was gained during the large-scale work on victim identification following mass casualties in the Tu 154-M and Airbus A310 passenger plane crashes. The authors emphasize the necessity to set up a specialized agency of constant readiness meeting modern requirements for the implementation of a system of measures for personality identification. This agency must incorporate relevant departments of the Ministries of Health, Defense, and Emergency Situations as well as investigative authorities and other organizations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shea, M.
1995-09-01
The proper isolation of radioactive waste is one of today`s most pressing environmental issues. Research is being carried out by many countries around the world in order to answer critical and perplexing questions regarding the safe disposal of radioactive waste. Natural analogue studies are an increasingly important facet of this international research effort. The Pocos de Caldas Project represents a major effort of the international technical and scientific community towards addressing one of modern civilization`s most critical environmental issues - radioactive waste isolation.
NASA Astrophysics Data System (ADS)
Chen, Ke-Jung
2014-03-01
Modern cosmological simulations predict that the first generation of stars formed with a mass scale around 100 M⊙ about 300-400 million years after the Big Bang. When the first stars reached the end of their lives, many of them might have died as energetic supernovae (SNe) that could have significantly affected the early Universe via injecting large amounts of energy and metals into the primordial intergalactic medium. In this paper, we review the current models of the first SNe by discussing on the relevant background physics, computational methods and the latest results.
Integrated response to the dynamic threat of school violence.
Callaway, David W; Westmoreland, Ted C; Baez, Alejandro A; McKay, Sean A; Raja, Ali S
2010-01-01
A terrorist attack on US schools no longer can be considered a Black Swan event. Mounting evidence suggests that extremist organizations actively are targeting US schools. Equally disturbing are data suggesting that schools, universities, and communities are unprepared for large-scale violence. The Operational Medicine Institute Conference on an Integrated Response to the Modern Urban Terrorist Threat revealed significant variations in the perceived threats and critical response gaps among emergency medical providers, law enforcement personnel, politicians, and security specialists. The participants recommended several steps to address these gaps in preparedness, training, responses, and recovery.
MINC 2.0: A Flexible Format for Multi-Modal Images.
Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C
2016-01-01
It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.
Redox Flow Batteries, Hydrogen and Distributed Storage.
Dennison, C R; Vrubel, Heron; Amstutz, Véronique; Peljo, Pekka; Toghill, Kathryn E; Girault, Hubert H
2015-01-01
Social, economic, and political pressures are causing a shift in the global energy mix, with a preference toward renewable energy sources. In order to realize widespread implementation of these resources, large-scale storage of renewable energy is needed. Among the proposed energy storage technologies, redox flow batteries offer many unique advantages. The primary limitation of these systems, however, is their limited energy density which necessitates very large installations. In order to enhance the energy storage capacity of these systems, we have developed a unique dual-circuit architecture which enables two levels of energy storage; first in the conventional electrolyte, and then through the formation of hydrogen. Moreover, we have begun a pilot-scale demonstration project to investigate the scalability and technical readiness of this approach. This combination of conventional energy storage and hydrogen production is well aligned with the current trajectory of modern energy and mobility infrastructure. The combination of these two means of energy storage enables the possibility of an energy economy dominated by renewable resources.
Small-scale dynamic confinement gap test
NASA Astrophysics Data System (ADS)
Cook, Malcolm
2011-06-01
Gap tests are routinely used to ascertain the shock sensitiveness of new explosive formulations. The tests are popular since that are easy and relatively cheap to perform. However, with modern insensitive formulations with big critical diameters, large test samples are required. This can make testing and screening of new formulations expensive since large quantities of test material are required. Thus a new test that uses significantly smaller sample quantities would be very beneficial. In this paper we describe a new small-scale test that has been designed using our CHARM ignition and growth routine in the DYNA2D hydrocode. The new test is a modified gap test and uses detonating nitromethane to provide dynamic confinement (instead of a thick metal case) whilst exposing the sample to a long duration shock wave. The long duration shock wave allows less reactive materials that are below their critical diameter, more time to react. We present details on the modelling of the test together with some preliminary experiments to demonstrate the potential of the new test method.
Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua
2016-02-25
The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.
Evaluation of fuel preparation systems for lean premixing-prevaporizing combustors
NASA Technical Reports Server (NTRS)
Dodds, W. J.; Ekstedt, E. E.
1985-01-01
A series of experiments was carried out in order to produce design data for a premixing prevaporizing fuel-air mixture preparation system for aircraft gas turbine engine combustors. The fuel-air mixture uniformity of four different system design concepts was evaluated over a range of conditions representing the cruise operation of a modern commercial turbofan engine. Operating conditions including pressure, temperature, fuel-to-air ratio, and velocity, exhibited no clear effect on mixture uniformity of systems using pressure-atomizing fuel nozzles and large-scale mixing devices. However, the performance of systems using atomizing fuel nozzles and large-scale mixing devices was found to be sensitive to operating conditions. Variations in system design variables were also evaluated and correlated. Mixing uniformity was found to improve with system length, pressure drop, and the number of fuel injection points per unit area. A premixing system capable of providing mixing uniformity to within 15 percent over a typical range of cruise operating conditions is demonstrated.
Kinetic description of large-scale low pressure glow discharges
NASA Astrophysics Data System (ADS)
Kortshagen, Uwe; Heil, Brian
1997-10-01
In recent years the so called ``nonlocal approximation'' to the solution of the electron Boltzmann equation has attracted considerable attention as an extremely efficient method for the kinetic modeling of low pressure discharges. However, it appears that modern discharges, which are optimized to provide large-scale plasma uniformity, are explicitly designed to work in a regime, in which the nonlocal approximation is no longer strictly valid. In the presentation we discuss results of a hybrid model, which is based on the natural division of the electron distribution function into a nonlocal body, which is determined by elastic collisions only, and a high energy part which requires a more complete treatment due to the action of inelastic collisions and wall losses of electrons. The method is applied to an inductively coupled low pressure discharge. We discuss the transition from plasma density profiles maximal on the discharge axis to plasma density profiles with off-center maxima, which has been observed in experiments. A positive feedback mechanism involved in this transition is pointed out.
Ingestion of bacterially expressed double-stranded RNA inhibits gene expression in planarians.
Newmark, Phillip A; Reddien, Peter W; Cebrià, Francesc; Sánchez Alvarado, Alejandro
2003-09-30
Freshwater planarian flatworms are capable of regenerating complete organisms from tiny fragments of their bodies; the basis for this regenerative prowess is an experimentally accessible stem cell population that is present in the adult planarian. The study of these organisms, classic experimental models for investigating metazoan regeneration, has been revitalized by the application of modern molecular biological approaches. The identification of thousands of unique planarian ESTs, coupled with large-scale whole-mount in situ hybridization screens, and the ability to inhibit planarian gene expression through double-stranded RNA-mediated genetic interference, provide a wealth of tools for studying the molecular mechanisms that regulate tissue regeneration and stem cell biology in these organisms. Here we show that, as in Caenorhabditis elegans, ingestion of bacterially expressed double-stranded RNA can inhibit gene expression in planarians. This inhibition persists throughout the process of regeneration, allowing phenotypes with disrupted regenerative patterning to be identified. These results pave the way for large-scale screens for genes involved in regenerative processes.
Sreenilayam, Sithara P.; Panarin, Yuri P.; Vij, Jagdish K.; Panov, Vitaly P.; Lehmann, Anne; Poppe, Marco; Prehm, Marko; Tschierske, Carsten
2016-01-01
Liquid crystals (LCs) represent one of the foundations of modern communication and photonic technologies. Present display technologies are based mainly on nematic LCs, which suffer from limited response time for use in active colour sequential displays and limited image grey scale. Herein we report the first observation of a spontaneously formed helix in a polar tilted smectic LC phase (SmC phase) of achiral bent-core (BC) molecules with the axis of helix lying parallel to the layer normal and a pitch much shorter than the optical wavelength. This new phase shows fast (∼30 μs) grey-scale switching due to the deformation of the helix by the electric field. Even more importantly, defect-free alignment is easily achieved for the first time for a BC mesogen, thus providing potential use in large-scale devices with fast linear and thresholdless electro-optical response. PMID:27156514
NASA Astrophysics Data System (ADS)
Sreenilayam, Sithara P.; Panarin, Yuri P.; Vij, Jagdish K.; Panov, Vitaly P.; Lehmann, Anne; Poppe, Marco; Prehm, Marko; Tschierske, Carsten
2016-05-01
Liquid crystals (LCs) represent one of the foundations of modern communication and photonic technologies. Present display technologies are based mainly on nematic LCs, which suffer from limited response time for use in active colour sequential displays and limited image grey scale. Herein we report the first observation of a spontaneously formed helix in a polar tilted smectic LC phase (SmC phase) of achiral bent-core (BC) molecules with the axis of helix lying parallel to the layer normal and a pitch much shorter than the optical wavelength. This new phase shows fast (~30 μs) grey-scale switching due to the deformation of the helix by the electric field. Even more importantly, defect-free alignment is easily achieved for the first time for a BC mesogen, thus providing potential use in large-scale devices with fast linear and thresholdless electro-optical response.
Maximum one-shot dissipated work from Rényi divergences
NASA Astrophysics Data System (ADS)
Yunger Halpern, Nicole; Garner, Andrew J. P.; Dahlsten, Oscar C. O.; Vedral, Vlatko
2018-05-01
Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.
Maximum one-shot dissipated work from Rényi divergences.
Yunger Halpern, Nicole; Garner, Andrew J P; Dahlsten, Oscar C O; Vedral, Vlatko
2018-05-01
Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.
Do we understand what creates 150-km echoes and gives them their distinct structure?
NASA Astrophysics Data System (ADS)
Oppenheim, M. M.; Kudeki, E.; Salas Reyes, P.; Dimant, Y. S.
2017-12-01
Researchers first discovered 150-km echoes over 50 years ago using the first large VHF radars near the geomagnetic equator. However, the underlying mechanism that creates and modulates them remains largely a mystery. Despite this lack of understanding the aeronomy community uses them to monitor daytime vertical plasma drifts between 130 and 160 km altitude. In a 2016 paper, Oppenheim and Dimant used simulations to show that photoelectrons can generate the type of echoes seen by the radars but this theory doesn't explain any of the detailed structures. This paper will show the modern observations of 150 km echoes using simultaneous radar and ionosonde measurements. It will then describe the latest analysis to attempt to explain these features using large-scale kinetic simulations of photoelectrons interacting with the ambient ionospheric plasma under a range of conditions.
NASA Technical Reports Server (NTRS)
Klinger, L. F.
1988-01-01
The study of mass extinction events has largely focused on defining an environmental factor or factors that might account for specific patterns of faunal demise. Several hypotheses elaborate on how a given environmental factor might affect fauna directly, but differentially, causing extinction in certain taxa but not others. Yet few studies have considered specific habitat changes that might result from natural vegetation processes or from perturbations of vegetation. The role of large-scale habitat change induced by natural successional change from forest to bog (paludification) is examined and how large perturbations (e.g., volcanism, bolide impacts) might favor increased rates of paludification and consequent mass extinctions is considered. This hypothesis has an advantage over other hypotheses for mass extinctions in that modern day analogs of paludification are common throughout the world, thus allowing for considerable testing.
Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.
2015-12-01
Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.
The NASA modern technology rotors program
NASA Technical Reports Server (NTRS)
Watts, M. E.; Cross, J. L.
1986-01-01
Existing data bases regarding helicopters are based on work conducted on 'old-technology' rotor systems. The Modern Technology Rotors (MTR) Program is to provide extensive data bases on rotor systems using present and emerging technology. The MTR is concerned with modern, four-bladed, rotor systems presently being manufactured or under development. Aspects of MTR philosophy are considered along with instrumentation, the MTR test program, the BV 360 Rotor, and the UH-60 Black Hawk. The program phases include computer modelling, shake test, model-scale test, minimally instrumented flight test, extensively pressure-instrumented-blade flight test, and full-scale wind tunnel test.
NASA Astrophysics Data System (ADS)
Gerke, Kirill M.; Vasilyev, Roman V.; Khirevich, Siarhei; Collins, Daniel; Karsanina, Marina V.; Sizonenko, Timofey O.; Korost, Dmitry V.; Lamontagne, Sébastien; Mallants, Dirk
2018-05-01
Permeability is one of the fundamental properties of porous media and is required for large-scale Darcian fluid flow and mass transport models. Whilst permeability can be measured directly at a range of scales, there are increasing opportunities to evaluate permeability from pore-scale fluid flow simulations. We introduce the free software Finite-Difference Method Stokes Solver (FDMSS) that solves Stokes equation using a finite-difference method (FDM) directly on voxelized 3D pore geometries (i.e. without meshing). Based on explicit convergence studies, validation on sphere packings with analytically known permeabilities, and comparison against lattice-Boltzmann and other published FDM studies, we conclude that FDMSS provides a computationally efficient and accurate basis for single-phase pore-scale flow simulations. By implementing an efficient parallelization and code optimization scheme, permeability inferences can now be made from 3D images of up to 109 voxels using modern desktop computers. Case studies demonstrate the broad applicability of the FDMSS software for both natural and artificial porous media.
Extreme Scale Plasma Turbulence Simulations on Top Supercomputers Worldwide
Tang, William; Wang, Bei; Ethier, Stephane; ...
2016-11-01
The goal of the extreme scale plasma turbulence studies described in this paper is to expedite the delivery of reliable predictions on confinement physics in large magnetic fusion systems by using world-class supercomputers to carry out simulations with unprecedented resolution and temporal duration. This has involved architecture-dependent optimizations of performance scaling and addressing code portability and energy issues, with the metrics for multi-platform comparisons being 'time-to-solution' and 'energy-to-solution'. Realistic results addressing how confinement losses caused by plasma turbulence scale from present-day devices to the much larger $25 billion international ITER fusion facility have been enabled by innovative advances in themore » GTC-P code including (i) implementation of one-sided communication from MPI 3.0 standard; (ii) creative optimization techniques on Xeon Phi processors; and (iii) development of a novel performance model for the key kernels of the PIC code. Our results show that modeling data movement is sufficient to predict performance on modern supercomputer platforms.« less
Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...
2016-11-01
A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less
Ho, Andrew D; Yu, Carol C
2015-06-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.
GPU accelerated FDTD solver and its application in MRI.
Chi, J; Liu, F; Jin, J; Mason, D G; Crozier, S
2010-01-01
The finite difference time domain (FDTD) method is a popular technique for computational electromagnetics (CEM). The large computational power often required, however, has been a limiting factor for its applications. In this paper, we will present a graphics processing unit (GPU)-based parallel FDTD solver and its successful application to the investigation of a novel B1 shimming scheme for high-field magnetic resonance imaging (MRI). The optimized shimming scheme exhibits considerably improved transmit B(1) profiles. The GPU implementation dramatically shortened the runtime of FDTD simulation of electromagnetic field compared with its CPU counterpart. The acceleration in runtime has made such investigation possible, and will pave the way for other studies of large-scale computational electromagnetic problems in modern MRI which were previously impractical.
Archaean tectonic systems: A view from igneous rocks
NASA Astrophysics Data System (ADS)
Moyen, Jean-François; Laurent, Oscar
2018-03-01
This work examines the global distribution of Archaean and modern igneous rock's compositions, without relying on preconceptions about the link between rock compositions and tectonic sites (in contrast with "geotectonic" diagrams). Rather, Archaean and modern geochemical patterns are interpreted and compared in terms of source and melting conditions. Mafic rocks on the modern Earth show a clear chemical separation between arc and non-arc rocks. This points to the first order difference between wet (arc) and dry (mid-ocean ridges and hotspots) mantle melting. Dry melts are further separated in depleted (MORB) and enriched (OIB) sources. This three-fold pattern is a clear image of the ridge/subduction/plume system that dominates modern tectonics. In contrast, Archaean mafic and ultramafic rocks are clustered in an intermediate position, between the three main modern types. This suggests that the Archaean mantle had lesser amounts of clearly depleted or enriched portions; that true subductions were rare; and that the distinction between oceanic plateaus and ridges may have been less significant. Modern granitic rocks dominantly belong to two groups: arc-related granitoids, petrologically connected to arc basalts; and collision granitoids, related to felsic sources. In contrast, the Archaean record is dominated by the TTG suite that derives from an alkali-rich mafic source (i.e. altered basalt). The geochemical diversity of the TTG suite points to a great range of melting depths, from ca. 5 to > 20 kbar. This reveals the absence of large sedimentary accumulations, again the paucity of modern-like arc situations, and the importance played by reworking of an earlier basaltic shell, in a range of settings (including some proto-subduction mechanisms). Nonetheless, granitoids in each individual region show a progressive transition towards more modern-looking associations of arc-like and peraluminous granites. Collectively, the geochemical evidence suggests an Archaean Earth with somewhat different tectonic systems. In particular, the familiar distinction between collision, arcs, ridges and hotspots seems to blur in the Archaean. Rather, the large-scale geochemical pattern reveals a long-lived, altered and periodically resurfaced basaltic crust. This protocrust is reworked, through a range of processes occurring at various depths that correspond to a progressive stabilization of burial systems and the establishment of true subductions. A punctuated onset of global plate tectonics is unlikely to have occurred, but rather short-term episodes of proto-subduction in the late Archaean evolved over time into longer-term, more stable style of plate tectonics as mantle temperature decayed.
Evolution of neuronal signalling: transmitters and receptors.
Hoyle, Charles H V
2011-11-16
Evolution is a dynamic process during which the genome should not be regarded as a static entity. Molecular and morphological information yield insights into the evolution of species and their phylogenetic relationships, and molecular information in particular provides information into the evolution of signalling processes. Many signalling systems have their origin in primitive, even unicellular, organisms. Through time, and as organismal complexity increased, certain molecules were employed as intercellular signal molecules. In the autonomic nervous system the basic unit of chemical transmission is a ligand and its cognate receptor. The general mechanisms underlying evolution of signal molecules and their cognate receptors have their basis in the alteration of the genome. In the past this has occurred in large-scale events, represented by two or more doublings of the whole genome, or large segments of the genome, early in the deuterostome lineage, after the emergence of urochordates and cephalochordates, and before the emergence of vertebrates. These duplications were followed by extensive remodelling involving subsequent small-scale changes, ranging from point mutations to exon duplication. Concurrent with these processes was multiple gene loss so that the modern genome contains roughly the same number of genes as in early deuterostomes despite the large-scale genomic duplications. In this review, the principles that underlie evolution that have led to large and small families of autonomic neurotransmitters and their receptors are discussed, with emphasis on G protein-coupled receptors. Copyright © 2010 Elsevier B.V. All rights reserved.
Anthropogenic and Climatic Influence on Vegetation Fires in Peatland of Insular Southeast Asia
NASA Astrophysics Data System (ADS)
Liew, S.; Miettinen, J.; Salinas Cortijo, S. V.
2011-12-01
Fire is traditionally used as a tool in land clearing by farmers and shifting cultivators in Southeast Asia. However, the small scale clearing of land is increasingly being replaced by modern large-scale conversion of forests into plantations/agricultural land, usually also by fires. Fires get out of control in periods of extreme drought, especially during the El Nino periods, resulting in severe episodes of transboundary air pollution in the form of smoke haze. We use the MODIS active fires product (hotspots) to establish correlations between the temporal and spatial patterns of vegetation fires with climatic variables, land cover change and soil type (peat or non-peat) in the western part of Insular Southeast Asia for a decade from 2001 to 2010. Fire occurrence exhibits a negative correlation with rainfall, and is more severe overall during the El-Nino periods. However, not all regions are equally affected by El-Nino. In Southern Sumatra and Southern Borneo the correlation with El-Nino is high. However, fires in some regions such as the peatland in Riau, Jambi and Sarawak do not appear to be influenced by El-Nino. These regions are also experiencing rapid conversion of forest to large scale plantations.
Effects of historical and modern mining on mercury deposition in southeastern Peru.
Beal, Samuel A; Jackson, Brian P; Kelly, Meredith A; Stroup, Justin S; Landis, Joshua D
2013-11-19
Both modern anthropogenic emissions of mercury (Hg), primarily from artisanal and small-scale gold mining (ASGM), and preindustrial anthropogenic emissions from mining are thought to have a large impact on present-day atmospheric Hg deposition. We study the spatial distribution of Hg and its depositional history over the past ∼400 years in sediment cores from lakes located regionally proximal (∼90-150 km) to the largest ASGM in Peru and distal (>400 km) to major preindustrial mining centers. Total Hg concentrations in surface sediments from fourteen lakes are typical of remote regions (10-115 ng g(-1)). Hg fluxes in cores from four lakes demonstrate preindustrial Hg deposition in southeastern Peru was spatially variable and at least an order of magnitude lower than previously reported fluxes in lakes located closer to mining centers. Average modern (A.D. 2000-2011) Hg fluxes in these cores are 3.4-6.9 μg m(-2) a(-1), compared to average preindustrial (A.D. 1800-1850) fluxes of 0.8-2.5 μg m(-2) a(-1). Modern Hg fluxes determined from the four lakes are on average 3.3 (±1.5) times greater than their preindustrial fluxes, similar to those determined in other remote lakes around the world. This agreement suggests that Hg emissions from ASGM are likely not significantly deposited in nearby down-wind regions.
Childlessness patterns in Taiwan.
Poston Dl
1988-06-01
Taiwan is a newly developed and industrialized area, and along with Korea, Brazil, Argentina, and a few other countries, belongs in a special class of recently industrialized areas. Taiwan has been undergoing large-scale modernization since the 1950s when the Nationalist government 1st began to implement land reform programs and today is 1 of the showcase of newly developed areas of the world. Demographic transition theory shows that fertility is negatively associated with modernization. During the past 3 decades, fertility in Taiwan has followed this pattern in a dramatic manner. Studies of childlessness conducted in Western countries have shown also that as the modernizing influences continue, fertility declines, and childlessness increases as it becomes more and more voluntary. Subregions with the highest levels of modernization and the lowest fertility rates should therefore be characterized by the highest levels of childlessness, particularly among younger women, and vice versa. Given the levels of socioeconomic and demographic development in Taiwan and its subregions circa 1980, as well as its variability among the hsiens and major cities, the author would expect to find higher levels of childlessness in the more developed localities, and lower levels in the less developed subregions. This hypothesis is tested with data from the 1980 Census of Population and Housing: General Report, Taiwan--Fukien Area (Republic of China, 1982) and the 1980 Taiwan--Fukien Demographic Fact Book (Republic of China, 1980).
Effects of historical and modern mining on mercury deposition in southeastern Peru
Beal, Samuel A.; Jackson, Brian P.; Kelly, Meredith A.; Stroup, Justin S.; Landis, Joshua D.
2013-01-01
Both modern anthropogenic emissions of mercury (Hg), primarily from artisanal and small-scale gold mining (ASGM), and preindustrial anthropogenic emissions from mining are thought to have a large impact on present-day atmospheric Hg deposition. We study the spatial distribution of Hg and its depositional history over the past ~400 years in sediment cores from lakes located regionally proximal (~90–150 km) to the largest ASGM in Peru and distal (>400 km) to major preindustrial mining centers. Total Hg concentrations in surface sediments from fourteen lakes are typical of remote regions (10–115 ng g−1). Hg fluxes in cores from four lakes demonstrate preindustrial Hg deposition in southeastern Peru was spatially variable and at least an order of magnitude lower than previously reported fluxes in lakes located closer to mining centers. Average modern (A.D. 2000–2011) Hg fluxes in these cores are 3.4–6.9 μg m−2 a−1, compared to average preindustrial (A.D. 1800–1850) fluxes of 0.8–2.5 μg m−2 a−1. Modern Hg fluxes determined from the four lakes are on average 3.3 (±1.5) times greater than their preindustrial fluxes, similar to those determined in other remote lakes around the world. This agreement suggests that Hg emissions from ASGM are likely not significantly deposited in nearby downwind regions. PMID:24124645
An updated estimate of the body dimensions of US children.
Pagano, Brian T; Parkinson, Matthew B; Reed, Matthew P
2015-01-01
Anthropometric data from children are important for product design and the promulgation of safety standards. The last major detailed study of child anthropometry in the USA was conducted more than 30 years ago. Subsequent demographic changes and the increased prevalence of overweight and obesity render those data increasingly obsolete. A new, large-scale anthropometric survey is needed. As an interim step, a new anthropometric synthesis technique was used to create a virtual population of modern children, each described by 84 anthropometric measures. A subset of these data was validated against limited modern data. Comparisons with data from the 1970s showed significant changes in measures of width and circumference of the torso, arms and legs. Measures of length and measurements of the head, face, hands and feet exhibited little change. The new virtual population provides guidance for a comprehensive child anthropometry survey and could improve safety and accommodation in product design. Practitioner Summary: This research reviews the inadequacies of available sources of US child anthropometry as a result of the rise in the rates of overweight and obesity. A new synthesised database of detailed modern child anthropometry was created and validated. The results quantify changes in US child body dimensions since the 1970s.
Cloudy's Journey from FORTRAN to C, Why and How
NASA Astrophysics Data System (ADS)
Ferland, G. J.
Cloudy is a large-scale plasma simulation code that is widely used across the astronomical community as an aid in the interpretation of spectroscopic data. The cover of the ADAS VI book featured predictions of the code. The FORTRAN 77 source code has always been freely available on the Internet, contributing to its widespread use. The coming of PCs and Linux has fundamentally changed the computing environment. Modern Fortran compilers (F90 and F95) are not freely available. A common-use code must be written in either FORTRAN 77 or C to be Open Source/GNU/Linux friendly. F77 has serious drawbacks - modern language constructs cannot be used, students do not have skills in this language, and it does not contribute to their future employability. It became clear that the code would have to be ported to C to have a viable future. I describe the approach I used to convert Cloudy from FORTRAN 77 with MILSPEC extensions to ANSI/ISO 89 C. Cloudy is now openly available as a C code, and will evolve to C++ as gcc and standard C++ mature. Cloudy looks to a bright future with a modern language.
The Modernization and Associated Restructuring of the National Weather Service: An Overview.
NASA Astrophysics Data System (ADS)
Friday, Elbert W., Jr.
1994-01-01
The scientific understanding of the atmosphere and the ability to forecast large-and small-scale hydrometeorological phenomena have increased dramatically over the last two decades. As a result, the National Oceanic and Atmospheric Administration has set an ambitious goal: to modernize the National Weather Service (NWS)through the deployment of proven observational, information processing, and communications technologies, and to establish an associated cost-effective operational structure. The modernization and associated restructuring of the NWS will assure that the major advances that have been made in our ability to observe and understand the atmosphere are applied to the practical problems of providing atmospheric and hydrologic services to the nation. Implementation and practice of the new science will improve forecasts, provide more reliable detection of and warnings for severe weather and flooding, achieve more uniform hydrometeorological services across the nation, permit a more cost-effective NWS, and increase productivity among NWS employees. The changes proposed by the NWS will allow increased productivity and efficiency for any entity dependent on weather information, including local, state, and federal government agencies; researchers; private-sector meteorologists; private industry; and resource management organizations. This is the first in a series of articles intended to highlight these changes.
NASA Astrophysics Data System (ADS)
Wolf-Grosse, Tobias; Esau, Igor; Reuder, Joachim
2017-06-01
Street-level urban air pollution is a challenging concern for modern urban societies. Pollution dispersion models assume that the concentrations decrease monotonically with raising wind speed. This convenient assumption breaks down when applied to flows with local recirculations such as those found in topographically complex coastal areas. This study looks at a practically important and sufficiently common case of air pollution in a coastal valley city. Here, the observed concentrations are determined by the interaction between large-scale topographically forced and local-scale breeze-like recirculations. Analysis of a long observational dataset in Bergen, Norway, revealed that the most extreme cases of recurring wintertime air pollution episodes were accompanied by increased large-scale wind speeds above the valley. Contrary to the theoretical assumption and intuitive expectations, the maximum NO2 concentrations were not found for the lowest 10 m ERA-Interim wind speeds but in situations with wind speeds of 3 m s-1. To explain this phenomenon, we investigated empirical relationships between the large-scale forcing and the local wind and air quality parameters. We conducted 16 large-eddy simulation (LES) experiments with the Parallelised Large-Eddy Simulation Model (PALM) for atmospheric and oceanic flows. The LES accounted for the realistic relief and coastal configuration as well as for the large-scale forcing and local surface condition heterogeneity in Bergen. They revealed that emerging local breeze-like circulations strongly enhance the urban ventilation and dispersion of the air pollutants in situations with weak large-scale winds. Slightly stronger large-scale winds, however, can counteract these local recirculations, leading to enhanced surface air stagnation. Furthermore, this study looks at the concrete impact of the relative configuration of warmer water bodies in the city and the major transport corridor. We found that a relatively small local water body acted as a barrier for the horizontal transport of air pollutants from the largest street in the valley and along the valley bottom, transporting them vertically instead and hence diluting them. We found that the stable stratification accumulates the street-level pollution from the transport corridor in shallow air pockets near the surface. The polluted air pockets are transported by the local recirculations to other less polluted areas with only slow dilution. This combination of relatively long distance and complex transport paths together with weak dispersion is not sufficiently resolved in classical air pollution models. The findings have important implications for the air quality predictions over urban areas. Any prediction not resolving these, or similar local dynamic features, might not be able to correctly simulate the dispersion of pollutants in cities.
Taylor, Andrea B; Vinyard, Christopher J
2013-05-01
The jaw-closing muscles are responsible for generating many of the forces and movements associated with feeding. Muscle physiologic cross-sectional area (PCSA) and fiber length are two architectural parameters that heavily influence muscle function. While there have been numerous comparative studies of hominoid and hominin craniodental and mandibular morphology, little is known about hominoid jaw-muscle fiber architecture. We present novel data on masseter and temporalis internal muscle architecture for small- and large-bodied hominoids. Hominoid scaling patterns are evaluated and compared with representative New- (Cebus) and Old-World (Macaca) monkeys. Variation in hominoid jaw-muscle fiber architecture is related to both absolute size and allometry. PCSAs scale close to isometry relative to jaw length in anthropoids, but likely with positive allometry in hominoids. Thus, large-bodied apes may be capable of generating both absolutely and relatively greater muscle forces compared with smaller-bodied apes and monkeys. Compared with extant apes, modern humans exhibit a reduction in masseter PCSA relative to condyle-M1 length but retain relatively long fibers, suggesting humans may have sacrificed relative masseter muscle force during chewing without appreciably altering muscle excursion/contraction velocity. Lastly, craniometric estimates of PCSAs underestimate hominoid masseter and temporalis PCSAs by more than 50% in gorillas, and overestimate masseter PCSA by as much as 30% in humans. These findings underscore the difficulty of accurately estimating jaw-muscle fiber architecture from craniometric measures and suggest models of fossil hominin and hominoid bite forces will be improved by incorporating architectural data in estimating jaw-muscle forces. Copyright © 2013 Wiley Periodicals, Inc.
Ward, Dylan J.; Anderson, Robert S.; Haeussler, Peter J.
2012-01-01
Parts of the Alaska Range (Alaska, USA) stand in prominent exception to the “glacial buzzsaw hypothesis,” which postulates that terrain raised above the ELA is rapidly denuded by glaciers. In this paper, we discuss the role of a strong contrast in rock type in the development of this exceptional terrain. Much of the range is developed on pervasively fractured flysch, with local relief of 1000–1500 m, and mean summit elevations that are similar to modern snow line elevations. In contrast, Cretaceous and Tertiary plutons of relatively intact granite support the range's tallest mountains (including Mt. McKinley, or Denali, at 6194 m), with 2500–5000 m of local relief. The high granitic peaks protrude well above modern snow lines and support many large glaciers. We focus on the plutons of the Denali massif and the Kichatna Mountains, to the west. We use field observations, satellite photos, and digital elevation data to demonstrate how exhumation of these plutons affects glacier longitudinal profiles, the glacial drainage network, and the effectiveness of periglacial processes. In strong granite, steep, smooth valley walls are maintained by detachment of rock slabs along sheeting joints. These steep walls act as low-friction surfaces (“Teflon”), efficiently shedding snow. Simple scaling calculations show that this avalanching may greatly enhance the health of the modern glaciers. We conclude that, in places such as Denali, unusual combinations of rapid tectonic uplift and great rock strength have created the highest relief in North America by enhancing glacial erosion in the valleys while preserving the peaks.
Zanon, Marco; Davis, Basil A. S.; Marquer, Laurent; Brewer, Simon; Kaplan, Jed O.
2018-01-01
Characterization of land cover change in the past is fundamental to understand the evolution and present state of the Earth system, the amount of carbon and nutrient stocks in terrestrial ecosystems, and the role played by land-atmosphere interactions in influencing climate. The estimation of land cover changes using palynology is a mature field, as thousands of sites in Europe have been investigated over the last century. Nonetheless, a quantitative land cover reconstruction at a continental scale has been largely missing. Here, we present a series of maps detailing the evolution of European forest cover during last 12,000 years. Our reconstructions are based on the Modern Analog Technique (MAT): a calibration dataset is built by coupling modern pollen samples with the corresponding satellite-based forest-cover data. Fossil reconstructions are then performed by assigning to every fossil sample the average forest cover of its closest modern analogs. The occurrence of fossil pollen assemblages with no counterparts in modern vegetation represents a known limit of analog-based methods. To lessen the influence of no-analog situations, pollen taxa were converted into plant functional types prior to running the MAT algorithm. We then interpolate site-specific reconstructions for each timeslice using a four-dimensional gridding procedure to create continuous gridded maps at a continental scale. The performance of the MAT is compared against methodologically independent forest-cover reconstructions produced using the REVEALS method. MAT and REVEALS estimates are most of the time in good agreement at a trend level, yet MAT regularly underestimates the occurrence of densely forested situations, requiring the application of a bias correction procedure. The calibrated MAT-based maps draw a coherent picture of the establishment of forests in Europe in the Early Holocene with the greatest forest-cover fractions reconstructed between ∼8,500 and 6,000 calibrated years BP. This forest maximum is followed by a general decline in all parts of the continent, likely as a result of anthropogenic deforestation. The continuous spatial and temporal nature of our reconstruction, its continental coverage, and gridded format make it suitable for climate, hydrological, and biogeochemical modeling, among other uses. PMID:29568303
Zanon, Marco; Davis, Basil A S; Marquer, Laurent; Brewer, Simon; Kaplan, Jed O
2018-01-01
Characterization of land cover change in the past is fundamental to understand the evolution and present state of the Earth system, the amount of carbon and nutrient stocks in terrestrial ecosystems, and the role played by land-atmosphere interactions in influencing climate. The estimation of land cover changes using palynology is a mature field, as thousands of sites in Europe have been investigated over the last century. Nonetheless, a quantitative land cover reconstruction at a continental scale has been largely missing. Here, we present a series of maps detailing the evolution of European forest cover during last 12,000 years. Our reconstructions are based on the Modern Analog Technique (MAT): a calibration dataset is built by coupling modern pollen samples with the corresponding satellite-based forest-cover data. Fossil reconstructions are then performed by assigning to every fossil sample the average forest cover of its closest modern analogs. The occurrence of fossil pollen assemblages with no counterparts in modern vegetation represents a known limit of analog-based methods. To lessen the influence of no-analog situations, pollen taxa were converted into plant functional types prior to running the MAT algorithm. We then interpolate site-specific reconstructions for each timeslice using a four-dimensional gridding procedure to create continuous gridded maps at a continental scale. The performance of the MAT is compared against methodologically independent forest-cover reconstructions produced using the REVEALS method. MAT and REVEALS estimates are most of the time in good agreement at a trend level, yet MAT regularly underestimates the occurrence of densely forested situations, requiring the application of a bias correction procedure. The calibrated MAT-based maps draw a coherent picture of the establishment of forests in Europe in the Early Holocene with the greatest forest-cover fractions reconstructed between ∼8,500 and 6,000 calibrated years BP. This forest maximum is followed by a general decline in all parts of the continent, likely as a result of anthropogenic deforestation. The continuous spatial and temporal nature of our reconstruction, its continental coverage, and gridded format make it suitable for climate, hydrological, and biogeochemical modeling, among other uses.
NASA Astrophysics Data System (ADS)
Cao, X.; Tian, F.; Telford, R.; Ni, J.; Xu, Q.; Chen, F.; Liu, X.; Stebich, M.; Zhao, Y.; Herzschuh, U.
2017-12-01
Pollen-based quantitative reconstructions of past climate variables is a standard palaeoclimatic approach. Despite knowing that the spatial extent of the calibration-set affects the reconstruction result, guidance is lacking as to how to determine a suitable spatial extent of the pollen-climate calibration-set. In this study, past mean annual precipitation (Pann) during the Holocene (since 11.5 cal ka BP) is reconstructed repeatedly for pollen records from Qinghai Lake (36.7°N, 100.5°E; north-east Tibetan Plateau), Gonghai Lake (38.9°N, 112.2°E; north China) and Sihailongwan Lake (42.3°N, 126.6°E; north-east China) using calibration-sets of varying spatial extents extracted from the modern pollen dataset of China and Mongolia (2559 sampling sites and 168 pollen taxa in total). Results indicate that the spatial extent of the calibration-set has a strong impact on model performance, analogue quality and reconstruction diagnostics (absolute value, range, trend, optimum). Generally, these effects are stronger with the modern analogue technique (MAT) than with weighted averaging partial least squares (WA-PLS). With respect to fossil spectra from northern China, the spatial extent of calibration-sets should be restricted to ca. 1000 km in radius because small-scale calibration-sets (<800 km radius) will likely fail to include enough spatial variation in the modern pollen assemblages to reflect the temporal range shifts during the Holocene, while too broad a scale calibration-set (>1500 km radius) will include taxa with very different pollen-climate relationships. Based on our results we conclude that the optimal calibration-set should 1) cover a reasonably large spatial extent with an even distribution of modern pollen samples; 2) possess good model performance as indicated by cross-validation, high analogue quality, and excellent fit with the target fossil pollen spectra; 3) possess high taxonomic resolution, and 4) obey the modern and past distribution ranges of taxa inferred from palaeo-genetic and macrofossil studies.
Computation and analysis of cavitating flow in Francis-class hydraulic turbines
NASA Astrophysics Data System (ADS)
Leonard, Daniel J.
Hydropower is the most proven renewable energy technology, supplying the world with 16% of its electricity. Conventional hydropower generates a vast majority of that percentage. Although a mature technology, hydroelectric generation shows great promise for expansion through new dams and plants in developing hydro countries. Moreover, in developed hydro countries, such as the United States, installing generating units in existing dams and the modern refurbishment of existing plants can greatly expand generating capabilities with little to no further impact on the environment. In addition, modern computational technology and fluid dynamics expertise has led to substantial improvements in modern turbine design and performance. Cavitation has always presented a problem in hydroturbines, causing performance breakdown, erosion, damage, vibration, and noise. While modern turbines are usually designed to be cavitation-free at their best efficiency point, due to the variable demand of the energy market it is fairly common to operate at off-design conditions. Here, cavitation and its deleterious effects are unavoidable, and hence, cavitation is a limiting factor on the design and operation of these turbines. Multiphase Computational Fluid Dynamics (CFD) has been used in recent years to model cavitating flow for a large range of problems, including turbomachinery. However, CFD of cavitating flow in hydroturbines is still in its infancy. This dissertation presents steady-periodic Reynolds-averaged Navier-Stokes simulations of a cavitating Francis-class hydroturbine at model and prototype scales. Computational results of the reduced-scale model and full-scale prototype, undergoing performance breakdown, are compared with empirical model data and prototype performance estimations based on standard industry scalings from the model data. Mesh convergence of the simulations is also displayed. Comparisons are made between the scales to display that cavitation performance breakdown can occur more abruptly in the model than the prototype, due to lack of Froude similitude between the two. When severe cavitation occurs, clear differences are observed in vapor content between the scales. A stage-by-stage performance decomposition is conducted to analyze the losses within individual components of each scale of the machine. As cavitation becomes more severe, the losses in the draft tube account for an increasing amount of the total losses in the machine. More losses occur in the model draft tube as cavitation formation in the prototype draft tube is prevented by the larger hydrostatic pressure gradient across the machine. Additionally, unsteady Detached Eddy Simulations of the fully-coupled cavitating hydroturbine are performed for both scales. Both mesh and temporal convergence studies are provided. The temporal and spectral content of fluctuations in torque and pressure are monitored and compared between single-phase, cavitating, model, and prototype cases. A shallow draft tube induced runner imbalance results in an asymmetric vapor distribution about the runner, leading to more extensive growth and collapse of vapor on any individual blade as it undergoes a revolution. Unique frequency components manifest and persist through the entire machine only when cavitation is present in the hub vortex. Large maximum pressure spikes, which result from vapor collapse, are observed on the blade surfaces in the multiphase simulations, and these may be a potential source of cavitation damage and erosion. Multiphase CFD is shown to be an accurate and effective technique for simulating and analyzing cavitating flow in Francis-class hydraulic turbines. It is recommended that it be used as an industrial tool to supplement model cavitation experiments for all types of hydraulic turbines. Moreover, multiphase CFD can be equally effective as a research tool, to investigate mechanisms of cavitating hydraulic turbines that are not understood, and to uncover unique new phenomena which are currently unknown.
Preface to the volume Large Rivers
NASA Astrophysics Data System (ADS)
Latrubesse, Edgardo M.; Abad, Jorge D.
2018-02-01
The study and knowledge of the geomorphology of large rivers increased significantly during the last years and the factors that triggered these advances are multiple. On one hand, modern technologies became more accessible and their disseminated usage allowed the collection of data from large rivers as never seen before. The generalized use of high tech data collection with geophysics equipment such as acoustic Doppler current profilers-ADCPs, multibeam echosounders, plus the availability of geospatial and computational tools for morphodynamics, hydrological and hydrosedimentological modeling, have accelerated the scientific production on the geomorphology of large rivers at a global scale. Despite the advances, there is yet a lot of work ahead. Good parts of the large rivers are in the tropics and many are still unexplored. The tropics also hold crucial fluvial basins that concentrate good part of the gross domestic product of large countries like the Parana River in Argentina and Brazil, the Ganges-Brahmaputra in India, the Indus River in Pakistan, and the Mekong River in several countries of South East Asia. The environmental importance of tropical rivers is also outstanding. They hold the highest biodiversity of fluvial fauna and alluvial vegetation and many of them, particularly those in Southeast Asia, are among the most hazardous systems for floods in the entire world. Tropical rivers draining mountain chains such as the Himalaya, the Andes and insular Southeast Asia are also among the most heavily sediment loaded rivers and play a key role in both the storage of sediment at continental scale and the transference of sediments from the continent to the Ocean at planetary scale (Andermann et al., 2012; Latrubesse and Restrepo, 2014; Milliman and Syvitski, 1992; Milliman and Farsnworth, 2011; Sinha and Friend, 1994).
Hubble's Cosmology: From a Finite Expanding Universe to a Static Endless Universe
NASA Astrophysics Data System (ADS)
Assis, A. K. T.; Neves, M. C. D.; Soares, D. S. L.
2009-12-01
We analyze the views of Edwin Hubble (1889-1953) as regards the large scale structure of the universe. In 1929 he initially accepted a finite expanding universe in order to explain the redshifts of distant galaxies. Later on he turned to an infinite stationary universe and a new principle of nature in order to explain the same phenomena. Initially, he was impressed by the agreement of his redshift-distance relation with one of the predictions of de Sitter's cosmological model, namely, the so-called "de Sitter effect'', the phenomenon of the scattering of material particles, leading to an expanding universe. A number of observational evidences, though, made him highly skeptical with such a scenario. They were better accounted for by an infinite static universe. The evidences he found were: (i) the huge values he was getting for the "recession'' velocities of the nebulae (1,800 km s-1 in 1929 up to 42,000 km s-1 in 1942, leading to v/c = 1/7), with the redshifts interpreted as velocity-shifts. All other known real velocities of large astronomical bodies are much smaller than these. (ii) The "number effect'' test, which is the running of nebulae luminosity with redshift. Hubble found that a static universe is, within the observational uncertainties, slightly favored. The test is equivalent to the modern "Tolman effect,'' for galaxy surface brightnesses, whose results are still a matter of dispute. (iii) The smallness of the size and the age of the curved expanding universe, implied by the expansion rate that he had determined, and, (iv) the fact that a uniform distribution of galaxies on large scales is more easily obtained from galaxy counts, when a static and flat model is considered. In an expanding and closed universe, Hubble found that homogeneity was only obtained at the cost of a large curvature. We show, by quoting his works, that Hubble remained cautiously against the big bang until the end of his life, contrary to the statements of many modern authors. In order to account for redshifts, in a non-expanding universe, Hubble called for a new principle of nature, like the "tired-light'' mechanism proposed by Fritz Zwicky in 1929. On the other hand, he was aware of the theoretical difficulties of such a radical assumption. Hubble's approach to cosmology strongly suggests that he would not agree with the present status of the modern cosmological paradigm, since he was, above all, driven by observations and by the consequences derived from them.
Carbonell, Felix; Iturria-Medina, Yasser; Evans, Alan C
2018-01-01
Protein misfolding refers to a process where proteins become structurally abnormal and lose their specific 3-dimensional spatial configuration. The histopathological presence of misfolded protein (MP) aggregates has been associated as the primary evidence of multiple neurological diseases, including Prion diseases, Alzheimer's disease, Parkinson's disease, and Creutzfeldt-Jacob disease. However, the exact mechanisms of MP aggregation and propagation, as well as their impact in the long-term patient's clinical condition are still not well understood. With this aim, a variety of mathematical models has been proposed for a better insight into the kinetic rate laws that govern the microscopic processes of protein aggregation. Complementary, another class of large-scale models rely on modern molecular imaging techniques for describing the phenomenological effects of MP propagation over the whole brain. Unfortunately, those neuroimaging-based studies do not take full advantage of the tremendous capabilities offered by the chemical kinetics modeling approach. Actually, it has been barely acknowledged that the vast majority of large-scale models have foundations on previous mathematical approaches that describe the chemical kinetics of protein replication and propagation. The purpose of the current manuscript is to present a historical review about the development of mathematical models for describing both microscopic processes that occur during the MP aggregation and large-scale events that characterize the progression of neurodegenerative MP-mediated diseases.
Load Balancing Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pearce, Olga Tkachyshyn
2014-12-01
The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one atmore » the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.« less
Large ejecta fragments from asteroids. [Abstract only
NASA Technical Reports Server (NTRS)
Asphaug, E.
1994-01-01
The asteroid 4 Vesta, with its unique basaltic crust, remains a key mystery of planetary evolution. A localized olivine feature suggests excavation of subcrustal material in a crater or impact basin comparable in size to the planetary radius (R(sub vesta) is approximately = 280 km). Furthermore, a 'clan' of small asteroids associated with Vesta (by spectral and orbital similarities) may be ejecta from this impact 151 and direct parents of the basaltic achondrites. To escape, these smaller (about 4-7 km) asteroids had to be ejected at speeds greater than the escape velocity, v(sub esc) is approximately = 350 m/s. This evidence that large fragments were ejected at high speed from Vesta has not been reconciled with the present understanding of impact physics. Analytical spallation models predict that an impactor capable of ejecting these 'chips off Vesta' would be almost the size of Vesta! Such an impact would lead to the catastrophic disruption of both bodies. A simpler analysis is outlined, based on comparison with cratering on Mars, and it is shown that Vesta could survive an impact capable of ejecting kilometer-scale fragments at sufficient speed. To what extent does Vesta survive the formation of such a large crater? This is best addressed using a hydrocode such as SALE 2D with centroidal gravity to predict velocities subsequent to impact. The fragmentation outcome and velocity subsequent to the impact described to demonstrate that Vesta survives without large-scale disassembly or overturning of the crust. Vesta and its clan represent a valuable dataset for testing fragmentation hydrocodes such as SALE 2D and SPH 3D at planetary scales. Resolution required to directly model spallation 'chips' on a body 100 times as large is now marginally possible on modern workstations. These boundaries are important in near-surface ejection processes and in large-scale disruption leading to asteroid families and stripped cores.
Why are U.S. nuclear weapon modernization efforts controversial?
NASA Astrophysics Data System (ADS)
Acton, James
2016-03-01
U.S. nuclear weapon modernization programs are focused on extending the lives of existing warheads and developing new delivery vehicles to replace ageing bombers, intercontinental ballistic missiles, and ballistic missile submarines. These efforts are contested and controversial. Some critics argue that they are largely unnecessary, financially wasteful and potentially destabilizing. Other critics posit that they do not go far enough and that nuclear weapons with new military capabilities are required. At its core, this debate centers on three strategic questions. First, what roles should nuclear weapons be assigned? Second, what military capabilities do nuclear weapons need to fulfill these roles? Third, how severe are the unintended escalation risks associated with particular systems? Proponents of scaled-down modernization efforts generally argue for reducing the role of nuclear weapons but also that, even under existing policy, new military capabilities are not required. They also tend to stress the escalation risks of new--and even some existing--capabilities. Proponents of enhanced modernization efforts tend to advocate for a more expansive role for nuclear weapons in national security strategy. They also often argue that nuclear deterrence would be enhanced by lower yield weapons and/or so called bunker busters able to destroy more deeply buried targets. The debate is further fueled by technical disagreements over many aspects of ongoing and proposed modernization efforts. Some of these disagreements--such as the need for warhead life extension programs and their necessary scope--are essentially impossible to resolve at the unclassified level. By contrast, unclassified analysis can help elucidate--though not answer--other questions, such as the potential value of bunker busters.
NASA Astrophysics Data System (ADS)
O'Brien, T. A.; Kashinath, K.; Collins, W.
2015-12-01
Over warm tropical oceans the increase in greenhouse trapping with increasing SST is faster than that of the surface emission, resulting in a decrease in outgoing longwave radiation at the top of the atmosphere (OLR) when SST increases, also known as the super greenhouse effect (SGE). If SGE is directly linked to SST changes, there are profound implications for positive climate feedbacks in the tropics. However, a number of studies in the last 20 years have provided compelling evidence that the OLR-SST relationship is coincidental rather than causal. These studies suggested that the onset of SGE is dominated by the large-scale dynamics, and that the apparent OLR-SST relationships disappear when individual large-scale regimes are considered. We show that these conclusions are contingent on the quality of the datasets used in the analysis, and that modern satellite observations and reanalyses support a strong relationship between SGE and SST. We find that the SGE occurs across all dynamical regimes, suggesting that this may be related primarily to SST rather than large-scale dynamics. We also find that the discontinuity in the relationship between OLR and SST at high SST (29.5 C), i.e. the shutdown of SGE, also occurs across almost all dynamical regimes, suggesting that this behavior may also be strongly linked to SST. Collectively, these results suggest that the SGE may actually be controlled by SST. Work is ongoing to understand the robustness of this new result to other datasets, to understand whether SST is truly the controlling variable, and to understand the mechanism by which OLR could decrease with increasing SST even under strongly subsiding conditions.
Scalable Performance Measurement and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd
2009-01-01
Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less
Emergy Expenditure Among Municipal Wastewater Treatment Systems Across US
The urbanization of the modern community creates large population centers that generate concentrated wastewater. A large expenditure on wastewater treatment has to be invested to make a modern city function without human and environmental health problems. Society relies on syste...
Ordering Unstructured Meshes for Sparse Matrix Computations on Leading Parallel Systems
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Li, Xiaoye; Heber, Gerd; Biswas, Rupak
2000-01-01
The ability of computers to solve hitherto intractable problems and simulate complex processes using mathematical models makes them an indispensable part of modern science and engineering. Computer simulations of large-scale realistic applications usually require solving a set of non-linear partial differential equations (PDES) over a finite region. For example, one thrust area in the DOE Grand Challenge projects is to design future accelerators such as the SpaHation Neutron Source (SNS). Our colleagues at SLAC need to model complex RFQ cavities with large aspect ratios. Unstructured grids are currently used to resolve the small features in a large computational domain; dynamic mesh adaptation will be added in the future for additional efficiency. The PDEs for electromagnetics are discretized by the FEM method, which leads to a generalized eigenvalue problem Kx = AMx, where K and M are the stiffness and mass matrices, and are very sparse. In a typical cavity model, the number of degrees of freedom is about one million. For such large eigenproblems, direct solution techniques quickly reach the memory limits. Instead, the most widely-used methods are Krylov subspace methods, such as Lanczos or Jacobi-Davidson. In all the Krylov-based algorithms, sparse matrix-vector multiplication (SPMV) must be performed repeatedly. Therefore, the efficiency of SPMV usually determines the eigensolver speed. SPMV is also one of the most heavily used kernels in large-scale numerical simulations.
Enabling Interactive Measurements from Large Coverage Microscopy
Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary
2017-01-01
Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600
A transparently scalable visualization architecture for exploring the universe.
Fu, Chi-Wing; Hanson, Andrew J
2007-01-01
Modern astronomical instruments produce enormous amounts of three-dimensional data describing the physical Universe. The currently available data sets range from the solar system to nearby stars and portions of the Milky Way Galaxy, including the interstellar medium and some extrasolar planets, and extend out to include galaxies billions of light years away. Because of its gigantic scale and the fact that it is dominated by empty space, modeling and rendering the Universe is very different from modeling and rendering ordinary three-dimensional virtual worlds at human scales. Our purpose is to introduce a comprehensive approach to an architecture solving this visualization problem that encompasses the entire Universe while seeking to be as scale-neutral as possible. One key element is the representation of model-rendering procedures using power scaled coordinates (PSC), along with various PSC-based techniques that we have devised to generalize and optimize the conventional graphics framework to the scale domains of astronomical visualization. Employing this architecture, we have developed an assortment of scale-independent modeling and rendering methods for a large variety of astronomical models, and have demonstrated scale-insensitive interactive visualizations of the physical Universe covering scales ranging from human scale to the Earth, to the solar system, to the Milky Way Galaxy, and to the entire observable Universe.
Hansen, Jeffrey; Jurgens, Bryant; Fram, Miranda S.
2018-01-01
Total dissolved solids (TDS) concentrations in groundwater tapped for beneficial uses (drinking water, irrigation, freshwater industrial) have increased on average by about 100 mg/L over the last 100 years in the San Joaquin Valley, California (SJV). During this period land use in the SJV changed from natural vegetation and dryland agriculture to dominantly irrigated agriculture with growing urban areas. Century-scale salinity trends were evaluated by comparing TDS concentrations and major ion compositions of groundwater from wells sampled in 1910 (Historic) to data from wells sampled in 1993-2015 (Modern). TDS concentrations in subregions of the SJV, the southern (SSJV), western (WSJV), northeastern (NESJV), and southeastern (SESJV) were calculated using a cell-declustering method. TDS concentrations increased in all regions, with the greatest increases found in the SSJV and SESJV. Evaluation of the Modern data from the NESJV and SESJV found higher TDS concentrations in recently recharged (post-1950) groundwater from shallow (< 50 m) wells surrounded predominantly by agricultural land uses, while premodern (pre-1950) groundwater from deeper wells, and recently recharged groundwater from wells surrounded by mainly urban, natural, and mixed land uses had lower TDS concentrations, approaching the TDS concentrations in the Historic groundwater. For the NESJV and SESJV, inverse geochemical modeling with PHREEQC indicated that weathering of primary silicate minerals accounted for the majority of the increase in TDS concentrations, contributing more than nitrate from fertilizers and sulfate from soil amendments combined. Bicarbonate showed the greatest increase among major ions, resulting from enhanced silicate weathering due to recharge of irrigation water enriched in CO2 during the growing season. The results of this study demonstrate that large anthropogenic changes to the hydrologic regime, like massive development of irrigated agriculture in semi-arid areas like the SJV, can cause large changes in groundwater quality on a regional scale.
Hansen, Jeffrey A; Jurgens, Bryant C; Fram, Miranda S
2018-06-09
Total dissolved solids (TDS) concentrations in groundwater tapped for beneficial uses (drinking water, irrigation, freshwater industrial) have increased on average by about 100 mg/L over the last 100 years in the San Joaquin Valley, California (SJV). During this period land use in the SJV changed from natural vegetation and dryland agriculture to dominantly irrigated agriculture with growing urban areas. Century-scale salinity trends were evaluated by comparing TDS concentrations and major ion compositions of groundwater from wells sampled in 1910 (Historic) to data from wells sampled in 1993-2015 (Modern). TDS concentrations in subregions of the SJV, the southern (SSJV), western (WSJV), northeastern (NESJV), and southeastern (SESJV) were calculated using a cell-declustering method. TDS concentrations increased in all regions, with the greatest increases found in the SSJV and SESJV. Evaluation of the Modern data from the NESJV and SESJV found higher TDS concentrations in recently recharged (post-1950) groundwater from shallow (<50 m) wells surrounded predominantly by agricultural land uses, while premodern (pre-1950) groundwater from deeper wells, and recently recharged groundwater from wells surrounded by mainly urban, natural, and mixed land uses had lower TDS concentrations, approaching the TDS concentrations in the Historic groundwater. For the NESJV and SESJV, inverse geochemical modeling with PHREEQC indicated that weathering of primary silicate minerals accounted for the majority of the increase in TDS concentrations, contributing more than nitrate from fertilizers and sulfate from soil amendments combined. Bicarbonate showed the greatest increase among major ions, resulting from enhanced silicate weathering due to recharge of irrigation water enriched in CO 2 during the growing season. The results of this study demonstrate that large anthropogenic changes to the hydrologic regime, like massive development of irrigated agriculture in semi-arid areas like the SJV, can cause large changes in groundwater quality on a regional scale. Published by Elsevier B.V.
Large-scale-system effectiveness analysis. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Foster, J.W.
1979-11-01
Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.
Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline
Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.
2016-09-28
A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.
Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.
A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.
Mathematical geophysics: A survey of recent developments in seismology and geodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vlaar, N.J.
1988-01-01
This survey deals with modern methods for the determination of the structure of the Earth and for the analysis and modeling of the dynamic processes in the Earth's interior. Seismology and the three-dimensional structure of the Earth are covered in chapters devoted to waves in the three-dimensional Earth and large-scale inversion, while the discussion of convection and lithospheric processes focuses on geomagnetism, mantle convection, post-glacial rebound, and thermomechanical processes in the lithosphere. The emphasis of the work is theoretical, but the reader will find a discussion of the pertinent observational evidence.
NASA Technical Reports Server (NTRS)
Criswell, D. R. (Editor); Freeman, J. W. (Editor)
1974-01-01
Reviewed are the active mechanisms relating the moon to its environment and the linkage between these mechanisms and their records in the lunar sample and geophysical data. Topics: (1) large scale plasma interactions with the moon and non-magnetic planets; (2) ancient and present day lunar surface magnetic and electric fields; (3) dynamics and evolution of the lunar atmosphere; (4) evolution of the solar plasma; (5) lunar record of solar radiations; (6) non-meteoritic and meteoritic disturbance and transport of lunar surface materials; and (7) future lunar exploration.
Archean Microbial Mat Communities
NASA Astrophysics Data System (ADS)
Tice, Michael M.; Thornton, Daniel C. O.; Pope, Michael C.; Olszewski, Thomas D.; Gong, Jian
2011-05-01
Much of the Archean record of microbial communities consists of fossil mats and stromatolites. Critical physical emergent properties governing the evolution of large-scale (centimeters to meters) topographic relief on the mat landscape are (a) mat surface roughness relative to the laminar sublayer and (b) cohesion. These properties can be estimated for fossil samples under many circumstances. A preliminary analysis of Archean mat cohesion suggests that mats growing in shallow marine environments from throughout this time had cohesions similar to those of modern shallow marine mats. There may have been a significant increase in mat strength at the end of the Archean.
Do the pyramids show continental drift?
Pawley, G S; Abrahamsen, N
1973-03-02
The mystery of the orientation of the Great Pyramids of Giza has remained unexplained for many decades. The general alignment is 4 minutes west of north. It is argued that this is not a builders' error but is caused by movement over the centuries. Modern theories of continental drift do not predict quite such large movements, but other causes of polar wandering give even smaller shifts. Thus, continental drift is the most likely explanation, although somewhat implausible, especially as relevant measurements have been made over a 50-year period, whereas geophysical measurements of sea-floor spreading relate to million-year time scales.
Modernization of Koesters interferometer and high accuracy calibration gauge blocks
NASA Astrophysics Data System (ADS)
França, R. S.; Silva, I. L. M.; Couceiro, I. B.; Torres, M. A. C.; Bessa, M. S.; Costa, P. A.; Oliveira, W., Jr.; Grieneisen, H. P. H.
2016-07-01
The Optical Metrology Division (Diopt) of Inmetro is responsible for maintaining the national reference of the length unit according to International System of Units (SI) definitions. The length unit is realized by interferometric techniques and is disseminated to the dimensional community through calibrations of gauge blocks. Calibration of large gauge blocks from 100 mm to 1000 mm has been performed by Diopt with a Koesters interferometer with reference to spectral lines of a krypton discharge lamp. Replacement of this lamp by frequency stabilized lasers, traceable now to the time and frequency scale, is described and the first results are reported.
Chola, Lumbwe; McGee, Shelley; Tugendhaft, Aviva; Buchmann, Eckhart; Hofman, Karen
2015-01-01
Family planning contributes significantly to the prevention of maternal and child mortality. However, many women still do not use modern contraception and the numbers of unintended pregnancies, abortions and subsequent deaths are high. In this paper, we estimate the service delivery costs of scaling up modern contraception, and the potential impact on maternal, newborn and child survival in South Africa. The Family Planning model in Spectrum was used to project the impact of modern contraception on pregnancies, abortions and births in South Africa (2015-2030). The contraceptive prevalence rate (CPR) was increased annually by 0.68 percentage points. The Lives Saved Tool was used to estimate maternal and child deaths, with coverage of essential maternal and child health interventions increasing by 5% annually. A scenario analysis was done to test impacts when: the change in CPR was 0.1% annually; and intervention coverage increased linearly to 99% in 2030. If CPR increased by 0.68% annually, the number of pregnancies would reduce from 1.3 million in 2014 to one million in 2030. Unintended pregnancies, abortions and births decrease by approximately 20%. Family planning can avert approximately 7,000 newborn and child and 600 maternal deaths. The total annual costs of providing modern contraception in 2030 are estimated to be US$33 million and the cost per user of modern contraception is US$7 per year. The incremental cost per life year gained is US$40 for children and US$1,000 for mothers. Maternal and child mortality remain high in South Africa, and scaling up family planning together with optimal maternal, newborn and child care is crucial. A huge impact can be made on maternal and child mortality, with a minimal investment per user of modern contraception.
The Particle Adventure | What is fundamental? | Fundamental
? The modern atom model The scale of the atom What are we looking for? The standard model The standard Major accelerators The event Detectors Detector shapes Modern detectors Typical detector components
The Particle Adventure | Accelerators and Particle Detectors
? The modern atom model The scale of the atom What are we looking for? The standard model The standard Major accelerators The event Detectors Detector shapes Modern detectors Typical detector components
Systems biology for understanding and engineering of heterotrophic oleaginous microorganisms.
Park, Beom Gi; Kim, Minsuk; Kim, Joonwon; Yoo, Heewang; Kim, Byung-Gee
2017-01-01
Heterotrophic oleaginous microorganisms continue to draw interest as they can accumulate a large amount of lipids which is a promising feedstock for the production of biofuels and oleochemicals. Nutrient limitation, especially nitrogen limitation, is known to effectively trigger the lipid production in these microorganisms. For the aim of developing improved strains, the mechanisms behind the lipid production have been studied for a long time. Nowadays, system-level understanding of their metabolism and associated metabolic switches is attainable with modern systems biology tools. This work reviews the systems biology studies, based on (i) top-down, large-scale 'omics' tools, and (ii) bottom-up, mathematical modeling methods, on the heterotrophic oleaginous microorganisms with an emphasis on further application to metabolic engineering. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Roadman, Jason; Mohseni, Kamran
2009-11-01
Modern technology operating in the atmospheric boundary layer could benefit from more accurate wind tunnel testing. While scaled atmospheric boundary layer tunnels have been well developed, tunnels replicating portions of the turbulence of the atmospheric boundary layer at full scale are a comparatively new concept. Testing at full-scale Reynolds numbers with full-scale turbulence in an ``atmospheric wind tunnel'' is sought. Many programs could utilize such a tool including that of Micro Aerial Vehicles (MAVs) and other unmanned aircraft, the wind energy industry, fuel efficient vehicles, and the study of bird and insect fight. The construction of an active ``gust generator'' for a new atmospheric tunnel is reviewed and the turbulence it generates is measured utilizing single and cross hot wires. Results from this grid are compared to atmospheric turbulence and it is shown that various gust strengths can be produced corresponding to days ranging from calm to quite gusty. An initial test is performed in the atmospheric wind tunnel whereby the effects of various turbulence conditions on transition and separation on the upper surface of a MAV wing is investigated using oil flow visualization.
Introducing Large-Scale Innovation in Schools
NASA Astrophysics Data System (ADS)
Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.
2016-08-01
Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.
Blazing Signature Filter: a library for fast pairwise similarity comparisons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon-Yong; Fujimoto, Grant M.; Wilson, Ryan
Identifying similarities between datasets is a fundamental task in data mining and has become an integral part of modern scientific investigation. Whether the task is to identify co-expressed genes in large-scale expression surveys or to predict combinations of gene knockouts which would elicit a similar phenotype, the underlying computational task is often a multi-dimensional similarity test. As datasets continue to grow, improvements to the efficiency, sensitivity or specificity of such computation will have broad impacts as it allows scientists to more completely explore the wealth of scientific data. A significant practical drawback of large-scale data mining is the vast majoritymore » of pairwise comparisons are unlikely to be relevant, meaning that they do not share a signature of interest. It is therefore essential to efficiently identify these unproductive comparisons as rapidly as possible and exclude them from more time-intensive similarity calculations. The Blazing Signature Filter (BSF) is a highly efficient pairwise similarity algorithm which enables extensive data mining within a reasonable amount of time. The algorithm transforms datasets into binary metrics, allowing it to utilize the computationally efficient bit operators and provide a coarse measure of similarity. As a result, the BSF can scale to high dimensionality and rapidly filter unproductive pairwise comparison. Two bioinformatics applications of the tool are presented to demonstrate the ability to scale to billions of pairwise comparisons and the usefulness of this approach.« less
Ghost reefs: Nautical charts document large spatial scale of coral reef loss over 240 years
McClenachan, Loren; O’Connor, Grace; Neal, Benjamin P.; Pandolfi, John M.; Jackson, Jeremy B. C.
2017-01-01
Massive declines in population abundances of marine animals have been documented over century-long time scales. However, analogous loss of spatial extent of habitat-forming organisms is less well known because georeferenced data are rare over long time scales, particularly in subtidal, tropical marine regions. We use high-resolution historical nautical charts to quantify changes to benthic structure over 240 years in the Florida Keys, finding an overall loss of 52% (SE, 6.4%) of the area of the seafloor occupied by corals. We find a strong spatial dimension to this decline; the spatial extent of coral in Florida Bay and nearshore declined by 87.5% (SE, 7.2%) and 68.8% (SE, 7.5%), respectively, whereas that of offshore areas of coral remained largely intact. These estimates add to finer-scale loss in live coral cover exceeding 90% in some locations in recent decades. The near-complete elimination of the spatial coverage of nearshore coral represents an underappreciated spatial component of the shifting baseline syndrome, with important lessons for other species and ecosystems. That is, modern surveys are typically designed to assess change only within the species’ known, extant range. For species ranging from corals to sea turtles, this approach may overlook spatial loss over longer time frames, resulting in both overly optimistic views of their current conservation status and underestimates of their restoration potential. PMID:28913420
Herculano-Houzel, Suzana; Manger, Paul R.; Kaas, Jon H.
2014-01-01
Enough species have now been subject to systematic quantitative analysis of the relationship between the morphology and cellular composition of their brain that patterns begin to emerge and shed light on the evolutionary path that led to mammalian brain diversity. Based on an analysis of the shared and clade-specific characteristics of 41 modern mammalian species in 6 clades, and in light of the phylogenetic relationships among them, here we propose that ancestral mammal brains were composed and scaled in their cellular composition like modern afrotherian and glire brains: with an addition of neurons that is accompanied by a decrease in neuronal density and very little modification in glial cell density, implying a significant increase in average neuronal cell size in larger brains, and the allocation of approximately 2 neurons in the cerebral cortex and 8 neurons in the cerebellum for every neuron allocated to the rest of brain. We also propose that in some clades the scaling of different brain structures has diverged away from the common ancestral layout through clade-specific (or clade-defining) changes in how average neuronal cell mass relates to numbers of neurons in each structure, and how numbers of neurons are differentially allocated to each structure relative to the number of neurons in the rest of brain. Thus, the evolutionary expansion of mammalian brains has involved both concerted and mosaic patterns of scaling across structures. This is, to our knowledge, the first mechanistic model that explains the generation of brains large and small in mammalian evolution, and it opens up new horizons for seeking the cellular pathways and genes involved in brain evolution. PMID:25157220
When do glaciated landscapes form?
NASA Astrophysics Data System (ADS)
Koppes, M. N.
2015-12-01
Glacial erosion is a fundamental link between climate and the tectonic and surface processes that create topography. Mountain ranges worldwide have undergone large-scale modification due the erosive action of ice masses, yet the mechanisms that control the timing of this modification and the rate by which ice erodes remain poorly understood. Available data report a wide range of erosion rates from individual ice masses over varying timescales, from the modern to orogenic. Recent numerical modeling efforts have focused on replicating the processes that produce the geomorphic signatures of glacial landscapes. Central to these models is a simple index that relates erosion rate to ice dynamics. To provide a quantitative test of the links between glacial erosion, sliding and ice discharge, we examined explicitly the factors controlling modern glacier erosion rates across climatic regimes, from Patagonia to the Antarctic Peninsula. We find that modern, basin-averaged erosion rates vary by three orders of magnitude, from 1->10 mm yr-1 in Patagonia to 0.01-<0.1 mm yr-1 in the AP, largely as a function of temperature and basal thermal regime. Erosion rates also increase non-linearly with both the sliding speed and the ice flux through the ELA, in accord with theories of glacial erosion. Notably, erosion rates decrease by over two orders of magnitude between temperate and polar glaciers with similar discharge rates. The difference in erosion rates between temperate and colder glaciers of similar shape and size is primarily related to the abundance of meltwater accessing the bed. Since all glaciers worldwide have experienced colder than current climatic conditions, the 100-fold decrease in long-term relative to modern erosion rates may in part reflect the temporal averaging of temperate and polar conditions over the lifecycle of these glaciers. Hence, climatic variation, more than the extent of ice cover or tectonic changes, controls the pace at which glaciers shape mountains.
17 CFR 240.14a-5 - Presentation of information in proxy statement.
Code of Federal Regulations, 2011 CFR
2011-04-01
... roman type at least as large and as legible as 10-point modern type, except that to the extent necessary..., may be in roman type at least as large and as legible as 8-point modern type. All such type shall be...
New materials and structures for photovoltaics
NASA Astrophysics Data System (ADS)
Zunger, Alex; Wagner, S.; Petroff, P. M.
1993-01-01
Despite the fact that over the years crystal chemists have discovered numerous semiconducting substances, and that modern epitaxial growth techniques are able to produce many novel atomic-scale architectures, current electronic and opto-electronic technologies are based but on a handful of ˜10 traditional semiconductor core materials. This paper surveys a number of yet-unexploited classes of semiconductors, pointing to the much-needed research in screening, growing, and characterizing promising members of these classes. In light of the unmanageably large number of a-priori possibilities, we emphasize the role that structural chemistry and modern computer-aided design must play in screening potentially important candidates. The basic classes of materials discussed here include nontraditional alloys, such as non-isovalent and heterostructural semiconductors, materials at reduced dimensionality, including superlattices, zeolite-caged nanostructures and organic semiconductors, spontaneously ordered alloys, interstitial semiconductors, filled tetrahedral structures, ordered vacancy compounds, and compounds based on d and f electron elements. A collaborative effort among material predictor, material grower, and material characterizer holds the promise for a successful identification of new and exciting systems.
Acceleration of modern acidification in the South China Sea driven by anthropogenic CO2
Liu, Yi; Peng, Zicheng; Zhou, Renjun; Song, Shaohua; Liu, Weiguo; You, Chen-Feng; Lin, Yen-Po; Yu, Kefu; Wu, Chung-Che; Wei, Gangjian; Xie, Luhua; Burr, George S.; Shen, Chuan-Chou
2014-01-01
Modern acidification by the uptake of anthropogenic CO2 can profoundly affect the physiology of marine organisms and the structure of ocean ecosystems. Centennial-scale global and regional influences of anthropogenic CO2 remain largely unknown due to limited instrumental pH records. Here we present coral boron isotope-inferred pH records for two periods from the South China Sea: AD 1048–1079 and AD 1838–2001. There are no significant pH differences between the first period at the Medieval Warm Period and AD 1830–1870. However, we find anomalous and unprecedented acidification during the 20th century, pacing the observed increase in atmospheric CO2. Moreover, pH value also varies in phase with inter-decadal changes in Asian Winter Monsoon intensity. As the level of atmospheric CO2 keeps rising, the coupling global warming via weakening the winter monsoon intensity could exacerbate acidification of the South China Sea and threaten this expansive shallow water marine ecosystem. PMID:24888785
Bindler, Richard; Renberg, Ingemar; Rydberg, Johan; Andrén, Thomas
2009-07-01
Metal pollution is viewed as a modern problem that began in the 19th century and accelerated through the 20th century; however, in many parts of the globe this view is wrong. Here, we studied past waterborne metal pollution in lake sediments from the Bergslagen region in central Sweden, one of many historically important mining regions in Europe. With a focus on lead (including isotopes), we trace mining impacts from a local scale, through a 120-km-long river system draining into Mälaren--Sweden's third largest lake, and finally also the Baltic Sea. Comparison of sediment and peat records shows that pollution from Swedish mining was largely waterborne and that atmospheric deposition was dominated by long-range transport from other regions. Swedish ore lead is detectable from the 10th century, but the greatest impact occurred during the 16th-18th centuries with improvements occurring over recent centuries, i.e., historical pollution > modern industrial pollution.
3D-Simulation Of Concentration Distributions Inside Large-Scale Circulating Fluidized Bed Combustors
NASA Astrophysics Data System (ADS)
Wischnewski, R.; Ratschow, L.; Hartge, E. U.; Werthe, J.
With increasing size of modern CFB combustors the lateral mixing of fuels and secondary air gains more and more importance. Strong concentration gradients, which result from improper lateral mixing, can lead to operational problems, high flue gas emissions and lower boiler efficiencies. A 3D-model for the simulation of local gas and solids concentrations inside industrial-sized CFB boilers has been developed. The model is based on a macroscopic approach and considers all major mechanisms during fuel spreading and subsequent combustion of char and volatiles. Typical characteristics of modern boilers like staged combustion, a smaller cross-sectional area in the lower section of the combustion chamber and the co-combustion of additional fuels with coal can be considered. The 252 MWth combustor of Stadtwerke Duisburg AG is used for the validation of the model. A comprehensive picture of the local conditions inside the combustion chamber is achieved by the combination of local gas measurements and the three-dimensional simulation of concentration distributions.
Acceleration of modern acidification in the South China Sea driven by anthropogenic CO2
NASA Astrophysics Data System (ADS)
Liu, Yi; Peng, Zicheng; Zhou, Renjun; Song, Shaohua; Liu, Weiguo; You, Chen-Feng; Lin, Yen-Po; Yu, Kefu; Wu, Chung-Che; Wei, Gangjian; Xie, Luhua; Burr, George S.; Shen, Chuan-Chou
2014-06-01
Modern acidification by the uptake of anthropogenic CO2 can profoundly affect the physiology of marine organisms and the structure of ocean ecosystems. Centennial-scale global and regional influences of anthropogenic CO2 remain largely unknown due to limited instrumental pH records. Here we present coral boron isotope-inferred pH records for two periods from the South China Sea: AD 1048-1079 and AD 1838-2001. There are no significant pH differences between the first period at the Medieval Warm Period and AD 1830-1870. However, we find anomalous and unprecedented acidification during the 20th century, pacing the observed increase in atmospheric CO2. Moreover, pH value also varies in phase with inter-decadal changes in Asian Winter Monsoon intensity. As the level of atmospheric CO2 keeps rising, the coupling global warming via weakening the winter monsoon intensity could exacerbate acidification of the South China Sea and threaten this expansive shallow water marine ecosystem.
Wang, Jack T H; Schembri, Mark A; Hall, Roy A
2013-01-01
Designing and implementing assessment tasks in large-scale undergraduate science courses is a labor-intensive process subject to increasing scrutiny from students and quality assurance authorities alike. Recent pedagogical research has provided conceptual frameworks for teaching introductory undergraduate microbiology, but has yet to define best-practice assessment guidelines. This study assessed the applicability of Biggs' theory of constructive alignment in designing consistent learning objectives, activities, and assessment items that aligned with the American Society for Microbiology's concept-based microbiology curriculum in MICR2000, an introductory microbiology course offered at the University of Queensland, Australia. By improving the internal consistency in assessment criteria and increasing the number of assessment items explicitly aligned to the course learning objectives, the teaching team was able to efficiently provide adequate feedback on numerous assessment tasks throughout the semester, which contributed to improved student performance and learning gains. When comparing the constructively aligned 2011 offering of MICR2000 with its 2010 counterpart, students obtained higher marks in both coursework assignments and examinations as the semester progressed. Students also valued the additional feedback provided, as student rankings for course feedback provision increased in 2011 and assessment and feedback was identified as a key strength of MICR2000. By designing MICR2000 using constructive alignment and iterative assessment tasks that followed a common set of learning outcomes, the teaching team was able to effectively deliver detailed and timely feedback in a large introductory microbiology course. This study serves as a case study for how constructive alignment can be integrated into modern teaching practices for large-scale courses.
Large-scale annotation of small-molecule libraries using public databases.
Zhou, Yingyao; Zhou, Bin; Chen, Kaisheng; Yan, S Frank; King, Frederick J; Jiang, Shumei; Winzeler, Elizabeth A
2007-01-01
While many large publicly accessible databases provide excellent annotation for biological macromolecules, the same is not true for small chemical compounds. Commercial data sources also fail to encompass an annotation interface for large numbers of compounds and tend to be cost prohibitive to be widely available to biomedical researchers. Therefore, using annotation information for the selection of lead compounds from a modern day high-throughput screening (HTS) campaign presently occurs only under a very limited scale. The recent rapid expansion of the NIH PubChem database provides an opportunity to link existing biological databases with compound catalogs and provides relevant information that potentially could improve the information garnered from large-scale screening efforts. Using the 2.5 million compound collection at the Genomics Institute of the Novartis Research Foundation (GNF) as a model, we determined that approximately 4% of the library contained compounds with potential annotation in such databases as PubChem and the World Drug Index (WDI) as well as related databases such as the Kyoto Encyclopedia of Genes and Genomes (KEGG) and ChemIDplus. Furthermore, the exact structure match analysis showed 32% of GNF compounds can be linked to third party databases via PubChem. We also showed annotations such as MeSH (medical subject headings) terms can be applied to in-house HTS databases in identifying signature biological inhibition profiles of interest as well as expediting the assay validation process. The automated annotation of thousands of screening hits in batch is becoming feasible and has the potential to play an essential role in the hit-to-lead decision making process.
NASA Astrophysics Data System (ADS)
Wei, Zhongwang; Lee, Xuhui; Liu, Zhongfang; Seeboonruang, Uma; Koike, Masahiro; Yoshimura, Kei
2018-04-01
Many paleoclimatic records in Southeast Asia rely on rainfall isotope ratios as proxies for past hydroclimatic variability. However, the physical processes controlling modern rainfall isotopic behaviors in the region is poorly constrained. Here, we combined isotopic measurements at six sites across Thailand with an isotope-incorporated atmospheric circulation model (IsoGSM) and the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to investigate the factors that govern the variability of precipitation isotope ratios in this region. Results show that rainfall isotope ratios are both correlated with local rainfall amount and regional outgoing longwave radiation, suggesting that rainfall isotope ratios in this region are controlled not only by local rain amount (amount effect) but also by large-scale convection. As a transition zone between the Indian monsoon and the western North Pacific monsoon, the spatial difference of observed precipitation isotope among different sites are associated with moisture source. These results highlight the importance of regional processes in determining rainfall isotope ratios in the tropics and provide constraints on the interpretation of paleo-precipitation isotope records in the context of regional climate dynamics.
Kang, Dae Y; Kim, Yun-Soung; Ornelas, Gladys; Sinha, Mridu; Naidu, Keerthiga; Coleman, Todd P
2015-09-16
New classes of ultrathin flexible and stretchable devices have changed the way modern electronics are designed to interact with their target systems. Though more and more novel technologies surface and steer the way we think about future electronics, there exists an unmet need in regards to optimizing the fabrication procedures for these devices so that large-scale industrial translation is realistic. This article presents an unconventional approach for facile microfabrication and processing of adhesive-peeled (AP) flexible sensors. By assembling AP sensors on a weakly-adhering substrate in an inverted fashion, we demonstrate a procedure with 50% reduced end-to-end processing time that achieves greater levels of fabrication yield. The methodology is used to demonstrate the fabrication of electrical and mechanical flexible and stretchable AP sensors that are peeled-off their carrier substrates by consumer adhesives. In using this approach, we outline the manner by which adhesion is maintained and buckling is reduced for gold film processing on polydimethylsiloxane substrates. In addition, we demonstrate the compatibility of our methodology with large-scale post-processing using a roll-to-roll approach.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
NASA Astrophysics Data System (ADS)
Ruffell, Alastair; McKinley, Jennifer
2005-03-01
One hundred years ago Georg Popp became the first scientist to present in court a case where the geological makeup of soils was used to secure a criminal conviction. Subsequently there have been significant advances in the theory and practice of forensic geoscience: many of them subsequent to the seminal publication of "Forensic Geology" by Murray and Tedrow [Murray, R., Tedrow, J.C.F. 1975 (republished 1986). Forensic Geology: Earth Sciences and Criminal Investigation. Rutgers University Press, New York, 240 pp.]. Our review places historical development in the modern context of how the allied disciplines of geology (mineralogy, sedimentology, microscopy), geophysics, soil science, microbiology, anthropology and geomorphology have been used as tool to aid forensic (domestic, serious, terrorist and international) crime investigations. The latter half of this paper uses the concept of scales of investigation, from large-scale landforms through to microscopic particles as a method of categorising the large number of geoscience applications to criminal investigation. Forensic geoscience has traditionally used established non-forensic techniques: 100 years after Popp's seminal work, research into forensic geoscience is beginning to lead, as opposed to follow other scientific disciplines.
Effectively Transparent Front Contacts for Optoelectronic Devices
Saive, Rebecca; Borsuk, Aleca M.; Emmer, Hal S.; ...
2016-06-10
Effectively transparent front contacts for optoelectronic devices achieve a measured transparency of up to 99.9% and a measured sheet resistance of 4.8 Ω sq-1. These 3D microscale triangular cross-section grid fingers redirect incoming photons efficiently to the active semiconductor area and can replace standard grid fingers as well as transparent conductive oxide layers in optoelectronic devices. Optoelectronic devices such as light emitting diodes, photodiodes, and solar cells play an important and expanding role in modern technology. Photovoltaics is one of the largest optoelectronic industry sectors and an ever-increasing component of the world's rapidly growing renewable carbon-free electricity generation infrastructure. Inmore » recent years, the photovoltaics field has dramatically expanded owing to the large-scale manufacture of inexpensive crystalline Si and thin film cells and modules. The current record efficiency (η = 25.6%) Si solar cell utilizes a heterostructure intrinsic thin layer (HIT) design[1] to enable increased open circuit voltage, while more mass-manufacturable solar cell architectures feature front contacts.[2, 3] Thus improved solar cell front contact designs are important for future large-scale photovoltaics with even higher efficiency.« less
Murphy, Patricia; Kabir, Md Humayun; Srivastava, Tarini; Mason, Michele E.; Dewi, Chitra U.; Lim, Seakcheng; Yang, Andrian; Djordjevic, Djordje; Killingsworth, Murray C.; Ho, Joshua W. K.; Harman, David G.
2018-01-01
ABSTRACT Cataracts cause vision loss and blindness by impairing the ability of the ocular lens to focus light onto the retina. Various cataract risk factors have been identified, including drug treatments, age, smoking and diabetes. However, the molecular events responsible for these different forms of cataract are ill-defined, and the advent of modern cataract surgery in the 1960s virtually eliminated access to human lenses for research. Here, we demonstrate large-scale production of light-focusing human micro-lenses from spheroidal masses of human lens epithelial cells purified from differentiating pluripotent stem cells. The purified lens cells and micro-lenses display similar morphology, cellular arrangement, mRNA expression and protein expression to human lens cells and lenses. Exposing the micro-lenses to the emergent cystic fibrosis drug Vx-770 reduces micro-lens transparency and focusing ability. These human micro-lenses provide a powerful and large-scale platform for defining molecular disease mechanisms caused by cataract risk factors, for anti-cataract drug screening and for clinically relevant toxicity assays. PMID:29217756
Murphy, Patricia; Kabir, Md Humayun; Srivastava, Tarini; Mason, Michele E; Dewi, Chitra U; Lim, Seakcheng; Yang, Andrian; Djordjevic, Djordje; Killingsworth, Murray C; Ho, Joshua W K; Harman, David G; O'Connor, Michael D
2018-01-09
Cataracts cause vision loss and blindness by impairing the ability of the ocular lens to focus light onto the retina. Various cataract risk factors have been identified, including drug treatments, age, smoking and diabetes. However, the molecular events responsible for these different forms of cataract are ill-defined, and the advent of modern cataract surgery in the 1960s virtually eliminated access to human lenses for research. Here, we demonstrate large-scale production of light-focusing human micro-lenses from spheroidal masses of human lens epithelial cells purified from differentiating pluripotent stem cells. The purified lens cells and micro-lenses display similar morphology, cellular arrangement, mRNA expression and protein expression to human lens cells and lenses. Exposing the micro-lenses to the emergent cystic fibrosis drug Vx-770 reduces micro-lens transparency and focusing ability. These human micro-lenses provide a powerful and large-scale platform for defining molecular disease mechanisms caused by cataract risk factors, for anti-cataract drug screening and for clinically relevant toxicity assays. © 2018. Published by The Company of Biologists Ltd.
SNAVA-A real-time multi-FPGA multi-model spiking neural network simulation architecture.
Sripad, Athul; Sanchez, Giovanny; Zapata, Mireya; Pirrone, Vito; Dorta, Taho; Cambria, Salvatore; Marti, Albert; Krishnamourthy, Karthikeyan; Madrenas, Jordi
2018-01-01
Spiking Neural Networks (SNN) for Versatile Applications (SNAVA) simulation platform is a scalable and programmable parallel architecture that supports real-time, large-scale, multi-model SNN computation. This parallel architecture is implemented in modern Field-Programmable Gate Arrays (FPGAs) devices to provide high performance execution and flexibility to support large-scale SNN models. Flexibility is defined in terms of programmability, which allows easy synapse and neuron implementation. This has been achieved by using a special-purpose Processing Elements (PEs) for computing SNNs, and analyzing and customizing the instruction set according to the processing needs to achieve maximum performance with minimum resources. The parallel architecture is interfaced with customized Graphical User Interfaces (GUIs) to configure the SNN's connectivity, to compile the neuron-synapse model and to monitor SNN's activity. Our contribution intends to provide a tool that allows to prototype SNNs faster than on CPU/GPU architectures but significantly cheaper than fabricating a customized neuromorphic chip. This could be potentially valuable to the computational neuroscience and neuromorphic engineering communities. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kang, Dae Y.; Kim, Yun-Soung; Ornelas, Gladys; Sinha, Mridu; Naidu, Keerthiga; Coleman, Todd P.
2015-01-01
New classes of ultrathin flexible and stretchable devices have changed the way modern electronics are designed to interact with their target systems. Though more and more novel technologies surface and steer the way we think about future electronics, there exists an unmet need in regards to optimizing the fabrication procedures for these devices so that large-scale industrial translation is realistic. This article presents an unconventional approach for facile microfabrication and processing of adhesive-peeled (AP) flexible sensors. By assembling AP sensors on a weakly-adhering substrate in an inverted fashion, we demonstrate a procedure with 50% reduced end-to-end processing time that achieves greater levels of fabrication yield. The methodology is used to demonstrate the fabrication of electrical and mechanical flexible and stretchable AP sensors that are peeled-off their carrier substrates by consumer adhesives. In using this approach, we outline the manner by which adhesion is maintained and buckling is reduced for gold film processing on polydimethylsiloxane substrates. In addition, we demonstrate the compatibility of our methodology with large-scale post-processing using a roll-to-roll approach. PMID:26389915
Trends in modern system theory
NASA Technical Reports Server (NTRS)
Athans, M.
1976-01-01
The topics considered are related to linear control system design, adaptive control, failure detection, control under failure, system reliability, and large-scale systems and decentralized control. It is pointed out that the design of a linear feedback control system which regulates a process about a desirable set point or steady-state condition in the presence of disturbances is a very important problem. The linearized dynamics of the process are used for design purposes. The typical linear-quadratic design involving the solution of the optimal control problem of a linear time-invariant system with respect to a quadratic performance criterion is considered along with gain reduction theorems and the multivariable phase margin theorem. The stumbling block in many adaptive design methodologies is associated with the amount of real time computation which is necessary. Attention is also given to the desperate need to develop good theories for large-scale systems, the beginning of a microprocessor revolution, the translation of the Wiener-Hopf theory into the time domain, and advances made in dynamic team theory, dynamic stochastic games, and finite memory stochastic control.
Integrated Data Modeling and Simulation on the Joint Polar Satellite System Program
NASA Technical Reports Server (NTRS)
Roberts, Christopher J.; Boyce, Leslye; Smith, Gary; Li, Angela; Barrett, Larry
2012-01-01
The Joint Polar Satellite System is a modern, large-scale, complex, multi-mission aerospace program, and presents a variety of design, testing and operational challenges due to: (1) System Scope: multi-mission coordination, role, responsibility and accountability challenges stemming from porous/ill-defined system and organizational boundaries (including foreign policy interactions) (2) Degree of Concurrency: design, implementation, integration, verification and operation occurring simultaneously, at multiple scales in the system hierarchy (3) Multi-Decadal Lifecycle: technical obsolesce, reliability and sustainment concerns, including those related to organizational and industrial base. Additionally, these systems tend to become embedded in the broader societal infrastructure, resulting in new system stakeholders with perhaps different preferences (4) Barriers to Effective Communications: process and cultural issues that emerge due to geographic dispersion and as one spans boundaries including gov./contractor, NASA/Other USG, and international relationships.
Preface: MHD wave phenomena in the solar interior and atmosphere
NASA Astrophysics Data System (ADS)
Fedun, Viktor; Srivastava, A. K.
2018-01-01
The Sun is our nearest star and this star produces various plasma wave processes and energetic events. These phenomena strongly influence interplanetary plasma dynamics and contribute to space-weather. The understanding of solar atmospheric dynamics requires hi-resolution modern observations which, in turn, further advances theoretical models of physical processes in the solar interior and atmosphere. In particular, it is essential to connect the magnetohydrodynamic (MHD) wave processes with the small and large-scale solar phenomena vis-a-vis transport of energy and mass. With the advent of currently available and upcoming high-resolution space (e.g., IRIS, SDO, Hinode, Aditya-L1, Solar-C, Solar Orbiter), and ground-based (e.g., SST, ROSA, NLST, Hi-C, DKIST, EST, COSMO) observations, solar physicists are able to explore exclusive wave processes in various solar magnetic structures at different spatio-temporal scales.
Three Thousand Years of Continuity in the Maternal Lineages of Ancient Sheep (Ovis aries) in Estonia
Rannamäe, Eve; Lõugas, Lembi; Speller, Camilla F.; Valk, Heiki; Maldre, Liina; Wilczyński, Jarosław; Mikhailov, Aleksandr; Saarma, Urmas
2016-01-01
Although sheep (Ovis aries) have been one of the most exploited domestic animals in Estonia since the Late Bronze Age, relatively little is known about their genetic history. Here, we explore temporal changes in Estonian sheep populations and their mitochondrial genetic diversity over the last 3000 years. We target a 558 base pair fragment of the mitochondrial hypervariable region in 115 ancient sheep from 71 sites in Estonia (c. 1200 BC–AD 1900s), 19 ancient samples from Latvia, Russia, Poland and Greece (6800 BC–AD 1700), as well as 44 samples of modern Kihnu native sheep breed. Our analyses revealed: (1) 49 mitochondrial haplotypes, associated with sheep haplogroups A and B; (2) high haplotype diversity in Estonian ancient sheep; (3) continuity in mtDNA haplotypes through time; (4) possible population expansion during the first centuries of the Middle Ages (associated with the establishment of the new power regime related to 13th century crusades); (5) significant difference in genetic diversity between ancient populations and modern native sheep, in agreement with the beginning of large-scale breeding in the 19th century and population decline in local sheep. Overall, our results suggest that in spite of the observed fluctuations in ancient sheep populations, and changes in the natural and historical conditions, the utilisation of local sheep has been constant in the territory of Estonia, displaying matrilineal continuity from the Middle Bronze Age through the Modern Period, and into modern native sheep. PMID:27732668
NASA Astrophysics Data System (ADS)
Schneider, Christian
2017-04-01
The study analyzes the impact of different farming systems on soil quality and soil degradation in European loess landscapes. The analyses are based on geo-chemical soil properties, landscape metrics and geomorphological indicators. The German Middle Saxonian Loess Region represents loess landscapes whose ecological functions were shaped by land consolidation measures resulting in large-scale high-input farming systems. The Polish Proszowice Plateau is still characterized by a traditional small-scale peasant agriculture. The research areas were analyzed on different scale levels combining GIS, field, and laboratory methods. A digital terrain classification was used to identify representative catchment basins for detailed pedological studies which were focused on soil properties that responded to soil management within several years, like pH-value, total carbon (TC), total nitrogen (TN), inorganic carbon (IC), soil organic carbon (TOC=TC-IC), hot-water extractable carbon (HWC), hot-water extractable nitrogen (HWN), total phosphorus, plant-available phosphorus (P), plant-available potassium (K) and the potential cation exchange capacity (CEC). The study has shown that significant differences in major soil properties can be observed because of different fertilizer inputs and partly because of different cultivation techniques. Also the traditional system increases soil heterogeneity. Contrary to expectations the study has shown that the small-scale peasant farming system resulted in similar mean soil organic carbon and phosphorus contents like the industrialized high-input farming system. A further study could include investigations of the effects of soil amendments like herbicides and pesticide on soil degradation.
ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems
Expósito, Roberto R.
2018-01-01
Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/. PMID:29608567
ParBiBit: Parallel tool for binary biclustering on modern distributed-memory systems.
González-Domínguez, Jorge; Expósito, Roberto R
2018-01-01
Biclustering techniques are gaining attention in the analysis of large-scale datasets as they identify two-dimensional submatrices where both rows and columns are correlated. In this work we present ParBiBit, a parallel tool to accelerate the search of interesting biclusters on binary datasets, which are very popular on different fields such as genetics, marketing or text mining. It is based on the state-of-the-art sequential Java tool BiBit, which has been proved accurate by several studies, especially on scenarios that result on many large biclusters. ParBiBit uses the same methodology as BiBit (grouping the binary information into patterns) and provides the same results. Nevertheless, our tool significantly improves performance thanks to an efficient implementation based on C++11 that includes support for threads and MPI processes in order to exploit the compute capabilities of modern distributed-memory systems, which provide several multicore CPU nodes interconnected through a network. Our performance evaluation with 18 representative input datasets on two different eight-node systems shows that our tool is significantly faster than the original BiBit. Source code in C++ and MPI running on Linux systems as well as a reference manual are available at https://sourceforge.net/projects/parbibit/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-01-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator designs have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take a tremendous amount of computing time. In this review the method of determining chaotic orbit and applying the method to nonlinear problems in accelerator physics is discussed. We then discuss the scaling properties and effect of random sextupoles.« less
The urbanization of the modern community creates large population centers that generate concentrated wastewater. A large expenditure on wastewater treatment has to be invested to make a modern city function without human and environmental health problems. Society relies on syste...
Chapter 1: Biomedical knowledge integration.
Payne, Philip R O
2012-01-01
The modern biomedical research and healthcare delivery domains have seen an unparalleled increase in the rate of innovation and novel technologies over the past several decades. Catalyzed by paradigm-shifting public and private programs focusing upon the formation and delivery of genomic and personalized medicine, the need for high-throughput and integrative approaches to the collection, management, and analysis of heterogeneous data sets has become imperative. This need is particularly pressing in the translational bioinformatics domain, where many fundamental research questions require the integration of large scale, multi-dimensional clinical phenotype and bio-molecular data sets. Modern biomedical informatics theory and practice has demonstrated the distinct benefits associated with the use of knowledge-based systems in such contexts. A knowledge-based system can be defined as an intelligent agent that employs a computationally tractable knowledge base or repository in order to reason upon data in a targeted domain and reproduce expert performance relative to such reasoning operations. The ultimate goal of the design and use of such agents is to increase the reproducibility, scalability, and accessibility of complex reasoning tasks. Examples of the application of knowledge-based systems in biomedicine span a broad spectrum, from the execution of clinical decision support, to epidemiologic surveillance of public data sets for the purposes of detecting emerging infectious diseases, to the discovery of novel hypotheses in large-scale research data sets. In this chapter, we will review the basic theoretical frameworks that define core knowledge types and reasoning operations with particular emphasis on the applicability of such conceptual models within the biomedical domain, and then go on to introduce a number of prototypical data integration requirements and patterns relevant to the conduct of translational bioinformatics that can be addressed via the design and use of knowledge-based systems.
GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing
Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal
2016-01-01
Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300
Chapter 1: Biomedical Knowledge Integration
Payne, Philip R. O.
2012-01-01
The modern biomedical research and healthcare delivery domains have seen an unparalleled increase in the rate of innovation and novel technologies over the past several decades. Catalyzed by paradigm-shifting public and private programs focusing upon the formation and delivery of genomic and personalized medicine, the need for high-throughput and integrative approaches to the collection, management, and analysis of heterogeneous data sets has become imperative. This need is particularly pressing in the translational bioinformatics domain, where many fundamental research questions require the integration of large scale, multi-dimensional clinical phenotype and bio-molecular data sets. Modern biomedical informatics theory and practice has demonstrated the distinct benefits associated with the use of knowledge-based systems in such contexts. A knowledge-based system can be defined as an intelligent agent that employs a computationally tractable knowledge base or repository in order to reason upon data in a targeted domain and reproduce expert performance relative to such reasoning operations. The ultimate goal of the design and use of such agents is to increase the reproducibility, scalability, and accessibility of complex reasoning tasks. Examples of the application of knowledge-based systems in biomedicine span a broad spectrum, from the execution of clinical decision support, to epidemiologic surveillance of public data sets for the purposes of detecting emerging infectious diseases, to the discovery of novel hypotheses in large-scale research data sets. In this chapter, we will review the basic theoretical frameworks that define core knowledge types and reasoning operations with particular emphasis on the applicability of such conceptual models within the biomedical domain, and then go on to introduce a number of prototypical data integration requirements and patterns relevant to the conduct of translational bioinformatics that can be addressed via the design and use of knowledge-based systems. PMID:23300416
GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.
Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal
2016-01-01
Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.
Large-Scale and Global Hydrology. Chapter 92
NASA Technical Reports Server (NTRS)
Rodell, Matthew; Beaudoing, Hiroko Kato; Koster, Randal; Peters-Lidard, Christa D.; Famiglietti, James S.; Lakshmi, Venkat
2016-01-01
Powered by the sun, water moves continuously between and through Earths oceanic, atmospheric, and terrestrial reservoirs. It enables life, shapes Earths surface, and responds to and influences climate change. Scientists measure various features of the water cycle using a combination of ground, airborne, and space-based observations, and seek to characterize it at multiple scales with the aid of numerical models. Over time our understanding of the water cycle and ability to quantify it have improved, owing to advances in observational capabilities, the extension of the data record, and increases in computing power and storage. Here we present some of the most recent estimates of global and continental ocean basin scale water cycle stocks and fluxes and provide examples of modern numerical modeling systems and reanalyses.Further, we discuss prospects for predicting water cycle variability at seasonal and longer scales, which is complicated by a changing climate and direct human impacts related to water management and agriculture. Changes to the water cycle will be among the most obvious and important facets of climate change, thus it is crucial that we continue to invest in our ability to monitor it.
Model Wind Turbines Tested at Full-Scale Similarity
NASA Astrophysics Data System (ADS)
Miller, M. A.; Kiefer, J.; Westergaard, C.; Hultmark, M.
2016-09-01
The enormous length scales associated with modern wind turbines complicate any efforts to predict their mechanical loads and performance. Both experiments and numerical simulations are constrained by the large Reynolds numbers governing the full- scale aerodynamics. The limited fundamental understanding of Reynolds number effects in combination with the lack of empirical data affects our ability to predict, model, and design improved turbines and wind farms. A new experimental approach is presented, which utilizes a highly pressurized wind tunnel (up to 220 bar). It allows exact matching of the Reynolds numbers (no matter how it is defined), tip speed ratios, and Mach numbers on a geometrically similar, small-scale model. The design of a measurement and instrumentation stack to control the turbine and measure the loads in the pressurized environment is discussed. Results are then presented in the form of power coefficients as a function of Reynolds number and Tip Speed Ratio. Due to gearbox power loss, a preliminary study has also been completed to find the gearbox efficiency and the resulting correction has been applied to the data set.
From Lattice Boltzmann to hydrodynamics in dissipative relativistic fluids
NASA Astrophysics Data System (ADS)
Gabbana, Alessandro; Mendoza, Miller; Succi, Sauro; Tripiccione, Raffaele
2017-11-01
Relativistic fluid dynamics is currently applied to several fields of modern physics, covering many physical scales, from astrophysics, to atomic scales (e.g. in the study of effective 2D systems such as graphene) and further down to subnuclear scales (e.g. quark-gluon plasmas). This talk focuses on recent progress in the largely debated connection between kinetic transport coefficients and macroscopic hydrodynamic parameters in dissipative relativistic fluid dynamics. We use a new relativistic Lattice Boltzmann method (RLBM), able to handle from ultra-relativistic to almost non-relativistic flows, and obtain strong evidence that the Chapman-Enskog expansion provides the correct pathway from kinetic theory to hydrodynamics. This analysis confirms recently obtained theoretical results, which can be used to obtain accurate calibrations for RLBM methods applied to realistic physics systems in the relativistic regime. Using this calibration methodology, RLBM methods are able to deliver improved physical accuracy in the simulation of the physical systems described above. European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Grant Agreement No. 642069.
NASA Astrophysics Data System (ADS)
Fike, D. A.; Jones, D. S.; Fischer, W. W.
2011-12-01
Sulfur isotope ratio data have been used to provide significant insights into global biogeochemical cycling over Earth history. In addition to providing a framework for the construction of global redox budgets, these observations also provide the primary constraints on the advent and environmental importance of particular microbial metabolisms. As the chemostratigraphic record has become better resolved in space and time, however, reports of coeval discordant data are increasingly common - both within and between individual sedimentary basins. If accurate, this variability challenges our understanding of the first order behavior of the 'global' sulfur biogeochemical cycle. Some of this discordance may be due to spatial gradients in important oceanographic parameters; however, we think that a more likely culprit is ongoing microbial metabolic activity (that impacts the isotopic composition recorded by geological samples) during both syndepositional sediment reworking and early diagenetic lithification. Modern studies have recently highlighted the efficacy with which microbial activity during sediment remobilization can dramatically alter isotopic profiles. Further, the magnitude of local, microbially driven variations in S isotopes in modern sediments is sufficiently large that uneven incorporation of these signatures during deposition and lithification can explain much of the observed discordance in chemostratigraphic reconstructions of sulfur cycling. Here we attempt to link spatial variability in the sedimentary rock record with understanding of modern microbial systems operating in marine sediments. To that end we examine chemostratigraphic records of sulfur isotope (δ34S) data spanning the terminal Neoproterozoic to early Paleozoic eras and assess their scales of spatial reproducibility. We can gain insight into interpreting the observed patterns in these records by examining modern (bio)sedimentary environments. This understanding also allows us to reflect on and refine time series isotope ratio data that constrain the behavior of the sulfur cycle over long timescales.
de Jong, Hein C.; Salentijn, Elma M. J.; Dekking, Liesbeth; Bosch, Dirk; Hamer, Rob J.; Gilissen, Ludovicus J. W. J.; van der Meer, Ingrid M.; Smulders, Marinus J. M.
2010-01-01
Gluten proteins from wheat can induce celiac disease (CD) in genetically susceptible individuals. Specific gluten peptides can be presented by antigen presenting cells to gluten-sensitive T-cell lymphocytes leading to CD. During the last decades, a significant increase has been observed in the prevalence of CD. This may partly be attributed to an increase in awareness and to improved diagnostic techniques, but increased wheat and gluten consumption is also considered a major cause. To analyze whether wheat breeding contributed to the increase of the prevalence of CD, we have compared the genetic diversity of gluten proteins for the presence of two CD epitopes (Glia-α9 and Glia-α20) in 36 modern European wheat varieties and in 50 landraces representing the wheat varieties grown up to around a century ago. Glia-α9 is a major (immunodominant) epitope that is recognized by the majority of CD patients. The minor Glia-α20 was included as a technical reference. Overall, the presence of the Glia-α9 epitope was higher in the modern varieties, whereas the presence of the Glia-α20 epitope was lower, as compared to the landraces. This suggests that modern wheat breeding practices may have led to an increased exposure to CD epitopes. On the other hand, some modern varieties and landraces have been identified that have relatively low contents of both epitopes. Such selected lines may serve as a start to breed wheat for the introduction of ‘low CD toxic’ as a new breeding trait. Large-scale culture and consumption of such varieties would considerably aid in decreasing the prevalence of CD. PMID:20664999
A Research Framework for Demonstrating Benefits of Advanced Control Room Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya; Boring, Ronald; Joe, Jeffrey
Control Room modernization is an important part of life extension for the existing light water reactor fleet. None of the 99 currently operating commercial nuclear power plants in the U.S. has completed a full-scale control room modernization to date. A full-scale modernization might, for example, entail replacement of all analog panels with digital workstations. Such modernizations have been undertaken successfully in upgrades in Europe and Asia, but the U.S. has yet to undertake a control room upgrade of this magnitude. Instead, nuclear power plant main control rooms for the existing commercial reactor fleet remain significantly analog, with only limited digitalmore » modernizations. Previous research under the U.S. Department of Energy’s Light Water Reactor Sustainability Program has helped establish a systematic process for control room upgrades that support the transition to a hybrid control. While the guidance developed to date helps streamline the process of modernization and reduce costs and uncertainty associated with introducing digital control technologies into an existing control room, these upgrades do not achieve the full potential of newer technologies that might otherwise enhance plant and operator performance. The aim of the control room benefits research presented here is to identify previously overlooked benefits of modernization, identify candidate technologies that may facilitate such benefits, and demonstrate these technologies through human factors research. This report serves as an outline for planned research on the benefits of greater modernization in the main control rooms of nuclear power plants.« less
Yasunari, Teppei J; Kim, Kyu-Myong; da Silva, Arlindo M; Hayasaki, Masamitsu; Akiyama, Masayuki; Murao, Naoto
2018-04-25
To identify the unusual climate conditions and their connections to air pollutions in a remote area due to wildfires, we examine three anomalous large-scale wildfires in May 2003, April 2008, and July 2014 over East Eurasia, as well as how products of those wildfires reached an urban city, Sapporo, in the northern part of Japan (Hokkaido), significantly affecting the air quality. NASA's MERRA-2 (the Modern-Era Retrospective analysis for Research and Applications, Version 2) aerosol re-analysis data closely reproduced the PM 2.5 variations in Sapporo for the case of smoke arrival in July 2014. Results show that all three cases featured unusually early snowmelt in East Eurasia, accompanied by warmer and drier surface conditions in the months leading to the fires, inducing long-lasting soil dryness and producing climate and environmental conditions conducive to active wildfires. Due to prevailing anomalous synoptic-scale atmospheric motions, smoke from those fires eventually reached a remote area, Hokkaido, and worsened the air quality in Sapporo. In future studies, continuous monitoring of the timing of Eurasian snowmelt and the air quality from the source regions to remote regions, coupled with the analysis of atmospheric and surface conditions, may be essential in more accurately predicting the effects of wildfires on air quality.
NASA Astrophysics Data System (ADS)
Ivkin, N.; Liu, Z.; Yang, L. F.; Kumar, S. S.; Lemson, G.; Neyrinck, M.; Szalay, A. S.; Braverman, V.; Budavari, T.
2018-04-01
Cosmological N-body simulations play a vital role in studying models for the evolution of the Universe. To compare to observations and make a scientific inference, statistic analysis on large simulation datasets, e.g., finding halos, obtaining multi-point correlation functions, is crucial. However, traditional in-memory methods for these tasks do not scale to the datasets that are forbiddingly large in modern simulations. Our prior paper (Liu et al., 2015) proposes memory-efficient streaming algorithms that can find the largest halos in a simulation with up to 109 particles on a small server or desktop. However, this approach fails when directly scaling to larger datasets. This paper presents a robust streaming tool that leverages state-of-the-art techniques on GPU boosting, sampling, and parallel I/O, to significantly improve performance and scalability. Our rigorous analysis of the sketch parameters improves the previous results from finding the centers of the 103 largest halos (Liu et al., 2015) to ∼ 104 - 105, and reveals the trade-offs between memory, running time and number of halos. Our experiments show that our tool can scale to datasets with up to ∼ 1012 particles while using less than an hour of running time on a single GPU Nvidia GTX 1080.
A simple modern correctness condition for a space-based high-performance multiprocessor
NASA Technical Reports Server (NTRS)
Probst, David K.; Li, Hon F.
1992-01-01
A number of U.S. national programs, including space-based detection of ballistic missile launches, envisage putting significant computing power into space. Given sufficient progress in low-power VLSI, multichip-module packaging and liquid-cooling technologies, we will see design of high-performance multiprocessors for individual satellites. In very high speed implementations, performance depends critically on tolerating large latencies in interprocessor communication; without latency tolerance, performance is limited by the vastly differing time scales in processor and data-memory modules, including interconnect times. The modern approach to tolerating remote-communication cost in scalable, shared-memory multiprocessors is to use a multithreaded architecture, and alter the semantics of shared memory slightly, at the price of forcing the programmer either to reason about program correctness in a relaxed consistency model or to agree to program in a constrained style. The literature on multiprocessor correctness conditions has become increasingly complex, and sometimes confusing, which may hinder its practical application. We propose a simple modern correctness condition for a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and the parallel programming system.
Afshar, Yaser; Sbalzarini, Ivo F.
2016-01-01
Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144
Afshar, Yaser; Sbalzarini, Ivo F
2016-01-01
Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.
Tertiary evolution of the Shimanto belt (Japan): A large-scale collision in Early Miocene
NASA Astrophysics Data System (ADS)
Raimbourg, Hugues; Famin, Vincent; Palazzin, Giulia; Yamaguchi, Asuka; Augier, Romain
2017-07-01
To decipher the Miocene evolution of the Shimanto belt of southwestern Japan, structural and paleothermal studies were carried out in the western area of Shikoku Island. All units constituting the belt, both in its Cretaceous and Tertiary domains, are in average strongly dipping to the NW or SE, while shortening directions deduced from fault kinematics are consistently orientated NNW-SSE. Peak paleotemperatures estimated with Raman spectra of organic matter increase strongly across the southern, Tertiary portion of the belt, in tandem with the development of a steeply dipping metamorphic cleavage. Near the southern tip of Ashizuri Peninsula, the unconformity between accreted strata and fore-arc basin, present along the whole belt, corresponds to a large paleotemperature gap, supporting the occurrence of a major collision in Early Miocene. This tectonic event occurred before the magmatic event that affected the whole belt at 15 Ma. The associated shortening was accommodated in two opposite modes, either localized on regional-scale faults such as the Nobeoka Tectonic Line in Kyushu or distributed through the whole belt as in Shikoku. The reappraisal of this collision leads to reinterpret large-scale seismic refraction profiles of the margins, where the unit underlying the modern accretionary prism is now attributed to an older package of deformed and accreted sedimentary units belonging to the Shimanto belt. When integrated into reconstructions of Philippine Sea Plate motion, the collision corresponds to the oblique collision of a paleo Izu-Bonin-Mariana Arc with Japan in Early Miocene.
Data-driven modeling of surface temperature anomaly and solar activity trends
Friedel, Michael J.
2012-01-01
A novel two-step modeling scheme is used to reconstruct and analyze surface temperature and solar activity data at global, hemispheric, and regional scales. First, the self-organizing map (SOM) technique is used to extend annual modern climate data from the century to millennial scale. The SOM component planes are used to identify and quantify strength of nonlinear relations among modern surface temperature anomalies (<150 years), tropical and extratropical teleconnections, and Palmer Drought Severity Indices (0–2000 years). Cross-validation of global sea and land surface temperature anomalies verifies that the SOM is an unbiased estimator with less uncertainty than the magnitude of anomalies. Second, the quantile modeling of SOM reconstructions reveal trends and periods in surface temperature anomaly and solar activity whose timing agrees with published studies. Temporal features in surface temperature anomalies, such as the Medieval Warm Period, Little Ice Age, and Modern Warming Period, appear at all spatial scales but whose magnitudes increase when moving from ocean to land, from global to regional scales, and from southern to northern regions. Some caveats that apply when interpreting these data are the high-frequency filtering of climate signals based on quantile model selection and increased uncertainty when paleoclimatic data are limited. Even so, all models find the rate and magnitude of Modern Warming Period anomalies to be greater than those during the Medieval Warm Period. Lastly, quantile trends among reconstructed equatorial Pacific temperature profiles support the recent assertion of two primary El Niño Southern Oscillation types. These results demonstrate the efficacy of this alternative modeling approach for reconstructing and interpreting scale-dependent climate variables.
Knight, Andrew; Watson, Katherine D.
2017-01-01
Simple Summary The identity of Jack the Ripper remains one of the greatest unsolved crime mysteries in history. Jack was notorious both for the brutality of his murders and also for his habit of stealing organs from his victims. His speed and skill in doing so, in conditions of poor light and haste, fueled theories he was a surgeon. However, re-examination of a mortuary sketch from one of his victims has revealed several key aspects that strongly suggest he had no professional surgical training. Instead, the technique used was more consistent with that of a slaughterhouse worker. There were many small-scale slaughterhouses in East London in the 1880s, within which conditions were harsh for animals and workers alike. The brutalizing effects of such work only add to concerns highlighted by modern research that those who commit violence on animals are more likely to target people. Modern slaughterhouses are more humane in some ways but more desensitizing in others, and sociological research has indicated that communities with slaughterhouses are more likely to experience the most violent of crimes. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. Abstract Hundreds of theories exist concerning the identity of “Jack the Ripper”. His propensity for anatomical dissection with a knife—and in particular the rapid location and removal of specific organs—led some to speculate that he must have been surgically trained. However, re-examination of a mortuary sketch of one of his victims has revealed several aspects of incisional technique highly inconsistent with professional surgical training. Related discrepancies are also apparent in the language used within the only letter from Jack considered to be probably authentic. The techniques he used to dispatch his victims and retrieve their organs were, however, highly consistent with techniques used within the slaughterhouses of the day. East London in the 1880s had a large number of small-scale slaughterhouses, within which conditions for both animals and workers were exceedingly harsh. Modern sociological research has highlighted the clear links between the infliction of violence on animals and that inflicted on humans, as well as increased risks of violent crimes in communities surrounding slaughterhouses. Conditions within modern slaughterhouses are more humane in some ways but more desensitising in others. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. PMID:28394281
Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.
Zauber, Henrik; Schulze, Waltraud X
2012-11-02
The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.
NASA Astrophysics Data System (ADS)
Gomes, M. L.; Fike, D. A.; Bergmann, K.; Knoll, A. H.
2015-12-01
Sulfur (S) isotope signatures of sedimentary pyrite preserved in marine rocks provide a rich suite of information about changes in biogeochemical cycling associated with the evolution of microbial metabolisms and oxygenation of Earth surface environments. Conventionally, these S isotope records are based on bulk rock measurements. Yet, in modern microbial mat environments, S isotope compositions of sulfide can vary by up to 40‰ over a spatial range of ~ 1 mm. Similar ranges of S isotope variability have been found in Archean pyrite grains using both Secondary Ion Mass Spectrometry and other micro-analytical techniques. These micron-scale patterns have been linked to changes in rates of microbial sulfate reduction and/or sulfide oxidation, isotopic distillation of the sulfate reservoir due to microbial sulfate reduction, and post-depositional alteration. Fine-scale mapping of S isotope compositions of pyrite can thus be used to differentiate primary environmental signals from post-depositional overprinting - improving our understanding of both. Here, we examine micron-scale S isotope patterns of pyrite in microbialites from the Mesoproterozoic-Neoproterozoic Sukhaya Tunguska Formation and Neoproterozoic Draken Formation in order to explore S isotope variability associated with different mat textures and pyrite grain morphologies. A primary goal is to link modern observations of how sulfide spatial isotope distributions reflect active microbial communities present at given depths in the mats to ancient processes driving fine-sale pyrite variability in microbialites. We find large (up to 60‰) S isotope variability within a spatial range of less than 2.5cm. The micron-scale S isotope measurements converge around the S isotope composition of pyrite extracted from bulk samples of the same microbialites. These micron-scale pyrite S isotope patterns have the potential to reveal important information about ancient biogeochemical cycling in Proterozoic mat environments with implications for interpreting S isotope signatures from the geological record.
Cascade-based attacks on complex networks
NASA Astrophysics Data System (ADS)
Motter, Adilson E.; Lai, Ying-Cheng
2002-12-01
We live in a modern world supported by large, complex networks. Examples range from financial markets to communication and transportation systems. In many realistic situations the flow of physical quantities in the network, as characterized by the loads on nodes, is important. We show that for such networks where loads can redistribute among the nodes, intentional attacks can lead to a cascade of overload failures, which can in turn cause the entire or a substantial part of the network to collapse. This is relevant for real-world networks that possess a highly heterogeneous distribution of loads, such as the Internet and power grids. We demonstrate that the heterogeneity of these networks makes them particularly vulnerable to attacks in that a large-scale cascade may be triggered by disabling a single key node. This brings obvious concerns on the security of such systems.
ERIC Educational Resources Information Center
Sabnani, Haresh B.; Ponterotto, Joseph G.
1992-01-01
Reviews eight instruments specifically conceptualized and developed for use in racial/ethnic minority-focused psychological research: Cultural Mistrust Inventory, African Self-Consciousness Scale, Cross-Cultural Counseling Inventory-Revised, Modern Racism Scale, Value Orientation Scale, Acculturation Rating Scale for Mexican Americans, Racial…
Person, M.; Banerjee, A.; Hofstra, A.; Sweetkind, D.; Gao, Y.
2008-01-01
The Great Basin region in the western United States contains active geothermal systems, large epithermal Au-Ag deposits, and world-class Carlin-type gold deposits. Temperature profiles, fluid inclusion studies, and isotopic evidence suggest that modern and fossil hydrothermal systems associated with gold mineralization share many common features, including the absence of a clear magmatic fluid source, discharge areas restricted to fault zones, and remarkably high temperatures (>200 ??C) at shallow depths (200-1500 m). While the plumbing of these systems varies, geochemical and isotopic data collected at the Dixie Valley and Beowawe geothermal systems suggest that fluid circulation along fault zones was relatively deep (>5 km) and comprised of relatively unexchanged Pleistocene meteoric water with small (<2.5%) shifts from the meteoric water line (MWL). Many fossil ore-forming systems were also dominated by meteoric water, but usually exhibit ??18O fluid-rock interactions with larger shifts of 5???-20??? from the MWL. Here we present a suite of two-dimensional regional (100 km) and local (40-50 km) scale hydrologic models that we have used to study the plumbing of modern and Tertiary hydrothermal systems of the Great Basin. Geologically and geophysically consistent cross sections were used to generate somewhat idealized hydrogeologic models for these systems that include the most important faults, aquifers, and confining units in their approximate configurations. Multiple constraints were used, including enthalpy, ??18O, silica compositions of fluids and/or rocks, groundwater residence times, fluid inclusion homogenization temperatures, and apatite fission track anomalies. Our results suggest that these hydrothermal systems were driven by natural thermal convection along anisotropic, subvertical faults connected in many cases at depth by permeable aquifers within favorable lithostratigraphic horizons. Those with minimal fluid ?? 18O shifts are restricted to high-permeability fault zones and relatively small-scale (???5 km), single-pass flow systems (e.g., Beowawe). Those with intermediate to large isotopic shifts (e.g., epithermal and Carlin-type Au) had larger-scale (???15 km) loop convection cells with a greater component of flow through marine sedimentary rocks at lower water/rock ratios and greater endowments of gold. Enthalpy calculations constrain the duration of Carlin-type gold systems to probably <200 k.y. Shallow heat flow gradients and fluid silica concentrations suggest that the duration of the modern Beowawe system is <5 k.y. However, fluid flow at Beowawe during the Quaternary must have been episodic with a net duration of ???200 k.y. to account for the amount of silica in the sinter deposits. In the Carlin trend, fluid circulation extended down into Paleozoic siliciclastic rocks, which afforded more mixing with isotopically enriched higher enthalpy fluids. Computed fission track ages along the Carlin trend included the convective effects, and ranged between 91.6 and 35.3 Ma. Older fission track ages occurred in zones of groundwater recharge, and the younger ages occurred in discharge areas. This is largely consistent with fission track ages reported in recent studies. We found that either an amagmatic system with more permeable faults (10-11 m2) or a magmatic system with less permeable faults (10-13 m2) could account for the published isotopic and thermal data along the Carlin trend systems. Localized high heat flow beneath the Muleshoe fault was needed to match fl uid inclusion temperatures at Mule Canyon. However, both magmatic and amagmatic scenarios require the existence of deep, permeable faults to bring hot fluids to the near surface. ?? 2008 Geological Society of America.
NASA Astrophysics Data System (ADS)
Stegen, Ronald; Gassmann, Matthias
2017-04-01
The use of a broad variation of agrochemicals is essential for the modern industrialized agriculture. During the last decades, the awareness of the side effects of their use has grown and with it the requirement to reproduce, understand and predict the behaviour of these agrochemicals in the environment, in order to optimize their use and minimize the side effects. The modern modelling has made great progress in understanding and predicting these chemicals with digital methods. While the behaviour of the applied chemicals is often investigated and modelled, most studies only simulate parent chemicals, considering total annihilation of the substance. However, due to a diversity of chemical, physical and biological processes, the substances are rather transformed into new chemicals, which themselves are transformed until, at the end of the chain, the substance is completely mineralized. During this process, the fate of each transformation product is determined by its own environmental characteristics and the pathway and results of transformation can differ largely by substance and environmental influences, that can occur in different compartments of the same site. Simulating transformation products introduces additional model uncertainties. Thus, the calibration effort increases compared to simulations of the transport and degradation of the primary substance alone. The simulation of the necessary physical processes needs a lot of calculation time. Due to that, few physically-based models offer the possibility to simulate transformation products at all, mostly at the field scale. The few models available for the catchment scale are not optimized for this duty, i.e. they are only able to simulate a single parent compound and up to two transformation products. Thus, for simulations of large physico-chemical parameter spaces, the enormous calculation time of the underlying hydrological model diminishes the overall performance. In this study, the structure of the model ZIN-AGRITRA is re-designed for the transport and transformation of an unlimited amount of agrochemicals in the soil-water-plant system at catchment scale. The focus is, besides a good hydrological standard, on a flexible variation of transformation processes and the optimization for the use of large numbers of different substances. Due to the new design, a reduction of the calculation time per tested substance is acquired, allowing faster testing of parameter spaces. Additionally, the new concept allows for the consideration of different transformation processes and products in different environmental compartments. A first test of calculation time improvements and flexible transformation pathways was performed in a Mediterranean meso-scale catchment, using the insecticide Chlorpyrifos and two of its transformation products, which emerge from different transformation processes, as test substances.
NASA Astrophysics Data System (ADS)
Avdeeva, Elena; Averina, Tatiana; Kochetova, Larisa
2018-03-01
Modern urbanization processes occurring on a global scale inevitably lead to an increase in population density in large cities. People assess the state of life quality and living standards of megalopolises under conditions of high-rise construction development ambiguously. Using SWOT analysis, the authors distinguished positive and negative aspects of high-rise construction, highlighted threats to its development and its opportunities. The article considers the model of development of the city's industry and infrastructure, which enables determining the optimal volume of production by sectors and branches of city economy in order to increase its innovative, production and economic potential and business activity.
GEMS Project: A Platform to Investigate Multiple Sclerosis Risk
Xia, Zongqi; White, Charles C.; Owen, Emily K.; Von Korff, Alina; Clarkson, Sarah R.; McCabe, Cristin A.; Cimpean, Maria; Winn, Phoebe A.; Hoesing, Ashley; Steele, Sonya U.; Cortese, Irene C. M.; Chitnis, Tanuja; Weiner, Howard L.; Reich, Daniel S.; Chibnik, Lori B.; De Jager, Philip L.
2015-01-01
The Genes and Environment in Multiple Sclerosis (GEMS) project establishes a platform to investigate the events leading to MS in at-risk individuals. It has recruited 2,632 first-degree relatives from across the USA. Using an integrated genetic and environmental risk score, we identified subjects with twice the MS risk when compared to the average family member, and we report an initial incidence rate in these subjects that is 30 times greater than that of sporadic MS. We discuss the feasibility of large-scale studies of asymptomatic at-risk subjects that leverage modern tools of subject recruitment to execute collaborative projects. PMID:26583565
The Yellow Fever Vaccine: A History
Frierson, J. Gordon
2010-01-01
After failed attempts at producing bacteria-based vaccines, the discovery of a viral agent causing yellow fever and its isolation in monkeys opened new avenues of research. Subsequent advances were the attenuation of the virus in mice and later in tissue culture; the creation of the seed lot system to avoid spontaneous mutations; the ability to produce the vaccine on a large scale in eggs; and the removal of dangerous contaminants. An important person in the story is Max Theiler, who was Professor of Epidemiology and Public Health at Yale from 1964-67, and whose work on virus attenuation created the modern vaccine and earned him the Nobel Prize. PMID:20589188
[Psychometric assessment of a brief Modern Racism Scale].
Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi C
2016-06-01
Objective To find the internal consistency of the Modern Racism Scale (MRS) among medical students in Bucaramanga, Colombia. Methods A total of 352 medical students, mean age=20.0 years (SD=1.9) reported their attitudes towards Afro-Colombians; 59.4 % were women. Students completed the 10-item version of MRS. Cronbach alpha and McDonald omega were calculated. Exploratory factor analyses were done to propose a brief version of the MRS. Results The 10-item version showed a Cronbach alpha of 0.48 and a McDonald omega of 0.15. The short version, the Brief Modern Racism Scale (BMRS) (items 1, 4, 5, 7 and 8) presented a Cronbach alpha of 0.64 and McDonald omega of 0.65. The BMRS showed one salient factor responsible of 41.6 % of the total variance. Conclusions A Spanish-language short version of the MRS shows better psychometric performance than the original version. Further study is needed to corroborate these findings or make adjustments for Colombian cultural regions.
Aquifer Vulnerability Assessment Based on Sequence Stratigraphic and ³⁹Ar Transport Modeling.
Sonnenborg, Torben O; Scharling, Peter B; Hinsby, Klaus; Rasmussen, Erik S; Engesgaard, Peter
2016-03-01
A large-scale groundwater flow and transport model is developed for a deep-seated (100 to 300 m below ground surface) sedimentary aquifer system. The model is based on a three-dimensional (3D) hydrostratigraphic model, building on a sequence stratigraphic approach. The flow model is calibrated against observations of hydraulic head and stream discharge while the credibility of the transport model is evaluated against measurements of (39)Ar from deep wells using alternative parameterizations of dispersivity and effective porosity. The directly simulated 3D mean age distributions and vertical fluxes are used to visualize the two-dimensional (2D)/3D age and flux distribution along transects and at the top plane of individual aquifers. The simulation results are used to assess the vulnerability of the aquifer system that generally has been assumed to be protected by thick overlaying clayey units and therefore proposed as future reservoirs for drinking water supply. The results indicate that on a regional scale these deep-seated aquifers are not as protected from modern surface water contamination as expected because significant leakage to the deeper aquifers occurs. The complex distribution of local and intermediate groundwater flow systems controlled by the distribution of the river network as well as the topographical variation (Tóth 1963) provides the possibility for modern water to be found in even the deepest aquifers. © 2015, National Ground Water Association.
Chola, Lumbwe; McGee, Shelley; Tugendhaft, Aviva; Buchmann, Eckhart; Hofman, Karen
2015-01-01
Introduction Family planning contributes significantly to the prevention of maternal and child mortality. However, many women still do not use modern contraception and the numbers of unintended pregnancies, abortions and subsequent deaths are high. In this paper, we estimate the service delivery costs of scaling up modern contraception, and the potential impact on maternal, newborn and child survival in South Africa. Methods The Family Planning model in Spectrum was used to project the impact of modern contraception on pregnancies, abortions and births in South Africa (2015-2030). The contraceptive prevalence rate (CPR) was increased annually by 0.68 percentage points. The Lives Saved Tool was used to estimate maternal and child deaths, with coverage of essential maternal and child health interventions increasing by 5% annually. A scenario analysis was done to test impacts when: the change in CPR was 0.1% annually; and intervention coverage increased linearly to 99% in 2030. Results If CPR increased by 0.68% annually, the number of pregnancies would reduce from 1.3 million in 2014 to one million in 2030. Unintended pregnancies, abortions and births decrease by approximately 20%. Family planning can avert approximately 7,000 newborn and child and 600 maternal deaths. The total annual costs of providing modern contraception in 2030 are estimated to be US$33 million and the cost per user of modern contraception is US$7 per year. The incremental cost per life year gained is US$40 for children and US$1,000 for mothers. Conclusion Maternal and child mortality remain high in South Africa, and scaling up family planning together with optimal maternal, newborn and child care is crucial. A huge impact can be made on maternal and child mortality, with a minimal investment per user of modern contraception. PMID:26076482
17 CFR 230.420 - Legibility of prospectus.
Code of Federal Regulations, 2014 CFR
2014-04-01
... data included therein shall be in roman type at least as large and as legible as 10-point modern type... data, including tabular data in notes, and (b) prospectuses deemed to be omitting prospectuses under rule 482 (17 CFR 230.482) may be in roman type at least as large and as legible as 8-point modern type...
17 CFR 230.420 - Legibility of prospectus.
Code of Federal Regulations, 2011 CFR
2011-04-01
... data included therein shall be in roman type at least as large and as legible as 10-point modern type... data, including tabular data in notes, and (b) prospectuses deemed to be omitting prospectuses under rule 482 (17 CFR 230.482) may be in roman type at least as large and as legible as 8-point modern type...
17 CFR 230.420 - Legibility of prospectus.
Code of Federal Regulations, 2013 CFR
2013-04-01
... data included therein shall be in roman type at least as large and as legible as 10-point modern type... data, including tabular data in notes, and (b) prospectuses deemed to be omitting prospectuses under rule 482 (17 CFR 230.482) may be in roman type at least as large and as legible as 8-point modern type...
17 CFR 230.420 - Legibility of prospectus.
Code of Federal Regulations, 2012 CFR
2012-04-01
... data included therein shall be in roman type at least as large and as legible as 10-point modern type... data, including tabular data in notes, and (b) prospectuses deemed to be omitting prospectuses under rule 482 (17 CFR 230.482) may be in roman type at least as large and as legible as 8-point modern type...
17 CFR 230.420 - Legibility of prospectus.
Code of Federal Regulations, 2010 CFR
2010-04-01
... size, type size and font, bold-face type, italics and red ink, by presenting all required information... data included therein shall be in roman type at least as large and as legible as 10-point modern type... rule 482 (17 CFR 230.482) may be in roman type at least as large and as legible as 8-point modern type...
NASA Technical Reports Server (NTRS)
Malcolm, G. N.; Schiff, L. B.
1985-01-01
Two rotary balance apparatuses were developed for testing airplane models in a coning motion. A large scale apparatus, developed for use in the 12-Foot Pressure Wind tunnel primarily to permit testing at high Reynolds numbers, was recently used to investigate the aerodynamics of 0.05-scale model of the F-15 fighter aircraft. Effects of Reynolds number, spin rate parameter, model attitude, presence of a nose boom, and model/sting mounting angle were investigated. A smaller apparatus, which investigates the aerodynamics of bodies of revolution in a coning motion, was used in the 6-by-6 foot Supersonic Wind Tunnel to investigate the aerodynamic behavior of a simple representation of a modern fighter, the Standard Dynamic Model (SDM). Effects of spin rate parameter and model attitude were investigated. A description of the two rigs and a discussion of some of the results obtained in the respective test are presented.
NASA Astrophysics Data System (ADS)
Schwaiger, Karl; Haider, Markus; Haemmerle, Martin; Steiner, Peter; Obermaier, Michael-Dario
2016-05-01
Flexible dispatch able solar thermal electricity plants applying state of the art power cycles have the potential of playing a vital role in modern electricity systems and even participating in the ancillary market. By replacing molten salt via particles, operation temperatures can be increased and plant efficiencies of over 45 % can be reached. In this work the concept for a utility scale plant using corundum as storage/heat transfer material is thermodynamically modeled and its key performance data are cited. A novel indirect fluidized bed particle receiver concept is presented, profiting from a near black body behavior being able to heat up large particle flows by realizing temperature cycles over 500°C. Specialized fluidized bed steam-generators are applied with negligible auxiliary power demand. The performance of the key components is discussed and a rough sketch of the plant is provided.
The dune effect on sand-transporting winds on Mars.
Jackson, Derek W T; Bourke, Mary C; Smyth, Thomas A G
2015-11-05
Wind on Mars is a significant agent of contemporary surface change, yet the absence of in situ meteorological data hampers the understanding of surface-atmospheric interactions. Airflow models at length scales relevant to landform size now enable examination of conditions that might activate even small-scale bedforms (ripples) under certain contemporary wind regimes. Ripples have the potential to be used as modern 'wind vanes' on Mars. Here we use 3D airflow modelling to demonstrate that local dune topography exerts a strong influence on wind speed and direction and that ripple movement likely reflects steered wind direction for certain dune ridge shapes. The poor correlation of dune orientation with effective sand-transporting winds suggests that large dunes may not be mobile under modelled wind scenarios. This work highlights the need to first model winds at high resolution before inferring regional wind patterns from ripple movement or dune orientations on the surface of Mars today.
The dune effect on sand-transporting winds on Mars
Jackson, Derek W. T.; Bourke, Mary C; Smyth, Thomas A. G.
2015-01-01
Wind on Mars is a significant agent of contemporary surface change, yet the absence of in situ meteorological data hampers the understanding of surface–atmospheric interactions. Airflow models at length scales relevant to landform size now enable examination of conditions that might activate even small-scale bedforms (ripples) under certain contemporary wind regimes. Ripples have the potential to be used as modern ‘wind vanes' on Mars. Here we use 3D airflow modelling to demonstrate that local dune topography exerts a strong influence on wind speed and direction and that ripple movement likely reflects steered wind direction for certain dune ridge shapes. The poor correlation of dune orientation with effective sand-transporting winds suggests that large dunes may not be mobile under modelled wind scenarios. This work highlights the need to first model winds at high resolution before inferring regional wind patterns from ripple movement or dune orientations on the surface of Mars today. PMID:26537669
Biomorphic architectures for autonomous Nanosat designs
NASA Technical Reports Server (NTRS)
Hasslacher, Brosl; Tilden, Mark W.
1995-01-01
Modern space tool design is the science of making a machine both massively complex while at the same time extremely robust and dependable. We propose a novel nonlinear control technique that produces capable, self-organizing, micron-scale space machines at low cost and in large numbers by parallel silicon assembly. Experiments using biomorphic architectures (with ideal space attributes) have produced a wide spectrum of survival-oriented machines that are reliably domesticated for work applications in specific environments. In particular, several one-chip satellite prototypes show interesting control properties that can be turned into numerous application-specific machines for autonomous, disposable space tasks. We believe that the real power of these architectures lies in their potential to self-assemble into larger, robust, loosely coupled structures. Assembly takes place at hierarchical space scales, with different attendant properties, allowing for inexpensive solutions to many daunting work tasks. The nature of biomorphic control, design, engineering options, and applications are discussed.
Martini, Roberto; Barthelat, Francois
2016-10-13
Protective systems that are simultaneously hard to puncture and compliant in flexion are desirable, but difficult to achieve because hard materials are usually stiff. However, we can overcome this conflicting design requirement by combining plates of a hard material with a softer substrate, and a strategy which is widely found in natural armors such as fish scales or osteoderms. Man-made segmented armors have a long history, but their systematic implementation in a modern and a protective system is still hampered by a limited understanding of the mechanics and the design of optimization guidelines, and by challenges in cost-efficient manufacturing. This study addresses these limitations with a flexible bioinspired armor based on overlapping ceramic scales. The fabrication combines laser engraving and a stretch-and-release method which allows for fine tuning of the size and overlap of the scales, and which is suitable for large scale fabrication. Compared to a continuous layer of uniform ceramic, our fish-scale like armor is not only more flexible, but it is also more resistant to puncture and more damage tolerant. The proposed armor is also about ten times more puncture resistant than soft elastomers, making it a very attractive alternative to traditional protective equipment.
(Finite) statistical size effects on compressive strength.
Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien
2014-04-29
The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.
Wang, Jack T. H.; Schembri, Mark A.; Hall, Roy A.
2013-01-01
Designing and implementing assessment tasks in large-scale undergraduate science courses is a labor-intensive process subject to increasing scrutiny from students and quality assurance authorities alike. Recent pedagogical research has provided conceptual frameworks for teaching introductory undergraduate microbiology, but has yet to define best-practice assessment guidelines. This study assessed the applicability of Biggs’ theory of constructive alignment in designing consistent learning objectives, activities, and assessment items that aligned with the American Society for Microbiology’s concept-based microbiology curriculum in MICR2000, an introductory microbiology course offered at the University of Queensland, Australia. By improving the internal consistency in assessment criteria and increasing the number of assessment items explicitly aligned to the course learning objectives, the teaching team was able to efficiently provide adequate feedback on numerous assessment tasks throughout the semester, which contributed to improved student performance and learning gains. When comparing the constructively aligned 2011 offering of MICR2000 with its 2010 counterpart, students obtained higher marks in both coursework assignments and examinations as the semester progressed. Students also valued the additional feedback provided, as student rankings for course feedback provision increased in 2011 and assessment and feedback was identified as a key strength of MICR2000. By designing MICR2000 using constructive alignment and iterative assessment tasks that followed a common set of learning outcomes, the teaching team was able to effectively deliver detailed and timely feedback in a large introductory microbiology course. This study serves as a case study for how constructive alignment can be integrated into modern teaching practices for large-scale courses. PMID:23858350
Composition, Respirable Fraction and Dissolution Rate of 24 Stone Wool MMVF with their Binder.
Wohlleben, Wendel; Waindok, Hubert; Daumann, Björn; Werle, Kai; Drum, Melanie; Egenolf, Heiko
2017-08-07
Man-made vitreous fibres (MMVF) are produced on a large scale for thermal insulation purposes. After extensive studies of fibre effects in the 1980ies and 1990ies, the composition of MMVF was modified to reduce the fibrotic and cancerogenic potential via reduced biopersistence. However, occupational risks by handling, applying, disposing modern MMVF may be underestimated as the conventional regulatory classification -combining composition, in-vivo clearance and effects- seems to be based entirely on MMVF after removal of the binder. Here we report the oxide composition of 23 modern MMVF from Germany, Finland, UK, Denmark, Russia, China (five different producers) and one pre-1995 MMVF. We find that most of the investigated modern MMVF can be classified as "High-alumina, low-silica wool", but several were on or beyond the borderline to "pre-1995 Rock (Stone) wool". We then used well-established flow-through dissolution testing at pH 4.5 and pH 7.4, with and without binder, at various flow rates, to screen the biosolubility of 14 MMVF over 32 days. At the flow rate and acidic pH of reports that found 47 ng/cm 2 /h dissolution rate for reference biopersistent MMVF21 (without binder), we find rates from 17 to 90 ng/cm 2 /h for modern MMVF as customary in trade (with binder). Removing the binder accelerates the dissolution significantly, but not to the level of reference biosoluble MMVF34. We finally simulated handling or disposing of MMVF and measured size fractions in the aerosol. The respirable fraction of modern MMVF is low, but not less than pre-1995 MMVF. The average composition of modern stone wool MMVF is different from historic biopersistent MMVF, but to a lesser extent than expected. The dissolution rates measured by abiotic methods indicate that the binder has a significant influence on dissolution via gel formation. Considering the content of respirable fibres, these findings imply that the risk assessment of modern stone wool may need to be revisited based on in-vivo studies of MMFV as marketed (with binder).
Fan, Long; Hui, Jerome H L; Yu, Zu Guo; Chu, Ka Hou
2014-07-01
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/. © 2014 John Wiley & Sons Ltd.
Large Eddy Simulation of Engineering Flows: A Bill Reynolds Legacy.
NASA Astrophysics Data System (ADS)
Moin, Parviz
2004-11-01
The term, Large eddy simulation, LES, was coined by Bill Reynolds, thirty years ago when he and his colleagues pioneered the introduction of LES in the engineering community. Bill's legacy in LES features his insistence on having a proper mathematical definition of the large scale field independent of the numerical method used, and his vision for using numerical simulation output as data for research in turbulence physics and modeling, just as one would think of using experimental data. However, as an engineer, Bill was pre-dominantly interested in the predictive capability of computational fluid dynamics and in particular LES. In this talk I will present the state of the art in large eddy simulation of complex engineering flows. Most of this technology has been developed in the Department of Energy's ASCI Program at Stanford which was led by Bill in the last years of his distinguished career. At the core of this technology is a fully implicit non-dissipative LES code which uses unstructured grids with arbitrary elements. A hybrid Eulerian/ Largangian approach is used for multi-phase flows, and chemical reactions are introduced through dynamic equations for mixture fraction and reaction progress variable in conjunction with flamelet tables. The predictive capability of LES is demonstrated in several validation studies in flows with complex physics and complex geometry including flow in the combustor of a modern aircraft engine. LES in such a complex application is only possible through efficient utilization of modern parallel super-computers which was recognized and emphasized by Bill from the beginning. The presentation will include a brief mention of computer science efforts for efficient implementation of LES.
Evolution of a Lowland Karst Landscape; A Mass-Balance Approach
NASA Astrophysics Data System (ADS)
Chamberlin, C.; Heffernan, J. B.; Cohen, M. J.; Quintero, C.; Pain, A.
2016-12-01
Karst landscapes are highly soluble, and are vulnerable to biological acid production as a major driving factor in their evolution. Big Cypress National Park (BICY) is a low-lying karst landscape in southern Florida displaying a distinctive morphology of isolated depressions likely influenced by biology. The goal of this study is to constrain timescales of landform development in BICY. This question was addressed through the construction of landscape-scale elemental budgets for both calcium and phosphorus. Precipitation and export fluxes were calculated using available chemistry and hydrology data, and stocks were calculated from a combination of existing data, field measurements, and laboratory chemical analysis. Estimates of expected mass export given no biological acid production and given an equivalent production of 100% of GPP were compared with observed rates. Current standing stocks of phosphorus are dominated by a large soil pool, and contain 500 Gg P. Inputs are largely dominated by precipitation, and 8000 years are necessary to accumulate standing stocks of phosphorus given modern fluxes. Calcium flux is vastly dominated by dissolution of the limestone bedrock, and though some calcium is retained in the soil, most is exported. Using LiDAR generated estimates of volume loss across the landscape and current export rates, an estimated 15,000 years would be necessary to create the modern landscape. Both of these estimates indicate that the BICY landscape is geologically very young. The different behaviors of these elements (calcium is largely exported, while phosphorus is largely retained) lend additional confidence to estimates of denudation rates of the landscape. These estimates can be even closer reconciled if calcium redistribution over the landscape is allowed for. This estimate is compared to the two bounding conditions for biological weathering to indicate a likely level of biological importance to landscape development in this system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya; Joe, Jeffrey; Rice, Brandon
Control Room modernization is an important part of life extension for the existing light water reactor fleet. None of the 99 currently operating commercial nuclear power plants in the U.S. has completed a full-scale control room modernization to date. A full-scale modernization might, for example, entail replacement of all analog panels with digital workstations. Such modernizations have been undertaken successfully in upgrades in Europe and Asia, but the U.S. has yet to undertake a control room upgrade of this magnitude. Instead, nuclear power plant main control rooms for the existing commercial reactor fleet remain significantly analog, with only limited digitalmore » modernizations. Previous research under the U.S. Department of Energy’s Light Water Reactor Sustainability Program has helped establish a systematic process for control room upgrades that support the transition to a hybrid control room. While the guidance developed to date helps streamline the process of modernization and reduce costs and uncertainty associated with introducing digital control technologies into an existing control room, these upgrades do not achieve the full potential of newer technologies that might otherwise enhance plant and operator performance. The aim of the control room benefits research is to identify previously overlooked benefits of modernization, identify candidate technologies that may facilitate such benefits, and demonstrate these technologies through human factors research. This report describes the initial upgrades to the HSSL and outlines the methodology for a pilot test of the HSSL configuration.« less
Settlement scaling and increasing returns in an ancient society
Ortman, Scott G.; Cabaniss, Andrew H. F.; Sturm, Jennie O.; Bettencourt, Luís M. A.
2015-01-01
A key property of modern cities is increasing returns to scale—the finding that many socioeconomic outputs increase more rapidly than their population size. Recent theoretical work proposes that this phenomenon is the result of general network effects typical of human social networks embedded in space and, thus, is not necessarily limited to modern settlements. We examine the extent to which increasing returns are apparent in archaeological settlement data from the pre-Hispanic Basin of Mexico. We review previous work on the quantitative relationship between population size and average settled area in this society and then present a general analysis of their patterns of monument construction and house sizes. Estimated scaling parameter values and residual statistics support the hypothesis that increasing returns to scale characterized various forms of socioeconomic production available in the archaeological record and are found to be consistent with key expectations from settlement scaling theory. As a consequence, these results provide evidence that the essential processes that lead to increasing returns in contemporary cities may have characterized human settlements throughout history, and demonstrate that increasing returns do not require modern forms of political or economic organization. PMID:26601129
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Computational Chemistry Using Modern Electronic Structure Methods
ERIC Educational Resources Information Center
Bell, Stephen; Dines, Trevor J.; Chowdhry, Babur Z.; Withnall, Robert
2007-01-01
Various modern electronic structure methods are now days used to teach computational chemistry to undergraduate students. Such quantum calculations can now be easily used even for large size molecules.
Large floods and climatic change during the Holocene on the Ara River, Central Japan
NASA Astrophysics Data System (ADS)
Grossman, Michael J.
2001-07-01
A reconstruction of part of the Holocene large flood record for the Ara River in central Japan is presented. Maximum intermediate gravel-size dimensions of terrace and modern floodplain gravels were measured along an 18-km reach of the river and were used in tractive force equations to estimate minimum competent flood depths. Results suggest that the magnitudes of large floods on the Ara River have varied in a non-random fashion since the end of the last glacial period. Large floods with greater magnitudes occurred during the warming period of the post-glacial and the warmer early to middle Holocene (to ˜5500 years BP). A shift in the magnitudes of large floods occurred ˜5500-5000 years BP. From this time, during the cooler middle to late Holocene, large floods generally had lower magnitudes. In the modern period, large flood magnitudes are the largest in the data set. As typhoons are the main cause of large floods on the Ara River in the modern record, the variation in large flood magnitudes suggests that the incidence of typhoon visits to the central Japan changed as the climate changed during the Holocene. Further, significant dates in the large flood record on the Ara River correspond to significant dates in Europe and the USA.
Large Fluvial Fans and Exploration for Hydrocarbons
NASA Technical Reports Server (NTRS)
Wilkinson, Murray Justin
2005-01-01
A report discusses the geological phenomena known, variously, as modern large (or large modern) fluvial fans or large continental fans, from a perspective of exploring for hydrocarbons. These fans are partial cones of river sediment that spread out to radii of 100 km or more. Heretofore, they have not been much recognized in the geological literature probably because they are difficult to see from the ground. They can, however, be seen in photographs taken by astronauts and on other remotely sensed imagery. Among the topics discussed in the report is the need for research to understand what seems to be an association among fluvial fans, alluvial fans, and hydrocarbon deposits. Included in the report is an abstract that summarizes the global distribution of large modern fluvial fans and a proposal to use that distribution as a guide to understanding paleo-fluvial reservoir systems where oil and gas have formed. Also included is an abstract that summarizes what a continuing mapping project has thus far revealed about the characteristics of large fans that have been found in a variety of geological environments.
Biological Physics major as a means to stimulate an undergraduate physics program
NASA Astrophysics Data System (ADS)
Jaeger, Herbert; Eid, Khalid; Yarrison-Rice, Jan
2013-03-01
In an effort to stress the cross-disciplinary nature of modern physics we added a Biological Physics major. Drawing from coursework in physics, biology, chemistry, mathematics, and related disciplines, it combines a broad curriculum with physical and mathematical rigor in preparation for careers in biophysics, medical physics, and biomedical engineering. Biological Physics offers a new path of studies to a large pool of life science students. We hope to grow our physics majors from 70-80 to more than 100 students and boost our graduation rate from the mid-teens to the mid-twenties. The new major brought about a revision of our sophomore curriculum to make room for modern topics without sidelining fundamentals. As a result, we split our 1-semester long Contemporary Physics course (4 cr hrs) into a year-long sequence Contemporary Physics Foundations and Contemporary Physics Frontiers (both 3 cr hrs). Foundations starts with relativity, then focuses on 4 quantum mechanics topics: wells, spin 1/2, oscillators, and hydrogen. Throughout the course applications are woven in whenever the opportunity arises, e.g. magnetism and NMR with spin 1/2. The following semester Frontiers explores scientific principles and technological advances that make quantum science and resulting technologies different from the large scale. Frontiers covers enabling techniques from atomic, molecular, condensed matter, and particle physics, as well as advances in nanotechnology, quantum optics, and biophysics.
Scaling of Convex Hull Volume to Body Mass in Modern Primates, Non-Primate Mammals and Birds
Brassey, Charlotte A.; Sellers, William I.
2014-01-01
The volumetric method of ‘convex hulling’ has recently been put forward as a mass prediction technique for fossil vertebrates. Convex hulling involves the calculation of minimum convex hull volumes (vol CH) from the complete mounted skeletons of modern museum specimens, which are subsequently regressed against body mass (M b) to derive predictive equations for extinct species. The convex hulling technique has recently been applied to estimate body mass in giant sauropods and fossil ratites, however the biomechanical signal contained within vol CH has remained unclear. Specifically, when vol CH scaling departs from isometry in a group of vertebrates, how might this be interpreted? Here we derive predictive equations for primates, non-primate mammals and birds and compare the scaling behaviour of M b to vol CH between groups. We find predictive equations to be characterised by extremely high correlation coefficients (r 2 = 0.97–0.99) and low mean percentage prediction error (11–20%). Results suggest non-primate mammals scale body mass to vol CH isometrically (b = 0.92, 95%CI = 0.85–1.00, p = 0.08). Birds scale body mass to vol CH with negative allometry (b = 0.81, 95%CI = 0.70–0.91, p = 0.011) and apparent density (vol CH/M b) therefore decreases with mass (r 2 = 0.36, p<0.05). In contrast, primates scale body mass to vol CH with positive allometry (b = 1.07, 95%CI = 1.01–1.12, p = 0.05) and apparent density therefore increases with size (r 2 = 0.46, p = 0.025). We interpret such departures from isometry in the context of the ‘missing mass’ of soft tissues that are excluded from the convex hulling process. We conclude that the convex hulling technique can be justifiably applied to the fossil record when a large proportion of the skeleton is preserved. However we emphasise the need for future studies to quantify interspecific variation in the distribution of soft tissues such as muscle, integument and body fat. PMID:24618736
Computing at the speed limit (supercomputers)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhard, R.
1982-07-01
The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less
Quantifying design trade-offs of beryllium targets on NIF
NASA Astrophysics Data System (ADS)
Yi, S. A.; Zylstra, A. B.; Kline, J. L.; Loomis, E. N.; Kyrala, G. A.; Shah, R. C.; Perry, T. S.; Kanzleiter, R. J.; Batha, S. H.; MacLaren, S. A.; Ralph, J. E.; Masse, L. P.; Salmonson, J. D.; Tipton, R. E.; Callahan, D. A.; Hurricane, O. A.
2017-10-01
An important determinant of target performance is implosion kinetic energy, which scales with the capsule size. The maximum achievable performance for a given laser is thus related to the largest capsule that can be imploded symmetrically, constrained by drive uniformity. A limiting factor for symmetric radiation drive is the ratio of hohlraum to capsule radii, or case-to-capsule ratio (CCR). For a fixed laser energy, a larger hohlraum allows for driving bigger capsules symmetrically at the cost of reduced peak radiation temperature (Tr). Beryllium ablators may thus allow for unique target design trade-offs due to their higher ablation efficiency at lower Tr. By utilizing larger hohlraum sizes than most modern NIF designs, beryllium capsules thus have the potential to operate in unique regions of the target design parameter space. We present design simulations of beryllium targets with a large CCR = 4.3 3.7 . These are scaled surrogates of large hohlraum low Tr beryllium targets, with the goal of quantifying symmetry tunability as a function of CCR. This work performed under the auspices of the U.S. DOE by LANL under contract DE-AC52- 06NA25396, and by LLNL under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Pollard, David; DeConto, Robert; Gomez, Natalya
2016-04-01
To date, most modeling of the Antarctic Ice Sheet's response to future warming has been calibrated using recent and modern observations. As an alternate approach, we apply a hybrid 3-D ice sheet-shelf model to the last deglacial retreat of Antarctica, making use of geologic data of the last ~20,000 years to test the model against the large-scale variations during this period. The ice model is coupled to a global Earth-sea level model to improve modeling of the bedrock response and to capture ocean-ice gravitational interactions. Following several recent ice-sheet studies, we use Large Ensemble (LE) statistical methods, performing sets of 625 runs from 30,000 years to present with systematically varying model parameters. Objective scores for each run are calculated using modern data and past reconstructed grounding lines, relative sea level records, cosmogenic elevation-age data and uplift rates. The LE results are analyzed to calibrate 4 particularly uncertain model parameters that concern marginal ice processes and interaction with the ocean. LE's are extended into the future with climates following RCP scenarios. An additional scoring criterion tests the model's ability to reproduce estimated sea-level high stands in the warm mid-Pliocene, for which drastic retreat mechanisms of hydrofracturing and ice-cliff failure are needed in the model. The LE analysis provides future sea-level-rise envelopes with well-defined parametric uncertainty bounds. Sensitivities of future LE results to Pliocene sea-level estimates, coupling to the Earth-sea level model, and vertical profiles of Earth properties, will be presented.
Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri
2014-01-01
In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.
Bioremediation at a global scale: from the test tube to planet Earth.
de Lorenzo, Víctor; Marlière, Philippe; Solé, Ricard
2016-09-01
Planet Earth's biosphere has evolved over billions of years as a balanced bio-geological system ultimately sustained by sunpower and the large-scale cycling of elements largely run by the global environmental microbiome. Humans have been part of this picture for much of their existence. But the industrial revolution started in the XIX century and the subsequent advances in medicine, chemistry, agriculture and communications have impacted such balances to an unprecedented degree - and the problem has nothing but exacerbated in the last 20 years. Human overpopulation, industrial growth along with unsustainable use of natural resources have driven many sites and perhaps the planetary ecosystem as a whole, beyond recovery by spontaneous natural means, even if the immediate causes could be stopped. The most conspicuous indications of such a state of affairs include the massive change in land use, the accelerated increase in the levels of greenhouse gases, the frequent natural disasters associated to climate change and the growing non-recyclable waste (e.g. plastics and recalcitrant chemicals) that we release to the Environment. While the whole planet is afflicted at a global scale by chemical pollution and anthropogenic emissions, the ongoing development of systems and synthetic biology, metagenomics, modern chemistry and some key concepts from ecological theory allow us to tackle this phenomenal challenge and propose large-scale interventions aimed at reversing and even improving the situation. This involves (i) identification of key reactions or processes that need to be re-established (or altogether created) for ecosystem reinstallation, (ii) implementation of such reactions in natural or designer hosts able to self-replicate and deliver the corresponding activities when/where needed in a fashion guided by sound ecological modelling, (iii) dispersal of niche-creating agents at a global scale and (iv) containment, monitoring and risk assessment of the whole process. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
A Particle Representation Model for the Deformation of Homogeneous Turbulence
NASA Technical Reports Server (NTRS)
Kassinos, S. C.; Reynolds, W. C.
1996-01-01
In simple flows, where the mean deformation rates are mild and the turbulence has time to come to equilibrium with the mean flow, the Reynolds stresses are determined by the applied strain rate. Hence in these flows, it is often adequate to use an eddy-viscosity representation. The modern family of kappa-epsilon models has been very useful in predicting near equilibrium turbulent flows, where the rms deformation rate S is small compared to the reciprocal time scale of the turbulence (epsilon/kappa). In modern engineering applications, turbulence models are quite often required to predict flows with very rapid deformations (large S kappa/epsilon). In these flows, the structure takes some time to respond and eddy viscosity models are inadequate. The response of turbulence to rapid deformations is given by rapid distortion theory (RDT). Under RDT the nonlinear effects due to turbulence-turbulence interactions are neglected in the governing equations, but even when linearized in this fashion, the governing equations are unclosed at the one-point level due to the non-locality of the pressure fluctuations.
Contraceptive knowledge, attitudes, and practice in Russia during the 1980s.
Popov, A A; Visser, A P; Ketting, E
1993-01-01
In the former Soviet Union, there was a lack of valid and reliable social research on knowledge, attitudes, and practice of contraception. The few available studies have not been published outside the Soviet Union. This article reviews five surveys that were conducted in Moscow and two other cities (Saratov and Tartu) during the period 1976-84. In addition, some data from a large-scale survey conducted in 1990 and covering the entire former Soviet Union are presented. The surveys indicate that the rhythm method, condoms, vaginal douches, and withdrawal were the main contraceptive methods used; only 1 to 3 percent of the women interviewed were using oral contraceptives, and about 10 percent used intrauterine devices. The low prevalence of use of reliable modern methods may explain the high incidence of induced abortion in Russia. The chronic unavailability of reliable contraceptives is one of the main factors of poor family planning. Lack of knowledge and negative opinions about modern contraception also play an important role. Some possibilities for improving the family planning situation in Russia are discussed.
Optimizing high performance computing workflow for protein functional annotation.
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-09-10
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.
NASA Astrophysics Data System (ADS)
Tulley-Cordova, C. L.; Bowen, G. J.
2017-12-01
A significant summertime feature of climate in the southwestern United States (US) is the North American monsoon (NAM), also known as the Mexican monsoon, Arizona monsoon, and the southwestern United States monsoon. NAM is a crucial contributor to total annual precipitation in the Four Corners region of the US. Modern investigation of NAM in this region using stable isotopes has been poorly studied. This study characterizes the spatio-temporal changes of NAM based on stable isotopic results from 40 sites, located within the boundaries of the Navajo Nation, in Arizona, New Mexico, and Utah from 2014 to 2017. Sample collections were collected monthly at each site from May to October. Examination of temporal trends of precipitation revealed strong monthly and interannual changes; spatial analysis showed weak large-scale relationships across the study area. Analysis of stable isotopes in precipitation, surface, ground, and spring waters can be used to interpret the isotopic differences in the modern hydro-climate of the Navajo Nation and Colorado Plateau to help predict future hydro-climate changes and its implications on future water resources.
Optimizing high performance computing workflow for protein functional annotation
Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene
2014-01-01
Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296
The deep human prehistory of global tropical forests and its relevance for modern conservation.
Roberts, Patrick; Hunt, Chris; Arroyo-Kalin, Manuel; Evans, Damian; Boivin, Nicole
2017-08-03
Significant human impacts on tropical forests have been considered the preserve of recent societies, linked to large-scale deforestation, extensive and intensive agriculture, resource mining, livestock grazing and urban settlement. Cumulative archaeological evidence now demonstrates, however, that Homo sapiens has actively manipulated tropical forest ecologies for at least 45,000 years. It is clear that these millennia of impacts need to be taken into account when studying and conserving tropical forest ecosystems today. Nevertheless, archaeology has so far provided only limited practical insight into contemporary human-tropical forest interactions. Here, we review significant archaeological evidence for the impacts of past hunter-gatherers, agriculturalists and urban settlements on global tropical forests. We compare the challenges faced, as well as the solutions adopted, by these groups with those confronting present-day societies, which also rely on tropical forests for a variety of ecosystem services. We emphasize archaeology's importance not only in promoting natural and cultural heritage in tropical forests, but also in taking an active role to inform modern conservation and policy-making.
NASA Astrophysics Data System (ADS)
Münzenberg, Gottfried; Geissel, Hans; Litvinov, Yuri A.
2010-04-01
This contribution is based on the combination of the talks: "What can we learn from large-scale mass measurements," "Present and future experiments with stored exotic nuclei at relativistic energies," and "Beta decay of highly-charged ions." Studying the nuclear mass surface gives information on the evolution of nuclear structure such as nuclear shells, the onset of deformation and the drip-lines. Previously, most of the masses far-off stability has been obtained from decay data. Modern methods allow direct mass measurements. They are much more sensitive, down to single atoms, access short-lived species and have high accuracy. Large-scale explorations of the nuclear mass surface are ideally performed with the combination of the in-flight FRagment Separator FRS and the Experimental Storage Ring ESR. After a brief historic introduction selected examples such as the evolution of shell closures far-off stability and the proton-neutron interaction will be discussed in the framework of our data. Recently, the experiments have been extended and led to the discovery of new heavy neutron-rich isotopes along with their mass and lifetime measurements. Storage rings applied at relativistic energies are a unique tool to study the radioactive decay of bare or few-electron atomic nuclei. New features observed with the analysis of stored circulating mother and daughter ions including oscillations in the decay curves of hydrogen-like nuclei will be addressed. Future experiments with NUSTAR at FAIR will further extend our knowledge to the borderlines of nuclear existence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabral, Joana; Department of Psychiatry, University of Oxford, Oxford OX3 7JX; Fernandes, Henrique M.
The neuropathology of schizophrenia remains unclear. Some insight has come from modern neuroimaging techniques, which offer an unparalleled opportunity to explore in vivo the structure and function of the brain. Using functional magnetic resonance imaging, it has been found that the large-scale resting-state functional connectivity (rsFC) in schizophrenia — measured as the temporal correlations of the blood-oxygen-level-dependent (BOLD) signal — exhibit altered network topology, with lower small-world index. The origin of these rsFC alterations and link with the underlying structural connectivity remain unclear. In this work, we used a computational model of spontaneous large-scale brain activity to explore the rolemore » of the structural connectivity in the large-scale dynamics of the brain in health and schizophrenia. The structural connectomes from 15 adolescent patients with early-onset schizophrenia and 15 age- and gender-matched controls were built from diffusion tensor imaging data to detect the white matter tracts between 90 brain areas. Brain areas, simulated using a reduced dynamic mean-field model, receive excitatory input from other areas in proportion to the number of fibre tracts between them. The simulated mean field activity was transformed into BOLD signal, and the properties of the simulated functional networks were analyzed. Our results suggest that the functional alterations observed in schizophrenia are not directly linked to alterations in the structural topology. Instead, subtly randomized and less small-world functional networks appear when the brain operates with lower global coupling, which shifts the dynamics from the optimal healthy regime.« less
NASA Astrophysics Data System (ADS)
Cabral, Joana; Fernandes, Henrique M.; Van Hartevelt, Tim J.; James, Anthony C.; Kringelbach, Morten L.; Deco, Gustavo
2013-12-01
The neuropathology of schizophrenia remains unclear. Some insight has come from modern neuroimaging techniques, which offer an unparalleled opportunity to explore in vivo the structure and function of the brain. Using functional magnetic resonance imaging, it has been found that the large-scale resting-state functional connectivity (rsFC) in schizophrenia — measured as the temporal correlations of the blood-oxygen-level-dependent (BOLD) signal — exhibit altered network topology, with lower small-world index. The origin of these rsFC alterations and link with the underlying structural connectivity remain unclear. In this work, we used a computational model of spontaneous large-scale brain activity to explore the role of the structural connectivity in the large-scale dynamics of the brain in health and schizophrenia. The structural connectomes from 15 adolescent patients with early-onset schizophrenia and 15 age- and gender-matched controls were built from diffusion tensor imaging data to detect the white matter tracts between 90 brain areas. Brain areas, simulated using a reduced dynamic mean-field model, receive excitatory input from other areas in proportion to the number of fibre tracts between them. The simulated mean field activity was transformed into BOLD signal, and the properties of the simulated functional networks were analyzed. Our results suggest that the functional alterations observed in schizophrenia are not directly linked to alterations in the structural topology. Instead, subtly randomized and less small-world functional networks appear when the brain operates with lower global coupling, which shifts the dynamics from the optimal healthy regime.
Thermal runaway of metal nano-tips during intense electron emission
NASA Astrophysics Data System (ADS)
Kyritsakis, A.; Veske, M.; Eimre, K.; Zadin, V.; Djurabekova, F.
2018-06-01
When an electron emitting tip is subjected to very high electric fields, plasma forms even under ultra high vacuum conditions. This phenomenon, known as vacuum arc, causes catastrophic surface modifications and constitutes a major limiting factor not only for modern electron sources, but also for many large-scale applications such as particle accelerators, fusion reactors etc. Although vacuum arcs have been studied thoroughly, the physical mechanisms that lead from intense electron emission to plasma ignition are still unclear. In this article, we give insights to the atomic scale processes taking place in metal nanotips under intense field emission conditions. We use multi-scale atomistic simulations that concurrently include field-induced forces, electron emission with finite-size and space-charge effects, Nottingham and Joule heating. We find that when a sufficiently high electric field is applied to the tip, the emission-generated heat partially melts it and the field-induced force elongates and sharpens it. This initiates a positive feedback thermal runaway process, which eventually causes evaporation of large fractions of the tip. The reported mechanism can explain the origin of neutral atoms necessary to initiate plasma, a missing key process required to explain the ignition of a vacuum arc. Our simulations provide a quantitative description of in the conditions leading to runaway, which shall be valuable for both field emission applications and vacuum arc studies.
NASA Astrophysics Data System (ADS)
Ngan, Henry Y. T.; Yung, Nelson H. C.; Yeh, Anthony G. O.
2015-02-01
This paper aims at presenting a comparative study of outlier detection (OD) for large-scale traffic data. The traffic data nowadays are massive in scale and collected in every second throughout any modern city. In this research, the traffic flow dynamic is collected from one of the busiest 4-armed junction in Hong Kong in a 31-day sampling period (with 764,027 vehicles in total). The traffic flow dynamic is expressed in a high dimension spatial-temporal (ST) signal format (i.e. 80 cycles) which has a high degree of similarities among the same signal and across different signals in one direction. A total of 19 traffic directions are identified in this junction and lots of ST signals are collected in the 31-day period (i.e. 874 signals). In order to reduce its dimension, the ST signals are firstly undergone a principal component analysis (PCA) to represent as (x,y)-coordinates. Then, these PCA (x,y)-coordinates are assumed to be conformed as Gaussian distributed. With this assumption, the data points are further to be evaluated by (a) a correlation study with three variant coefficients, (b) one-class support vector machine (SVM) and (c) kernel density estimation (KDE). The correlation study could not give any explicit OD result while the one-class SVM and KDE provide average 59.61% and 95.20% DSRs, respectively.
Heterologous laccase production and its role in industrial applications
Pezzella, Cinzia; Giardina, Paola; Faraco, Vincenza; Sannia, Giovanni
2010-01-01
Laccases are blue multicopper oxidases, catalyzing the oxidation of an array of aromatic substrates concomitantly with the reduction of molecular oxygen to water. These enzymes are implicated in a variety of biological activities. Most of the laccases studied thus far are of fungal origin. The large range of substrates oxidized by laccases has raised interest in using them within different industrial fields, such as pulp delignification, textile dye bleaching and bioremediation. Laccases secreted from native sources are usually not suitable for large-scale purposes, mainly due to low production yields and high cost of preparation/purification procedures. Heterologous expression may provide higher enzyme yields and may permit to produce laccases with desired properties (such as different substrate specificities, or improved stabilities) for industrial applications. This review surveys researches on heterologous laccase expression focusing on the pivotal role played by recombinant systems towards the development of robust tools for greening modern industry. PMID:21327057
WebCIS: large scale deployment of a Web-based clinical information system.
Hripcsak, G; Cimino, J J; Sengupta, S
1999-01-01
WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.
Power feasibility of implantable digital spike sorting circuits for neural prosthetic systems.
Zumsteg, Zachary S; Kemere, Caleb; O'Driscoll, Stephen; Santhanam, Gopal; Ahmed, Rizwan E; Shenoy, Krishna V; Meng, Teresa H
2005-09-01
A new class of neural prosthetic systems aims to assist disabled patients by translating cortical neural activity into control signals for prosthetic devices. Based on the success of proof-of-concept systems in the laboratory, there is now considerable interest in increasing system performance and creating implantable electronics for use in clinical systems. A critical question that impacts system performance and the overall architecture of these systems is whether it is possible to identify the neural source of each action potential (spike sorting) in real-time and with low power. Low power is essential both for power supply considerations and heat dissipation in the brain. In this paper we report that state-of-the-art spike sorting algorithms are not only feasible using modern complementary metal oxide semiconductor very large scale integration processes, but may represent the best option for extracting large amounts of data in implantable neural prosthetic interfaces.
The future of medical diagnostics: large digitized databases.
Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron
2012-09-01
The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.
Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.
2016-12-01
We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology
A phylogenetic blueprint for a modern whale.
Gatesy, John; Geisler, Jonathan H; Chang, Joseph; Buell, Carl; Berta, Annalisa; Meredith, Robert W; Springer, Mark S; McGowen, Michael R
2013-02-01
The emergence of Cetacea in the Paleogene represents one of the most profound macroevolutionary transitions within Mammalia. The move from a terrestrial habitat to a committed aquatic lifestyle engendered wholesale changes in anatomy, physiology, and behavior. The results of this remarkable transformation are extant whales that include the largest, biggest brained, fastest swimming, loudest, deepest diving mammals, some of which can detect prey with a sophisticated echolocation system (Odontoceti - toothed whales), and others that batch feed using racks of baleen (Mysticeti - baleen whales). A broad-scale reconstruction of the evolutionary remodeling that culminated in extant cetaceans has not yet been based on integration of genomic and paleontological information. Here, we first place Cetacea relative to extant mammalian diversity, and assess the distribution of support among molecular datasets for relationships within Artiodactyla (even-toed ungulates, including Cetacea). We then merge trees derived from three large concatenations of molecular and fossil data to yield a composite hypothesis that encompasses many critical events in the evolutionary history of Cetacea. By combining diverse evidence, we infer a phylogenetic blueprint that outlines the stepwise evolutionary development of modern whales. This hypothesis represents a starting point for more detailed, comprehensive phylogenetic reconstructions in the future, and also highlights the synergistic interaction between modern (genomic) and traditional (morphological+paleontological) approaches that ultimately must be exploited to provide a rich understanding of evolutionary history across the entire tree of Life. Copyright © 2012 Elsevier Inc. All rights reserved.
Aiello, I.W.; Bekins, B.A.
2010-01-01
The recent discoveries of large, active populations of microbes in the subseafloor of the world's oceans supports the impact of the deep biosphere biota on global biogeochemical cycles and raises important questions concerning the functioning of these extreme environments for life. These investigations demonstrated that subseafloor microbes are unevenly distributed and that cell abundances and metabolic activities are often independent from sediment depths, with increased prokaryotic activity at geochemical and/or sedimentary interfaces. In this study we demonstrate that microbial populations vary at the scale of individual beds in the biogenic oozes of a drill site in the eastern equatorial Pacific (Ocean Drilling Program Leg 201, Site 1226). We relate bedding-scale changes in biogenic ooze sediment composition to organic carbon (OC) and microbial cell concentrations using high-resolution color reflectance data as proxy for lithology. Our analyses demonstrate that microbial concentrations are an order of magnitude higher in the more organic-rich diatom oozes than in the nannofossil oozes. The variations mimic small-scale variations in diatom abundance and OC, indicating that the modern distribution of microbial biomass is ultimately controlled by Milankovitch-frequency variations in past oceanographic conditions. ?? 2010 Geological Society of America.
Mantle convection on modern supercomputers
NASA Astrophysics Data System (ADS)
Weismüller, Jens; Gmeiner, Björn; Mohr, Marcus; Waluga, Christian; Wohlmuth, Barbara; Rüde, Ulrich; Bunge, Hans-Peter
2015-04-01
Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures demand an interdisciplinary co-design. Here we report about recent advances of the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups in computer sciences, mathematics and geophysical application under the leadership of FAU Erlangen. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection assessing the impact of small scale processes on global mantle flow.
Size variation in Middle Pleistocene humans.
Arsuaga, J L; Carretero, J M; Lorenzo, C; Gracia, A; Martínez, I; Bermúdez de Castro, J M; Carbonell, E
1997-08-22
It has been suggested that European Middle Pleistocene humans, Neandertals, and prehistoric modern humans had a greater sexual dimorphism than modern humans. Analysis of body size variation and cranial capacity variation in the large sample from the Sima de los Huesos site in Spain showed instead that the sexual dimorphism is comparable in Middle Pleistocene and modern populations.
Repeated large-scale retreat and advance of Totten Glacier indicated by inland bed erosion.
Aitken, A R A; Roberts, J L; van Ommen, T D; Young, D A; Golledge, N R; Greenbaum, J S; Blankenship, D D; Siegert, M J
2016-05-19
Climate variations cause ice sheets to retreat and advance, raising or lowering sea level by metres to decametres. The basic relationship is unambiguous, but the timing, magnitude and sources of sea-level change remain unclear; in particular, the contribution of the East Antarctic Ice Sheet (EAIS) is ill defined, restricting our appreciation of potential future change. Several lines of evidence suggest possible collapse of the Totten Glacier into interior basins during past warm periods, most notably the Pliocene epoch, causing several metres of sea-level rise. However, the structure and long-term evolution of the ice sheet in this region have been understood insufficiently to constrain past ice-sheet extents. Here we show that deep ice-sheet erosion-enough to expose basement rocks-has occurred in two regions: the head of the Totten Glacier, within 150 kilometres of today's grounding line; and deep within the Sabrina Subglacial Basin, 350-550 kilometres from this grounding line. Our results, based on ICECAP aerogeophysical data, demarcate the marginal zones of two distinct quasi-stable EAIS configurations, corresponding to the 'modern-scale' ice sheet (with a marginal zone near the present ice-sheet margin) and the retreated ice sheet (with the marginal zone located far inland). The transitional region of 200-250 kilometres in width is less eroded, suggesting shorter-lived exposure to eroding conditions during repeated retreat-advance events, which are probably driven by ocean-forced instabilities. Representative ice-sheet models indicate that the global sea-level increase resulting from retreat in this sector can be up to 0.9 metres in the modern-scale configuration, and exceeds 2 metres in the retreated configuration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koperna, George J.; Pashin, Jack; Walsh, Peter
The Commercial Scale Project is a US DOE/NETL funded initiative aimed at enhancing the knowledge-base and industry’s ability to geologically store vast quantities of anthropogenic carbon. In support of this goal, a large-scale, stacked reservoir geologic model was developed for Gulf Coast sediments centered on the Citronelle Dome in southwest Alabama, the site of the SECARB Phase III Anthropogenic Test. Characterization of regional geology to construct the model consists of an assessment of the entire stratigraphic continuum at Citronelle Dome, from surface to the depth of the Donovan oil-bearing formation. This project utilizes all available geologic data available, which includes:more » modern geophysical well logs from three new wells drilled for SECARB’s Anthropogenic Test; vintage logs from the Citronelle oilfield wells; porosity and permeability data from whole core and sidewall cores obtained from the injection and observation wells drilled for the Anthropogenic Test; core data obtained from the SECARB Phase II saline aquifer injection test; regional core data for relevant formations from the Geological Survey of Alabama archives. Cross sections, isopach maps, and structure maps were developed to validate the geometry and architecture of the Citronelle Dome for building the model, and assuring that no major structural defects exist in the area. A synthetic neural network approach was used to predict porosity using the available SP and resistivity log data for the storage reservoir formations. These data are validated and applied to extrapolate porosity data over the study area wells, and to interpolate permeability amongst these data points. Geostatistical assessments were conducted over the study area. In addition to geologic characterization of the region, a suite of core analyses was conducted to construct a depositional model and constrain caprock integrity. Petrographic assessment of core was conducted by OSU and analyzed to build a depositional framework for the region and provide modern day analogues. Stability of the caprock over several test parameters was conducted by UAB to yield comprehensive measurements on long term stability of caprocks. The detailed geologic model of the full earth volume from surface thru the Donovan oil reservoir is incorporated into a state-of-the-art reservoir simulation conducted by the University of Alabama at Birmingham (UAB) to explore optimization of CO 2 injection and storage under different characterizations of reservoir flow properties. The application of a scaled up geologic modeling and reservoir simulation provides a proof of concept for the large scale volumetric modeling of CO 2 injection and storage the subsurface.« less
Study on data model of large-scale urban and rural integrated cadastre
NASA Astrophysics Data System (ADS)
Peng, Liangyong; Huang, Quanyi; Gao, Dequan
2008-10-01
Urban and Rural Integrated Cadastre (URIC) has been the subject of great interests for modern cadastre management. It is highly desirable to develop a rational data model for establishing an information system of URIC. In this paper, firstly, the old cadastral management mode in China was introduced, the limitation was analyzed, and the conception of URIC and its development course in China were described. Afterwards, based on the requirements of cadastre management in developed region, the goal of URIC and two key ideas for realizing URIC were proposed. Then, conceptual management mode was studied and a data model of URIC was designed. At last, based on the raw data of land use survey with a scale of 1:1000 and urban conversional cadastral survey with a scale of 1:500 in Jiangyin city, a well-defined information system of URIC was established according to the data model and an uniform management of land use and use right and landownership in urban and rural area was successfully realized. Its feasibility and practicability was well proved.
Tracking the Recent and late Pleistocene Azores front by the distribution of planktic foraminifers
NASA Astrophysics Data System (ADS)
Schiebel, Ralf; Schmuker, Barbara; Alves, Mário; Hemleben, Christoph
2002-11-01
South of the Azores Islands, the population dynamics and sedimentation of planktic foraminifers are significantly influenced by the hydrography of the Azores Front Current System (AFCS). Planktic foraminifers collected from the water column during seasonal cruises across the Azores Front, record the temporal and spatial scale of hydrographic and faunal dynamics within this area. Surface sediment analysis reveals the presence of a large number of pteropod shells indicating preservation of aragonite and, therefore, little alteration of the calcitic foraminiferal tests. Consequently, most of the seasonal and spatial variability of the Azores Front is expected to be recorded by the planktic foraminiferal assemblages present within the surface sediment. In particular, Globorotalia scitula, a subsurface-dwelling species, decreases significantly in abundance to the south of the Azores Front, and shows fine-scale changes at the glacial/interglacial time scale. Enhanced faunal proportions of G. scitula in a sediment core that is located to the south of the modern Azores Current indicate a southward shift of the Azores Front Current System during the glacials and the presence of a transitional water mass at the Azores region.
Seismic cycle feedbacks in a mid-crustal shear zone
NASA Astrophysics Data System (ADS)
Melosh, Benjamin L.; Rowe, Christie D.; Gerbi, Christopher; Smit, Louis; Macey, Paul
2018-07-01
Mid-crustal fault rheology is controlled by alternating brittle and plastic deformation mechanisms, which cause feedback cycles that influence earthquake behavior. Detailed mapping and microstructural observations in the Pofadder Shear Zone (Namibia and South Africa) reveal a lithologically heterogeneous shear zone core with quartz-rich mylonites and ultramylonites, plastically overprinted pseudotachylyte and active shear folds. We present evidence for a positive feedback cycle in which coseismic grain size reduction facilitates active shear folding by enhancing competency contrasts and promoting crystal plastic flow. Shear folding strengthens a portion of a shear zone by limb rotation, focusing deformation and promoting plastic flow or brittle slip in resulting areas of localized high stress. Using quartz paleopiezometry, we estimate strain and slip rates consistent with other studies of exhumed shear zones and modern plate boundary faults, helping establish the Pofadder Shear Zone as an ancient analogue to modern, continental-scale, strike-slip faults. This feedback cycle influences seismicity patterns at the scale of study (10s of meters) and possibly larger scales as well, and contributes to bulk strengthening of the brittle-plastic transition on modern plate boundary faults.
Ecological selectivity of the emerging mass extinction in the oceans.
Payne, Jonathan L; Bush, Andrew M; Heim, Noel A; Knope, Matthew L; McCauley, Douglas J
2016-09-16
To better predict the ecological and evolutionary effects of the emerging biodiversity crisis in the modern oceans, we compared the association between extinction threat and ecological traits in modern marine animals to associations observed during past extinction events using a database of 2497 marine vertebrate and mollusc genera. We find that extinction threat in the modern oceans is strongly associated with large body size, whereas past extinction events were either nonselective or preferentially removed smaller-bodied taxa. Pelagic animals were victimized more than benthic animals during previous mass extinctions but are not preferentially threatened in the modern ocean. The differential importance of large-bodied animals to ecosystem function portends greater future ecological disruption than that caused by similar levels of taxonomic loss in past mass extinction events. Copyright © 2016, American Association for the Advancement of Science.
Custom large scale integrated circuits for spaceborne SAR processors
NASA Technical Reports Server (NTRS)
Tyree, V. C.
1978-01-01
The application of modern LSI technology to the development of a time-domain azimuth correlator for SAR processing is discussed. General design requirements for azimuth correlators for missions such as SEASAT-A, Venus orbital imaging radar (VOIR), and shuttle imaging radar (SIR) are summarized. Several azimuth correlator architectures that are suitable for implementation using custom LSI devices are described. Technical factors pertaining to selection of appropriate LSI technologies are discussed, and the maturity of alternative technologies for spacecraft applications are reported in the context of expected space mission launch dates. The preliminary design of a custom LSI time-domain azimuth correlator device (ACD) being developed for use in future SAR processors is detailed.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Sisi; Nicely, Lucas D; Zhang, Haibin
Modern large-scale networks require the ability to withstand arbitrary failures (i.e., Byzantine failures). Byzantine reliable broadcast algorithms can be used to reliably disseminate information in the presence of Byzantine failures. We design a novel Byzantine reliable broadcast protocol for loosely connected and synchronous networks. While previous such protocols all assume correct senders, our protocol is the first to handle Byzantine senders. To achieve this goal, we have developed new techniques for fault detection and fault tolerance. Our protocol is efficient, and under normal circumstances, no expensive public-key cryptographic operations are used. We implement and evaluate our protocol, demonstrating that ourmore » protocol has high throughput and is superior to the existing protocols in uncivil executions.« less
General Catalogue of Variable Stars: Current Status and New Name-Lists
NASA Astrophysics Data System (ADS)
Samus, N. N.; Kazarovets, E. V.; Kireeva, N. N.; Pastukhova, E. N.; Durlevich, O. V.
2010-12-01
A short history of variable-star catalogs is presented. After the second World War, the International Astronomical Union asked astronomers of the Soviet Union to become responsible for variable-star catalogs. Currently, the catalog is kept electronically and is a joint project of the Institute of Astronomy (Russian Academy of Sciences) and Sternberg Astronomical Institute (Moscow University). We review recent trends in the field of variable-star catalogs, discuss problems and new prospects related to modern large-scale automatic photometric sky surveys, outline the subject of discussions on the future of the variable-star catalogs in the profile commissions of the IAU, and call for suggestions from the astronomical community.
Optical activity of helical quantum-dot supercrystals
NASA Astrophysics Data System (ADS)
Baimuratov, A. S.; Tepliakov, N. V.; Gun'ko, Yu. K.; Baranov, A. V.; Federov, A. V.; Rukhlenko, I. D.
2017-01-01
The size of chiral nanoparticles is much smaller than the optical wavelength. As a result, the difference in interaction of enantiomers with circularly polarized light of different handedness is practically unobservable. Due to the large mismatch in scale, the problem of enhancement of enantioselectivity of optical properties of nanoparticles is particularly important for modern photonics. In this work, we show that ordering of achiral nanoparticles into a chiral supercrystal with dimensions comparable to the wavelength of light allows achieving nearly total dissymmetry of optical absorption and demonstrate this using a helical super-crystal made of semiconductor quantum dots as an example. The proposed approach may find numerous applications in various optical and analytical methods used in biomedicine, chemistry, and pharmacology.
Sleator, Roy D
2011-04-01
The recent rapid expansion in the DNA and protein databases, arising from large-scale genomic and metagenomic sequence projects, has forced significant development in the field of phylogenetics: the study of the evolutionary relatedness of the planet's inhabitants. Advances in phylogenetic analysis have greatly transformed our view of the landscape of evolutionary biology, transcending the view of the tree of life that has shaped evolutionary theory since Darwinian times. Indeed, modern phylogenetic analysis no longer focuses on the restricted Darwinian-Mendelian model of vertical gene transfer, but must also consider the significant degree of lateral gene transfer, which connects and shapes almost all living things. Herein, I review the major tree-building methods, their strengths, weaknesses and future prospects.
Particle Substructure. A Common Theme of Discovery in this Century
DOE R&D Accomplishments Database
Panofsky, W. K. H.
1984-02-01
Some example of modern developments in particle physics are given which demonstrate that the fundamental rules of quantum mechanics, applied to all forces in nature as they became understood, have retained their validity. The well-established laws of electricity and magnetism, reformulated in terms of quantum mechanics, have exhibited a truly remarkable numerical agreement between theory and experiment over an enormous range of observation. As experimental techniques have grown from the top of a laboratory bench to the large accelerators of today, the basic components of experimentation have changed vastly in scale but only little in basic function. More important, the motivation of those engaged in this type of experimentation has hardly changed at all.
ERIC Educational Resources Information Center
Hill, Lilian H.
Five books, representing a small selection of possible readings on necessary changes of the human mind, point to a convergence of interest from different fields of study toward the need for modern society to develop the capacity to respond to the complexity of modern life and the newly acquired ability to destroy life on an unprecedented scale.…
Large-scale wind turbine structures
NASA Technical Reports Server (NTRS)
Spera, David A.
1988-01-01
The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.
Ganchoon, Filipinas; Bugho, Rommel; Calina, Liezel; Dy, Rochelle; Gosney, James
2017-06-09
Physiatrists have provided humanitarian assistance in recent large-scale global natural disasters. Super Typhoon Haiyan, the deadliest and most costly typhoon in modern Philippine history, made landfall on 8 November 2013 resulting in significant humanitarian needs. Philippine Academy of Rehabilitation Medicine physiatrists conducted a project of 23 emergency basic relief and medical aid missions in response to Super Typhoon Haiyan from November 2013 to February 2014. The final mission was a medical aid mission to the inland rural community of Burauen, Leyte. Summary data were collected, collated, and tabulated; project and mission evaluation was performed. During the humanitarian assistance project, 31,254 basic relief kits containing a variety of food and non-food items were distributed and medical services including consultation, treatment, and medicines were provided to 7255 patients. Of the 344 conditions evaluated in the medical aid mission to Burauen, Leyte 85 (59%) were physical and rehabilitation medicine conditions comprised of musculoskeletal (62 [73%]), neurological (17 [20%]), and dermatological (6 [7%]) diagnoses. Post-mission and project analysis resulted in recommendations and programmatic changes to strengthen response in future disasters. Physiatrists functioned as medical providers, mission team leaders, community advocates, and in other roles. This physiatrist-led humanitarian assistance project met critical basic relief and medical aid needs of persons impacted by Super Typhoon Haiyan, demonstrating significant roles performed by physiatrists in response to a large-scale natural disaster. Resulting disaster programing changes and recommendations may inform a more effective response by PARM mission teams in the Philippines as well as by other South-Eastern Asia teams comprising rehabilitation professionals to large-scale, regional natural disasters. Implications for rehabilitation Large-scale natural disasters including tropical cyclones can have a catastrophic impact on the affected population. In response to Super Typhoon Haiyan, physiatrists representing the Philippine Academy of Rehabilitation Medicine conducted a project of 23 emergency basic relief and medical aid missions from November 2013 to February 2014. Project analysis indicates that medical mission teams responding in similar settings may expect to evaluate a significant number of physical medicine and rehabilitation conditions. Medical rehabilitation with participation by rehabilitation professionals including rehabilitation doctors is essential to the emergency medical response in large-scale natural disasters.
From dinosaurs to modern bird diversity: extending the time scale of adaptive radiation.
Moen, Daniel; Morlon, Hélène
2014-05-01
What explains why some groups of organisms, like birds, are so species rich? And what explains their extraordinary ecological diversity, ranging from large, flightless birds to small migratory species that fly thousand of kilometers every year? These and similar questions have spurred great interest in adaptive radiation, the diversification of ecological traits in a rapidly speciating group of organisms. Although the initial formulation of modern concepts of adaptive radiation arose from consideration of the fossil record, rigorous attempts to identify adaptive radiation in the fossil record are still uncommon. Moreover, most studies of adaptive radiation concern groups that are less than 50 million years old. Thus, it is unclear how important adaptive radiation is over temporal scales that span much larger portions of the history of life. In this issue, Benson et al. test the idea of a "deep-time" adaptive radiation in dinosaurs, compiling and using one of the most comprehensive phylogenetic and body-size datasets for fossils. Using recent phylogenetic statistical methods, they find that in most clades of dinosaurs there is a strong signal of an "early burst" in body-size evolution, a predicted pattern of adaptive radiation in which rapid trait evolution happens early in a group's history and then slows down. They also find that body-size evolution did not slow down in the lineage leading to birds, hinting at why birds survived to the present day and diversified. This paper represents one of the most convincing attempts at understanding deep-time adaptive radiations.
Modern Paradigm of Star Formation in the Galaxy
NASA Astrophysics Data System (ADS)
Sobolev, A. M.
2017-06-01
Understanding by the scientific community of the star formation processes in the Galaxy undergone significant changes in recent years. This is largely due to the development of the observational basis of astronomy in the infrared and submillimeter ranges. Analysis of new observational data obtained in the course of the Herschel project, by radio interferometer ALMA and other modern facilities significantly advanced our understanding of the structure of the regions of star formation, young stellar object vicinities and provided comprehensive data on the mass function of proto-stellar objects in a number of star-forming complexes of the Galaxy. Mapping of the complexes in molecular radio lines allowed to study their spatial and kinematic structure on the spatial scales of tens and hundreds of parsecs. The next breakthrough in this field can be achieved as a result of the planned project “Spektr-MM” (Millimetron) which implies a significant improvement in angular resolution and sensitivity. The use of sensitive interferometers allowed to investigate the details of star formation processes at small spatial scales - down to the size of the solar system (with the help of the ALMA), and even the Sun (in the course of the space project “Spektr-R” = RadioAstron). Significant contribution to the study of the processes of accretion is expected as a result of the project “Spektr-UV” (WSO-UV = “World Space Observatory - Ultraviolet”). Complemented with significant theoretical achievements obtained observational data have greatly promoted our understanding of the star formation processes.
Workflows for Full Waveform Inversions
NASA Astrophysics Data System (ADS)
Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas
2017-04-01
Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.
NASA Astrophysics Data System (ADS)
Massa, C.; Beilman, D. W.; Nichols, J. E.; Elison Timm, O.
2016-12-01
Holocene peat deposits from the Hawaiian Islands provide a unique opportunity to resolve millennial to centennial-scale climate variability over the central Pacific region, where data remain scarce. Because both extratropical and tropical modes of climate variability have a strong influence on modern rainfall over the archipelago, hydroclimate proxies from peat would provide valuable information about past Pacific climate changes. The few terrestrial records studied, based on pollen or leaf wax biomarkers, showed evidence for substantial vegetation changes that have been linked to a drying trend over the Holocene. Leaf wax n-alkanes, as well as their stable isotopic compositions (δ13C and δD), are indeed increasingly used to reconstruct past hydroclimate conditions. The interpretation of n-alkanes as biomarkers requires however a thorough knowledge of their distribution in modern plants that contribute to sediments, but in Hawaii the modern vegetation is understudied compared to proxy applications. Here we report results from a preliminary investigation of n-alkanes distributions in dominant modern plant litter collected at a bog site at the summit of the Waianae mountains on the Island of Oahu. We compared n-alkane distributions among species and plant groups in order to test whether taxa or plant functional types (mosses, ferns, woody plants, and sedges) can be discriminated from their n-alkane profiles. Results showed that general plant groups were difficult to distinguish based on individual n-alkanes abundances, chain lengths, or ratios. At the species level, the sedge Machaerina augustifolia, was largely dominated by n-C29 ( 60%), suggesting some chain lengths could be useful as proxies for identifying the contribution of sedges to sedimentary records. Woody plant average chain length was highly variable but overall was not shorter (even slightly higher) than in other terrestrial plants, as it is often assumed. A sedimentary profile from this site shows variation and an overall decrease in n-alkane chain length over the Holocene, but patterns across common modern plants suggest that caution should be exercised when ascribing n-alkane distribution parameters to a specific group of tropical vegetation.
Modern developments for ground-based monitoring of fire behavior and effects
Colin C. Hardy; Robert Kremens; Matthew B. Dickinson
2010-01-01
Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...
Language and Content in the Modern Foreign Languages Degree: A Students' Perspective
ERIC Educational Resources Information Center
Gieve, Simon; Cunico, Sonia
2012-01-01
This paper reports on a small-scale qualitative study of students' experience of their Modern Foreign Languages (MFL) degrees with particular regard to the relationship between language and content learning. It is framed by the identification in the recent Worton Report on MFL studies in UK higher education and elsewhere of a dualism between…
Parental Modernity in Childrearing and Educational Attitudes and Beliefs.
ERIC Educational Resources Information Center
Schaefer, Earl S.; Edgerton, Marianna
The development and validation of a brief scale of parental modernity in child rearing and educational attitudes and beliefs are reported. Three samples (A, B and C) of mothers and their children varying in number, race, socioeconomic status (SES), risk for educational failure, and, in one sample, age participated in the study. Various measures,…
Seismic monitoring at Cascade Volcanic Centers, 2004?status and recommendations
Moran, Seth C.
2004-01-01
The purpose of this report is to assess the current (May, 2004) status of seismic monitoring networks at the 13 major Cascade volcanic centers. Included in this assessment are descriptions of each network, analyses of the ability of each network to detect and to locate seismic activity, identification of specific weaknesses in each network, and a prioritized list of those networks that are most in need of additional seismic stations. At the outset it should be recognized that no Cascade volcanic center currently has an adequate seismic network relative to modern-day networks at Usu Volcano (Japan) or Etna and Stromboli volcanoes (Italy). For a system the size of Three Sisters, for example, a modern-day, cutting-edge seismic network would ideally consist of a minimum of 10 to 12 short-period three-component seismometers (for determining particle motions, reliable S-wave picks, moment tensor inversions, fault-plane solutions, and other important seismic parameters) and 7 to 10 broadband sensors (which, amongst other considerations, enable detection and location of very long period (VLP) and other low-frequency events, moment tensor inversions, and, because of their wide dynamic range, on-scale recording of large-amplitude events). Such a dense, multi component seismic network would give the ability to, for example, detect in near-real-time earthquake migrations over a distance of ~0.5km or less, locate tremor sources, determine the nature of a seismic source (that is, pure shear, implosive, explosive), provide on-scale recordings of very small and very large-amplitude seismic signals, and detect localized changes in seismic stress tensor orientations caused by movement of magma bodies. However, given that programmatic resources are currently limited, installation of such networks at this time is unrealistic. Instead, this report focuses on identifying what additional stations are needed to guarantee that anomalous seismicity associated with volcanic unrest will be detected in a timely manner and, in the case of magnitude = 1 earthquakes, reliably located.
Growth of modern branched columnar stromatolites in Lake Joyce, Antarctica.
Mackey, T J; Sumner, D Y; Hawes, I; Jungblut, A D; Andersen, D T
2015-07-01
Modern decimeter-scale columnar stromatolites from Lake Joyce, Antarctica, show a change in branching pattern during a period of lake level rise. Branching patterns correspond to a change in cyanobacterial community composition as preserved in authigenic calcite crystals. The transition in stromatolite morphology is preserved by mineralized layers that contain microfossils and cylindrical molds of cyanobacterial filaments. The molds are composed of two populations with different diameters. Large diameter molds (>2.8 μm) are abundant in calcite forming the oldest stromatolite layers, but are absent from younger layers. In contrast, <2.3 μm diameter molds are common in all stromatolites layers. Loss of large diameter molds corresponds to the transition from smooth-sided stromatolitic columns to branched and irregular columns. Mold diameters are similar to trichome diameters of the four most abundant living cyanobacteria morphotypes in Lake Joyce: Phormidium autumnale morphotypes have trichome diameters >3.5 μm, whereas Leptolyngbya antarctica, L. fragilis, and Pseudanabaena frigida morphotypes have diameters <2.3 μm. P. autumnale morphotypes were only common in mats at <12 m depth. Mats containing abundant P. autumnale morphotypes were smooth, whereas mats with few P. autumnale morphotypes contained small peaks and protruding bundles of filaments, suggesting that the absence of P. autumnale morphotypes allowed small-scale topography to develop on mats. Comparisons of living filaments and mold diameters suggest that P. autumnale morphotypes were present early in stromatolite growth, but disappeared from the community through time. We hypothesize that the mat-smoothing behavior of P. autumnale morphotypes inhibited nucleation of stromatolite branches. When P. autumnale morphotypes were excluded from the community, potentially reflecting a rise in lake level, short-wavelength roughness provided nuclei for stromatolite branches. This growth history provides a conceptual model for initiation of branched stromatolite growth resulting from a change in microbial community composition. © 2015 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Michael K.; Davidson, Megan
As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less
Technology and human purpose: the problem of solids transport on the earth's surface
NASA Astrophysics Data System (ADS)
Haff, P. K.
2012-05-01
Displacement of mass of limited deformability ("solids") on the Earth's surface is opposed by friction and (the analog of) form resistance - impediments relaxed by rotational motion, self-powering of mass units, and transport infrastructure. These features of solids transport first evolved in the biosphere prior to the emergence of technology, allowing slope-independent, diffusion-like motion of discrete objects as massive as several tons, as illustrated by animal foraging and movement along game trails. However, high-energy-consumption technology powered by fossil fuels required a mechanism that could support advective transport of solids, i.e., long-distance, high-volume, high-speed, unidirectional, slope independent transport across the land surface of materials like coal, containerized fluids, and minerals. Pre-technology nature was able to sustain large-scale, long-distance solids advection only in the limited form of piggybacking on geophysical flows of water (river sediment) and air (dust). The appearance of a generalized mechanism for advection of solids independent of fluid flows and gravity appeared only upon the emergence of human purpose. Purpose enables solids advection by, in effect, enabling a simulated continuous potential gradient, otherwise lacking, between discrete and widely separated fossil-fuel energy sources and sinks. Invoking purpose as a mechanism in solids advection is an example of the need to import anthropic principles and concepts into the language and methodology of modern Earth system dynamics. As part of the emergence of a generalized solids advection mechanism, several additional transport requirements necessary to the function of modern large-scale technological systems were also satisfied. These include spatially accurate delivery of advected payload, targetability to essentially arbitrarily located destinations (such as cities), and independence of structure of advected payload from transport mechanism. The latter property enables the transport of an onboard power supply and delivery of persistent-memory, high-information-content payload, such as technological artifacts ("parts").
End-effects-regime in full scale and lab scale rocket nozzles
NASA Astrophysics Data System (ADS)
Rojo, Raymundo; Tinney, Charles; Baars, Woutijn; Ruf, Joseph
2014-11-01
Modern rockets utilize a thrust-optimized parabolic-contour design for their nozzles for its high performance and reliability. However, the evolving internal flow structures within these high area ratio rocket nozzles during start up generate a powerful amount of vibro-acoustic loads that act on the launch vehicle. Modern rockets must be designed to accommodate for these heavy loads or else risk a catastrophic failure. This study quantifies a particular moment referred to as the ``end-effects regime,'' or the largest source of vibro-acoustic loading during start-up [Nave & Coffey, AIAA Paper 1973-1284]. Measurements from full scale ignitions are compared with aerodynamically scaled representations in a fully anechoic chamber. Laboratory scale data is then matched with both static and dynamic wall pressure measurements to capture the associating shock structures within the nozzle. The event generated during the ``end-effects regime'' was successfully reproduced in the both the lab-scale models, and was characterized in terms of its mean, variance and skewness, as well as the spectral properties of the signal obtained by way of time-frequency analyses.
Unusually large tsunamis frequent a currently creeping part of the Aleutian megathrust
Witter, Robert C.; Carver, G.A.; Briggs, Richard; Gelfenbaum, Guy R.; Koehler, R.D.; La Selle, SeanPaul M.; Bender, Adrian M.; Engelhart, S.E.; Hemphill-Haley, E.; Hill, Troy D.
2016-01-01
Current models used to assess earthquake and tsunami hazards are inadequate where creep dominates a subduction megathrust. Here we report geological evidence for large tsunamis, occurring on average every 300–340 years, near the source areas of the 1946 and 1957 Aleutian tsunamis. These areas bookend a postulated seismic gap over 200 km long where modern geodetic measurements indicate that the megathrust is currently creeping. At Sedanka Island, evidence for large tsunamis includes six sand sheets that blanket a lowland facing the Pacific Ocean, rise to 15 m above mean sea level, contain marine diatoms, cap terraces, adjoin evidence for scour, and date from the past 1700 years. The youngest sheet, and modern drift logs found as far as 800 m inland and >18 m elevation, likely record the 1957 tsunami. Modern creep on the megathrust coexists with previously unrecognized tsunami sources along this part of the Aleutian Subduction Zone.
NASA Astrophysics Data System (ADS)
Kotowski, A. J.; Behr, W. M.; Tong, X.; Lavier, L.
2017-12-01
The rheology of the deep subduction interface strongly influences the occurrence, recurrence, and migration of episodic tremor and slow slip (ETS) events. To better understand the environment of deep ETS, we characterize the length scales and types of rheological heterogeneities that decorate the deep interface using an exhumed subduction complex. The Cycladic Blueschist Unit on Syros, Greece, records Eocene subduction to 60 km, partial exhumation along the top of the slab, and final exhumation along Miocene detachment faults. The CBU reached 450-580˚C and 14-16 kbar, PT conditions similar to where ETS occurs in several modern subduction zones. Rheological heterogeneity is preserved in a range of rock types on Syros, with the most prominent type being brittle pods embedded within a viscous matrix. Prograde, blueschist-facies metabasalts show strong deformation fabrics characteristic of viscous flow; cm- to m-scale eclogitic lenses are embedded within them as massive, veined pods, foliated pods rotated with respect to the blueschist fabric, and attenuated, foliation-parallel lenses. Similar relationships are observed in blueschist-facies metasediments interpreted to have deformed during early exhumation. In these rocks, metabasalts form lenses ranging in size from m- to 10s of m and are distributed at the m-scale throughout the metasedimentary matrix. Several of the metamafic lenses, and the matrix rocks immediately adjacent to them, preserve multiple generations of dilational veins and shear fractures filled with quartz and high pressure minerals. These observations suggest that coupled brittle-viscous deformation under high fluid pressures may characterize the subduction interface in the deep tremor source region. To test this further, we modeled the behavior of an elasto-plastic pod in a viscous shear zone under high fluid pressures. Our models show that local stress concentrations around the pod are large enough to generate transient dilational shear at seismic strain rates. Scaling the model up to a typical source area for deep tremor suggests these heterogeneities may yield a seismic moment similar to those calculated for tremor bursts in modern subduction zones.
NASA Astrophysics Data System (ADS)
Pérez, Lara F.; Nielsen, Tove; Knutz, Paul C.; Kuijpers, Antoon; Damm, Volkmar
2018-04-01
The continental shelf of central-east Greenland is shaped by several glacially carved transverse troughs that form the oceanward extension of the major fjord systems. The evolution of these troughs through time, and their relation with the large-scale glaciation of the Northern Hemisphere, is poorly understood. In this study seismostratigraphic analyses have been carried out to determine the morphological and structural development of this important sector of the East Greenland glaciated margin. The age of major stratigraphic discontinuities has been constrained by a direct tie to ODP site 987 drilled in the Greenland Sea basin plain off Scoresby Sund fan system. The areal distribution and internal facies of the identified seismic units reveal the large-scale depositional pattern formed by ice-streams draining a major part of the central-east Greenland ice sheet. Initial sedimentation along the margin was, however, mainly controlled by tectonic processes related to the margin construction, continental uplift, and fluvial processes. From late Miocene to present, progradational and erosional patterns point to repeated glacial advances across the shelf. The evolution of depo-centres suggests that ice sheet advances over the continental shelf have occurred since late Miocene, about 2 Myr earlier than previously assumed. This cross-shelf glaciation is more pronounced during late Miocene and early Pliocene along Blosseville Kyst and around the Pliocene/Pleistocene boundary off Scoresby Sund; indicating a northward migration of the glacial advance. The two main periods of glaciation were separated by a major retreat of the ice sheet to an inland position during middle Pliocene. Mounded-wavy deposits interpreted as current-related deposits suggest the presence of changing along-slope current dynamics in concert with the development of the modern North Atlantic oceanographic pattern.
Discovering Cortical Folding Patterns in Neonatal Cortical Surfaces Using Large-Scale Dataset
Meng, Yu; Li, Gang; Wang, Li; Lin, Weili; Gilmore, John H.
2017-01-01
The cortical folding of the human brain is highly complex and variable across individuals. Mining the major patterns of cortical folding from modern large-scale neuroimaging datasets is of great importance in advancing techniques for neuroimaging analysis and understanding the inter-individual variations of cortical folding and its relationship with cognitive function and disorders. As the primary cortical folding is genetically influenced and has been established at term birth, neonates with the minimal exposure to the complicated postnatal environmental influence are the ideal candidates for understanding the major patterns of cortical folding. In this paper, for the first time, we propose a novel method for discovering the major patterns of cortical folding in a large-scale dataset of neonatal brain MR images (N = 677). In our method, first, cortical folding is characterized by the distribution of sulcal pits, which are the locally deepest points in cortical sulci. Because deep sulcal pits are genetically related, relatively consistent across individuals, and also stable during brain development, they are well suitable for representing and characterizing cortical folding. Then, the similarities between sulcal pit distributions of any two subjects are measured from spatial, geometrical, and topological points of view. Next, these different measurements are adaptively fused together using a similarity network fusion technique, to preserve their common information and also catch their complementary information. Finally, leveraging the fused similarity measurements, a hierarchical affinity propagation algorithm is used to group similar sulcal folding patterns together. The proposed method has been applied to 677 neonatal brains (the largest neonatal dataset to our knowledge) in the central sulcus, superior temporal sulcus, and cingulate sulcus, and revealed multiple distinct and meaningful folding patterns in each region. PMID:28229131
NASA Astrophysics Data System (ADS)
Svenson, Eric Johan
Participants on the Invincible America Assembly in Fairfield, Iowa, and neighboring Maharishi Vedic City, Iowa, practicing Maharishi Transcendental Meditation(TM) (TM) and the TM-Sidhi(TM) programs in large groups, submitted written experiences that they had had during, and in some cases shortly after, their daily practice of the TM and TM-Sidhi programs. Participants were instructed to include in their written experiences only what they observed and to leave out interpretation and analysis. These experiences were then read by the author and compared with principles and phenomena of modern physics, particularly with quantum theory, astrophysics, quantum cosmology, and string theory as well as defining characteristics of higher states of consciousness as described by Maharishi Vedic Science. In all cases, particular principles or phenomena of physics and qualities of higher states of consciousness appeared qualitatively quite similar to the content of the given experience. These experiences are presented in an Appendix, in which the corresponding principles and phenomena of physics are also presented. These physics "commentaries" on the experiences were written largely in layman's terms, without equations, and, in nearly every case, with clear reference to the corresponding sections of the experiences to which a given principle appears to relate. An abundance of similarities were apparent between the subjective experiences during meditation and principles of modern physics. A theoretic framework for understanding these rich similarities may begin with Maharishi's theory of higher states of consciousness provided herein. We conclude that the consistency and richness of detail found in these abundant similarities warrants the further pursuit and development of such a framework.
Volcanic Eruptions and Climate
NASA Technical Reports Server (NTRS)
LeGrande, Allegra N.; Anchukaitis, Kevin J.
2015-01-01
Volcanic eruptions represent some of the most climatically important and societally disruptive short-term events in human history. Large eruptions inject ash, dust, sulfurous gases (e.g. SO2, H2S), halogens (e.g. Hcl and Hbr), and water vapor into the Earth's atmosphere. Sulfurous emissions principally interact with the climate by converting into sulfate aerosols that reduce incoming solar radiation, warming the stratosphere and altering ozone creation, reducing global mean surface temperature, and suppressing the hydrological cycle. In this issue, we focus on the history, processes, and consequences of these large eruptions that inject enough material into the stratosphere to significantly affect the climate system. In terms of the changes wrought on the energy balance of the Earth System, these transient events can temporarily have a radiative forcing magnitude larger than the range of solar, greenhouse gas, and land use variability over the last millennium. In simulations as well as modern and paleoclimate observations, volcanic eruptions cause large inter-annual to decadal-scale changes in climate. Active debates persist concerning their role in longer-term (multi-decadal to centennial) modification of the Earth System, however.
InterProScan 5: genome-scale protein function classification
Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah
2014-01-01
Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626
IMAGE EXPLORER: Astronomical Image Analysis on an HTML5-based Web Application
NASA Astrophysics Data System (ADS)
Gopu, A.; Hayashi, S.; Young, M. D.
2014-05-01
Large datasets produced by recent astronomical imagers cause the traditional paradigm for basic visual analysis - typically downloading one's entire image dataset and using desktop clients like DS9, Aladin, etc. - to not scale, despite advances in desktop computing power and storage. This paper describes Image Explorer, a web framework that offers several of the basic visualization and analysis functionality commonly provided by tools like DS9, on any HTML5 capable web browser on various platforms. It uses a combination of the modern HTML5 canvas, JavaScript, and several layers of lossless PNG tiles producted from the FITS image data. Astronomers are able to rapidly and simultaneously open up several images on their web-browser, adjust the intensity min/max cutoff or its scaling function, and zoom level, apply color-maps, view position and FITS header information, execute typically used data reduction codes on the corresponding FITS data using the FRIAA framework, and overlay tiles for source catalog objects, etc.
OWL: A scalable Monte Carlo simulation suite for finite-temperature study of materials
NASA Astrophysics Data System (ADS)
Li, Ying Wai; Yuk, Simuck F.; Cooper, Valentino R.; Eisenbach, Markus; Odbadrakh, Khorgolkhuu
The OWL suite is a simulation package for performing large-scale Monte Carlo simulations. Its object-oriented, modular design enables it to interface with various external packages for energy evaluations. It is therefore applicable to study the finite-temperature properties for a wide range of systems: from simple classical spin models to materials where the energy is evaluated by ab initio methods. This scheme not only allows for the study of thermodynamic properties based on first-principles statistical mechanics, it also provides a means for massive, multi-level parallelism to fully exploit the capacity of modern heterogeneous computer architectures. We will demonstrate how improved strong and weak scaling is achieved by employing novel, parallel and scalable Monte Carlo algorithms, as well as the applications of OWL to a few selected frontier materials research problems. This research was supported by the Office of Science of the Department of Energy under contract DE-AC05-00OR22725.
Age constraints on the evolution of the Quetico belt, Superior Province, Ontario
NASA Technical Reports Server (NTRS)
Percival, J. A.; Sullivan, R. W.
1986-01-01
Much attention has been focused on the nature of Archean tectonic processes and the extent to which they were different from modern rigid-plate tectonics. The Archean Superior Province has linear metavolcanic and metasediment-dominated subprovinces of similar scale to cenozoic island arc-trench systems of the western Pacific, suggesting an origin by accreting arcs. Models of the evolution of metavolcanic belts in parts of the Superior Province suggest an arc setting but the tectonic environment and evolution of the intervening metasedimentary belts are poorly understood. In addition to explaining the setting giving rise to a linear sedimentary basin, models must account for subsequent shortening and high-temperature, low-pressure metamorphism. Correlation of rock units and events in adjacent metavolcanic and metasedimentary belts is a first step toward understanding large-scale crustal interactions. To this end, zircon geochronology has been applied to metavolcanic belts of the western Superior Province; new age data for the Quetico metasedimentary belt is reported, permitting correlation with the adjacent Wabigoon and Wawa metavolcanic subprovinces.
Direct Growth of Graphene Film on Germanium Substrate
Wang, Gang; Zhang, Miao; Zhu, Yun; Ding, Guqiao; Jiang, Da; Guo, Qinglei; Liu, Su; Xie, Xiaoming; Chu, Paul K.; Di, Zengfeng; Wang, Xi
2013-01-01
Graphene has been predicted to play a role in post-silicon electronics due to the extraordinary carrier mobility. Chemical vapor deposition of graphene on transition metals has been considered as a major step towards commercial realization of graphene. However, fabrication based on transition metals involves an inevitable transfer step which can be as complicated as the deposition of graphene itself. By ambient-pressure chemical vapor deposition, we demonstrate large-scale and uniform depositon of high-quality graphene directly on a Ge substrate which is wafer scale and has been considered to replace conventional Si for the next generation of high-performance metal-oxide-semiconductor field-effect transistors (MOSFETs). The immiscible Ge-C system under equilibrium conditions dictates graphene depositon on Ge via a self-limiting and surface-mediated process rather than a precipitation process as observed from other metals with high carbon solubility. Our technique is compatible with modern microelectronics technology thus allowing integration with high-volume production of complementary metal-oxide-semiconductors (CMOS). PMID:23955352
Multicategory Composite Least Squares Classifiers
Park, Seo Young; Liu, Yufeng; Liu, Dacheng; Scholl, Paul
2010-01-01
Classification is a very useful statistical tool for information extraction. In particular, multicategory classification is commonly seen in various applications. Although binary classification problems are heavily studied, extensions to the multicategory case are much less so. In view of the increased complexity and volume of modern statistical problems, it is desirable to have multicategory classifiers that are able to handle problems with high dimensions and with a large number of classes. Moreover, it is necessary to have sound theoretical properties for the multicategory classifiers. In the literature, there exist several different versions of simultaneous multicategory Support Vector Machines (SVMs). However, the computation of the SVM can be difficult for large scale problems, especially for problems with large number of classes. Furthermore, the SVM cannot produce class probability estimation directly. In this article, we propose a novel efficient multicategory composite least squares classifier (CLS classifier), which utilizes a new composite squared loss function. The proposed CLS classifier has several important merits: efficient computation for problems with large number of classes, asymptotic consistency, ability to handle high dimensional data, and simple conditional class probability estimation. Our simulated and real examples demonstrate competitive performance of the proposed approach. PMID:21218128
Capital Architecture: Situating symbolism parallel to architectural methods and technology
NASA Astrophysics Data System (ADS)
Daoud, Bassam
Capital Architecture is a symbol of a nation's global presence and the cultural and social focal point of its inhabitants. Since the advent of High-Modernism in Western cities, and subsequently decolonised capitals, civic architecture no longer seems to be strictly grounded in the philosophy that national buildings shape the legacy of government and the way a nation is regarded through its built environment. Amidst an exceedingly globalized architectural practice and with the growing concern of key heritage foundations over the shortcomings of international modernism in representing its immediate socio-cultural context, the contextualization of public architecture within its sociological, cultural and economic framework in capital cities became the key denominator of this thesis. Civic architecture in capital cities is essential to confront the challenges of symbolizing a nation and demonstrating the legitimacy of the government'. In today's dominantly secular Western societies, governmental architecture, especially where the seat of political power lies, is the ultimate form of architectural expression in conveying a sense of identity and underlining a nation's status. Departing with these convictions, this thesis investigates the embodied symbolic power, the representative capacity, and the inherent permanence in contemporary architecture, and in its modes of production. Through a vast study on Modern architectural ideals and heritage -- in parallel to methodologies -- the thesis stimulates the future of large scale governmental building practices and aims to identify and index the key constituents that may respond to the lack representation in civic architecture in capital cities.
ERIC Educational Resources Information Center
Blonder, Ron; Sakhnini, Sohair
2017-01-01
The high-school chemistry curriculum is loaded with many important chemical concepts that are taught at the high-school level and it is therefore very difficult to add modern contents to the existing curriculum. However, many studies have underscored the importance of integrating modern chemistry contents such as nanotechnology into a high-school…
Land Change in Russia since 2000
NASA Astrophysics Data System (ADS)
de Beurs, K.; Ioffe, G.; Nefedova, T.
2010-12-01
Agricultural reform has been an important anthropogenic change process shaping landscapes in European Russia since the formal collapse of the Soviet Union at the end of 1991. Widespread land abandonment is perhaps the most evident side effect of the reform, even visible in synoptic imagery. While land abandonment as a result of the collapse of the Soviet Union is relatively well documented, few studies have investigated the unfolding process of abandonment that results from rural population declines. Russia’s population is projected to shrink by a staggering 29% by 2050 and population dynamics are predicted to play a significant role structuring rural landscapes across European Russia. While often treated as a unified whole with respect to agricultural reform, significant regional diversity exists in Russia. Official statistics at the rayon (county) level are typically skewed toward large-scale farming and farm data from important household productions are summarized into regional averages. In addition, data at sub-district level can often only be obtained by visiting rural administrators in person. Large scale official data thus need to be interpreted with caution. Here we present data collected during the summer of 2010 from representative settlements and enterprises in selected counties within the oblasts (states) of Kostroma and Samara. These field data will provide an initial overview of the economic and social state in modern rural western Russia. We will combine the field data with established socio-demographic observations as well as satellite observations at multiple scales to understand the effect of global change and to project future developments.
Heyerdahl, Emily K; Morgan, Penelope; Riser, James P
2008-03-01
Our objective was to infer the climate drivers of regionally synchronous fire years in dry forests of the U.S. northern Rockies in Idaho and western Montana. During our analysis period (1650-1900), we reconstructed fires from 9245 fire scars on 576 trees (mostly ponderosa pine, Pinus ponderosa P. & C. Lawson) at 21 sites and compared them to existing tree-ring reconstructions of climate (temperature and the Palmer Drought Severity Index [PDSI]) and large-scale climate patterns that affect modern spring climate in this region (El Niño Southern Oscillation [ENSO] and the Pacific Decadal Oscillation [PDO]). We identified 32 regional-fire years as those with five or more sites with fire. Fires were remarkably widespread during such years, including one year (1748) in which fires were recorded at 10 sites across what are today seven national forests plus one site on state land. During regional-fire years, spring-summers were significantly warm and summers were significantly warm-dry whereas the opposite conditions prevailed during the 99 years when no fires were recorded at any of our sites (no-fire years). Climate in prior years was not significantly associated with regional- or no-fire years. Years when fire was recorded at only a few of our sites occurred under a broad range of climate conditions, highlighting the fact that the regional climate drivers of fire are most evident when fires are synchronized across a large area. No-fire years tended to occur during La Niña years, which tend to have anomalously deep snowpacks in this region. However, ENSO was not a significant driver of regional-fire years, consistent with the greater influence of La Niña than El Niño conditions on the spring climate of this region. PDO was not a significant driver of past fire, despite being a strong driver of modern spring climate and modern regional-fire years in the northern Rockies.
Opinion evolution in different social acquaintance networks.
Chen, Xi; Zhang, Xiao; Wu, Zhan; Wang, Hongwei; Wang, Guohua; Li, Wei
2017-11-01
Social acquaintance networks influenced by social culture and social policy have a great impact on public opinion evolution in daily life. Based on the differences between socio-culture and social policy, three different social acquaintance networks (kinship-priority acquaintance network, independence-priority acquaintance network, and hybrid acquaintance network) incorporating heredity proportion p h and variation proportion p v are proposed in this paper. Numerical experiments are conducted to investigate network topology and different phenomena during opinion evolution, using the Deffuant model. We found that in kinship-priority acquaintance networks, similar to the Chinese traditional acquaintance networks, opinions always achieve fragmentation, resulting in the formation of multiple large clusters and many small clusters due to the fact that individuals believe more in their relatives and live in a relatively closed environment. In independence-priority acquaintance networks, similar to Western acquaintance networks, the results are similar to those in the kinship-priority acquaintance network. In hybrid acquaintance networks, similar to the Chinese modern acquaintance networks, only a few clusters are formed indicating that in modern China, opinions are more likely to reach consensus on a large scale. These results are similar to the opinion evolution phenomena in modern society, proving the rationality and applicability of network models combined with social culture and policy. We also found a threshold curve p v +2p h =2.05 in the results for the final opinion clusters and evolution time. Above the threshold curve, opinions could easily reach consensus. Based on the above experimental results, a culture-policy-driven mechanism for the opinion dynamic is worth promoting in this paper, that is, opinion dynamics can be driven by different social cultures and policies through the influence of heredity and variation in interpersonal relationship networks. This finding is of great significance for predicting opinion evolution under different acquaintance networks and formulating reasonable policies based on cultural characteristics to guide public opinion.
Opinion evolution in different social acquaintance networks
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhang, Xiao; Wu, Zhan; Wang, Hongwei; Wang, Guohua; Li, Wei
2017-11-01
Social acquaintance networks influenced by social culture and social policy have a great impact on public opinion evolution in daily life. Based on the differences between socio-culture and social policy, three different social acquaintance networks (kinship-priority acquaintance network, independence-priority acquaintance network, and hybrid acquaintance network) incorporating heredity proportion ph and variation proportion pv are proposed in this paper. Numerical experiments are conducted to investigate network topology and different phenomena during opinion evolution, using the Deffuant model. We found that in kinship-priority acquaintance networks, similar to the Chinese traditional acquaintance networks, opinions always achieve fragmentation, resulting in the formation of multiple large clusters and many small clusters due to the fact that individuals believe more in their relatives and live in a relatively closed environment. In independence-priority acquaintance networks, similar to Western acquaintance networks, the results are similar to those in the kinship-priority acquaintance network. In hybrid acquaintance networks, similar to the Chinese modern acquaintance networks, only a few clusters are formed indicating that in modern China, opinions are more likely to reach consensus on a large scale. These results are similar to the opinion evolution phenomena in modern society, proving the rationality and applicability of network models combined with social culture and policy. We also found a threshold curve pv+2 ph=2.05 in the results for the final opinion clusters and evolution time. Above the threshold curve, opinions could easily reach consensus. Based on the above experimental results, a culture-policy-driven mechanism for the opinion dynamic is worth promoting in this paper, that is, opinion dynamics can be driven by different social cultures and policies through the influence of heredity and variation in interpersonal relationship networks. This finding is of great significance for predicting opinion evolution under different acquaintance networks and formulating reasonable policies based on cultural characteristics to guide public opinion.
Perry, Jonathan M G; Cooke, Siobhán B; Runestad Connour, Jacqueline A; Burgess, M Loring; Ruff, Christopher B
2018-02-01
Body mass is an important component of any paleobiological reconstruction. Reliable skeletal dimensions for making estimates are desirable but extant primate reference samples with known body masses are rare. We estimated body mass in a sample of extinct platyrrhines and Fayum anthropoids based on four measurements of the articular surfaces of the humerus and femur. Estimates were based on a large extant reference sample of wild-collected individuals with associated body masses, including previously published and new data from extant platyrrhines, cercopithecoids, and hominoids. In general, scaling of joint dimensions is positively allometric relative to expectations of geometric isometry, but negatively allometric relative to expectations of maintaining equivalent joint surface areas. Body mass prediction equations based on articular breadths are reasonably precise, with %SEEs of 17-25%. The breadth of the distal femoral articulation yields the most reliable estimates of body mass because it scales similarly in all major anthropoid taxa. Other joints scale differently in different taxa; therefore, locomotor style and phylogenetic affinity must be considered when calculating body mass estimates from the proximal femur, proximal humerus, and distal humerus. The body mass prediction equations were applied to 36 Old World and New World fossil anthropoid specimens representing 11 taxa, plus two Haitian specimens of uncertain taxonomic affinity. Among the extinct platyrrhines studied, only Cebupithecia is similar to large, extant platyrrhines in having large humeral (especially distal) joints. Our body mass estimates differ from each other and from published estimates based on teeth in ways that reflect known differences in relative sizes of the joints and teeth. We prefer body mass estimators that are biomechanically linked to weight-bearing, and especially those that are relatively insensitive to differences in locomotor style and phylogenetic history. Whenever possible, extant reference samples should be chosen to match target fossils in joint proportionality. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Holocene Geomagnetic Field: Spikes, Low Field Anomalies, and Asymmetries
NASA Astrophysics Data System (ADS)
Constable, C.
2017-12-01
Our understanding of the Holocene magnetic field is constrained by individual paleomagnetic records of variable quality and resolution, composite regional secular variation curves, and low resolution global time-varying geomagnetic field models. Although spatial and temporal data coverages have greatly improved in recent years, typical views of millennial-scale secular variation and the underlying physical processes continue to be heavily influenced by more detailed field structure and short term variability inferred from the historical record and modern observations. Recent models of gyre driven decay of the geomagnetic dipole on centennial time scales, and studies of the evolution of the South Atlantic Anomaly provide one prominent example. Since 1840 dipole decay has largely been driven by meridional flux advection, with generally smaller fairly steady contributions from magnetic diffusion. The decay is dominantly associated with geomagnetic activity in the Southern Hemisphere. In contrast to the present decay, dipole strength generally grew between 1500 and 1000 BC, sustaining high but fluctuating values around 90-100 ZAm2 until after 1500 AD. Thus high dipole moments appear to have been present shortly after 1000 AD at the time of the Levantine spikes, which represent extreme variations in regional geomagnetic field strength. It has been speculated that the growth in dipole moment originated from a strong flux patch near the equatorial region at the core-mantle boundary that migrated north and west to augment the dipole strength, suggesting the presence of a large-scale anticyclonic gyre in the northern hemisphere, not totally unlike the southern hemisphere flow that dominates present day dipole decay. The later brief episodes of high field strength in the Levant may have contributed to prolonged values of high dipole strength until the onset of dipole decay in the late second millennium AD. This could support the concept of a large-scale stable flow configuration for several millennia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Tony; Sutherland, James C.
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
Research on Agricultural Product Options Pricing Based on Lévy Copula
NASA Astrophysics Data System (ADS)
Qiu, Hong
2017-11-01
China is a large agricultural country, and the healthy development of agriculture is related to the stability of the whole society. With the advancement of modern agriculture and the expansion of agricultural scale, the demand for farmers to avoid market risks is increasingly urgent. Option trading has the effect of attracting farmers’ intervention, promoting order agriculture development, perfecting agricultural support policy and promoting the development of agricultural futures market. Relative to the futures, the option transaction because the margin is low, reducing the trader’s entry threshold, you can make more small and medium investors to participate. This is not only active in the futures market, but also for many small and medium investors to provide effective financial management tools.
Saad, Tony; Sutherland, James C.
2016-05-04
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
An intelligent load shedding scheme using neural networks and neuro-fuzzy.
Haidar, Ahmed M A; Mohamed, Azah; Al-Dabbagh, Majid; Hussain, Aini; Masoum, Mohammad
2009-12-01
Load shedding is some of the essential requirement for maintaining security of modern power systems, particularly in competitive energy markets. This paper proposes an intelligent scheme for fast and accurate load shedding using neural networks for predicting the possible loss of load at the early stage and neuro-fuzzy for determining the amount of load shed in order to avoid a cascading outage. A large scale electrical power system has been considered to validate the performance of the proposed technique in determining the amount of load shed. The proposed techniques can provide tools for improving the reliability and continuity of power supply. This was confirmed by the results obtained in this research of which sample results are given in this paper.
Bromelain: an overview of industrial application and purification strategies.
Arshad, Zatul Iffah Mohd; Amid, Azura; Yusof, Faridah; Jaswir, Irwandi; Ahmad, Kausar; Loke, Show Pau
2014-09-01
This review highlights the use of bromelain in various applications with up-to-date literature on the purification of bromelain from pineapple fruit and waste such as peel, core, crown, and leaves. Bromelain, a cysteine protease, has been exploited commercially in many applications in the food, beverage, tenderization, cosmetic, pharmaceutical, and textile industries. Researchers worldwide have been directing their interest to purification strategies by applying conventional and modern approaches, such as manipulating the pH, affinity, hydrophobicity, and temperature conditions in accord with the unique properties of bromelain. The amount of downstream processing will depend on its intended application in industries. The breakthrough of recombinant DNA technology has facilitated the large-scale production and purification of recombinant bromelain for novel applications in the future.
Ostrander, Chadlin M.; Owens, Jeremy D.; Nielsen, Sune G.
2017-01-01
The rates of marine deoxygenation leading to Cretaceous Oceanic Anoxic Events are poorly recognized and constrained. If increases in primary productivity are the primary driver of these episodes, progressive oxygen loss from global waters should predate enhanced carbon burial in underlying sediments—the diagnostic Oceanic Anoxic Event relic. Thallium isotope analysis of organic-rich black shales from Demerara Rise across Oceanic Anoxic Event 2 reveals evidence of expanded sediment-water interface deoxygenation ~43 ± 11 thousand years before the globally recognized carbon cycle perturbation. This evidence for rapid oxygen loss leading to an extreme ancient climatic event has timely implications for the modern ocean, which is already experiencing large-scale deoxygenation. PMID:28808684
Turboprop Model in the 8- by 6-Foot Supersonic Wind Tunnel
1976-08-21
National Aeronautics and Space Administration (NASA) engineer Robert Jeracki prepares a Hamilton Standard SR-1 turboprop model in the test section of the 8- by 6-Foot Supersonic Wind Tunnel at the Lewis Research Center. Lewis researchers were analyzing a series of eight-bladed propellers in their wind tunnels to determine their operating characteristics at speeds up to Mach 0.8. The program, which became the Advanced Turboprop, was part of a NASA-wide Aircraft Energy Efficiency Program which was designed to reduce aircraft fuel costs by 50 percent. The ATP concept was different from the turboprops in use in the 1950s. The modern versions had at least eight blades and were swept back for better performance. After Lewis researchers developed the advanced turboprop theory and established its potential performance capabilities, they commenced an almost decade-long partnership with Hamilton Standard to develop, verify, and improve the concept. A series of 24-inch scale models of the SR-1 with different blade shapes and angles were tested in Lewis’ wind tunnels. A formal program was established in 1978 to examine associated noise levels, aerodynamics, and the drive system. The testing of the large-scale propfan was done on test rigs, in large wind tunnels, and, eventually, on aircraft.
Schweitzer, Peter; Povoroznyuk, Olga; Schiesser, Sigrid
2017-01-01
Abstract Public and academic discourses about the Polar regions typically focus on the so-called natural environment. While, these discourses and inquiries continue to be relevant, the current article asks the question how to conceptualize the on-going industrial and infrastructural build-up of the Arctic. Acknowledging that the “built environment” is not an invention of modernity, the article nevertheless focuses on large-scale infrastructural projects of the twentieth century, which marks a watershed of industrial and infrastructural development in the north. Given that the Soviet Union was at the vanguard of these developments, the focus will be on Soviet and Russian large-scale projects. We will be discussing two cases of transportation infrastructure, one of them based on an on-going research project being conducted by the authors along the Baikal–Amur Mainline (BAM) and the other focused on the so-called Northern Sea Route, the marine passage with a long history that has recently been regaining public and academic attention. The concluding section will argue for increased attention to the interactions between humans and the built environment, serving as a kind of programmatic call for more anthropological attention to infrastructure in the Russian north and other polar regions. PMID:29098112
NASA Astrophysics Data System (ADS)
Gabriel, Alice; Pelties, Christian
2014-05-01
In this presentation we will demonstrate the benefits of using modern numerical methods to support physic-based ground motion modeling and research. For this purpose, we utilize SeisSol an arbitrary high-order derivative Discontinuous Galerkin (ADER-DG) scheme to solve the spontaneous rupture problem with high-order accuracy in space and time using three-dimensional unstructured tetrahedral meshes. We recently verified the method in various advanced test cases of the 'SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise' benchmark suite, including branching and dipping fault systems, heterogeneous background stresses, bi-material faults and rate-and-state friction constitutive formulations. Now, we study the dynamic rupture process using 3D meshes of fault systems constructed from geological and geophysical constraints, such as high-resolution topography, 3D velocity models and fault geometries. Our starting point is a large scale earthquake dynamic rupture scenario based on the 1994 Northridge blind thrust event in Southern California. Starting from this well documented and extensively studied event, we intend to understand the ground-motion, including the relevant high frequency content, generated from complex fault systems and its variation arising from various physical constraints. For example, our results imply that the Northridge fault geometry favors a pulse-like rupture behavior.
Gravitational Lensing: Einstein's unfinished symphony
NASA Astrophysics Data System (ADS)
Treu, Tommaso; Ellis, Richard S.
2015-01-01
Gravitational lensing - the deflection of light rays by gravitating matter - has become a major tool in the armoury of the modern cosmologist. Proposed nearly a hundred years ago as a key feature of Einstein's theory of general relativity, we trace the historical development since its verification at a solar eclipse in 1919. Einstein was apparently cautious about its practical utility and the subject lay dormant observationally for nearly 60 years. Nonetheless there has been rapid progress over the past twenty years. The technique allows astronomers to chart the distribution of dark matter on large and small scales thereby testing predictions of the standard cosmological model which assumes dark matter comprises a massive weakly-interacting particle. By measuring the distances and tracing the growth of dark matter structure over cosmic time, gravitational lensing also holds great promise in determining whether the dark energy, postulated to explain the accelerated cosmic expansion, is a vacuum energy density or a failure of general relativity on large scales. We illustrate the wide range of applications which harness the power of gravitational lensing, from searches for the earliest galaxies magnified by massive clusters to those for extrasolar planets which temporarily brighten a background star. We summarise the future prospects with dedicated ground and space-based facilities designed to exploit this remarkable physical phenomenon.
Scalable NIC-based reduction on large-scale clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, A.; Fernández, J. C.; Petrini, F.
2003-01-01
Many parallel algorithms require effiaent support for reduction mllectives. Over the years, researchers have developed optimal reduction algonduns by taking inm account system size, dam size, and complexities of reduction operations. However, all of these algorithm have assumed the faa that the reduction precessing takes place on the host CPU. Modem Network Interface Cards (NICs) sport programmable processors with substantial memory and thus introduce a fresh variable into the equation This raises the following intersting challenge: Can we take advantage of modern NICs to implementJost redudion operations? In this paper, we take on this challenge in the context of large-scalemore » clusters. Through experiments on the 960-node, 1920-processor or ASCI Linux Cluster (ALC) located at the Lawrence Livermore National Laboratory, we show that NIC-based reductions indeed perform with reduced latency and immed consistency over host-based aleorithms for the wmmon case and that these benefits scale as the system grows. In the largest configuration tested--1812 processors-- our NIC-based algorithm can sum a single element vector in 73 ps with 32-bi integers and in 118 with Mbit floating-point numnbers. These results represent an improvement, respeaively, of 121% and 39% with resvect w the {approx}roductionle vel MPI library« less
The origins of intensive marine fishing in medieval Europe: the English evidence.
Barrett, James H; Locker, Alison M; Roberts, Callum M
2004-12-07
The catastrophic impact of fishing pressure on species such as cod and herring is well documented. However, the antiquity of their intensive exploitation has not been established. Systematic catch statistics are only available for ca.100 years, but large-scale fishing industries existed in medieval Europe and the expansion of cod fishing from the fourteenth century (first in Iceland, then in Newfoundland) played an important role in the European colonization of the Northwest Atlantic. History has demonstrated the scale of these late medieval and post-medieval fisheries, but only archaeology can illuminate earlier practices. Zooarchaeological evidence shows that the clearest changes in marine fishing in England between AD 600 and 1600 occurred rapidly around AD 1000 and involved large increases in catches of herring and cod. Surprisingly, this revolution predated the documented post-medieval expansion of England's sea fisheries and coincided with the Medieval Warm Period--when natural herring and cod productivity was probably low in the North Sea. This counterintuitive discovery can be explained by the concurrent rise of urbanism and human impacts on freshwater ecosystems. The search for 'pristine' baselines regarding marine ecosystems will thus need to employ medieval palaeoecological proxies in addition to recent fisheries data and early modern historical records.
Knoll, Andrew H.; Nowak, Martin A.
2017-01-01
The integration of fossils, phylogeny, and geochronology has resulted in an increasingly well-resolved timetable of evolution. Life appears to have taken root before the earliest known minimally metamorphosed sedimentary rocks were deposited, but for a billion years or more, evolution played out beneath an essentially anoxic atmosphere. Oxygen concentrations in the atmosphere and surface oceans first rose in the Great Oxygenation Event (GOE) 2.4 billion years ago, and a second increase beginning in the later Neoproterozoic Era [Neoproterozoic Oxygenation Event (NOE)] established the redox profile of modern oceans. The GOE facilitated the emergence of eukaryotes, whereas the NOE is associated with large and complex multicellular organisms. Thus, the GOE and NOE are fundamental pacemakers for evolution. On the time scale of Earth’s entire 4 billion–year history, the evolutionary dynamics of the planet’s biosphere appears to be fast, and the pace of evolution is largely determined by physical changes of the planet. However, in Phanerozoic ecosystems, interactions between new functions enabled by the accumulation of characters in a complex regulatory environment and changing biological components of effective environments appear to have an important influence on the timing of evolutionary innovations. On the much shorter time scale of transient environmental perturbations, such as those associated with mass extinctions, rates of genetic accommodation may have been limiting for life. PMID:28560344
The origins of intensive marine fishing in medieval Europe: the English evidence.
Barrett, James H.; Locker, Alison M.; Roberts, Callum M.
2004-01-01
The catastrophic impact of fishing pressure on species such as cod and herring is well documented. However, the antiquity of their intensive exploitation has not been established. Systematic catch statistics are only available for ca.100 years, but large-scale fishing industries existed in medieval Europe and the expansion of cod fishing from the fourteenth century (first in Iceland, then in Newfoundland) played an important role in the European colonization of the Northwest Atlantic. History has demonstrated the scale of these late medieval and post-medieval fisheries, but only archaeology can illuminate earlier practices. Zooarchaeological evidence shows that the clearest changes in marine fishing in England between AD 600 and 1600 occurred rapidly around AD 1000 and involved large increases in catches of herring and cod. Surprisingly, this revolution predated the documented post-medieval expansion of England's sea fisheries and coincided with the Medieval Warm Period--when natural herring and cod productivity was probably low in the North Sea. This counterintuitive discovery can be explained by the concurrent rise of urbanism and human impacts on freshwater ecosystems. The search for 'pristine' baselines regarding marine ecosystems will thus need to employ medieval palaeoecological proxies in addition to recent fisheries data and early modern historical records. PMID:15590590
The Population Tracking Model: A Simple, Scalable Statistical Model for Neural Population Data
O'Donnell, Cian; alves, J. Tiago Gonç; Whiteley, Nick; Portera-Cailliau, Carlos; Sejnowski, Terrence J.
2017-01-01
Our understanding of neural population coding has been limited by a lack of analysis methods to characterize spiking data from large populations. The biggest challenge comes from the fact that the number of possible network activity patterns scales exponentially with the number of neurons recorded (∼2Neurons). Here we introduce a new statistical method for characterizing neural population activity that requires semi-independent fitting of only as many parameters as the square of the number of neurons, requiring drastically smaller data sets and minimal computation time. The model works by matching the population rate (the number of neurons synchronously active) and the probability that each individual neuron fires given the population rate. We found that this model can accurately fit synthetic data from up to 1000 neurons. We also found that the model could rapidly decode visual stimuli from neural population data from macaque primary visual cortex about 65 ms after stimulus onset. Finally, we used the model to estimate the entropy of neural population activity in developing mouse somatosensory cortex and, surprisingly, found that it first increases, and then decreases during development. This statistical model opens new options for interrogating neural population data and can bolster the use of modern large-scale in vivo Ca2+ and voltage imaging tools. PMID:27870612
Zelt, Colin A.; Haines, Seth; Powers, Michael H.; Sheehan, Jacob; Rohdewald, Siegfried; Link, Curtis; Hayashi, Koichi; Zhao, Don; Zhou, Hua-wei; Burton, Bethany L.; Petersen, Uni K.; Bonal, Nedra D.; Doll, William E.
2013-01-01
Seismic refraction methods are used in environmental and engineering studies to image the shallow subsurface. We present a blind test of inversion and tomographic refraction analysis methods using a synthetic first-arrival-time dataset that was made available to the community in 2010. The data are realistic in terms of the near-surface velocity model, shot-receiver geometry and the data's frequency and added noise. Fourteen estimated models were determined by ten participants using eight different inversion algorithms, with the true model unknown to the participants until it was revealed at a session at the 2011 SAGEEP meeting. The estimated models are generally consistent in terms of their large-scale features, demonstrating the robustness of refraction data inversion in general, and the eight inversion algorithms in particular. When compared to the true model, all of the estimated models contain a smooth expression of its two main features: a large offset in the bedrock and the top of a steeply dipping low-velocity fault zone. The estimated models do not contain a subtle low-velocity zone and other fine-scale features, in accord with conventional wisdom. Together, the results support confidence in the reliability and robustness of modern refraction inversion and tomographic methods.
Information is not good enough: the transformation of health education in France in the late 1970s.
Berlivet, Luc
2008-01-01
The aim of this article is to provide a historical account of the many problems that arose in the making of large-scale health education campaigns in France, from the mid-1970s to the mid-1980s, as well as the solution explored at that time. Fascination for the alleged influence of mass media on human behaviour prompted a government keen on anything "modern" to implement the first ever large-scale health education campaign on the risks of "excessive smoking." However, the hyper-rationalistic approach to communication favoured by the French Committee for Health Education (FCHE), in charge of the campaign, prompted many questions concerning both the impact, and the political implications of their approach. The historical investigation described here is based on the study of various kinds of propaganda material; interviews with health education specialists and senior civil servants; and the systematic exploration of the archives produced by the FCHE. The analysis of the issues raised by this policy, as well as the answers provided by the protagonists themselves, sheds new light on pending questions in health education, such as the need to reconcile political acceptability and effectiveness, and the role that social sciences might play in this process.
Hoyo, Javier Del; Choi, Heejoo; Burge, James H; Kim, Geon-Hee; Kim, Dae Wook
2017-06-20
The control of surface errors as a function of spatial frequency is critical during the fabrication of modern optical systems. A large-scale surface figure error is controlled by a guided removal process, such as computer-controlled optical surfacing. Smaller-scale surface errors are controlled by polishing process parameters. Surface errors of only a few millimeters may degrade the performance of an optical system, causing background noise from scattered light and reducing imaging contrast for large optical systems. Conventionally, the microsurface roughness is often given by the root mean square at a high spatial frequency range, with errors within a 0.5×0.5 mm local surface map with 500×500 pixels. This surface specification is not adequate to fully describe the characteristics for advanced optical systems. The process for controlling and minimizing mid- to high-spatial frequency surface errors with periods of up to ∼2-3 mm was investigated for many optical fabrication conditions using the measured surface power spectral density (PSD) of a finished Zerodur optical surface. Then, the surface PSD was systematically related to various fabrication process parameters, such as the grinding methods, polishing interface materials, and polishing compounds. The retraceable experimental polishing conditions and processes used to produce an optimal optical surface PSD are presented.
Emergence of a turbulent cascade in a quantum gas
NASA Astrophysics Data System (ADS)
Navon, Nir; Gaunt, Alexander L.; Smith, Robert P.; Hadzibabic, Zoran
2016-11-01
A central concept in the modern understanding of turbulence is the existence of cascades of excitations from large to small length scales, or vice versa. This concept was introduced in 1941 by Kolmogorov and Obukhov, and such cascades have since been observed in various systems, including interplanetary plasmas, supernovae, ocean waves and financial markets. Despite much progress, a quantitative understanding of turbulence remains a challenge, owing to the interplay between many length scales that makes theoretical simulations of realistic experimental conditions difficult. Here we observe the emergence of a turbulent cascade in a weakly interacting homogeneous Bose gas—a quantum fluid that can be theoretically described on all relevant length scales. We prepare a Bose-Einstein condensate in an optical box, drive it out of equilibrium with an oscillating force that pumps energy into the system at the largest length scale, study its nonlinear response to the periodic drive, and observe a gradual development of a cascade characterized by an isotropic power-law distribution in momentum space. We numerically model our experiments using the Gross-Pitaevskii equation and find excellent agreement with the measurements. Our experiments establish the uniform Bose gas as a promising new medium for investigating many aspects of turbulence, including the interplay between vortex and wave turbulence, and the relative importance of quantum and classical effects.
Challenges to Comparative Education: Between Retrospect and Expectation
NASA Astrophysics Data System (ADS)
Mitter, Wolfgang
1997-09-01
This article takes an autobiographical approach, which is applied in order to devise a panorama of Comparative Education, as it has developed during the past four decades. It identifies five "paradigms" which have been dominant in debates, inquiries and projects carried out by comparative educationists in various regions: East-West conflict (fifties and sixties); large-scale educational reforms (seventies); intercultural education in multicultural societies and gender issues (late seventies and eighties); transformation processes and post-modern "revolt" against the predominant theories of modernity (late eighties and nineties); and universalism versus cultural pluralism (nineties). The "changes' of these paradigms have been stimulated by developments and trends in the socioeconomic, political and cultural framework as well as by the continuous debates within the science. The current period seems to be primarily determined by the challenges to Comparative Education to define its response to the antagonism between universalism and cultural pluralism. Comparative studies may provide evidence for "reconciliatory" endeavour. They can, on the one hand, warn universalists of hasty "harmonisations", while, on the other hand, they can tell radical advocates of cultural pluralism that the commitment to one's ethnic, religious or national identity need and should not close the doors to the search for universal trends and generally acceptable values.
NASA Astrophysics Data System (ADS)
Dutta, Koushik
2016-06-01
Radiocarbon, or 14C, is a radiometric dating method ideally suited for providing a chronological framework in archaeology and geosciences for timescales spanning the last 50,000 years. 14C is easily detectable in most common natural organic materials and has a half-life (5,730±40 years) relevant to these timescales. 14C produced from large-scale detonations of nuclear bombs between the 1950s and the early 1960s can be used for dating modern organic materials formed after the 1950s. Often these studies demand high-resolution chronology to resolve ages within a few decades to less than a few years. Despite developments in modern, high-precision 14C analytical methods, the applicability of 14C in high-resolution chronology is limited by short-term variations in atmospheric 14C in the past. This article reviews the roles of the principal natural drivers (e.g., solar magnetic activity and ocean circulation) and the anthropogenic perturbations (e.g., fossil fuel CO2 and 14C from nuclear and thermonuclear bombs) that are responsible for short-term 14C variations in the environment. Methods and challenges of high-resolution 14C dating are discussed.
Northward extent of East Asian monsoon covaries with intensity on orbital and millennial timescales.
Goldsmith, Yonaton; Broecker, Wallace S; Xu, Hai; Polissar, Pratigya J; deMenocal, Peter B; Porat, Naomi; Lan, Jianghu; Cheng, Peng; Zhou, Weijian; An, Zhisheng
2017-02-21
The magnitude, rate, and extent of past and future East Asian monsoon (EAM) rainfall fluctuations remain unresolved. Here, late Pleistocene-Holocene EAM rainfall intensity is reconstructed using a well-dated northeastern China closed-basin lake area record located at the modern northwestern fringe of the EAM. The EAM intensity and northern extent alternated rapidly between wet and dry periods on time scales of centuries. Lake levels were 60 m higher than present during the early and middle Holocene, requiring a twofold increase in annual rainfall, which, based on modern rainfall distribution, requires a ∼400 km northward expansion/migration of the EAM. The lake record is highly correlated with both northern and southern Chinese cave deposit isotope records, supporting rainfall "intensity based" interpretations of these deposits as opposed to an alternative "water vapor sourcing" interpretation. These results indicate that EAM intensity and the northward extent covary on orbital and millennial timescales. The termination of wet conditions at 5.5 ka BP (∼35 m lake drop) triggered a large cultural collapse of Early Neolithic cultures in north China, and possibly promoted the emergence of complex societies of the Late Neolithic.
Global late Quaternary megafauna extinctions linked to humans, not climate change.
Sandom, Christopher; Faurby, Søren; Sandel, Brody; Svenning, Jens-Christian
2014-07-22
The late Quaternary megafauna extinction was a severe global-scale event. Two factors, climate change and modern humans, have received broad support as the primary drivers, but their absolute and relative importance remains controversial. To date, focus has been on the extinction chronology of individual or small groups of species, specific geographical regions or macroscale studies at very coarse geographical and taxonomic resolution, limiting the possibility of adequately testing the proposed hypotheses. We present, to our knowledge, the first global analysis of this extinction based on comprehensive country-level data on the geographical distribution of all large mammal species (more than or equal to 10 kg) that have gone globally or continentally extinct between the beginning of the Last Interglacial at 132,000 years BP and the late Holocene 1000 years BP, testing the relative roles played by glacial-interglacial climate change and humans. We show that the severity of extinction is strongly tied to hominin palaeobiogeography, with at most a weak, Eurasia-specific link to climate change. This first species-level macroscale analysis at relatively high geographical resolution provides strong support for modern humans as the primary driver of the worldwide megafauna losses during the late Quaternary.
Isolation and Purification of Biotechnological Products
NASA Astrophysics Data System (ADS)
Hubbuch, Jürgen; Kula, Maria-Regina
2007-05-01
The production of modern pharma proteins is one of the most rapid growing fields in biotechnology. The overall development and production is a complex task ranging from strain development and cultivation to the purification and formulation of the drug. Downstream processing, however, still accounts for the major part of production costs. This is mainly due to the high demands on purity and thus safety of the final product and results in processes with a sequence of typically more than 10 unit operations. Consequently, even if each process step would operate at near optimal yield, a very significant amount of product would be lost. The majority of unit operations applied in downstream processing have a long history in the field of chemical and process engineering; nevertheless, mathematical descriptions of the respective processes and the economical large-scale production of modern pharmaceutical products are hampered by the complexity of the biological feedstock, especially the high molecular weight and limited stability of proteins. In order to develop new operational steps as well as a successful overall process, it is thus a necessary prerequisite to develop a deeper understanding of the thermodynamics and physics behind the applied processes as well as the implications for the product.
Global late Quaternary megafauna extinctions linked to humans, not climate change
Sandom, Christopher; Faurby, Søren; Sandel, Brody; Svenning, Jens-Christian
2014-01-01
The late Quaternary megafauna extinction was a severe global-scale event. Two factors, climate change and modern humans, have received broad support as the primary drivers, but their absolute and relative importance remains controversial. To date, focus has been on the extinction chronology of individual or small groups of species, specific geographical regions or macroscale studies at very coarse geographical and taxonomic resolution, limiting the possibility of adequately testing the proposed hypotheses. We present, to our knowledge, the first global analysis of this extinction based on comprehensive country-level data on the geographical distribution of all large mammal species (more than or equal to 10 kg) that have gone globally or continentally extinct between the beginning of the Last Interglacial at 132 000 years BP and the late Holocene 1000 years BP, testing the relative roles played by glacial–interglacial climate change and humans. We show that the severity of extinction is strongly tied to hominin palaeobiogeography, with at most a weak, Eurasia-specific link to climate change. This first species-level macroscale analysis at relatively high geographical resolution provides strong support for modern humans as the primary driver of the worldwide megafauna losses during the late Quaternary. PMID:24898370
Northward extent of East Asian monsoon covaries with intensity on orbital and millennial timescales
Goldsmith, Yonaton; Broecker, Wallace S.; Xu, Hai; Polissar, Pratigya J.; deMenocal, Peter B.; Porat, Naomi; Lan, Jianghu; Cheng, Peng; Zhou, Weijian; An, Zhisheng
2017-01-01
The magnitude, rate, and extent of past and future East Asian monsoon (EAM) rainfall fluctuations remain unresolved. Here, late Pleistocene–Holocene EAM rainfall intensity is reconstructed using a well-dated northeastern China closed-basin lake area record located at the modern northwestern fringe of the EAM. The EAM intensity and northern extent alternated rapidly between wet and dry periods on time scales of centuries. Lake levels were 60 m higher than present during the early and middle Holocene, requiring a twofold increase in annual rainfall, which, based on modern rainfall distribution, requires a ∼400 km northward expansion/migration of the EAM. The lake record is highly correlated with both northern and southern Chinese cave deposit isotope records, supporting rainfall “intensity based” interpretations of these deposits as opposed to an alternative “water vapor sourcing” interpretation. These results indicate that EAM intensity and the northward extent covary on orbital and millennial timescales. The termination of wet conditions at 5.5 ka BP (∼35 m lake drop) triggered a large cultural collapse of Early Neolithic cultures in north China, and possibly promoted the emergence of complex societies of the Late Neolithic. PMID:28167754
Northward extent of East Asian monsoon covaries with intensity on orbital and millennial timescales
NASA Astrophysics Data System (ADS)
Goldsmith, Yonaton; Broecker, Wallace S.; Xu, Hai; Polissar, Pratigya J.; deMenocal, Peter B.; Porat, Naomi; Lan, Jianghu; Cheng, Peng; Zhou, Weijian; An, Zhisheng
2017-02-01
The magnitude, rate, and extent of past and future East Asian monsoon (EAM) rainfall fluctuations remain unresolved. Here, late Pleistocene-Holocene EAM rainfall intensity is reconstructed using a well-dated northeastern China closed-basin lake area record located at the modern northwestern fringe of the EAM. The EAM intensity and northern extent alternated rapidly between wet and dry periods on time scales of centuries. Lake levels were 60 m higher than present during the early and middle Holocene, requiring a twofold increase in annual rainfall, which, based on modern rainfall distribution, requires a ˜400 km northward expansion/migration of the EAM. The lake record is highly correlated with both northern and southern Chinese cave deposit isotope records, supporting rainfall “intensity based” interpretations of these deposits as opposed to an alternative “water vapor sourcing” interpretation. These results indicate that EAM intensity and the northward extent covary on orbital and millennial timescales. The termination of wet conditions at 5.5 ka BP (˜35 m lake drop) triggered a large cultural collapse of Early Neolithic cultures in north China, and possibly promoted the emergence of complex societies of the Late Neolithic.
Horizontal transfer of an adaptive chimeric photoreceptor from bryophytes to ferns
Li, Fay-Wei; Villarreal, Juan Carlos; Kelly, Steven; Rothfels, Carl J.; Melkonian, Michael; Frangedakis, Eftychios; Ruhsam, Markus; Sigel, Erin M.; Der, Joshua P.; Pittermann, Jarmila; Burge, Dylan O.; Pokorny, Lisa; Larsson, Anders; Chen, Tao; Weststrand, Stina; Thomas, Philip; Carpenter, Eric; Zhang, Yong; Tian, Zhijian; Chen, Li; Yan, Zhixiang; Zhu, Ying; Sun, Xiao; Wang, Jun; Stevenson, Dennis W.; Crandall-Stotler, Barbara J.; Shaw, A. Jonathan; Deyholos, Michael K.; Soltis, Douglas E.; Graham, Sean W.; Windham, Michael D.; Langdale, Jane A.; Wong, Gane Ka-Shu; Mathews, Sarah; Pryer, Kathleen M.
2014-01-01
Ferns are well known for their shade-dwelling habits. Their ability to thrive under low-light conditions has been linked to the evolution of a novel chimeric photoreceptor—neochrome—that fuses red-sensing phytochrome and blue-sensing phototropin modules into a single gene, thereby optimizing phototropic responses. Despite being implicated in facilitating the diversification of modern ferns, the origin of neochrome has remained a mystery. We present evidence for neochrome in hornworts (a bryophyte lineage) and demonstrate that ferns acquired neochrome from hornworts via horizontal gene transfer (HGT). Fern neochromes are nested within hornwort neochromes in our large-scale phylogenetic reconstructions of phototropin and phytochrome gene families. Divergence date estimates further support the HGT hypothesis, with fern and hornwort neochromes diverging 179 Mya, long after the split between the two plant lineages (at least 400 Mya). By analyzing the draft genome of the hornwort Anthoceros punctatus, we also discovered a previously unidentified phototropin gene that likely represents the ancestral lineage of the neochrome phototropin module. Thus, a neochrome originating in hornworts was transferred horizontally to ferns, where it may have played a significant role in the diversification of modern ferns. PMID:24733898
Lucas, Peter W; Philip, Swapna M; Al-Qeoud, Dareen; Al-Draihim, Nuha; Saji, Sreeja; van Casteren, Adam
2016-01-01
Mammalian enamel, the contact dental tissue, is something of an enigma. It is almost entirely made of hydroxyapatite, yet exhibits very different mechanical behavior to a homogeneous block of the same mineral. Recent approaches suggest that its hierarchical composite form, similar to other biological hard tissues, leads to a mechanical performance that depends very much on the scale of measurement. The stiffness of the material is predicted to be highest at the nanoscale, being sacrificed to produce a high toughness at the largest scale, that is, at the level of the tooth crown itself. Yet because virtually all this research has been conducted only on human (or sometimes "bovine") enamel, there has been little regard for structural variation of the tissue considered as evolutionary adaptation to diet. What is mammalian enamel optimized for? We suggest that there are competing selective pressures. We suggest that the structural characteristics that optimize enamel to resist large-scale fractures, such as crown failures, are very different to those that resist wear (small-scale fracture). While enamel is always designed for damage tolerance, this may be suboptimal in the enamel of some species, including modern humans (which have been the target of most investigations), in order to counteract wear. The experimental part of this study introduces novel techniques that help to assess resistance at the nanoscale. © 2015 Wiley Periodicals, Inc.
Rogers, Lauren A; Schindler, Daniel E; Lisi, Peter J; Holtgrieve, Gordon W; Leavitt, Peter R; Bunting, Lynda; Finney, Bruce P; Selbie, Daniel T; Chen, Guangjie; Gregory-Eaves, Irene; Lisac, Mark J; Walsh, Patrick B
2013-01-29
Observational data from the past century have highlighted the importance of interdecadal modes of variability in fish population dynamics, but how these patterns of variation fit into a broader temporal and spatial context remains largely unknown. We analyzed time series of stable nitrogen isotopes from the sediments of 20 sockeye salmon nursery lakes across western Alaska to characterize temporal and spatial patterns in salmon abundance over the past ∼500 y. Although some stocks varied on interdecadal time scales (30- to 80-y cycles), centennial-scale variation, undetectable in modern-day catch records and survey data, has dominated salmon population dynamics over the past 500 y. Before 1900, variation in abundance was clearly not synchronous among stocks, and the only temporal signal common to lake sediment records from this region was the onset of commercial fishing in the late 1800s. Thus, historical changes in climate did not synchronize stock dynamics over centennial time scales, emphasizing that ecosystem complexity can produce a diversity of ecological responses to regional climate forcing. Our results show that marine fish populations may alternate between naturally driven periods of high and low abundance over time scales of decades to centuries and suggest that management models that assume time-invariant productivity or carrying capacity parameters may be poor representations of the biological reality in these systems.
Towards Personalized Cardiology: Multi-Scale Modeling of the Failing Heart
Amr, Ali; Neumann, Dominik; Georgescu, Bogdan; Seegerer, Philipp; Kamen, Ali; Haas, Jan; Frese, Karen S.; Irawati, Maria; Wirsz, Emil; King, Vanessa; Buss, Sebastian; Mereles, Derliz; Zitron, Edgar; Keller, Andreas; Katus, Hugo A.; Comaniciu, Dorin; Meder, Benjamin
2015-01-01
Background Despite modern pharmacotherapy and advanced implantable cardiac devices, overall prognosis and quality of life of HF patients remain poor. This is in part due to insufficient patient stratification and lack of individualized therapy planning, resulting in less effective treatments and a significant number of non-responders. Methods and Results State-of-the-art clinical phenotyping was acquired, including magnetic resonance imaging (MRI) and biomarker assessment. An individualized, multi-scale model of heart function covering cardiac anatomy, electrophysiology, biomechanics and hemodynamics was estimated using a robust framework. The model was computed on n=46 HF patients, showing for the first time that advanced multi-scale models can be fitted consistently on large cohorts. Novel multi-scale parameters derived from the model of all cases were analyzed and compared against clinical parameters, cardiac imaging, lab tests and survival scores to evaluate the explicative power of the model and its potential for better patient stratification. Model validation was pursued by comparing clinical parameters that were not used in the fitting process against model parameters. Conclusion This paper illustrates how advanced multi-scale models can complement cardiovascular imaging and how they could be applied in patient care. Based on obtained results, it becomes conceivable that, after thorough validation, such heart failure models could be applied for patient management and therapy planning in the future, as we illustrate in one patient of our cohort who received CRT-D implantation. PMID:26230546
Rogers, Lauren A.; Schindler, Daniel E.; Lisi, Peter J.; Holtgrieve, Gordon W.; Leavitt, Peter R.; Bunting, Lynda; Finney, Bruce P.; Selbie, Daniel T.; Chen, Guangjie; Gregory-Eaves, Irene; Lisac, Mark J.; Walsh, Patrick B.
2013-01-01
Observational data from the past century have highlighted the importance of interdecadal modes of variability in fish population dynamics, but how these patterns of variation fit into a broader temporal and spatial context remains largely unknown. We analyzed time series of stable nitrogen isotopes from the sediments of 20 sockeye salmon nursery lakes across western Alaska to characterize temporal and spatial patterns in salmon abundance over the past ∼500 y. Although some stocks varied on interdecadal time scales (30- to 80-y cycles), centennial-scale variation, undetectable in modern-day catch records and survey data, has dominated salmon population dynamics over the past 500 y. Before 1900, variation in abundance was clearly not synchronous among stocks, and the only temporal signal common to lake sediment records from this region was the onset of commercial fishing in the late 1800s. Thus, historical changes in climate did not synchronize stock dynamics over centennial time scales, emphasizing that ecosystem complexity can produce a diversity of ecological responses to regional climate forcing. Our results show that marine fish populations may alternate between naturally driven periods of high and low abundance over time scales of decades to centuries and suggest that management models that assume time-invariant productivity or carrying capacity parameters may be poor representations of the biological reality in these systems. PMID:23322737
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pask, J E; Sukumar, N; Guney, M
2011-02-28
Over the course of the past two decades, quantum mechanical calculations have emerged as a key component of modern materials research. However, the solution of the required quantum mechanical equations is a formidable task and this has severely limited the range of materials systems which can be investigated by such accurate, quantum mechanical means. The current state of the art for large-scale quantum simulations is the planewave (PW) method, as implemented in now ubiquitous VASP, ABINIT, and QBox codes, among many others. However, since the PW method uses a global Fourier basis, with strictly uniform resolution at all points inmore » space, and in which every basis function overlaps every other at every point, it suffers from substantial inefficiencies in calculations involving atoms with localized states, such as first-row and transition-metal atoms, and requires substantial nonlocal communications in parallel implementations, placing critical limits on scalability. In recent years, real-space methods such as finite-differences (FD) and finite-elements (FE) have been developed to address these deficiencies by reformulating the required quantum mechanical equations in a strictly local representation. However, while addressing both resolution and parallel-communications problems, such local real-space approaches have been plagued by one key disadvantage relative to planewaves: excessive degrees of freedom (grid points, basis functions) needed to achieve the required accuracies. And so, despite critical limitations, the PW method remains the standard today. In this work, we show for the first time that this key remaining disadvantage of real-space methods can in fact be overcome: by building known atomic physics into the solution process using modern partition-of-unity (PU) techniques in finite element analysis. Indeed, our results show order-of-magnitude reductions in basis size relative to state-of-the-art planewave based methods. The method developed here is completely general, applicable to any crystal symmetry and to both metals and insulators alike. We have developed and implemented a full self-consistent Kohn-Sham method, including both total energies and forces for molecular dynamics, and developed a full MPI parallel implementation for large-scale calculations. We have applied the method to the gamut of physical systems, from simple insulating systems with light atoms to complex d- and f-electron systems, requiring large numbers of atomic-orbital enrichments. In every case, the new PU FE method attained the required accuracies with substantially fewer degrees of freedom, typically by an order of magnitude or more, than the current state-of-the-art PW method. Finally, our initial MPI implementation has shown excellent parallel scaling of the most time-critical parts of the code up to 1728 processors, with clear indications of what will be required to achieve comparable scaling for the rest. Having shown that the key remaining disadvantage of real-space methods can in fact be overcome, the work has attracted significant attention: with sixteen invited talks, both domestic and international, so far; two papers published and another in preparation; and three new university and/or national laboratory collaborations, securing external funding to pursue a number of related research directions. Having demonstrated the proof of principle, work now centers on the necessary extensions and optimizations required to bring the prototype method and code delivered here to production applications.« less
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
Lockley, Martin G; McCrea, Richard T; Buckley, Lisa G; Lim, Jong Deock; Matthews, Neffra A; Breithaupt, Brent H; Houck, Karen J; Gierliński, Gerard D; Surmik, Dawid; Kim, Kyung Soo; Xing, Lida; Kong, Dal Yong; Cart, Ken; Martin, Jason; Hadden, Glade
2016-01-07
Relationships between non-avian theropod dinosaurs and extant and fossil birds are a major focus of current paleobiological research. Despite extensive phylogenetic and morphological support, behavioural evidence is mostly ambiguous and does not usually fossilize. Thus, inferences that dinosaurs, especially theropods displayed behaviour analogous to modern birds are intriguing but speculative. Here we present extensive and geographically widespread physical evidence of substrate scraping behavior by large theropods considered as compelling evidence of "display arenas" or leks, and consistent with "nest scrape display" behaviour among many extant ground-nesting birds. Large scrapes, up to 2 m in diameter, occur abundantly at several Cretaceous sites in Colorado. They constitute a previously unknown category of large dinosaurian trace fossil, inferred to fill gaps in our understanding of early phases in the breeding cycle of theropods. The trace makers were probably lekking species that were seasonally active at large display arena sites. Such scrapes indicate stereotypical avian behaviour hitherto unknown among Cretaceous theropods, and most likely associated with terrirorial activity in the breeding season. The scrapes most probably occur near nesting colonies, as yet unknown or no longer preserved in the immediate study areas. Thus, they provide clues to paleoenvironments where such nesting sites occurred.
NASA Astrophysics Data System (ADS)
Lockley, Martin G.; McCrea, Richard T.; Buckley, Lisa G.; Deock Lim, Jong; Matthews, Neffra A.; Breithaupt, Brent H.; Houck, Karen J.; Gierliński, Gerard D.; Surmik, Dawid; Soo Kim, Kyung; Xing, Lida; Yong Kong, Dal; Cart, Ken; Martin, Jason; Hadden, Glade
2016-01-01
Relationships between non-avian theropod dinosaurs and extant and fossil birds are a major focus of current paleobiological research. Despite extensive phylogenetic and morphological support, behavioural evidence is mostly ambiguous and does not usually fossilize. Thus, inferences that dinosaurs, especially theropods displayed behaviour analogous to modern birds are intriguing but speculative. Here we present extensive and geographically widespread physical evidence of substrate scraping behavior by large theropods considered as compelling evidence of “display arenas” or leks, and consistent with “nest scrape display” behaviour among many extant ground-nesting birds. Large scrapes, up to 2 m in diameter, occur abundantly at several Cretaceous sites in Colorado. They constitute a previously unknown category of large dinosaurian trace fossil, inferred to fill gaps in our understanding of early phases in the breeding cycle of theropods. The trace makers were probably lekking species that were seasonally active at large display arena sites. Such scrapes indicate stereotypical avian behaviour hitherto unknown among Cretaceous theropods, and most likely associated with terrirorial activity in the breeding season. The scrapes most probably occur near nesting colonies, as yet unknown or no longer preserved in the immediate study areas. Thus, they provide clues to paleoenvironments where such nesting sites occurred.
Lockley, Martin G.; McCrea, Richard T.; Buckley, Lisa G.; Deock Lim, Jong; Matthews, Neffra A.; Breithaupt, Brent H.; Houck, Karen J.; Gierliński, Gerard D.; Surmik, Dawid; Soo Kim, Kyung; Xing, Lida; Yong Kong, Dal; Cart, Ken; Martin, Jason; Hadden, Glade
2016-01-01
Relationships between non-avian theropod dinosaurs and extant and fossil birds are a major focus of current paleobiological research. Despite extensive phylogenetic and morphological support, behavioural evidence is mostly ambiguous and does not usually fossilize. Thus, inferences that dinosaurs, especially theropods displayed behaviour analogous to modern birds are intriguing but speculative. Here we present extensive and geographically widespread physical evidence of substrate scraping behavior by large theropods considered as compelling evidence of “display arenas” or leks, and consistent with “nest scrape display” behaviour among many extant ground-nesting birds. Large scrapes, up to 2 m in diameter, occur abundantly at several Cretaceous sites in Colorado. They constitute a previously unknown category of large dinosaurian trace fossil, inferred to fill gaps in our understanding of early phases in the breeding cycle of theropods. The trace makers were probably lekking species that were seasonally active at large display arena sites. Such scrapes indicate stereotypical avian behaviour hitherto unknown among Cretaceous theropods, and most likely associated with terrirorial activity in the breeding season. The scrapes most probably occur near nesting colonies, as yet unknown or no longer preserved in the immediate study areas. Thus, they provide clues to paleoenvironments where such nesting sites occurred. PMID:26741567
Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems
NASA Astrophysics Data System (ADS)
Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard
Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.
Ocean-atmosphere forcing of South American tropical paleoclimate, LGM to present
NASA Astrophysics Data System (ADS)
Baker, P. A.; Fritz, S. C.; Dwyer, G. S.; Rigsby, C. A.; Silva, C. G.; Burns, S. J.
2012-12-01
Because of many recent terrestrial paleoclimatic and marine paleoceanographic records, late Quaternary South American tropical paleoclimate is as well understood as that anywhere in the world. While lessons learned from the recent instrumental record of climate are informative, this record is too short to capture much of the lower frequency variability encountered in the paleoclimate records and much of the observed paleoclimate is without modern analogue. This paleoclimate is known to be regionally variable with significant differences both north and south of the equator and between the western high Andes and eastern lowlands of the Amazon and Nordeste Brazil. Various extrinsic forcing mechanisms affected climate throughout the period, including global concentrations of GHGs, Northern Hemisphere ice sheet forcing, seasonal insolation forcing of the South American summer monsoon (SASM), millennial-scale Atlantic forcing, and Pacific forcing of the large-scale Walker circulation. The magnitude of the climate response to these forcings varied temporally, largely because of the varying amplitude of the forcing itself. For example, during the last glacial, large-amplitude north Atlantic forcing during Heinrich 1 and the LGM itself, led to wet (dry) conditions south (north) of the equator. During the Holocene, Atlantic forcing was lower amplitude, thus seasonal insolation forcing generally predominated with a weaker-than-normal SASM during the early Holocene resulting in dry conditions in the south-western tropics and wet conditions in the eastern lowlands and Nordeste; in the late Holocene seasonal insolation reached a maximum in the southern tropics and climate conditions reversed.
On the structure and stability of magnetic tower jets
Huarte-Espinosa, M.; Frank, A.; Blackman, E. G.; ...
2012-09-05
Modern theoretical models of astrophysical jets combine accretion, rotation, and magnetic fields to launch and collimate supersonic flows from a central source. Near the source, magnetic field strengths must be large enough to collimate the jet requiring that the Poynting flux exceeds the kinetic energy flux. The extent to which the Poynting flux dominates kinetic energy flux at large distances from the engine distinguishes two classes of models. In magneto-centrifugal launch models, magnetic fields dominate only at scales <~ 100 engine radii, after which the jets become hydrodynamically dominated (HD). By contrast, in Poynting flux dominated (PFD) magnetic tower models,more » the field dominates even out to much larger scales. To compare the large distance propagation differences of these two paradigms, we perform three-dimensional ideal magnetohydrodynamic adaptive mesh refinement simulations of both HD and PFD stellar jets formed via the same energy flux. We also compare how thermal energy losses and rotation of the jet base affects the stability in these jets. For the conditions described, we show that PFD and HD exhibit observationally distinguishable features: PFD jets are lighter, slower, and less stable than HD jets. Here, unlike HD jets, PFD jets develop current-driven instabilities that are exacerbated as cooling and rotation increase, resulting in jets that are clumpier than those in the HD limit. Our PFD jet simulations also resemble the magnetic towers that have been recently created in laboratory astrophysical jet experiments.« less
Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin
2015-01-15
Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.
Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin
2014-01-01
Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633
Birds of a Feather: Neanderthal Exploitation of Raptors and Corvids
Finlayson, Clive; Brown, Kimberly; Blasco, Ruth; Rosell, Jordi; Negro, Juan José; Finlayson, Geraldine; Sánchez Marco, Antonio; Giles Pacheco, Francisco; Rodríguez Vidal, Joaquín; Carrión, José S.; Fa, Darren A.; Rodríguez Llanes, José M.
2012-01-01
The hypothesis that Neanderthals exploited birds for the use of their feathers or claws as personal ornaments in symbolic behaviour is revolutionary as it assigns unprecedented cognitive abilities to these hominins. This inference, however, is based on modest faunal samples and thus may not represent a regular or systematic behaviour. Here we address this issue by looking for evidence of such behaviour across a large temporal and geographical framework. Our analyses try to answer four main questions: 1) does a Neanderthal to raptor-corvid connection exist at a large scale, thus avoiding associations that might be regarded as local in space or time?; 2) did Middle (associated with Neanderthals) and Upper Palaeolithic (associated with modern humans) sites contain a greater range of these species than Late Pleistocene paleontological sites?; 3) is there a taphonomic association between Neanderthals and corvids-raptors at Middle Palaeolithic sites on Gibraltar, specifically Gorham's, Vanguard and Ibex Caves? and; 4) was the extraction of wing feathers a local phenomenon exclusive to the Neanderthals at these sites or was it a geographically wider phenomenon?. We compiled a database of 1699 Pleistocene Palearctic sites based on fossil bird sites. We also compiled a taphonomical database from the Middle Palaeolithic assemblages of Gibraltar. We establish a clear, previously unknown and widespread, association between Neanderthals, raptors and corvids. We show that the association involved the direct intervention of Neanderthals on the bones of these birds, which we interpret as evidence of extraction of large flight feathers. The large number of bones, the variety of species processed and the different temporal periods when the behaviour is observed, indicate that this was a systematic, geographically and temporally broad, activity that the Neanderthals undertook. Our results, providing clear evidence that Neanderthal cognitive capacities were comparable to those of Modern Humans, constitute a major advance in the study of human evolution. PMID:23029321
Birds of a feather: Neanderthal exploitation of raptors and corvids.
Finlayson, Clive; Brown, Kimberly; Blasco, Ruth; Rosell, Jordi; Negro, Juan José; Bortolotti, Gary R; Finlayson, Geraldine; Sánchez Marco, Antonio; Giles Pacheco, Francisco; Rodríguez Vidal, Joaquín; Carrión, José S; Fa, Darren A; Rodríguez Llanes, José M
2012-01-01
The hypothesis that Neanderthals exploited birds for the use of their feathers or claws as personal ornaments in symbolic behaviour is revolutionary as it assigns unprecedented cognitive abilities to these hominins. This inference, however, is based on modest faunal samples and thus may not represent a regular or systematic behaviour. Here we address this issue by looking for evidence of such behaviour across a large temporal and geographical framework. Our analyses try to answer four main questions: 1) does a Neanderthal to raptor-corvid connection exist at a large scale, thus avoiding associations that might be regarded as local in space or time?; 2) did Middle (associated with Neanderthals) and Upper Palaeolithic (associated with modern humans) sites contain a greater range of these species than Late Pleistocene paleontological sites?; 3) is there a taphonomic association between Neanderthals and corvids-raptors at Middle Palaeolithic sites on Gibraltar, specifically Gorham's, Vanguard and Ibex Caves? and; 4) was the extraction of wing feathers a local phenomenon exclusive to the Neanderthals at these sites or was it a geographically wider phenomenon?. We compiled a database of 1699 Pleistocene Palearctic sites based on fossil bird sites. We also compiled a taphonomical database from the Middle Palaeolithic assemblages of Gibraltar. We establish a clear, previously unknown and widespread, association between Neanderthals, raptors and corvids. We show that the association involved the direct intervention of Neanderthals on the bones of these birds, which we interpret as evidence of extraction of large flight feathers. The large number of bones, the variety of species processed and the different temporal periods when the behaviour is observed, indicate that this was a systematic, geographically and temporally broad, activity that the Neanderthals undertook. Our results, providing clear evidence that Neanderthal cognitive capacities were comparable to those of Modern Humans, constitute a major advance in the study of human evolution.
Research on the competitiveness and development strategy of china's modern coal chemical industry
NASA Astrophysics Data System (ADS)
Wang, Q.; Han, Y. J.; Yu, Z. F.
2016-08-01
China's modern coal chemical industry has grown into a certain scale after over a decade of development, and remarkable progress has been made in key technologies. But as oil price collapsed since 2015, the economic benefit of the industry also slumped, with loud controversies in China over the necessity of modern coal chemical industry. The research believes that the modern coal chemical industry plays a positive role in the clean and sustainable exploitation of coal in China. It makes profit when oil price is no lower than 60/bbl, and outperforms petrochemical in terms of cost effectiveness when the price is between 60/bbl and 80/bbl. Given the low oil price and challenges posed by environmental protection and water restraints, we suggest that the state announce a guideline quickly, with adjusted tax policies and an encouragement to technological innovation, so that the modern coal chemical industry in China can grow sound and stable.
Atlantic SSTs control regime shifts in forest fire activity of Northern Scandinavia
Drobyshev, Igor; Bergeron, Yves; Vernal, Anne de; Moberg, Anders; Ali, Adam A.; Niklasson, Mats
2016-01-01
Understanding the drivers of the boreal forest fire activity is challenging due to the complexity of the interactions driving fire regimes. We analyzed drivers of forest fire activity in Northern Scandinavia (above 60 N) by combining modern and proxy data over the Holocene. The results suggest that the cold climate in northern Scandinavia was generally characterized by dry conditions favourable to periods of regionally increased fire activity. We propose that the cold conditions over the northern North Atlantic, associated with low SSTs, expansion of sea ice cover, and the southward shift in the position of the subpolar gyre, redirect southward the precipitation over Scandinavia, associated with the westerlies. This dynamics strengthens high pressure systems over Scandinavia and results in increased regional fire activity. Our study reveals a previously undocumented teleconnection between large scale climate and ocean dynamics over the North Atlantic and regional boreal forest fire activity in Northern Scandinavia. Consistency of the pattern observed annually through millennium scales suggests that a strong link between Atlantic SST and fire activity on multiple temporal scales over the entire Holocene is relevant for understanding future fire activity across the European boreal zone. PMID:26940995
Scaling Semantic Graph Databases in Size and Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morari, Alessandro; Castellana, Vito G.; Villa, Oreste
In this paper we present SGEM, a full software system for accelerating large-scale semantic graph databases on commodity clusters. Unlike current approaches, SGEM addresses semantic graph databases by only employing graph methods at all the levels of the stack. On one hand, this allows exploiting the space efficiency of graph data structures and the inherent parallelism of graph algorithms. These features adapt well to the increasing system memory and core counts of modern commodity clusters. On the other hand, however, these systems are optimized for regular computation and batched data transfers, while graph methods usually are irregular and generate fine-grainedmore » data accesses with poor spatial and temporal locality. Our framework comprises a SPARQL to data parallel C compiler, a library of parallel graph methods and a custom, multithreaded runtime system. We introduce our stack, motivate its advantages with respect to other solutions and show how we solved the challenges posed by irregular behaviors. We present the result of our software stack on the Berlin SPARQL benchmarks with datasets up to 10 billion triples (a triple corresponds to a graph edge), demonstrating scaling in dataset size and in performance as more nodes are added to the cluster.« less
NASA Astrophysics Data System (ADS)
Chirayath, V.; Instrella, R.
2016-02-01
We present NASA ESTO FluidCam 1 & 2, Visible and NIR Fluid-Lensing-enabled imaging payloads for Unmanned Aerial Vehicles (UAVs). Developed as part of a focused 2014 earth science technology grant, FluidCam 1&2 are Fluid-Lensing-based computational optical imagers designed for automated 3D mapping and remote sensing of underwater coastal targets from airborne platforms. Fluid Lensing has been used to map underwater reefs in 3D in American Samoa and Hamelin Pool, Australia from UAV platforms at sub-cm scale, which has proven a valuable tool in modern marine research for marine biosphere assessment and conservation. We share FluidCam 1&2 instrument validation and testing results as well as preliminary processed data from field campaigns. Petabyte-scale aerial survey efforts using Fluid Lensing to image at-risk reefs demonstrate broad applicability to large-scale automated species identification, morphology studies and reef ecosystem characterization for shallow marine environments and terrestrial biospheres, of crucial importance to improving bathymetry data for physical oceanographic models and understanding climate change's impact on coastal zones, global oxygen production, carbon sequestration.
NASA Astrophysics Data System (ADS)
Chirayath, V.
2015-12-01
We present NASA ESTO FluidCam 1 & 2, Visible and NIR Fluid-Lensing-enabled imaging payloads for Unmanned Aerial Vehicles (UAVs). Developed as part of a focused 2014 earth science technology grant, FluidCam 1&2 are Fluid-Lensing-based computational optical imagers designed for automated 3D mapping and remote sensing of underwater coastal targets from airborne platforms. Fluid Lensing has been used to map underwater reefs in 3D in American Samoa and Hamelin Pool, Australia from UAV platforms at sub-cm scale, which has proven a valuable tool in modern marine research for marine biosphere assessment and conservation. We share FluidCam 1&2 instrument validation and testing results as well as preliminary processed data from field campaigns. Petabyte-scale aerial survey efforts using Fluid Lensing to image at-risk reefs demonstrate broad applicability to large-scale automated species identification, morphology studies and reef ecosystem characterization for shallow marine environments and terrestrial biospheres, of crucial importance to improving bathymetry data for physical oceanographic models and understanding climate change's impact on coastal zones, global oxygen production, carbon sequestration.
Lecocq, Antoine; Kryger, Per; Vejsnæs, Flemming; Bruun Jensen, Annette
2015-01-01
Over the last few decades, a gradual departure away from traditional agricultural practices has resulted in alterations to the composition of the countryside and landscapes across Europe. In the face of such changes, monitoring the development and productivity of honey bee colonies from different sites can give valuable insight on the influence of landscape on their productivity and might point towards future directions for modernized beekeeping practices. Using data on honeybee colony weights provided by electronic scales spread across Denmark, we investigated the effect of the immediate landscape on colony productivity. In order to extract meaningful information, data manipulation was necessary prior to analysis as a result of different management regimes or scales malfunction. Once this was carried out, we were able to show that colonies situated in landscapes composed of more than 50% urban areas were significantly more productive than colonies situated in those with more than 50% agricultural areas or those in mixed areas. As well as exploring some of the potential reasons for the observed differences, we discuss the value of weight monitoring of colonies on a large scale. PMID:26147392
Atlantic SSTs control regime shifts in forest fire activity of Northern Scandinavia
NASA Astrophysics Data System (ADS)
Drobyshev, Igor; Bergeron, Yves; Vernal, Anne De; Moberg, Anders; Ali, Adam A.; Niklasson, Mats
2016-03-01
Understanding the drivers of the boreal forest fire activity is challenging due to the complexity of the interactions driving fire regimes. We analyzed drivers of forest fire activity in Northern Scandinavia (above 60 N) by combining modern and proxy data over the Holocene. The results suggest that the cold climate in northern Scandinavia was generally characterized by dry conditions favourable to periods of regionally increased fire activity. We propose that the cold conditions over the northern North Atlantic, associated with low SSTs, expansion of sea ice cover, and the southward shift in the position of the subpolar gyre, redirect southward the precipitation over Scandinavia, associated with the westerlies. This dynamics strengthens high pressure systems over Scandinavia and results in increased regional fire activity. Our study reveals a previously undocumented teleconnection between large scale climate and ocean dynamics over the North Atlantic and regional boreal forest fire activity in Northern Scandinavia. Consistency of the pattern observed annually through millennium scales suggests that a strong link between Atlantic SST and fire activity on multiple temporal scales over the entire Holocene is relevant for understanding future fire activity across the European boreal zone.
Atlantic SSTs control regime shifts in forest fire activity of Northern Scandinavia.
Drobyshev, Igor; Bergeron, Yves; Vernal, Anne de; Moberg, Anders; Ali, Adam A; Niklasson, Mats
2016-03-04
Understanding the drivers of the boreal forest fire activity is challenging due to the complexity of the interactions driving fire regimes. We analyzed drivers of forest fire activity in Northern Scandinavia (above 60 N) by combining modern and proxy data over the Holocene. The results suggest that the cold climate in northern Scandinavia was generally characterized by dry conditions favourable to periods of regionally increased fire activity. We propose that the cold conditions over the northern North Atlantic, associated with low SSTs, expansion of sea ice cover, and the southward shift in the position of the subpolar gyre, redirect southward the precipitation over Scandinavia, associated with the westerlies. This dynamics strengthens high pressure systems over Scandinavia and results in increased regional fire activity. Our study reveals a previously undocumented teleconnection between large scale climate and ocean dynamics over the North Atlantic and regional boreal forest fire activity in Northern Scandinavia. Consistency of the pattern observed annually through millennium scales suggests that a strong link between Atlantic SST and fire activity on multiple temporal scales over the entire Holocene is relevant for understanding future fire activity across the European boreal zone.
The cosmic web and microwave background fossilize the first turbulent combustion
NASA Astrophysics Data System (ADS)
Gibson, Carl H.
2015-09-01
The weblike structure of the cosmic microwave background CMB temperature fluctuations are interpreted as fossils of the first turbulent combustion that drives the big bang1,2,3. Modern turbulence theory3 requires that inertial vortex forces cause turbulence to always cascade from small scales to large, contrary to the standard turbulence model where the cascade is reversed. Assuming that the universe begins at Planck length 10-35 m and temperature 1032 K, the mechanism of the big bang is a powerful turbulent combustion instability, where turbulence forms at the Kolmogorov scale and mass-energy is extracted by < -10113 Pa negative stresses from big bang turbulence working against gravity. Prograde accretion of a Planck antiparticle on a spinning particle-antiparticle pair releases 42% of a particle rest mass from the Kerr metric, producing a spinning gas of turbulent Planck particles that cascades to larger scales at smaller temperatures (10-27 m, 1027 K) retaining the Planck density 1097 kg m-3, where quarks form and gluon viscosity fossilizes the turbulence. Viscous stress powers inflation to ~ 10 m and ~ 10100 kg. The CMB shows signatures of both plasma and big bang turbulence. Direct numerical simulations support the new turbulence theory6.
Design of a shape-memory alloy actuated macro-scale morphing aircraft mechanism
NASA Astrophysics Data System (ADS)
Manzo, Justin; Garcia, Ephrahim; Wickenheiser, Adam; Horner, Garnett C.
2005-05-01
As more alternative, lightweight actuators have become available, the conventional fixed-wing configuration seen on modern aircraft is under investigation for efficiency on a broad scale. If an aircraft could be designed with multiple functional equilibria of drastically varying aerodynamic parameters, one craft capable of 'morphing' its shape could be used to replace two or three designed with particular intentions. One proposed shape for large-scale (geometry change on the same order of magnitude as wingspan) morphing is the Hyper-Elliptical Cambered Span (HECS) wing, designed at NASA Langley to be implemented on an unmanned aerial vehicle (UAV). Proposed mechanisms to accomplish the spanwise curvature (in the y-z plane of the craft) that allow near-continuous bending of the wing are narrowed to a tendon-based DC motor actuated system, and a shape memory alloy-based (SMA) mechanism. At Cornell, simulations and wind tunnel experiments assess the validity of the HECS wing as a potential shape for a blended-wing body craft with the potential to effectively serve the needs of two conventional UAVs, and analyze the energetics of actuation associated with a morphing maneuver accomplished with both a DC motor and SMA wire.
Subterranean gravity and other deep hole geophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stacey, F.D.
1983-01-01
The early history of the determination of the Newtonian gravitational constant, G, was closely linked with the developments of geodesy and gravity surveying. The current search for non-Newtonian effects that may provide an experimental guide to unification theories has led to our retracting some of this history. Modern geophysical techniques and facilities, using especially mines and deep ocean probes, permit absolute measurements of G for distance scales up to a few kilometers. Although the accuracy of the very long range determinations cannot equal that of the best laboratory measurements, they are crucial to assessment of the possibility of a scalemore » dependence of G. Preliminary data give values of G on a scale 100-1000 m biased about 1% higher than the laboratory value. Possibilities of systematic error compel us to question this apparently significant bias but it provides the incentive for better controlled large scale experiments. Several are in progress or under development. A particular difficulty concerns the measurement of in situ density. Even for hard rock, release from overburden pressure causes microcracks and pores to open. Natural pore closure is effective only with deep burial and for this reason there are advantages in deep instrument placement for several geophysical studies.« less
A new paradigm for atomically detailed simulations of kinetics in biophysical systems.
Elber, Ron
2017-01-01
The kinetics of biochemical and biophysical events determined the course of life processes and attracted considerable interest and research. For example, modeling of biological networks and cellular responses relies on the availability of information on rate coefficients. Atomically detailed simulations hold the promise of supplementing experimental data to obtain a more complete kinetic picture. However, simulations at biological time scales are challenging. Typical computer resources are insufficient to provide the ensemble of trajectories at the correct length that is required for straightforward calculations of time scales. In the last years, new technologies emerged that make atomically detailed simulations of rate coefficients possible. Instead of computing complete trajectories from reactants to products, these approaches launch a large number of short trajectories at different positions. Since the trajectories are short, they are computed trivially in parallel on modern computer architecture. The starting and termination positions of the short trajectories are chosen, following statistical mechanics theory, to enhance efficiency. These trajectories are analyzed. The analysis produces accurate estimates of time scales as long as hours. The theory of Milestoning that exploits the use of short trajectories is discussed, and several applications are described.
Lecocq, Antoine; Kryger, Per; Vejsnæs, Flemming; Bruun Jensen, Annette
2015-01-01
Over the last few decades, a gradual departure away from traditional agricultural practices has resulted in alterations to the composition of the countryside and landscapes across Europe. In the face of such changes, monitoring the development and productivity of honey bee colonies from different sites can give valuable insight on the influence of landscape on their productivity and might point towards future directions for modernized beekeeping practices. Using data on honeybee colony weights provided by electronic scales spread across Denmark, we investigated the effect of the immediate landscape on colony productivity. In order to extract meaningful information, data manipulation was necessary prior to analysis as a result of different management regimes or scales malfunction. Once this was carried out, we were able to show that colonies situated in landscapes composed of more than 50% urban areas were significantly more productive than colonies situated in those with more than 50% agricultural areas or those in mixed areas. As well as exploring some of the potential reasons for the observed differences, we discuss the value of weight monitoring of colonies on a large scale.
Zilsel's Thesis, Maritime Culture, and Iberian Science in Early Modern Europe.
Leitão, Henrique; Sánchez, Antonio
2017-01-01
Zilsel's thesis on the artisanal origins of modern science remains one of the most original proposals about the emergence of scientific modernity. We propose to inspect the scientific developments in Iberia in the early modern period using Zilsel's ideas as a guideline. Our purpose is to show that his ideas illuminate the situation in Iberia but also that the Iberian case is a remarkable illustration of Zilsel's thesis. Furthermore, we argue that Zilsel's thesis is essentially a sociological explanation that cannot be applied to isolated cases; its use implies global events that involve extended societies over large periods of time.
Womack, James C; Anton, Lucian; Dziedzic, Jacek; Hasnip, Phil J; Probert, Matt I J; Skylaris, Chris-Kriton
2018-03-13
The solution of the Poisson equation is a crucial step in electronic structure calculations, yielding the electrostatic potential-a key component of the quantum mechanical Hamiltonian. In recent decades, theoretical advances and increases in computer performance have made it possible to simulate the electronic structure of extended systems in complex environments. This requires the solution of more complicated variants of the Poisson equation, featuring nonhomogeneous dielectric permittivities, ionic concentrations with nonlinear dependencies, and diverse boundary conditions. The analytic solutions generally used to solve the Poisson equation in vacuum (or with homogeneous permittivity) are not applicable in these circumstances, and numerical methods must be used. In this work, we present DL_MG, a flexible, scalable, and accurate solver library, developed specifically to tackle the challenges of solving the Poisson equation in modern large-scale electronic structure calculations on parallel computers. Our solver is based on the multigrid approach and uses an iterative high-order defect correction method to improve the accuracy of solutions. Using two chemically relevant model systems, we tested the accuracy and computational performance of DL_MG when solving the generalized Poisson and Poisson-Boltzmann equations, demonstrating excellent agreement with analytic solutions and efficient scaling to ∼10 9 unknowns and 100s of CPU cores. We also applied DL_MG in actual large-scale electronic structure calculations, using the ONETEP linear-scaling electronic structure package to study a 2615 atom protein-ligand complex with routinely available computational resources. In these calculations, the overall execution time with DL_MG was not significantly greater than the time required for calculations using a conventional FFT-based solver.
Leveraging annotation-based modeling with Jump.
Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti
2018-01-01
The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.
Performance Studies on Distributed Virtual Screening
Krüger, Jens; de la Garza, Luis; Kohlbacher, Oliver; Nagel, Wolfgang E.
2014-01-01
Virtual high-throughput screening (vHTS) is an invaluable method in modern drug discovery. It permits screening large datasets or databases of chemical structures for those structures binding possibly to a drug target. Virtual screening is typically performed by docking code, which often runs sequentially. Processing of huge vHTS datasets can be parallelized by chunking the data because individual docking runs are independent of each other. The goal of this work is to find an optimal splitting maximizing the speedup while considering overhead and available cores on Distributed Computing Infrastructures (DCIs). We have conducted thorough performance studies accounting not only for the runtime of the docking itself, but also for structure preparation. Performance studies were conducted via the workflow-enabled science gateway MoSGrid (Molecular Simulation Grid). As input we used benchmark datasets for protein kinases. Our performance studies show that docking workflows can be made to scale almost linearly up to 500 concurrent processes distributed even over large DCIs, thus accelerating vHTS campaigns significantly. PMID:25032219
NASA Astrophysics Data System (ADS)
Bai, Peng; Jeon, Mi Young; Ren, Limin; Knight, Chris; Deem, Michael W.; Tsapatsis, Michael; Siepmann, J. Ilja
2015-01-01
Zeolites play numerous important roles in modern petroleum refineries and have the potential to advance the production of fuels and chemical feedstocks from renewable resources. The performance of a zeolite as separation medium and catalyst depends on its framework structure. To date, 213 framework types have been synthesized and >330,000 thermodynamically accessible zeolite structures have been predicted. Hence, identification of optimal zeolites for a given application from the large pool of candidate structures is attractive for accelerating the pace of materials discovery. Here we identify, through a large-scale, multi-step computational screening process, promising zeolite structures for two energy-related applications: the purification of ethanol from fermentation broths and the hydroisomerization of alkanes with 18-30 carbon atoms encountered in petroleum refining. These results demonstrate that predictive modelling and data-driven science can now be applied to solve some of the most challenging separation problems involving highly non-ideal mixtures and highly articulated compounds.
Analyzing large-scale spiking neural data with HRLAnalysis™
Thibeault, Corey M.; O'Brien, Michael J.; Srinivasa, Narayan
2014-01-01
The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time, but also provide a user friendly Python interface. We describe the design considerations, implementation and features of the HRLAnalysis™ suite. In addition, performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible, but also straightforward to interface with existing Python modules. PMID:24634655
The Receiver System for the Ooty Wide Field Array
NASA Astrophysics Data System (ADS)
Subrahmanya, C. R.; Prasad, P.; Girish, B. S.; Somashekar, R.; Manoharan, P. K.; Mittal, A. K.
2017-03-01
The legacy Ooty Radio Telescope (ORT) is being reconfigured as a 264-element synthesis telescope, called the Ooty Wide Field Array (OWFA). Its antenna elements are the contiguous 1.92 m sections of the parabolic cylinder. It will operate in a 38-MHz frequency band centred at 326.5 MHz and will be equipped with a digital receiver including a 264-element spectral correlator with a spectral resolution of 48 kHz. OWFA is designed to retain the benefits of equatorial mount, continuous 9-hour tracking ability and large collecting area of the legacy telescope and use of modern digital techniques to enhance the instantaneous field-of-view by more than an order of magnitude. OWFA has unique advantages for contemporary investigations related to large scale structure, transient events and space weather watch. In this paper, we describe the RF subsystems, digitizers and fibre optic communication of OWFA and highlight some specific aspects of the system relevant for the observations planned during the initial operation.
Network analysis of mesoscale optical recordings to assess regional, functional connectivity.
Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H
2015-10-01
With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.
Sternad, M.; Forster, M.; Wilkening, M.
2016-01-01
Silicon-based microelectronics forms a major foundation of our modern society. Small lithium-ion batteries act as the key enablers of its success and have revolutionised portable electronics used in our all everyday’s life. While large-scale LIBs are expected to help establish electric vehicles, on the other end of device size chip-integrated Si-based μ-batteries may revolutionise microelectronics once more. In general, Si is regarded as one of the white hopes since it offers energy densities being ten times higher than conventional anode materials. The use of monocrystalline, wafer-grade Si, however, requires several hurdles to be overcome since it its volume largely expands during lithiation. Here, we will show how 3D patterned Si wafers, prepared by the sophisticated techniques from semiconductor industry, are to be electrochemically activated to overcome these limitations and to leverage their full potential being reflected in stable charge capacities (>1000 mAhg–1) and high Coulomb efficiencies (98.8%). PMID:27531589
Innovative Technological Development of Russian Mining Regions (on Example of Kemerovo Region)
NASA Astrophysics Data System (ADS)
Shavina, Evgeniya; Kalenov, Oleg
2017-11-01
A characteristic trend of many countries modern development is the transition to an innovative economy. At present, this is the only opportunity to secure and maintain a high standard of living for the population. Moreover, innovative development of Russian can be achieved during technological progress in its regions. In this regard, it is necessary to assess the innovative potential of the region and identify the most actual problems that impede the transition to the trajectory of innovative development. The authors outline several main indicators that help to determine the level of innovation and technological development of one of the largest industrial areas of Russia - Kemerovo region. The special economic role of Kemerovo region as a large territorial old-industrial complex of Western Siberia requires a large-scale work to solve the most acute problems of regional development. It is necessary to find the answer for existing problems through the implementation of a system of state regulation aimed at making the innovation component a leading factor of the regional economy competitiveness.