NASA Astrophysics Data System (ADS)
Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke
2018-01-01
Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.
GPU-accelerated depth map generation for X-ray simulations of complex CAD geometries
NASA Astrophysics Data System (ADS)
Grandin, Robert J.; Young, Gavin; Holland, Stephen D.; Krishnamurthy, Adarsh
2018-04-01
Interactive x-ray simulations of complex computer-aided design (CAD) models can provide valuable insights for better interpretation of the defect signatures such as porosity from x-ray CT images. Generating the depth map along a particular direction for the given CAD geometry is the most compute-intensive step in x-ray simulations. We have developed a GPU-accelerated method for real-time generation of depth maps of complex CAD geometries. We preprocess complex components designed using commercial CAD systems using a custom CAD module and convert them into a fine user-defined surface tessellation. Our CAD module can be used by different simulators as well as handle complex geometries, including those that arise from complex castings and composite structures. We then make use of a parallel algorithm that runs on a graphics processing unit (GPU) to convert the finely-tessellated CAD model to a voxelized representation. The voxelized representation can enable heterogeneous modeling of the volume enclosed by the CAD model by assigning heterogeneous material properties in specific regions. The depth maps are generated from this voxelized representation with the help of a GPU-accelerated ray-casting algorithm. The GPU-accelerated ray-casting method enables interactive (> 60 frames-per-second) generation of the depth maps of complex CAD geometries. This enables arbitrarily rotation and slicing of the CAD model, leading to better interpretation of the x-ray images by the user. In addition, the depth maps can be used to aid directly in CT reconstruction algorithms.
Rupture Dynamics and Ground Motion from Earthquakes on Rough Faults in Heterogeneous Media
NASA Astrophysics Data System (ADS)
Bydlon, S. A.; Kozdon, J. E.; Duru, K.; Dunham, E. M.
2013-12-01
Heterogeneities in the material properties of Earth's crust scatter propagating seismic waves. The effects of scattered waves are reflected in the seismic coda and depend on the amplitude of the heterogeneities, spatial arrangement, and distance from source to receiver. In the vicinity of the fault, scattered waves influence the rupture process by introducing fluctuations in the stresses driving propagating ruptures. Further variability in the rupture process is introduced by naturally occurring geometric complexity of fault surfaces, and the stress changes that accompany slip on rough surfaces. Our goal is to better understand the origin of complexity in the earthquake source process, and to quantify the relative importance of source complexity and scattering along the propagation path in causing incoherence of high frequency ground motion. Using a 2D high order finite difference rupture dynamics code, we nucleate ruptures on either flat or rough faults that obey strongly rate-weakening friction laws. These faults are embedded in domains with spatially varying material properties characterized by Von Karman autocorrelation functions and their associated power spectral density functions, with variations in wave speed of approximately 5 to 10%. Flat fault simulations demonstrate that off-fault material heterogeneity, at least with this particular form and amplitude, has only a minor influence on the rupture process (i.e., fluctuations in slip and rupture velocity). In contrast, ruptures histories on rough faults in both homogeneous and heterogeneous media include much larger short-wavelength fluctuations in slip and rupture velocity. We therefore conclude that source complexity is dominantly influenced by fault geometric complexity. To examine contributions of scattering versus fault geometry on ground motions, we compute spatially averaged root-mean-square (RMS) acceleration values as a function of fault perpendicular distance for a homogeneous medium and several heterogeneous media characterized by different statistical properties. We find that at distances less than ~6 km from the fault, RMS acceleration values from simulations with homogeneous and heterogeneous media are similar, but at greater distances the RMS values associated with heterogeneous media are larger than those associated with homogeneous media. The magnitude of this divergence increases with the amplitude of the heterogeneities. For instance, for a heterogeneous medium with a 10% standard deviation in material property values relative to mean values, RMS accelerations are ~50% larger than for a homogeneous medium at distances greater than 6 km. This finding is attributed to the scattering of coherent pulses into multiple pulses of decreased amplitude that subsequently arrive at later times. In order to understand the robustness of these results, an extension of our dynamic rupture and wave propagation code to 3D is underway.
Repulsive DNA-DNA interactions accelerate viral DNA packaging in phage Phi29.
Keller, Nicholas; delToro, Damian; Grimes, Shelley; Jardine, Paul J; Smith, Douglas E
2014-06-20
We use optical tweezers to study the effect of attractive versus repulsive DNA-DNA interactions on motor-driven viral packaging. Screening of repulsive interactions accelerates packaging, but induction of attractive interactions by spermidine(3+) causes heterogeneous dynamics. Acceleration is observed in a fraction of complexes, but most exhibit slowing and stalling, suggesting that attractive interactions promote nonequilibrium DNA conformations that impede the motor. Thus, repulsive interactions facilitate packaging despite increasing the energy of the theoretical optimum spooled DNA conformation.
Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Liang, Ke; Hong, Yang
2017-10-01
The shuffled complex evolution optimization developed at the University of Arizona (SCE-UA) has been successfully applied in various kinds of scientific and engineering optimization applications, such as hydrological model parameter calibration, for many years. The algorithm possesses good global optimality, convergence stability and robustness. However, benchmark and real-world applications reveal the poor computational efficiency of the SCE-UA. This research aims at the parallelization and acceleration of the SCE-UA method based on powerful heterogeneous computing technology. The parallel SCE-UA is implemented on Intel Xeon multi-core CPU (by using OpenMP and OpenCL) and NVIDIA Tesla many-core GPU (by using OpenCL, CUDA, and OpenACC). The serial and parallel SCE-UA were tested based on the Griewank benchmark function. Comparison results indicate the parallel SCE-UA significantly improves computational efficiency compared to the original serial version. The OpenCL implementation obtains the best overall acceleration results however, with the most complex source code. The parallel SCE-UA has bright prospects to be applied in real-world applications.
Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...
2015-07-13
Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less
NASA Astrophysics Data System (ADS)
Myre, Joseph M.
Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; McNeil, Andrew; Wetter, Michael
2013-05-23
Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less
Dastjerdi, Roya; Montazer, Majid; Shahsavan, Shadi; Böttcher, Horst; Moghadam, M B; Sarsour, Jamal
2013-01-01
This research has designed innovative Ag/TiO(2) polysiloxane-shield nano-reactors on the PET fabric to develop novel durable bio-photocatalyst purifiers. To create these very fine nano-reactors, oppositely surface charged multiple size nanoparticles have been applied accompanied with a crosslinkable amino-functionalized polysiloxane (XPs) emulsion. Investigation of photocatalytic dye decolorization efficiency revealed a non-heterogeneous mechanism including an accelerated degradation of entrapped dye molecules into the structural polysiloxane-shield nano-reactors. In fact, dye molecules can be adsorbed by both Ag and XPs due to their electrostatic interactions and/or even via forming a complex with them especially with silver NPs. The absorbed dye and active oxygen species generated by TiO(2) were entrapped by polysiloxane shelter and the presence of silver nanoparticles further attract the negative oxygen species closer to the adsorbed dye molecules. In this way, the dye molecules are in close contact with concentrated active oxygen species into the created nano-reactors. This provides an accelerated degradation of dye molecules. This non-heterogeneous mechanism has been detected on the sample containing all of the three components. Increasing the concentration of Ag and XPs accelerated the second step beginning with an enhanced rate. Further, the treated samples also showed an excellent antibacterial activity. Copyright © 2012 Elsevier B.V. All rights reserved.
Accelerating the development of old-growth characteristics in second-growth northern hardwoods
Karin S. Fassnacht; Dustin R. Bronson; Brian J. Palik; Anthony W. D' Amato; Craig Lorimer; Karl J. Martin
2015-01-01
Active management techniques that emulate natural forest disturbance and stand development processes have the potential to enhance species diversity, structural complexity, and spatial heterogeneity in managed forests, helping to meet goals related to biodiversity, ecosystem health, and forest resilience in the face of uncertain future conditions. There are a number of...
Unlocking Proteomic Heterogeneity in Complex Diseases through Visual Analytics
Bhavnani, Suresh K.; Dang, Bryant; Bellala, Gowtham; Divekar, Rohit; Visweswaran, Shyam; Brasier, Allan; Kurosky, Alex
2015-01-01
Despite years of preclinical development, biological interventions designed to treat complex diseases like asthma often fail in phase III clinical trials. These failures suggest that current methods to analyze biomedical data might be missing critical aspects of biological complexity such as the assumption that cases and controls come from homogeneous distributions. Here we discuss why and how methods from the rapidly evolving field of visual analytics can help translational teams (consisting of biologists, clinicians, and bioinformaticians) to address the challenge of modeling and inferring heterogeneity in the proteomic and phenotypic profiles of patients with complex diseases. Because a primary goal of visual analytics is to amplify the cognitive capacities of humans for detecting patterns in complex data, we begin with an overview of the cognitive foundations for the field of visual analytics. Next, we organize the primary ways in which a specific form of visual analytics called networks have been used to model and infer biological mechanisms, which help to identify the properties of networks that are particularly useful for the discovery and analysis of proteomic heterogeneity in complex diseases. We describe one such approach called subject-protein networks, and demonstrate its application on two proteomic datasets. This demonstration provides insights to help translational teams overcome theoretical, practical, and pedagogical hurdles for the widespread use of subject-protein networks for analyzing molecular heterogeneities, with the translational goal of designing biomarker-based clinical trials, and accelerating the development of personalized approaches to medicine. PMID:25684269
CoreTSAR: Core Task-Size Adapting Runtime
Scogland, Thomas R. W.; Feng, Wu-chun; Rountree, Barry; ...
2014-10-27
Heterogeneity continues to increase at all levels of computing, with the rise of accelerators such as GPUs, FPGAs, and other co-processors into everything from desktops to supercomputers. As a consequence, efficiently managing such disparate resources has become increasingly complex. CoreTSAR seeks to reduce this complexity by adaptively worksharing parallel-loop regions across compute resources without requiring any transformation of the code within the loop. Lastly, our results show performance improvements of up to three-fold over a current state-of-the-art heterogeneous task scheduler as well as linear performance scaling from a single GPU to four GPUs for many codes. In addition, CoreTSAR demonstratesmore » a robust ability to adapt to both a variety of workloads and underlying system configurations.« less
A Scalable Data Access Layer to Manage Structured Heterogeneous Biomedical Data.
Delussu, Giovanni; Lianas, Luca; Frexia, Francesca; Zanetti, Gianluigi
2016-01-01
This work presents a scalable data access layer, called PyEHR, designed to support the implementation of data management systems for secondary use of structured heterogeneous biomedical and clinical data. PyEHR adopts the openEHR's formalisms to guarantee the decoupling of data descriptions from implementation details and exploits structure indexing to accelerate searches. Data persistence is guaranteed by a driver layer with a common driver interface. Interfaces for two NoSQL Database Management Systems are already implemented: MongoDB and Elasticsearch. We evaluated the scalability of PyEHR experimentally through two types of tests, called "Constant Load" and "Constant Number of Records", with queries of increasing complexity on synthetic datasets of ten million records each, containing very complex openEHR archetype structures, distributed on up to ten computing nodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.; Yu, G.; Wang, K.
The physical designs of the new concept reactors which have complex structure, various materials and neutronic energy spectrum, have greatly improved the requirements to the calculation methods and the corresponding computing hardware. Along with the widely used parallel algorithm, heterogeneous platforms architecture has been introduced into numerical computations in reactor physics. Because of the natural parallel characteristics, the CPU-FPGA architecture is often used to accelerate numerical computation. This paper studies the application and features of this kind of heterogeneous platforms used in numerical calculation of reactor physics through practical examples. After the designed neutron diffusion module based on CPU-FPGA architecturemore » achieves a 11.2 speed up factor, it is proved to be feasible to apply this kind of heterogeneous platform into reactor physics. (authors)« less
A Scalable Data Access Layer to Manage Structured Heterogeneous Biomedical Data
Lianas, Luca; Frexia, Francesca; Zanetti, Gianluigi
2016-01-01
This work presents a scalable data access layer, called PyEHR, designed to support the implementation of data management systems for secondary use of structured heterogeneous biomedical and clinical data. PyEHR adopts the openEHR’s formalisms to guarantee the decoupling of data descriptions from implementation details and exploits structure indexing to accelerate searches. Data persistence is guaranteed by a driver layer with a common driver interface. Interfaces for two NoSQL Database Management Systems are already implemented: MongoDB and Elasticsearch. We evaluated the scalability of PyEHR experimentally through two types of tests, called “Constant Load” and “Constant Number of Records”, with queries of increasing complexity on synthetic datasets of ten million records each, containing very complex openEHR archetype structures, distributed on up to ten computing nodes. PMID:27936191
Tuning Spatial Profiles of Selection Pressure to Modulate the Evolution of Drug Resistance
NASA Astrophysics Data System (ADS)
De Jong, Maxwell G.; Wood, Kevin B.
2018-06-01
Spatial heterogeneity plays an important role in the evolution of drug resistance. While recent studies have indicated that spatial gradients of selection pressure can accelerate resistance evolution, much less is known about evolution in more complex spatial profiles. Here we use a stochastic toy model of drug resistance to investigate how different spatial profiles of selection pressure impact the time to fixation of a resistant allele. Using mean first passage time calculations, we show that spatial heterogeneity accelerates resistance evolution when the rate of spatial migration is sufficiently large relative to mutation but slows fixation for small migration rates. Interestingly, there exists an intermediate regime—characterized by comparable rates of migration and mutation—in which the rate of fixation can be either accelerated or decelerated depending on the spatial profile, even when spatially averaged selection pressure remains constant. Finally, we demonstrate that optimal tuning of the spatial profile can dramatically slow the spread and fixation of resistant subpopulations, even in the absence of a fitness cost for resistance. Our results may lay the groundwork for optimized, spatially resolved drug dosing strategies for mitigating the effects of drug resistance.
Runtime and Architecture Support for Efficient Data Exchange in Multi-Accelerator Applications.
Cabezas, Javier; Gelado, Isaac; Stone, John E; Navarro, Nacho; Kirk, David B; Hwu, Wen-Mei
2015-05-01
Heterogeneous parallel computing applications often process large data sets that require multiple GPUs to jointly meet their needs for physical memory capacity and compute throughput. However, the lack of high-level abstractions in previous heterogeneous parallel programming models force programmers to resort to multiple code versions, complex data copy steps and synchronization schemes when exchanging data between multiple GPU devices, which results in high software development cost, poor maintainability, and even poor performance. This paper describes the HPE runtime system, and the associated architecture support, which enables a simple, efficient programming interface for exchanging data between multiple GPUs through either interconnects or cross-node network interfaces. The runtime and architecture support presented in this paper can also be used to support other types of accelerators. We show that the simplified programming interface reduces programming complexity. The research presented in this paper started in 2009. It has been implemented and tested extensively in several generations of HPE runtime systems as well as adopted into the NVIDIA GPU hardware and drivers for CUDA 4.0 and beyond since 2011. The availability of real hardware that support key HPE features gives rise to a rare opportunity for studying the effectiveness of the hardware support by running important benchmarks on real runtime and hardware. Experimental results show that in a exemplar heterogeneous system, peer DMA and double-buffering, pinned buffers, and software techniques can improve the inter-accelerator data communication bandwidth by 2×. They can also improve the execution speed by 1.6× for a 3D finite difference, 2.5× for 1D FFT, and 1.6× for merge sort, all measured on real hardware. The proposed architecture support enables the HPE runtime to transparently deploy these optimizations under simple portable user code, allowing system designers to freely employ devices of different capabilities. We further argue that simple interfaces such as HPE are needed for most applications to benefit from advanced hardware features in practice.
Runtime and Architecture Support for Efficient Data Exchange in Multi-Accelerator Applications
Cabezas, Javier; Gelado, Isaac; Stone, John E.; Navarro, Nacho; Kirk, David B.; Hwu, Wen-mei
2014-01-01
Heterogeneous parallel computing applications often process large data sets that require multiple GPUs to jointly meet their needs for physical memory capacity and compute throughput. However, the lack of high-level abstractions in previous heterogeneous parallel programming models force programmers to resort to multiple code versions, complex data copy steps and synchronization schemes when exchanging data between multiple GPU devices, which results in high software development cost, poor maintainability, and even poor performance. This paper describes the HPE runtime system, and the associated architecture support, which enables a simple, efficient programming interface for exchanging data between multiple GPUs through either interconnects or cross-node network interfaces. The runtime and architecture support presented in this paper can also be used to support other types of accelerators. We show that the simplified programming interface reduces programming complexity. The research presented in this paper started in 2009. It has been implemented and tested extensively in several generations of HPE runtime systems as well as adopted into the NVIDIA GPU hardware and drivers for CUDA 4.0 and beyond since 2011. The availability of real hardware that support key HPE features gives rise to a rare opportunity for studying the effectiveness of the hardware support by running important benchmarks on real runtime and hardware. Experimental results show that in a exemplar heterogeneous system, peer DMA and double-buffering, pinned buffers, and software techniques can improve the inter-accelerator data communication bandwidth by 2×. They can also improve the execution speed by 1.6× for a 3D finite difference, 2.5× for 1D FFT, and 1.6× for merge sort, all measured on real hardware. The proposed architecture support enables the HPE runtime to transparently deploy these optimizations under simple portable user code, allowing system designers to freely employ devices of different capabilities. We further argue that simple interfaces such as HPE are needed for most applications to benefit from advanced hardware features in practice. PMID:26180487
Akbari, Samin; Pirbodaghi, Tohid
2014-09-07
High throughput heterogeneous immunoassays that screen antigen-specific antibody secreting cells are essential to accelerate monoclonal antibody discovery for therapeutic applications. Here, we introduce a heterogeneous single cell immunoassay based on alginate microparticles as permeable cell culture chambers. Using a microfluidic device, we encapsulated single antibody secreting cells in 35-40 μm diameter alginate microbeads. We functionalized the alginate to capture the secreted antibodies inside the microparticles, enabling single cell analysis and preventing the cross-talk between the neighboring encapsulated cells. We demonstrated non-covalent functionalization of alginate microparticles by adding three secondary antibodies to the alginate solution to form high molecular weight complexes that become trapped in the porous nanostructure of alginate and capture the secreted antibodies. We screened anti-TNF-alpha antibody-secreting cells from a mixture of antibody-secreting cells.
Heterogeneous catalyst for the production of ethylidene diacetate from acetic anhydride
Ramprasad, D.; Waller, F.J.
1998-06-16
This invention relates to a process for producing ethylidene diacetate by the reaction of acetic anhydride, acetic acid, hydrogen and carbon monoxide at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that is stable to hydrogenation and comprises an insoluble polymer having pendant quaternized heteroatoms, some of which heteroatoms are ionically bonded to anionic Group VIII metal complexes, the remainder of the heteroatoms being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled without loss in activity.
Heterogeneous catalyst for the production of ethylidene diacetate from acetic anhydride
Ramprasad, Dorai; Waller, Francis Joseph
1998-01-01
This invention relates to a process for producing ethylidene diacetate by the reaction of acetic anhydride, acetic acid, hydrogen and carbon monoxide at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that is stable to hydrogenation and comprises an insoluble polymer having pendant quaternized heteroatoms, some of which heteroatoms are ionically bonded to anionic Group VIII metal complexes, the remainder of the heteroatoms being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled without loss in activity.
NASA Astrophysics Data System (ADS)
Eghtesad, Adnan; Knezevic, Marko
2018-07-01
A corrective smooth particle method (CSPM) within smooth particle hydrodynamics (SPH) is used to study the deformation of an aircraft structure under high-velocity water-ditching impact load. The CSPM-SPH method features a new approach for the prediction of two-way fluid-structure interaction coupling. Results indicate that the implementation is well suited for modeling the deformation of structures under high-velocity impact into water as evident from the predicted stress and strain localizations in the aircraft structure as well as the integrity of the impacted interfaces, which show no artificial particle penetrations. To reduce the simulation time, a heterogeneous particle size distribution over a complex three-dimensional geometry is used. The variable particle size is achieved from a finite element mesh with variable element size and, as a result, variable nodal (i.e., SPH particle) spacing. To further accelerate the simulations, the SPH code is ported to a graphics processing unit using the OpenACC standard. The implementation and simulation results are described and discussed in this paper.
NASA Astrophysics Data System (ADS)
Eghtesad, Adnan; Knezevic, Marko
2017-12-01
A corrective smooth particle method (CSPM) within smooth particle hydrodynamics (SPH) is used to study the deformation of an aircraft structure under high-velocity water-ditching impact load. The CSPM-SPH method features a new approach for the prediction of two-way fluid-structure interaction coupling. Results indicate that the implementation is well suited for modeling the deformation of structures under high-velocity impact into water as evident from the predicted stress and strain localizations in the aircraft structure as well as the integrity of the impacted interfaces, which show no artificial particle penetrations. To reduce the simulation time, a heterogeneous particle size distribution over a complex three-dimensional geometry is used. The variable particle size is achieved from a finite element mesh with variable element size and, as a result, variable nodal (i.e., SPH particle) spacing. To further accelerate the simulations, the SPH code is ported to a graphics processing unit using the OpenACC standard. The implementation and simulation results are described and discussed in this paper.
Gorshkov, Anton V; Kirillin, Mikhail Yu
2015-08-01
Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.
An Advice Mechanism for Heterogeneous Robot Teams
NASA Astrophysics Data System (ADS)
Daniluk, Steven
The use of reinforcement learning for robot teams has enabled complex tasks to be performed, but at the cost of requiring a large amount of exploration. Exchanging information between robots in the form of advice is one method to accelerate performance improvements. This thesis presents an advice mechanism for robot teams that utilizes advice from heterogeneous advisers via a method guaranteeing convergence to an optimal policy. The presented mechanism has the capability to use multiple advisers at each time step, and decide when advice should be requested and accepted, such that the use of advice decreases over time. Additionally, collective collaborative, and cooperative behavioural algorithms are integrated into a robot team architecture, to create a new framework that provides fault tolerance and modularity for robot teams.
Heterogeneous catalyst for the production of acetic anhydride from methyl acetate
Ramprasad, D.; Waller, F.J.
1999-04-06
This invention relates to a process for producing acetic anhydride by the reaction of methyl acetate, carbon monoxide, and hydrogen at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that contains an insoluble polymer having pendant quaternized phosphine groups, some of which phosphine groups are ionically bonded to anionic Group VIII metal complexes, the remainder of the phosphine groups being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled for consecutive runs without loss in activity. Bifunctional catalysts for use in carbonylating dimethyl ether are also provided.
Ramprasad, D.; Waller, F.J.
1998-04-28
This invention relates to a process for producing ethylidene diacetate by the reaction of dimethyl ether, acetic acid, hydrogen and carbon monoxide at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that is stable to hydrogenation and comprises an insoluble polymer having pendant quaternized heteroatoms, some of which heteroatoms are ionically bonded to anionic Group VIII metal complexes, the remainder of the heteroatoms being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled for 3 consecutive runs without loss in activity.
Heterogeneous catalyst for the production of acetic anhydride from methyl acetate
Ramprasad, Dorai; Waller, Francis Joseph
1999-01-01
This invention relates to a process for producing acetic anhydride by the reaction of methyl acetate, carbon monoxide, and hydrogen at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that contains an insoluble polymer having pendant quaternized phosphine groups, some of which phosphine groups are ionically bonded to anionic Group VIII metal complexes, the remainder of the phosphine groups being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled for consecutive runs without loss in activity. Bifunctional catalysts for use in carbonylating dimethyl ether are also provided.
Ramprasad, Dorai; Waller, Francis Joseph
1998-01-01
This invention relates to a process for producing ethylidene diacetate by the reaction of dimethyl ether, acetic acid, hydrogen and carbon monoxide at elevated temperatures and pressures in the presence of an alkyl halide and a heterogeneous, bifunctional catalyst that is stable to hydrogenation and comprises an insoluble polymer having pendant quaternized heteroatoms, some of which heteroatoms are ionically bonded to anionic Group VIII metal complexes, the remainder of the heteroatoms being bonded to iodide. In contrast to prior art processes, no accelerator (promoter) is necessary to achieve the catalytic reaction and the products are easily separated from the catalyst by filtration. The catalyst can be recycled for 3 consecutive runs without loss in activity.
Accelerating Mathematics Achievement Using Heterogeneous Grouping
ERIC Educational Resources Information Center
Burris, Carol Corbett; Heubert, Jay P.; Levin, Henry M.
2006-01-01
This longitudinal study examined the effects of providing an accelerated mathematics curriculum in heterogeneously grouped middle school classes in a diverse suburban school district. A quasi-experimental cohort design was used to evaluate subsequent completion of advanced high school math courses as well as academic achievement. Results showed…
NASA Astrophysics Data System (ADS)
Xu, Chuanfu; Deng, Xiaogang; Zhang, Lilun; Fang, Jianbin; Wang, Guangxue; Jiang, Yi; Cao, Wei; Che, Yonggang; Wang, Yongxian; Wang, Zhenghua; Liu, Wei; Cheng, Xinghua
2014-12-01
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations for high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU-GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Chuanfu, E-mail: xuchuanfu@nudt.edu.cn; Deng, Xiaogang; Zhang, Lilun
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations formore » high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU–GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.« less
Some experiences and opportunities for big data in translational research.
Chute, Christopher G; Ullman-Cullere, Mollie; Wood, Grant M; Lin, Simon M; He, Min; Pathak, Jyotishman
2013-10-01
Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of "big data." The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records.
Some experiences and opportunities for big data in translational research
Chute, Christopher G.; Ullman-Cullere, Mollie; Wood, Grant M.; Lin, Simon M.; He, Min; Pathak, Jyotishman
2014-01-01
Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of “big data.” The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records. PMID:24008998
Mach wave properties in the presence of source and medium heterogeneity
NASA Astrophysics Data System (ADS)
Vyas, J. C.; Mai, P. M.; Galis, M.; Dunham, Eric M.; Imperatori, W.
2018-06-01
We investigate Mach wave coherence for kinematic supershear ruptures with spatially heterogeneous source parameters, embedded in 3D scattering media. We assess Mach wave coherence considering: 1) source heterogeneities in terms of variations in slip, rise time and rupture speed; 2) small-scale heterogeneities in Earth structure, parameterized from combinations of three correlation lengths and two standard deviations (assuming von Karman power spectral density with fixed Hurst exponent); and 3) joint effects of source and medium heterogeneities. Ground-motion simulations are conducted using a generalized finite-difference method, choosing a parameterization such that the highest resolved frequency is ˜5 Hz. We discover that Mach wave coherence is slightly diminished at near fault distances (< 10 km) due to spatially variable slip and rise time; beyond this distance the Mach wave coherence is more strongly reduced by wavefield scattering due to small-scale heterogeneities in Earth structure. Based on our numerical simulations and theoretical considerations we demonstrate that the standard deviation of medium heterogeneities controls the wavefield scattering, rather than the correlation length. In addition, we find that peak ground accelerations in the case of combined source and medium heterogeneities are consistent with empirical ground motion prediction equations for all distances, suggesting that in nature ground shaking amplitudes for supershear ruptures may not be elevated due to complexities in the rupture process and seismic wave-scattering.
Frataxin Accelerates [2Fe-2S] Cluster Formation on the Human Fe–S Assembly Complex
Fox, Nicholas G.; Das, Deepika; Chakrabarti, Mrinmoy; Lindahl, Paul A.; Barondeau, David P.
2015-01-01
Iron–sulfur (Fe–S) clusters function as protein cofactors for a wide variety of critical cellular reactions. In human mitochondria, a core Fe–S assembly complex [called SDUF and composed of NFS1, ISD11, ISCU2, and frataxin (FXN) proteins] synthesizes Fe–S clusters from iron, cysteine sulfur, and reducing equivalents and then transfers these intact clusters to target proteins. In vitro assays have relied on reducing the complexity of this complicated Fe–S assembly process by using surrogate electron donor molecules and monitoring simplified reactions. Recent studies have concluded that FXN promotes the synthesis of [4Fe-4S] clusters on the mammalian Fe–S assembly complex. Here the kinetics of Fe–S synthesis reactions were determined using different electron donation systems and by monitoring the products with circular dichroism and absorbance spectroscopies. We discovered that common surrogate electron donor molecules intercepted Fe–S cluster intermediates and formed high-molecular weight species (HMWS). The HMWS are associated with iron, sulfide, and thiol-containing proteins and have properties of a heterogeneous solubilized mineral with spectroscopic properties remarkably reminiscent of those of [4Fe-4S] clusters. In contrast, reactions using physiological reagents revealed that FXN accelerates the formation of [2Fe-2S] clusters rather than [4Fe-4S] clusters as previously reported. In the preceding paper [Fox, N. G., et al. (2015) Biochemistry 54, DOI: 10.1021/bi5014485], [2Fe-2S] intermediates on the SDUF complex were shown to readily transfer to uncomplexed ISCU2 or apo acceptor proteins, depending on the reaction conditions. Our results indicate that FXN accelerates a rate-limiting sulfur transfer step in the synthesis of [2Fe-2S] clusters on the human Fe–S assembly complex. PMID:26016518
Frataxin Accelerates [2Fe-2S] Cluster Formation on the Human Fe-S Assembly Complex.
Fox, Nicholas G; Das, Deepika; Chakrabarti, Mrinmoy; Lindahl, Paul A; Barondeau, David P
2015-06-30
Iron-sulfur (Fe-S) clusters function as protein cofactors for a wide variety of critical cellular reactions. In human mitochondria, a core Fe-S assembly complex [called SDUF and composed of NFS1, ISD11, ISCU2, and frataxin (FXN) proteins] synthesizes Fe-S clusters from iron, cysteine sulfur, and reducing equivalents and then transfers these intact clusters to target proteins. In vitro assays have relied on reducing the complexity of this complicated Fe-S assembly process by using surrogate electron donor molecules and monitoring simplified reactions. Recent studies have concluded that FXN promotes the synthesis of [4Fe-4S] clusters on the mammalian Fe-S assembly complex. Here the kinetics of Fe-S synthesis reactions were determined using different electron donation systems and by monitoring the products with circular dichroism and absorbance spectroscopies. We discovered that common surrogate electron donor molecules intercepted Fe-S cluster intermediates and formed high-molecular weight species (HMWS). The HMWS are associated with iron, sulfide, and thiol-containing proteins and have properties of a heterogeneous solubilized mineral with spectroscopic properties remarkably reminiscent of those of [4Fe-4S] clusters. In contrast, reactions using physiological reagents revealed that FXN accelerates the formation of [2Fe-2S] clusters rather than [4Fe-4S] clusters as previously reported. In the preceding paper [Fox, N. G., et al. (2015) Biochemistry 54, DOI: 10.1021/bi5014485], [2Fe-2S] intermediates on the SDUF complex were shown to readily transfer to uncomplexed ISCU2 or apo acceptor proteins, depending on the reaction conditions. Our results indicate that FXN accelerates a rate-limiting sulfur transfer step in the synthesis of [2Fe-2S] clusters on the human Fe-S assembly complex.
Fourier mode analysis of slab-geometry transport iterations in spatially periodic media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, E; Zika, M
1999-04-01
We describe a Fourier analysis of the diffusion-synthetic acceleration (DSA) and transport-synthetic acceleration (TSA) iteration schemes for a spatially periodic, but otherwise arbitrarily heterogeneous, medium. Both DSA and TSA converge more slowly in a heterogeneous medium than in a homogeneous medium composed of the volume-averaged scattering ratio. In the limit of a homogeneous medium, our heterogeneous analysis contains eigenvalues of multiplicity two at ''resonant'' wave numbers. In the presence of material heterogeneities, error modes corresponding to these resonant wave numbers are ''excited'' more than other error modes. For DSA and TSA, the iteration spectral radius may occur at these resonantmore » wave numbers, in which case the material heterogeneities most strongly affect iterative performance.« less
Li, Jiang; Liu, Jun-Ling; Liu, He-Yang; Xu, Guang-Yue; Zhang, Jun-Jie; Liu, Jia-Xing; Zhou, Guang-Lin; Li, Qin; Xu, Zhi-Hao; Fu, Yao
2017-04-10
This work provided the first example of selective hydrodeoxygenation of 5-hydroxymethylfurfural (HMF) to 2,5-dimethylfuran (DMF) over heterogeneous Fe catalysts. A catalyst prepared by the pyrolysis of an Fe-phenanthroline complex on activated carbon at 800 °C was demonstrated to be the most active heterogeneous Fe catalyst. Under the optimal reaction conditions, complete conversion of HMF was achieved with 86.2 % selectivity to DMF. The reaction pathway was investigated thoroughly, and the hydrogenation of the C=O bond in HMF was demonstrated to be the rate-determining step during the hydrodeoxygenation, which could be accelerated greatly by using alcohol solvents as additional H-donors. The excellent stability of the Fe catalyst, which was probably a result of the well-preserved active species and the pore structure of the Fe catalyst in the presence of H 2 , was demonstrated in batch and continuous flow fixed-bed reactors. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki
2014-12-01
As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.
NASA Astrophysics Data System (ADS)
Cavalli, F.; Naimzada, A.; Pecora, N.
2017-10-01
In the present paper, we investigate the dynamics of a model in which the real part of the economy, described within a multiplier-accelerator framework, interacts with a financial market with heterogeneous speculators, in order to study the channels through which the two sectors influence each other. Employing analytical and numerical tools, we investigate stability conditions as well as bifurcations and possible periodic, quasi-periodic, and chaotic dynamics, enlightening how the degree of market interaction, together with the accelerator parameter and the intervention of the fiscal authority, may affect the business cycle and the course of the financial market. In particular, we show that even if the steady state is locally stable, multistability phenomena can occur, with several and complex dynamic structures coexisting with the steady state. Finally, simulations reveal that the proposed model is able to explain several statistical properties and stylized facts observed in real financial markets, including persistent high volatility, fat-tailed return distributions, volatility clustering, and positive autocorrelation of absolute returns.
Cavalli, F; Naimzada, A; Pecora, N
2017-10-01
In the present paper, we investigate the dynamics of a model in which the real part of the economy, described within a multiplier-accelerator framework, interacts with a financial market with heterogeneous speculators, in order to study the channels through which the two sectors influence each other. Employing analytical and numerical tools, we investigate stability conditions as well as bifurcations and possible periodic, quasi-periodic, and chaotic dynamics, enlightening how the degree of market interaction, together with the accelerator parameter and the intervention of the fiscal authority, may affect the business cycle and the course of the financial market. In particular, we show that even if the steady state is locally stable, multistability phenomena can occur, with several and complex dynamic structures coexisting with the steady state. Finally, simulations reveal that the proposed model is able to explain several statistical properties and stylized facts observed in real financial markets, including persistent high volatility, fat-tailed return distributions, volatility clustering, and positive autocorrelation of absolute returns.
Big data analytics for the Future Circular Collider reliability and availability studies
NASA Astrophysics Data System (ADS)
Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter
2017-10-01
Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.
Asynchronous Replica Exchange Software for Grid and Heterogeneous Computing.
Gallicchio, Emilio; Xia, Junchao; Flynn, William F; Zhang, Baofeng; Samlalsingh, Sade; Mentes, Ahmet; Levy, Ronald M
2015-11-01
Parallel replica exchange sampling is an extended ensemble technique often used to accelerate the exploration of the conformational ensemble of atomistic molecular simulations of chemical systems. Inter-process communication and coordination requirements have historically discouraged the deployment of replica exchange on distributed and heterogeneous resources. Here we describe the architecture of a software (named ASyncRE) for performing asynchronous replica exchange molecular simulations on volunteered computing grids and heterogeneous high performance clusters. The asynchronous replica exchange algorithm on which the software is based avoids centralized synchronization steps and the need for direct communication between remote processes. It allows molecular dynamics threads to progress at different rates and enables parameter exchanges among arbitrary sets of replicas independently from other replicas. ASyncRE is written in Python following a modular design conducive to extensions to various replica exchange schemes and molecular dynamics engines. Applications of the software for the modeling of association equilibria of supramolecular and macromolecular complexes on BOINC campus computational grids and on the CPU/MIC heterogeneous hardware of the XSEDE Stampede supercomputer are illustrated. They show the ability of ASyncRE to utilize large grids of desktop computers running the Windows, MacOS, and/or Linux operating systems as well as collections of high performance heterogeneous hardware devices.
Rumor spreading model with noise interference in complex social networks
NASA Astrophysics Data System (ADS)
Zhu, Liang; Wang, Youguo
2017-03-01
In this paper, a modified susceptible-infected-removed (SIR) model has been proposed to explore rumor diffusion on complex social networks. We take variation of connectivity into consideration and assume the variation as noise. On the basis of related literature on virus networks, the noise is described as standard Brownian motion while stochastic differential equations (SDE) have been derived to characterize dynamics of rumor diffusion both on homogeneous networks and heterogeneous networks. Then, theoretical analysis on homogeneous networks has been demonstrated to investigate the solution of SDE model and the steady state of rumor diffusion. Simulations both on Barabási-Albert (BA) network and Watts-Strogatz (WS) network display that the addition of noise accelerates rumor diffusion and expands diffusion size, meanwhile, the spreading speed on BA network is much faster than on WS network under the same noise intensity. In addition, there exists a rumor diffusion threshold in statistical average meaning on homogeneous network which is absent on heterogeneous network. Finally, we find a positive correlation between peak value of infected individuals and noise intensity while a negative correlation between rumor lifecycle and noise intensity overall.
The Impact of Heterogeneity on Threshold-Limited Social Contagion, and on Crowd Decision-Making
NASA Astrophysics Data System (ADS)
Karampourniotis, Panagiotis Dimitrios
Recent global events and their poor predictability are often attributed to the complexity of the world event dynamics. A key factor generating the turbulence is human diversity. Here, we study the impact of heterogeneity of individuals on opinion formation and emergence of global biases. In the case of opinion formation, we focus on the heterogeneity of individuals' susceptibility to new ideas. In the case of global biases, we focus on the aggregated heterogeneity of individuals in a country. First, to capture the complex nature of social influencing we use a simple but classic model of contagion spreading in complex social systems, namely the threshold model. We investigate numerically and analytically the transition in the behavior of threshold-limited cascades in the presence of multiple initiators as the distribution of thresholds is varied between the two extreme cases of identical thresholds and a uniform distribution. We show that individuals' heterogeneity of susceptibility governs the dynamics, resulting in different sizes of initiators needed for consensus. Furthermore, given the impact of heterogeneity on the cascade dynamics, we investigate selection strategies for accelerating consensus. To this end, we introduce two new selection strategies for Influence Maximization. One of them focuses on finding the balance between targeting nodes which have high resistance to adoptions versus nodes positioned in central spots in networks. The second strategy focuses on the combination of nodes for reaching consensus, by targeting nodes which increase the group's influence. Our strategies outperform other existing strategies regardless of the susceptibility diversity and network degree assortativity. Finally, we study the aggregated biases of humans in a global setting. The emergence of technology and globalization gives raise to the debate on whether the world moves towards becoming flat, a world where preferential attachment does not govern economic growth. By studying the data from a global lending platform we discover that geographical proximity and cultural affinity are highly negatively correlated with levels of flatness of the world. Furthermore, we investigate the robustness of the flatness of the world against sudden catastrophic national events such as political disruptions, by removing countries (nodes) or connections (edges) between them.
Big data analytics as a service infrastructure: challenges, desired properties and solutions
NASA Astrophysics Data System (ADS)
Martín-Márquez, Manuel
2015-12-01
CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.
BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations
NASA Astrophysics Data System (ADS)
Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I.; Strydis, Christos
2017-12-01
Objective. The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. Approach. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload’s performance characteristics. Main results. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. Significance. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.
BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations.
Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I; Strydis, Christos
2017-12-01
The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload's performance characteristics. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.
Richards-Henderson, Nicole K.; Goldstein, Allen H.; Wilson, Kevin R.
2015-10-27
In this paper we report an unexpectedly large acceleration in the effective heterogeneous OH reaction rate in the presence of NO. This 10–50 fold acceleration originates from free radical chain reactions, propagated by alkoxy radicals that form inside the aerosol by the reaction of NO with peroxy radicals, which do not appear to produce chain terminating products (e.g., alkyl nitrates), unlike gas phase mechanisms. Lastly, a kinetic model, constrained by experiments, suggests that in polluted regions heterogeneous oxidation plays a much more prominent role in the daily chemical evolution of organic aerosol than previously believed.
Data warehousing methods and processing infrastructure for brain recovery research.
Gee, T; Kenny, S; Price, C J; Seghier, M L; Small, S L; Leff, A P; Pacurar, A; Strother, S C
2010-09-01
In order to accelerate translational neuroscience with the goal of improving clinical care it has become important to support rapid accumulation and analysis of large, heterogeneous neuroimaging samples and their metadata from both normal control and patient groups. We propose a multi-centre, multinational approach to accelerate the data mining of large samples and facilitate data-led clinical translation of neuroimaging results in stroke. Such data-driven approaches are likely to have an early impact on clinically relevant brain recovery while we simultaneously pursue the much more challenging model-based approaches that depend on a deep understanding of the complex neural circuitry and physiological processes that support brain function and recovery. We present a brief overview of three (potentially converging) approaches to neuroimaging data warehousing and processing that aim to support these diverse methods for facilitating prediction of cognitive and behavioral recovery after stroke, or other types of brain injury or disease.
Brosh, Robert M; Bellani, Marina; Liu, Yie; Seidman, Michael M
2017-01-01
Fanconi Anemia (FA) is a rare autosomal genetic disorder characterized by progressive bone marrow failure (BMF), endocrine dysfunction, cancer, and other clinical features commonly associated with normal aging. The anemia stems directly from an accelerated decline of the hematopoietic stem cell compartment. Although FA is a complex heterogeneous disease linked to mutations in 19 currently identified genes, there has been much progress in understanding the molecular pathology involved. FA is broadly considered a DNA repair disorder and the FA gene products, together with other DNA repair factors, have been implicated in interstrand cross-link (ICL) repair. However, in addition to the defective DNA damage response, altered epigenetic regulation, and telomere defects, FA is also marked by elevated levels of inflammatory mediators in circulation, a hallmark of faster decline in not only other hereditary aging disorders but also normal aging. In this review, we offer a perspective of FA as a monogenic accelerated aging disorder, citing the latest evidence for its multi-factorial deficiencies underlying its unique clinical and cellular features. Published by Elsevier B.V.
OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seyong; Vetter, Jeffrey S
2014-01-01
Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less
Analysis of Vehicle-Following Heterogeneity Using Self-Organizing Feature Maps
Cheu, Ruey Long; Guo, Xiucheng; Romo, Alicia
2014-01-01
A self-organizing feature map (SOM) was used to represent vehicle-following and to analyze the heterogeneities in vehicle-following behavior. The SOM was constructed in such a way that the prototype vectors represented vehicle-following stimuli (the follower's velocity, relative velocity, and gap) while the output signals represented the response (the follower's acceleration). Vehicle trajectories collected at a northbound segment of Interstate 80 Freeway at Emeryville, CA, were used to train the SOM. The trajectory information of two selected pairs of passenger cars was then fed into the trained SOM to identify similar stimuli experienced by the followers. The observed responses, when the stimuli were classified by the SOM into the same category, were compared to discover the interdriver heterogeneity. The acceleration profile of another passenger car was analyzed in the same fashion to observe the interdriver heterogeneity. The distribution of responses derived from data sets of car-following-car and car-following-truck, respectively, was compared to ascertain inter-vehicle-type heterogeneity. PMID:25538767
Higher-level phylogeny of paraneopteran insects inferred from mitochondrial genome sequences
Li, Hu; Shao, Renfu; Song, Nan; Song, Fan; Jiang, Pei; Li, Zhihong; Cai, Wanzhi
2015-01-01
Mitochondrial (mt) genome data have been proven to be informative for animal phylogenetic studies but may also suffer from systematic errors, due to the effects of accelerated substitution rate and compositional heterogeneity. We analyzed the mt genomes of 25 insect species from the four paraneopteran orders, aiming to better understand how accelerated substitution rate and compositional heterogeneity affect the inferences of the higher-level phylogeny of this diverse group of hemimetabolous insects. We found substantial heterogeneity in base composition and contrasting rates in nucleotide substitution among these paraneopteran insects, which complicate the inference of higher-level phylogeny. The phylogenies inferred with concatenated sequences of mt genes using maximum likelihood and Bayesian methods and homogeneous models failed to recover Psocodea and Hemiptera as monophyletic groups but grouped, instead, the taxa that had accelerated substitution rates together, including Sternorrhyncha (a suborder of Hemiptera), Thysanoptera, Phthiraptera and Liposcelididae (a family of Psocoptera). Bayesian inference with nucleotide sequences and heterogeneous models (CAT and CAT + GTR), however, recovered Psocodea, Thysanoptera and Hemiptera each as a monophyletic group. Within Psocodea, Liposcelididae is more closely related to Phthiraptera than to other species of Psocoptera. Furthermore, Thysanoptera was recovered as the sister group to Hemiptera. PMID:25704094
Higdon, Roger; Earl, Rachel K.; Stanberry, Larissa; Hudac, Caitlin M.; Montague, Elizabeth; Stewart, Elizabeth; Janko, Imre; Choiniere, John; Broomall, William; Kolker, Natali
2015-01-01
Abstract Complex diseases are caused by a combination of genetic and environmental factors, creating a difficult challenge for diagnosis and defining subtypes. This review article describes how distinct disease subtypes can be identified through integration and analysis of clinical and multi-omics data. A broad shift toward molecular subtyping of disease using genetic and omics data has yielded successful results in cancer and other complex diseases. To determine molecular subtypes, patients are first classified by applying clustering methods to different types of omics data, then these results are integrated with clinical data to characterize distinct disease subtypes. An example of this molecular-data-first approach is in research on Autism Spectrum Disorder (ASD), a spectrum of social communication disorders marked by tremendous etiological and phenotypic heterogeneity. In the case of ASD, omics data such as exome sequences and gene and protein expression data are combined with clinical data such as psychometric testing and imaging to enable subtype identification. Novel ASD subtypes have been proposed, such as CHD8, using this molecular subtyping approach. Broader use of molecular subtyping in complex disease research is impeded by data heterogeneity, diversity of standards, and ineffective analysis tools. The future of molecular subtyping for ASD and other complex diseases calls for an integrated resource to identify disease mechanisms, classify new patients, and inform effective treatment options. This in turn will empower and accelerate precision medicine and personalized healthcare. PMID:25831060
NASA Astrophysics Data System (ADS)
Thingbijam, Kiran Kumar; Galis, Martin; Vyas, Jagdish; Mai, P. Martin
2017-04-01
We examine the spatial interdependence between kinematic parameters of earthquake rupture, which include slip, rise-time (total duration of slip), acceleration time (time-to-peak slip velocity), peak slip velocity, and rupture velocity. These parameters were inferred from dynamic rupture models obtained by simulating spontaneous rupture on faults with varying degree of surface-roughness. We observe that the correlations between these parameters are better described by non-linear correlations (that is, on logarithm-logarithm scale) than by linear correlations. Slip and rise-time are positively correlated while these two parameters do not correlate with acceleration time, peak slip velocity, and rupture velocity. On the other hand, peak slip velocity correlates positively with rupture velocity but negatively with acceleration time. Acceleration time correlates negatively with rupture velocity. However, the observed correlations could be due to weak heterogeneity of the slip distributions given by the dynamic models. Therefore, the observed correlations may apply only to those parts of rupture plane with weak slip heterogeneity if earthquake-rupture associate highly heterogeneous slip distributions. Our findings will help to improve pseudo-dynamic rupture generators for efficient broadband ground-motion simulations for seismic hazard studies.
Solving global shallow water equations on heterogeneous supercomputers
Fu, Haohuan; Gan, Lin; Yang, Chao; Xue, Wei; Wang, Lanning; Wang, Xinliang; Huang, Xiaomeng; Yang, Guangwen
2017-01-01
The scientific demand for more accurate modeling of the climate system calls for more computing power to support higher resolutions, inclusion of more component models, more complicated physics schemes, and larger ensembles. As the recent improvements in computing power mostly come from the increasing number of nodes in a system and the integration of heterogeneous accelerators, how to scale the computing problems onto more nodes and various kinds of accelerators has become a challenge for the model development. This paper describes our efforts on developing a highly scalable framework for performing global atmospheric modeling on heterogeneous supercomputers equipped with various accelerators, such as GPU (Graphic Processing Unit), MIC (Many Integrated Core), and FPGA (Field Programmable Gate Arrays) cards. We propose a generalized partition scheme of the problem domain, so as to keep a balanced utilization of both CPU resources and accelerator resources. With optimizations on both computing and memory access patterns, we manage to achieve around 8 to 20 times speedup when comparing one hybrid GPU or MIC node with one CPU node with 12 cores. Using a customized FPGA-based data-flow engines, we see the potential to gain another 5 to 8 times improvement on performance. On heterogeneous supercomputers, such as Tianhe-1A and Tianhe-2, our framework is capable of achieving ideally linear scaling efficiency, and sustained double-precision performances of 581 Tflops on Tianhe-1A (using 3750 nodes) and 3.74 Pflops on Tianhe-2 (using 8644 nodes). Our study also provides an evaluation on the programming paradigm of various accelerator architectures (GPU, MIC, FPGA) for performing global atmospheric simulation, to form a picture about both the potential performance benefits and the programming efforts involved. PMID:28282428
Wang, Chuangqi; Choi, Hee June; Kim, Sung-Jin; Desai, Aesha; Lee, Namgyu; Kim, Dohoon; Bae, Yongho; Lee, Kwonmoo
2018-04-27
Cell protrusion is morphodynamically heterogeneous at the subcellular level. However, the mechanism of cell protrusion has been understood based on the ensemble average of actin regulator dynamics. Here, we establish a computational framework called HACKS (deconvolution of heterogeneous activity in coordination of cytoskeleton at the subcellular level) to deconvolve the subcellular heterogeneity of lamellipodial protrusion from live cell imaging. HACKS identifies distinct subcellular protrusion phenotypes based on machine-learning algorithms and reveals their underlying actin regulator dynamics at the leading edge. Using our method, we discover "accelerating protrusion", which is driven by the temporally ordered coordination of Arp2/3 and VASP activities. We validate our finding by pharmacological perturbations and further identify the fine regulation of Arp2/3 and VASP recruitment associated with accelerating protrusion. Our study suggests HACKS can identify specific subcellular protrusion phenotypes susceptible to pharmacological perturbation and reveal how actin regulator dynamics are changed by the perturbation.
Improving breeding efficiency in potato using molecular and quantitative genetics.
Slater, Anthony T; Cogan, Noel O I; Hayes, Benjamin J; Schultz, Lee; Dale, M Finlay B; Bryan, Glenn J; Forster, John W
2014-11-01
Potatoes are highly heterozygous and the conventional breeding of superior germplasm is challenging, but use of a combination of MAS and EBVs can accelerate genetic gain. Cultivated potatoes are highly heterozygous due to their outbreeding nature, and suffer acute inbreeding depression. Modern potato cultivars also exhibit tetrasomic inheritance. Due to this genetic heterogeneity, the large number of target traits and the specific requirements of commercial cultivars, potato breeding is challenging. A conventional breeding strategy applies phenotypic recurrent selection over a number of generations, a process which can take over 10 years. Recently, major advances in genetics and molecular biology have provided breeders with molecular tools to accelerate gains for some traits. Marker-assisted selection (MAS) can be effectively used for the identification of major genes and quantitative trait loci that exhibit large effects. There are also a number of complex traits of interest, such as yield, that are influenced by a large number of genes of individual small effect where MAS will be difficult to deploy. Progeny testing and the use of pedigree in the analysis can provide effective identification of the superior genetic factors that underpin these complex traits. Recently, it has been shown that estimated breeding values (EBVs) can be developed for complex potato traits. Using a combination of MAS and EBVs for simple and complex traits can lead to a significant reduction in the length of the breeding cycle for the identification of superior germplasm.
Fermilab | Tevatron | Accelerator
Leading accelerator technology Accelerator complex Illinois Accelerator Research Center Fermilab temperature. They were used to transfer particles from one part of the Fermilab accelerator complex to another center ring of Fermilab's accelerator complex. Before the Tevatron shut down, it had three primary
Yang, Liulin; Li, Yun; Wei, Zhi; Chang, Xiao
2018-06-01
Neuroblastoma is a highly complex and heterogeneous cancer in children. Acquired genomic alterations including MYCN amplification, 1p deletion and 11q deletion are important risk factors and biomarkers in neuroblastoma. Here, we performed a co-expression-based gene network analysis to study the intrinsic association between specific genomic changes and transcriptome organization. We identified multiple gene coexpression modules which are recurrent in two independent datasets and associated with functional pathways including nervous system development, cell cycle, immune system process and extracellular matrix/space. Our results also indicated that modules involved in nervous system development and cell cycle are highly associated with MYCN amplification and 1p deletion, while modules responding to immune system process are associated with MYCN amplification only. In summary, this integrated analysis provides novel insights into molecular heterogeneity and pathogenesis of neuroblastoma. This article is part of a Special Issue entitled: Accelerating Precision Medicine through Genetic and Genomic Big Data Analysis edited by Yudong Cai & Tao Huang. Copyright © 2017. Published by Elsevier B.V.
FROM2D to 3d Supervised Segmentation and Classification for Cultural Heritage Applications
NASA Astrophysics Data System (ADS)
Grilli, E.; Dininno, D.; Petrucci, G.; Remondino, F.
2018-05-01
The digital management of architectural heritage information is still a complex problem, as a heritage object requires an integrated representation of various types of information in order to develop appropriate restoration or conservation strategies. Currently, there is extensive research focused on automatic procedures of segmentation and classification of 3D point clouds or meshes, which can accelerate the study of a monument and integrate it with heterogeneous information and attributes, useful to characterize and describe the surveyed object. The aim of this study is to propose an optimal, repeatable and reliable procedure to manage various types of 3D surveying data and associate them with heterogeneous information and attributes to characterize and describe the surveyed object. In particular, this paper presents an approach for classifying 3D heritage models, starting from the segmentation of their textures based on supervised machine learning methods. Experimental results run on three different case studies demonstrate that the proposed approach is effective and with many further potentials.
Identification of Nanoparticle Prototypes and Archetypes.
Fernandez, Michael; Barnard, Amanda S
2015-12-22
High-throughput (HT) computational characterization of nanomaterials is poised to accelerate novel material breakthroughs. The number of possible nanomaterials is increasing exponentially along with their complexity, and so statistical and information technology will play a fundamental role in rationalizing nanomaterials HT data. We demonstrate that multivariate statistical analysis of heterogeneous ensembles can identify the truly significant nanoparticles and their most relevant properties. Virtual samples of diamond nanoparticles and graphene nanoflakes are characterized using clustering and archetypal analysis, where we find that saturated particles are defined by their geometry, while nonsaturated nanoparticles are defined by their carbon chemistry. At the complex hull of the nanostructure spaces, a combination of complex archetypes can efficiency describe a large number of members of the ensembles, whereas the regular shapes that are typically assumed to be representative can only describe a small set of the most regular morphologies. This approach provides a route toward the characterization of computationally intractable virtual nanomaterial spaces, which can aid nanomaterials discovery in the foreseen big data scenario.
Fast methods of fungal and bacterial identification. MALDI-TOF mass spectrometry, chromogenic media.
Siller-Ruiz, María; Hernández-Egido, Sara; Sánchez-Juanes, Fernando; González-Buitrago, José Manuel; Muñoz-Bellido, Juan Luis
2017-05-01
MALDI-TOF mass spectrometry is now a routine resource in Clinical Microbiology, because of its speed and reliability in the identification of microorganisms. Its performance in the identification of bacteria and yeasts is perfectly contrasted. The identification of mycobacteria and moulds is more complex, due to the heterogeneity of spectra within each species. The methodology is somewhat more complex, and expanding the size of species libraries, and the number of spectra of each species, will be crucial to achieve greater efficiency. Direct identification from blood cultures has been implemented, since its contribution to the management of severe patients is evident, but its application to other samples is more complex. Chromogenic media have also contributed to the rapid diagnosis in both bacteria and yeast, since they accelerate the diagnosis, facilitate the detection of mixed cultures and allow rapid diagnosis of resistant species. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Low-complexity transcoding algorithm from H.264/AVC to SVC using data mining
NASA Astrophysics Data System (ADS)
Garrido-Cantos, Rosario; De Cock, Jan; Martínez, Jose Luis; Van Leuven, Sebastian; Cuenca, Pedro; Garrido, Antonio
2013-12-01
Nowadays, networks and terminals with diverse characteristics of bandwidth and capabilities coexist. To ensure a good quality of experience, this diverse environment demands adaptability of the video stream. In general, video contents are compressed to save storage capacity and to reduce the bandwidth required for its transmission. Therefore, if these compressed video streams were compressed using scalable video coding schemes, they would be able to adapt to those heterogeneous networks and a wide range of terminals. Since the majority of the multimedia contents are compressed using H.264/AVC, they cannot benefit from that scalability. This paper proposes a low-complexity algorithm to convert an H.264/AVC bitstream without scalability to scalable bitstreams with temporal scalability in baseline and main profiles by accelerating the mode decision task of the scalable video coding encoding stage using machine learning tools. The results show that when our technique is applied, the complexity is reduced by 87% while maintaining coding efficiency.
Effective correlator for RadioAstron project
NASA Astrophysics Data System (ADS)
Sergeev, Sergey
This paper presents the implementation of programme FX-correlator for Very Long Baseline Interferometry, adapted for the project "RadioAstron". Software correlator implemented for heterogeneous computing systems using graphics accelerators. It is shown that for the task interferometry implementation of the graphics hardware has a high efficiency. The host processor of heterogeneous computing system, performs the function of forming the data flow for graphics accelerators, the number of which corresponds to the number of frequency channels. So, for the Radioastron project, such channels is seven. Each accelerator is perform correlation matrix for all bases for a single frequency channel. Initial data is converted to the floating-point format, is correction for the corresponding delay function and computes the entire correlation matrix simultaneously. Calculation of the correlation matrix is performed using the sliding Fourier transform. Thus, thanks to the compliance of a solved problem for architecture graphics accelerators, managed to get a performance for one processor platform Kepler, which corresponds to the performance of this task, the computing cluster platforms Intel on four nodes. This task successfully scaled not only on a large number of graphics accelerators, but also on a large number of nodes with multiple accelerators.
Sulfur Dioxide Accelerates the Heterogeneous Oxidation Rate of Organic Aerosol by Hydroxyl Radicals
Richards-Henderson, Nicole K.; Goldstein, Allen H.; Wilson, Kevin R.
2016-03-08
There remains considerable uncertainty in how anthropogenic gas phase emissions alter the oxidative aging of organic aerosols in the troposphere. Here we observe a 10-20 fold acceleration in the effective heterogeneous OH oxidation rate of organic aerosol in the presence of SO 2. This acceleration originates from the radical chain reactions propagated by alkoxy radicals, which are formed efficiently inside the particle by the reaction of peroxy radicals with SO 2. As the OH approaches atmospheric concentrations, the radical chain length increases, transforming the aerosol at rates predicted to be up to 10 times the OH-aerosol collision frequency. Model predictions,more » constrained by experiments over orders of magnitude changes in [OH] and [SO 2], suggest that in polluted regions the heterogeneous processing of organic aerosols by OH ([SO 2] ≥ 40 ppb) occur on similar time scales as analogous gas-phase oxidation reactions. These results provide evidence for a previously unidentified mechanism by which organic aerosol oxidation is enhanced by anthropogenic gas phase emissions. (Chemical Equation Presented).« less
NASA Astrophysics Data System (ADS)
Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid
2017-12-01
Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ˜600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ˜0.25 s/excitation source.
The Big Role of Small RNAs in Anxiety and Stress-Related Disorders.
Malan-Müller, S; Hemmings, S M J
2017-01-01
In the study of complex, heterogeneous disorders, such as anxiety and stress-related disorders, epigenetic factors provide an additional level of heritable complexity. MicroRNAs (miRNAs) are a class of small, noncoding RNAs that function as epigenetic modulators of gene expression by binding to target messenger RNAs (mRNAs) and subsequently blocking translation or accelerating their degradation. In light of their abundance in the central nervous system (CNS) and their involvement in synaptic plasticity and neuronal differentiation, miRNAs represent an exciting frontier to be explored in the etiology and treatment of anxiety and stress-related disorders. This chapter will present a thorough review of miRNAs, their functions, and mRNA targets in the CNS, focusing on their role in anxiety and stress-related disorders as described by studies performed in animals and human subjects. © 2017 Elsevier Inc. All rights reserved.
Induced spatial heterogeneity in forest canopies: responses of small mammals.
A.B. Carey
2001-01-01
We hypothesized that creating a mosaic of interspersed patches of different densities of canopy trees in a second-growth Douglas-fir (Pseudotsuga menziesiz) forest would accelerate development of biocomplexity (diversity in ecosystem structure, composition, and processes) by promoting spatial heterogeneity in understory, midstory, and canopy,...
NASA Astrophysics Data System (ADS)
Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori
Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.
Complex contagions with timers
NASA Astrophysics Data System (ADS)
Oh, Se-Wook; Porter, Mason A.
2018-03-01
There has been a great deal of effort to try to model social influence—including the spread of behavior, norms, and ideas—on networks. Most models of social influence tend to assume that individuals react to changes in the states of their neighbors without any time delay, but this is often not true in social contexts, where (for various reasons) different agents can have different response times. To examine such situations, we introduce the idea of a timer into threshold models of social influence. The presence of timers on nodes delays adoptions—i.e., changes of state—by the agents, which in turn delays the adoptions of their neighbors. With a homogeneously-distributed timer, in which all nodes have the same amount of delay, the adoption order of nodes remains the same. However, heterogeneously-distributed timers can change the adoption order of nodes and hence the "adoption paths" through which state changes spread in a network. Using a threshold model of social contagions, we illustrate that heterogeneous timers can either accelerate or decelerate the spread of adoptions compared to an analogous situation with homogeneous timers, and we investigate the relationship of such acceleration or deceleration with respect to the timer distribution and network structure. We derive an analytical approximation for the temporal evolution of the fraction of adopters by modifying a pair approximation for the Watts threshold model, and we find good agreement with numerical simulations. We also examine our new timer model on networks constructed from empirical data.
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
2015-01-01
The lateral heterogeneity of cellular membranes plays an important role in many biological functions such as signaling and regulating membrane proteins. This heterogeneity can result from preferential interactions between membrane components or interactions with membrane proteins. One major difficulty in molecular dynamics simulations aimed at studying the membrane heterogeneity is that lipids diffuse slowly and collectively in bilayers, and therefore, it is difficult to reach equilibrium in lateral organization in bilayer mixtures. Here, we propose the use of the replica exchange with solute tempering (REST) approach to accelerate lateral relaxation in heterogeneous bilayers. REST is based on the replica exchange method but tempers only the solute, leaving the temperature of the solvent fixed. Since the number of replicas in REST scales approximately only with the degrees of freedom in the solute, REST enables us to enhance the configuration sampling of lipid bilayers with fewer replicas, in comparison with the temperature replica exchange molecular dynamics simulation (T-REMD) where the number of replicas scales with the degrees of freedom of the entire system. We apply the REST method to a cholesterol and 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC) bilayer mixture and find that the lateral distribution functions of all molecular pair types converge much faster than in the standard MD simulation. The relative diffusion rate between molecules in REST is, on average, an order of magnitude faster than in the standard MD simulation. Although REST was initially proposed to study protein folding and its efficiency in protein folding is still under debate, we find a unique application of REST to accelerate lateral equilibration in mixed lipid membranes and suggest a promising way to probe membrane lateral heterogeneity through molecular dynamics simulation. PMID:25328493
Huang, Kun; García, Angel E
2014-10-14
The lateral heterogeneity of cellular membranes plays an important role in many biological functions such as signaling and regulating membrane proteins. This heterogeneity can result from preferential interactions between membrane components or interactions with membrane proteins. One major difficulty in molecular dynamics simulations aimed at studying the membrane heterogeneity is that lipids diffuse slowly and collectively in bilayers, and therefore, it is difficult to reach equilibrium in lateral organization in bilayer mixtures. Here, we propose the use of the replica exchange with solute tempering (REST) approach to accelerate lateral relaxation in heterogeneous bilayers. REST is based on the replica exchange method but tempers only the solute, leaving the temperature of the solvent fixed. Since the number of replicas in REST scales approximately only with the degrees of freedom in the solute, REST enables us to enhance the configuration sampling of lipid bilayers with fewer replicas, in comparison with the temperature replica exchange molecular dynamics simulation (T-REMD) where the number of replicas scales with the degrees of freedom of the entire system. We apply the REST method to a cholesterol and 1,2-dipalmitoyl- sn -glycero-3-phosphocholine (DPPC) bilayer mixture and find that the lateral distribution functions of all molecular pair types converge much faster than in the standard MD simulation. The relative diffusion rate between molecules in REST is, on average, an order of magnitude faster than in the standard MD simulation. Although REST was initially proposed to study protein folding and its efficiency in protein folding is still under debate, we find a unique application of REST to accelerate lateral equilibration in mixed lipid membranes and suggest a promising way to probe membrane lateral heterogeneity through molecular dynamics simulation.
Jiang, Yuyi; Shao, Zhiqing; Guo, Yi
2014-01-01
A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems. PMID:25143977
Jiang, Yuyi; Shao, Zhiqing; Guo, Yi
2014-01-01
A complex computing problem can be solved efficiently on a system with multiple computing nodes by dividing its implementation code into several parallel processing modules or tasks that can be formulated as directed acyclic graph (DAG) problems. The DAG jobs may be mapped to and scheduled on the computing nodes to minimize the total execution time. Searching an optimal DAG scheduling solution is considered to be NP-complete. This paper proposed a tuple molecular structure-based chemical reaction optimization (TMSCRO) method for DAG scheduling on heterogeneous computing systems, based on a very recently proposed metaheuristic method, chemical reaction optimization (CRO). Comparing with other CRO-based algorithms for DAG scheduling, the design of tuple reaction molecular structure and four elementary reaction operators of TMSCRO is more reasonable. TMSCRO also applies the concept of constrained critical paths (CCPs), constrained-critical-path directed acyclic graph (CCPDAG) and super molecule for accelerating convergence. In this paper, we have also conducted simulation experiments to verify the effectiveness and efficiency of TMSCRO upon a large set of randomly generated graphs and the graphs for real world problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Johnson, Seth R; Prokopenko, Andrey V
'ForTrilinos' is related to The Trilinos Project, which contains a large and growing collection of solver capabilities that can utilize next-generation platforms, in particular scalable multicore, manycore, accelerator and heterogeneous systems. Trilinos is primarily written in C++, including its user interfaces. While C++ is advantageous for gaining access to the latest programming environments, it limits Trilinos usage via Fortran. Sever ad hoc translation interfaces exist to enable Fortran usage of Trilinos, but none of these interfaces is general-purpose or written for reusable and sustainable external use. 'ForTrilinos' provides a seamless pathway for large and complex Fortran-based codes to access Trilinosmore » without C/C++ interface code. This access includes Fortran versions of Kokkos abstractions for code execution and data management.« less
Seismic signal processing on heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas
2015-04-01
The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that require dedicated HPC solutions. The chosen application is using a wide range of common signal processing methods, which include various IIR filter designs, amplitude and phase correlation, computing the analytic signal, and discrete Fourier transforms. Furthermore, various processing methods specific for seismology, like rotation of seismic traces, are used. Efficient implementation of all these methods on the GPU-accelerated systems represents several challenges. In particular, it requires a careful distribution of work between the sequential processors and accelerators. Furthermore, since the application is designed to process very large volumes of data, special attention had to be paid to the efficient use of the available memory and networking hardware resources in order to reduce intensity of data input and output. In our contribution we will explain the software architecture as well as principal engineering decisions used to address these challenges. We will also describe the programming model based on C++ and CUDA that we used to develop the software. Finally, we will demonstrate performance improvements achieved by using the heterogeneous computing architecture. This work was supported by a grant from the Swiss National Supercomputing Centre (CSCS) under project ID d26.
Accelerated deflation promotes homogeneous airspace liquid distribution in the edematous lung.
Wu, You; Nguyen, Tam L; Perlman, Carrie E
2017-04-01
Edematous lungs contain regions with heterogeneous alveolar flooding. Liquid is trapped in flooded alveoli by a pressure barrier-higher liquid pressure at the border than in the center of flooded alveoli-that is proportional to surface tension, T Stress is concentrated between aerated and flooded alveoli, to a degree proportional to T Mechanical ventilation, by cyclically increasing T , injuriously exacerbates stress concentrations. Overcoming the pressure barrier to redistribute liquid more homogeneously between alveoli should reduce stress concentration prevalence and ventilation injury. In isolated rat lungs, we test whether accelerated deflation can overcome the pressure barrier and catapult liquid out of flooded alveoli. We generate a local edema model with normal T by microinfusing liquid into surface alveoli. We generate a global edema model with high T by establishing hydrostatic edema, which does not alter T , and then gently ventilating the edematous lungs, which increases T at 15 cmH 2 O transpulmonary pressure by 52%. Thus ventilation of globally edematous lungs increases T , which should increase stress concentrations and, with positive feedback, cause escalating ventilation injury. In the local model, when the pressure barrier is moderate, accelerated deflation causes liquid to escape from flooded alveoli and redistribute more equitably. Flooding heterogeneity tends to decrease. In the global model, accelerated deflation causes liquid escape, but-because of elevated T -the liquid jumps to nearby, aerated alveoli. Flooding heterogeneity is unaltered. In pulmonary edema with normal T , early ventilation with accelerated deflation might reduce the positive feedback mechanism through which ventilation injury increases over time. NEW & NOTEWORTHY We introduce, in the isolated rat lung, a new model of pulmonary edema with elevated surface tension. We first generate hydrostatic edema and then ventilate gently to increase surface tension. We investigate the mechanical mechanisms through which 1 ) ventilation injures edematous lungs and 2 ) ventilation with accelerated deflation might lessen ventilation injury. Copyright © 2017 the American Physiological Society.
Accelerated deflation promotes homogeneous airspace liquid distribution in the edematous lung
Wu, You; Nguyen, Tam L.
2017-01-01
Edematous lungs contain regions with heterogeneous alveolar flooding. Liquid is trapped in flooded alveoli by a pressure barrier—higher liquid pressure at the border than in the center of flooded alveoli—that is proportional to surface tension, T. Stress is concentrated between aerated and flooded alveoli, to a degree proportional to T. Mechanical ventilation, by cyclically increasing T, injuriously exacerbates stress concentrations. Overcoming the pressure barrier to redistribute liquid more homogeneously between alveoli should reduce stress concentration prevalence and ventilation injury. In isolated rat lungs, we test whether accelerated deflation can overcome the pressure barrier and catapult liquid out of flooded alveoli. We generate a local edema model with normal T by microinfusing liquid into surface alveoli. We generate a global edema model with high T by establishing hydrostatic edema, which does not alter T, and then gently ventilating the edematous lungs, which increases T at 15 cmH2O transpulmonary pressure by 52%. Thus ventilation of globally edematous lungs increases T, which should increase stress concentrations and, with positive feedback, cause escalating ventilation injury. In the local model, when the pressure barrier is moderate, accelerated deflation causes liquid to escape from flooded alveoli and redistribute more equitably. Flooding heterogeneity tends to decrease. In the global model, accelerated deflation causes liquid escape, but—because of elevated T—the liquid jumps to nearby, aerated alveoli. Flooding heterogeneity is unaltered. In pulmonary edema with normal T, early ventilation with accelerated deflation might reduce the positive feedback mechanism through which ventilation injury increases over time. NEW & NOTEWORTHY We introduce, in the isolated rat lung, a new model of pulmonary edema with elevated surface tension. We first generate hydrostatic edema and then ventilate gently to increase surface tension. We investigate the mechanical mechanisms through which 1) ventilation injures edematous lungs and 2) ventilation with accelerated deflation might lessen ventilation injury. PMID:27979983
Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid
2017-12-01
Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ∼0.25 s/excitation source. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Analysis of the landscape complexity and heterogeneity of the Pantanal wetland.
Miranda, C S; Gamarra, R M; Mioto, C L; Silva, N M; Conceição Filho, A P; Pott, A
2018-05-01
This is the first report on analysis of habitat complexity and heterogeneity of the Pantanal wetland. The Pantanal encompasses a peculiar mosaic of environments, being important to evaluate and monitor this area concerning conservation of biodiversity. Our objective was to indirectly measure the habitat complexity and heterogeneity of the mosaic forming the sub-regions of the Pantanal, by means of remote sensing. We obtained free images of Normalized Difference Vegetation Index (NDVI) from the sensor MODIS and calculated the mean value (complexity) and standard deviation (heterogeneity) for each sub-region in the years 2000, 2008 and 2015. The sub-regions of Poconé, Canoeira, Paraguai and Aquidauana presented the highest values of complexity (mean NDVI), between 0.69 and 0.64 in the evaluated years. The highest horizontal heterogeneity (NDVI standard deviation) was observed in the sub-region of Tuiuiú, with values of 0.19 in the years 2000 and 2015, and 0.21 in the year 2008. We concluded that the use of NDVI to estimate landscape parameters is an efficient tool for assessment and monitoring of the complexity and heterogeneity of the Pantanal habitats, applicable in other regions.
Catalytic performance of heterogeneous Rh/C3N4 for the carbonylation of methanol
NASA Astrophysics Data System (ADS)
Budiman, Anatta Wahyu; Choi, Myoung Jae; Nur, Adrian
2018-02-01
The excess of water in homogeneous the carbonylation of methanol system could increase the amount of by-products formed through water-gas shift reaction and could accelerate the rusting of equipment. Many scientists tried to decrease the content of water in the carbonylation of methanol system by using lithium and iodide promoter that results a moderate catalytic activity in the water content at 2wt%. The heterogenized catalyst offers several distinct advantages such as it was enables increased catalyst concentration in the reaction mixture, which is directly proportional to acetic acid production rate, without the addition of an alkali iodide salt promoter. The heterogeneous catalyst also results in reduced by-product formation. This study is aimed to produce a novel catalyst (Rh/C3N4) with a high selectivity of acetic acid in a relatively lower water and halide content. This novel catalyst performs high conversion and selectivity of acetic acid as the result of the strong ionic bonding of melamine and rhodium complex species that was caused by the presence of methyl iodide species. The CO2 in feed gas significantly decreases the catalytic activity of Rh-melamine because of its inert characteristics. The kinetic test was performed as that the first order kinetic equation. The kinetic tests revealed the reaction route of the the carbonylation of methanol in this system was performed trough the methyl acetate.
NASA Astrophysics Data System (ADS)
McClure, J. E.; Prins, J. F.; Miller, C. T.
2014-07-01
Multiphase flow implementations of the lattice Boltzmann method (LBM) are widely applied to the study of porous medium systems. In this work, we construct a new variant of the popular "color" LBM for two-phase flow in which a three-dimensional, 19-velocity (D3Q19) lattice is used to compute the momentum transport solution while a three-dimensional, seven velocity (D3Q7) lattice is used to compute the mass transport solution. Based on this formulation, we implement a novel heterogeneous GPU-accelerated algorithm in which the mass transport solution is computed by multiple shared memory CPU cores programmed using OpenMP while a concurrent solution of the momentum transport is performed using a GPU. The heterogeneous solution is demonstrated to provide speedup of 2.6 × as compared to multi-core CPU solution and 1.8 × compared to GPU solution due to concurrent utilization of both CPU and GPU bandwidths. Furthermore, we verify that the proposed formulation provides an accurate physical representation of multiphase flow processes and demonstrate that the approach can be applied to perform heterogeneous simulations of two-phase flow in porous media using a typical GPU-accelerated workstation.
Characterizing heterogeneous cellular responses to perturbations.
Slack, Michael D; Martinez, Elisabeth D; Wu, Lani F; Altschuler, Steven J
2008-12-09
Cellular populations have been widely observed to respond heterogeneously to perturbation. However, interpreting the observed heterogeneity is an extremely challenging problem because of the complexity of possible cellular phenotypes, the large dimension of potential perturbations, and the lack of methods for separating meaningful biological information from noise. Here, we develop an image-based approach to characterize cellular phenotypes based on patterns of signaling marker colocalization. Heterogeneous cellular populations are characterized as mixtures of phenotypically distinct subpopulations, and responses to perturbations are summarized succinctly as probabilistic redistributions of these mixtures. We apply our method to characterize the heterogeneous responses of cancer cells to a panel of drugs. We find that cells treated with drugs of (dis-)similar mechanism exhibit (dis-)similar patterns of heterogeneity. Despite the observed phenotypic diversity of cells observed within our data, low-complexity models of heterogeneity were sufficient to distinguish most classes of drug mechanism. Our approach offers a computational framework for assessing the complexity of cellular heterogeneity, investigating the degree to which perturbations induce redistributions of a limited, but nontrivial, repertoire of underlying states and revealing functional significance contained within distinct patterns of heterogeneous responses.
Cooperative growth phenomena in silicon/germanium low-temperature epitaxy
NASA Astrophysics Data System (ADS)
Meyerson, Bernard S.; Uram, Kevin J.; LeGoues, Francoise K.
1988-12-01
A series of Si:Ge alloys and structures has been prepared by ultrahigh-vacuum chemical vapor deposition. Alloys of composition 0≤Ge/Si≤0.20 are readily deposited at T=550 °C. Commensurate, defect-free strained layers are deposited up to a critical thickness, whereupon the accumulated stress in the films is accommodated by the formation of dislocation networks in the substrate wafers. A cooperative growth phenomenon is observed where the addition of 10% germane to the gaseous deposition source accelerates silane's heterogeneous reaction rate by a factor of 25. A model is proposed where Ge acts as a desorption center for mobile hydrogen adatoms on the Si[100] surface, accelerating heterogeneous silane pyrolysis by the enhanced availability of chemisorption sites.
A FFT-based formulation for discrete dislocation dynamics in heterogeneous media
NASA Astrophysics Data System (ADS)
Bertin, N.; Capolungo, L.
2018-02-01
In this paper, an extension of the DDD-FFT approach presented in [1] is developed for heterogeneous elasticity. For such a purpose, an iterative spectral formulation in which convolutions are calculated in the Fourier space is developed to solve for the mechanical state associated with the discrete eigenstrain-based microstructural representation. With this, the heterogeneous DDD-FFT approach is capable of treating anisotropic and heterogeneous elasticity in a computationally efficient manner. In addition, a GPU implementation is presented to allow for further acceleration. As a first example, the approach is used to investigate the interaction between dislocations and second-phase particles, thereby demonstrating its ability to inherently incorporate image forces arising from elastic inhomogeneities.
Towards functional antibody-based vaccines to prevent pre-erythrocytic malaria infection.
Sack, Brandon; Kappe, Stefan H I; Sather, D Noah
2017-05-01
An effective malaria vaccine would be considered a milestone of modern medicine, yet has so far eluded research and development efforts. This can be attributed to the extreme complexity of the malaria parasites, presenting with a multi-stage life cycle, high genome complexity and the parasite's sophisticated immune evasion measures, particularly antigenic variation during pathogenic blood stage infection. However, the pre-erythrocytic (PE) early infection forms of the parasite exhibit relatively invariant proteomes, and are attractive vaccine targets as they offer multiple points of immune system attack. Areas covered: We cover the current state of and roadblocks to the development of an effective, antibody-based PE vaccine, including current vaccine candidates, limited biological knowledge, genetic heterogeneity, parasite complexity, and suboptimal preclinical models as well as the power of early stage clinical models. Expert commentary: PE vaccines will need to elicit broad and durable immunity to prevent infection. This could be achievable if recent innovations in studying the parasites' infection biology, rational vaccine selection and design as well as adjuvant formulation are combined in a synergistic and multipronged approach. Improved preclinical assays as well as the iterative testing of vaccine candidates in controlled human malaria infection trials will further accelerate this effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacón, L., E-mail: chacon@lanl.gov; Chen, G.; Knoll, D.A.
We review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. The HOLOmore » approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.« less
Multiscale high-order/low-order (HOLO) algorithms and applications
NASA Astrophysics Data System (ADS)
Chacón, L.; Chen, G.; Knoll, D. A.; Newman, C.; Park, H.; Taitano, W.; Willert, J. A.; Womeldorff, G.
2017-02-01
We review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. The HOLO approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.
GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing
Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal
2016-01-01
Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300
GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.
Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal
2016-01-01
Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.
Aerosol simulation including chemical and nuclear reactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marwil, E.S.; Lemmon, E.C.
1985-01-01
The numerical simulation of aerosol transport, including the effects of chemical and nuclear reactions presents a challenging dynamic accounting problem. Particles of different sizes agglomerate and settle out due to various mechanisms, such as diffusion, diffusiophoresis, thermophoresis, gravitational settling, turbulent acceleration, and centrifugal acceleration. Particles also change size, due to the condensation and evaporation of materials on the particle. Heterogeneous chemical reactions occur at the interface between a particle and the suspending medium, or a surface and the gas in the aerosol. Homogeneous chemical reactions occur within the aersol suspending medium, within a particle, and on a surface. These reactionsmore » may include a phase change. Nuclear reactions occur in all locations. These spontaneous transmutations from one element form to another occur at greatly varying rates and may result in phase or chemical changes which complicate the accounting process. This paper presents an approach for inclusion of these effects on the transport of aerosols. The accounting system is very complex and results in a large set of stiff ordinary differential equations (ODEs). The techniques for numerical solution of these ODEs require special attention to achieve their solution in an efficient and affordable manner. 4 refs.« less
The big data-big model (BDBM) challenges in ecological research
NASA Astrophysics Data System (ADS)
Luo, Y.
2015-12-01
The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple, heterogeneous data sets; intractability of structural complexity of big models; equifinality of model structure selection and parameter estimation; and computational demand of global optimization with Big Models.
Kyung, Daeseung; Lim, Hyung-Kyu; Kim, Hyungjun; Lee, Woojin
2015-01-20
In this study, we investigated experimentally and computationally the effect of organo-mineral complexes on the nucleation kinetics of CO2 hydrate. These complexes formed via adsorption of zwitter-ionic glycine (Gly-zw) onto the surface of sodium montmorillonite (Na-MMT). The electrostatic attraction between the −NH3(+) group of Gly-zw, and the negatively charged Na-MMT surface, provides the thermodynamic driving force for the organo-mineral complexation. We suggest that the complexation of Gly-zw on the Na-MMT surface accelerates CO2 hydrate nucleation kinetics by increasing the mineral–water interfacial area (thus increasing the number of effective hydrate-nucleation sites), and also by suppressing the thermal fluctuation of solvated Na(+) (a well-known hydrate formation inhibitor) in the vicinity of the mineral surface by coordinating with the −COO(–) groups of Gly-zw. We further confirmed that the local density of hydrate-forming molecules (i.e., reactants of CO2 and water) at the mineral surface (regardless of the presence of Gly-zw) becomes greater than that of bulk phase. This is expected to promote the hydrate nucleation kinetics at the surface. Our study sheds new light on CO2 hydrate nucleation kinetics in heterogeneous marine environments, and could provide knowledge fundamental to successful CO2 sequestration under seabed sediments.
Lemesle, B; Planton, M; Pagès, B; Pariente, J
Temporal lobe epilepsy (TLE) is a type of epilepsy that often has a negative impact on patients' memory. Despite the importance of patients' complaints in this regard, the difficulties described by these patients are often not easy to demonstrate through a standard neuropsychological assessment. Accelerated long-term forgetting and autobiographical memory disorders are the two main memory impairments reported in the literature in patients with TLE. However, the methods used by different authors to evaluate long-term memory and autobiographical memory are heterogeneous. This heterogeneity can lead to differences in the observed results as well as how they are interpreted. Yet, despite the methodological differences, objectification of such memory deficits appears to be both specific and robust within this patient population. Analysis of the literature shows that accelerated long-term forgetting and autobiographical memory disorders share the same clinical characteristics. This leads to the assumption that they are, in fact, only one entity and that their evaluation may be done through a single procedure. Our proposal is to place this evaluation within the context of memory consolidation disorders. With such a perspective, evaluation of accelerated forgetting in autobiographical memory should consist of identifying a disorder in the formation and/or recovery of new memory traces. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
An Experimental Investigation of Incompressible Richtmyer-Meshkov Instability
NASA Technical Reports Server (NTRS)
Jacobs, J. W.; Niederhaus, C. E.
2002-01-01
Richtmyer-Meshkov (RM) instability occurs when two different density fluids are impulsively accelerated in the direction normal to their nearly planar interface. The instability causes small perturbations on the interface to grow and eventually become a turbulent flow. It is closely related to Rayleigh-Taylor instability, which is the instability of a planar interface undergoing constant acceleration, such as caused by the suspension of a heavy fluid over a lighter one in the earth's gravitational field. Like the well-known Kelvin-Helmholtz instability, RM instability is a fundamental hydrodynamic instability which exhibits many of the nonlinear complexities that transform simple initial conditions into a complex turbulent flow. Furthermore, the simplicity of RM instability (in that it requires very few defining parameters), and the fact that it can be generated in a closed container, makes it an excellent test bed to study nonlinear stability theory as well as turbulent transport in a heterogeneous system. However, the fact that RM instability involves fluids of unequal densities which experience negligible gravitational force, except during the impulsive acceleration, requires RM instability experiments to be carried out under conditions of microgravity. This experimental study investigates the instability of an interface between incompressible, miscible liquids with an initial sinusoidal perturbation. The impulsive acceleration is generated by bouncing a rectangular tank containing two different density liquids off a retractable vertical spring. The initial perturbation is produced prior to release by oscillating the tank in the horizontal direction to produce a standing wave. The instability evolves in microgravity as the tank travels up and then down the vertical rails of a drop tower until hitting a shock absorber at the bottom. Planar Laser Induced Fluorescence (PLIF) is employed to visualize the flow. PLIF images are captured by a video camera that travels with the tank. Figure 1 is as sequence of images showing the development of the instability from the initial sinusoidal disturbance far into the nonlinear regime which is characterized by the appearance of mushroom structures resulting from the coalescence of baroclinic vorticity produced by the impulsive acceleration. At later times in this sequence the vortex cores are observed to become unstable showing the beginnings of the transition to turbulence in this flow. The amplitude of the growing disturbance after the impulsive acceleration is measured and found to agree well with theoretical predictions. The effects of Reynolds number (based on circulation) on the development of the vortices and the transition to turbulence are also determined.
Lorenc, Theo; Felix, Lambert; Petticrew, Mark; Melendez-Torres, G J; Thomas, James; Thomas, Sian; O'Mara-Eves, Alison; Richardson, Michelle
2016-11-16
Complex or heterogeneous data pose challenges for systematic review and meta-analysis. In recent years, a number of new methods have been developed to meet these challenges. This qualitative interview study aimed to understand researchers' understanding of complexity and heterogeneity and the factors which may influence the choices researchers make in synthesising complex data. We conducted interviews with a purposive sample of researchers (N = 19) working in systematic review or meta-analysis across a range of disciplines. We analysed data thematically using a framework approach. Participants reported using a broader range of methods and data types in complex reviews than in traditional reviews. A range of techniques are used to explore heterogeneity, but there is some debate about their validity, particularly when applied post hoc. Technical considerations of how to synthesise complex evidence cannot be isolated from questions of the goals and contexts of research. However, decisions about how to analyse data appear to be made in a largely informal way, drawing on tacit expertise, and their relation to these broader questions remains unclear.
Mahapatra, Mausumi; Burkholder, Luke; Garvey, Michael; ...
2016-08-04
Unmodified racemic sites on heterogeneous chiral catalysts reduce their overall enantioselectivity, but this effect is mitigated in the Orito reaction (methyl pyruvate (MP) hydrogenation to methyl lactate) by an increased hydrogenation reactivity. Here, this effect is explored on a R-1-(1-naphthyl)ethylamine (NEA)-modified Pd(111) model catalyst where temperature-programmed desorption experiments reveal that NEA accelerates the rates of both MP hydrogenation and H/D exchange. NEAþMP docking complexes are imaged using scanning tunneling microscopy supplemented by density functional theory calculations to allow the most stable docking complexes to be identified. The results show that diastereomeric interactions between NEA and MP occur predominantly by bindingmore » of the C=C of the enol tautomer of MP to the surface, while simultaneously optimizing C=O...H 2N hydrogen-bonding interactions. In conclusion, the combination of chiral-NEA driven diastereomeric docking with a tautomeric preference enhances the hydrogenation activity since C=C bonds hydrogenate more easily than C=O bonds thus providing a rationale for the catalytic observations.« less
NASA Astrophysics Data System (ADS)
Glæsner, Nadia; Leue, Marin; Magid, Jacob; Gerke, Horst H.
2016-04-01
Understanding the heterogeneous nature of soil, i.e. properties and processes occurring specifically at local scales is essential for best managing our soil resources for agricultural production. Examination of intact soil structures in order to obtain an increased understanding of how soil systems operate from small to large scale represents a large gap within soil science research. Dissolved chemicals, nutrients and particles are transported through the disturbed plow layer of agricultural soil, where after flow through the lower soil layers occur by preferential flow via macropores. Rapid movement of water through macropores limit the contact between the preferentially moving water and the surrounding soil matrix, therefore contact and exchange of solutes in the water is largely restricted to the surface area of the macropores. Organomineral complex coated surfaces control sorption and exchange properties of solutes, as well as availability of essential nutrients to plant roots and to the preferentially flowing water. DRIFT (Diffuse Reflectance infrared Fourier Transform) Mapping has been developed to examine composition of organic matter coated macropores. In this study macropore surfaces structures will be determined for organic matter composition using DRIFT from a long-term field experiment on waste application to agricultural soil (CRUCIAL, close to Copenhagen, Denmark). Parcels with 5 treatments; accelerated household waste, accelerated sewage sludge, accelerated cattle manure, NPK and unfertilized, will be examined in order to study whether agricultural management have an impact on the organic matter composition of intact structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver
2009-11-20
Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less
NASA Astrophysics Data System (ADS)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.
NASA Astrophysics Data System (ADS)
Wieder, William R.; Knowles, John F.; Blanken, Peter D.; Swenson, Sean C.; Suding, Katharine N.
2017-04-01
Abiotic factors structure plant community composition and ecosystem function across many different spatial scales. Often, such variation is considered at regional or global scales, but here we ask whether ecosystem-scale simulations can be used to better understand landscape-level variation that might be particularly important in complex terrain, such as high-elevation mountains. We performed ecosystem-scale simulations by using the Community Land Model (CLM) version 4.5 to better understand how the increased length of growing seasons may impact carbon, water, and energy fluxes in an alpine tundra landscape. The model was forced with meteorological data and validated with observations from the Niwot Ridge Long Term Ecological Research Program site. Our results demonstrate that CLM is capable of reproducing the observed carbon, water, and energy fluxes for discrete vegetation patches across this heterogeneous ecosystem. We subsequently accelerated snowmelt and increased spring and summer air temperatures in order to simulate potential effects of climate change in this region. We found that vegetation communities that were characterized by different snow accumulation dynamics showed divergent biogeochemical responses to a longer growing season. Contrary to expectations, wet meadow ecosystems showed the strongest decreases in plant productivity under extended summer scenarios because of disruptions in hydrologic connectivity. These findings illustrate how Earth system models such as CLM can be used to generate testable hypotheses about the shifting nature of energy, water, and nutrient limitations across space and through time in heterogeneous landscapes; these hypotheses may ultimately guide further experimental work and model development.
Classification of physical activities based on body-segments coordination.
Fradet, Laetitia; Marin, Frederic
2016-09-01
Numerous innovations based on connected objects and physical activity (PA) monitoring have been proposed. However, recognition of PAs requires robust algorithm and methodology. The current study presents an innovative approach for PA recognition. It is based on the heuristic definition of postures and the use of body-segments coordination obtained through external sensors. The first part of this study presents the methodology required to define the set of accelerations which is the most appropriate to represent the particular body-segments coordination involved in the chosen PAs (here walking, running, and cycling). For that purpose, subjects of different ages and heterogeneous physical conditions walked, ran, cycled, and performed daily activities at different paces. From the 3D motion capture, vertical and horizontal accelerations of 8 anatomical landmarks representative of the body were computed. Then, the 680 combinations from up to 3 accelerations were compared to identify the most appropriate set of acceleration to discriminate the PAs in terms of body segment coordinations. The discrimination was based on the maximal Hausdorff Distance obtained between the different set of accelerations. The vertical accelerations of both knees demonstrated the best PAs discrimination. The second step was the proof of concept, implementing the proposed algorithm to classify PAs of new group of subjects. The originality of the proposed algorithm is the possibility to use the subject's specific measures as reference data. With the proposed algorithm, 94% of the trials were correctly classified. In conclusion, our study proposed a flexible and extendable methodology. At the current stage, the algorithm has been shown to be valid for heterogeneous subjects, which suggests that it could be deployed in clinical or health-related applications regardless of the subjects' physical abilities or characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Catalytic photodegradation of pharmaceuticals - homogeneous and heterogeneous photocatalysis.
Klementova, S; Kahoun, D; Doubkova, L; Frejlachova, K; Dusakova, M; Zlamal, M
2017-01-18
Photocatalytic degradation of pharmaceuticals (hydrocortisone, estradiol, and verapamil) and personal care product additives (parabens-methyl, ethyl, and propyl derivatives) was investigated in the homogeneous phase (with ferric ions as the catalyst) and on TiO 2 . Ferric ions in concentrations corresponding to concentrations in natural water bodies were shown to be a significant accelerator of the degradation in homogeneous reaction mixtures. In heterogeneous photocatalytic reactions on TiO 2 , lower reaction rates, but mineralisation to higher extents, were observed.
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B.
2018-01-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support. PMID:29629431
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices.
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B
2017-06-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support.
Armstrong, Jonathan B.; Schindler, Daniel E.; Ruff, Casey P.; Brooks, Gabriel T.; Bentley, Kale E.; Torgersen, Christian E.
2013-01-01
Vertical heterogeneity in the physical characteristics of lakes and oceans is ecologically salient and exploited by a wide range of taxa through diel vertical migration to enhance their growth and survival. Whether analogous behaviors exploit horizontal habitat heterogeneity in streams is largely unknown. We investigated fish movement behavior at daily timescales to explore how individuals integrated across spatial variation in food abundance and water temperature. Juvenile coho salmon made feeding forays into cold habitats with abundant food, and then moved long distances (350–1300 m) to warmer habitats that accelerated their metabolism and increased their assimilative capacity. This behavioral thermoregulation enabled fish to mitigate trade-offs between trophic and thermal resources by exploiting thermal heterogeneity. Fish that exploited thermal heterogeneity grew at substantially faster rates than did individuals that assumed other behaviors. Our results provide empirical support for the importance of thermal diversity in lotic systems, and emphasize the importance of considering interactions between animal behavior and habitat heterogeneity when managing and restoring ecosystems.
NASA Astrophysics Data System (ADS)
Bolshakov, A. E.; Golubev, A. A.; Zenkevich, P. R.; Kats, M. M.; Kolomiets, A. A.
2014-09-01
We report the results of a study into the feasibility of conducting the ELISE and EXL experiments on collisions of nuclei of radioactive fragments with electrons at the Institute for Theoretical and Experimental Physics (ITEP). A scheme for uranium ion acceleration in the ITEP accelerator complex is chosen, and it is shown that uranium ions may be accelerated with an intensity of ˜1 × 1011 ions/s as soon as the complex is modified and a new injector is constructed. The basic parameters of the modified complex are given, and a layout diagram indicating the positions of the target that serves to produce radioactive fragments, the separator, and the storage rings (CR, RESR, NESR, and ER) at the ITEP site is presented.
Karalliedde, Janaka; Gnudi, Luigi
2016-02-01
Diabetes mellitus (DM) is increasingly recognized as a heterogeneous condition. The individualization of care and treatment necessitates an understanding of the individual patient's pathophysiology of DM that underpins their DM classification and clinical presentation. Classical type-2 diabetes mellitus is due to a combination of insulin resistance and an insulin secretory defect. Type-1 diabetes is characterized by a near-absolute deficiency of insulin secretion. More recently, advances in genetics and a better appreciation of the atypical features of DM has resulted in more categories of diabetes. In the context of kidney disease, patients with DM and microalbuminuria are more insulin resistant, and insulin resistance may be a pathway that results in accelerated progression of diabetic kidney disease. This review summarizes the updated classification of DM, including more rarer categories and their associated renal manifestations that need to be considered in patients who present with atypical features. The benefits and limitations of the tests utilized to make a diagnosis of DM are discussed. We also review the putative pathways and mechanisms by which insulin resistance drives the progression of diabetic kidney disease. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Multiscale high-order/low-order (HOLO) algorithms and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacon, Luis; Chen, Guangye; Knoll, Dana Alan
Here, we review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. Themore » HOLO approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.« less
Multiscale high-order/low-order (HOLO) algorithms and applications
Chacon, Luis; Chen, Guangye; Knoll, Dana Alan; ...
2016-11-11
Here, we review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. Themore » HOLO approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.« less
Optimizing structure of complex technical system by heterogeneous vector criterion in interval form
NASA Astrophysics Data System (ADS)
Lysenko, A. V.; Kochegarov, I. I.; Yurkov, N. K.; Grishko, A. K.
2018-05-01
The article examines the methods of development and multi-criteria choice of the preferred structural variant of the complex technical system at the early stages of its life cycle in the absence of sufficient knowledge of parameters and variables for optimizing this structure. The suggested methods takes into consideration the various fuzzy input data connected with the heterogeneous quality criteria of the designed system and the parameters set by their variation range. The suggested approach is based on the complex use of methods of interval analysis, fuzzy sets theory, and the decision-making theory. As a result, the method for normalizing heterogeneous quality criteria has been developed on the basis of establishing preference relations in the interval form. The method of building preferential relations in the interval form on the basis of the vector of heterogeneous quality criteria suggest the use of membership functions instead of the coefficients considering the criteria value. The former show the degree of proximity of the realization of the designed system to the efficient or Pareto optimal variants. The study analyzes the example of choosing the optimal variant for the complex system using heterogeneous quality criteria.
Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming
2011-02-01
High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.
Statistically Validated Networks in Bipartite Complex Systems
Tumminello, Michele; Miccichè, Salvatore; Lillo, Fabrizio; Piilo, Jyrki; Mantegna, Rosario N.
2011-01-01
Many complex systems present an intrinsic bipartite structure where elements of one set link to elements of the second set. In these complex systems, such as the system of actors and movies, elements of one set are qualitatively different than elements of the other set. The properties of these complex systems are typically investigated by constructing and analyzing a projected network on one of the two sets (for example the actor network or the movie network). Complex systems are often very heterogeneous in the number of relationships that the elements of one set establish with the elements of the other set, and this heterogeneity makes it very difficult to discriminate links of the projected network that are just reflecting system's heterogeneity from links relevant to unveil the properties of the system. Here we introduce an unsupervised method to statistically validate each link of a projected network against a null hypothesis that takes into account system heterogeneity. We apply the method to a biological, an economic and a social complex system. The method we propose is able to detect network structures which are very informative about the organization and specialization of the investigated systems, and identifies those relationships between elements of the projected network that cannot be explained simply by system heterogeneity. We also show that our method applies to bipartite systems in which different relationships might have different qualitative nature, generating statistically validated networks in which such difference is preserved. PMID:21483858
Toward benchmarking in catalysis science: Best practices, challenges, and opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bligaard, Thomas; Bullock, R. Morris; Campbell, Charles T.
Benchmarking is a community-based and (preferably) community-driven activity involving consensus-based decisions on how to make reproducible, fair, and relevant assessments. In catalysis science, important catalyst performance metrics include activity, selectivity, and the deactivation profile, which enable comparisons between new and standard catalysts. Benchmarking also requires careful documentation, archiving, and sharing of methods and measurements, to ensure that the full value of research data can be realized. Beyond these goals, benchmarking presents unique opportunities to advance and accelerate understanding of complex reaction systems by combining and comparing experimental information from multiple, in situ and operando techniques with theoretical insights derived frommore » calculations characterizing model systems. This Perspective describes the origins and uses of benchmarking and its applications in computational catalysis, heterogeneous catalysis, molecular catalysis, and electrocatalysis. As a result, it also discusses opportunities and challenges for future developments in these fields.« less
A roadmap towards personalized immunology.
Delhalle, Sylvie; Bode, Sebastian F N; Balling, Rudi; Ollert, Markus; He, Feng Q
2018-01-01
Big data generation and computational processing will enable medicine to evolve from a "one-size-fits-all" approach to precise patient stratification and treatment. Significant achievements using "Omics" data have been made especially in personalized oncology. However, immune cells relative to tumor cells show a much higher degree of complexity in heterogeneity, dynamics, memory-capability, plasticity and "social" interactions. There is still a long way ahead on translating our capability to identify potentially targetable personalized biomarkers into effective personalized therapy in immune-centralized diseases. Here, we discuss the recent advances and successful applications in "Omics" data utilization and network analysis on patients' samples of clinical trials and studies, as well as the major challenges and strategies towards personalized stratification and treatment for infectious or non-communicable inflammatory diseases such as autoimmune diseases or allergies. We provide a roadmap and highlight experimental, clinical, computational analysis, data management, ethical and regulatory issues to accelerate the implementation of personalized immunology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojick, D E; Warnick, W L; Carroll, B C
With the United States federal government spending billions annually for research and development, ways to increase the productivity of that research can have a significant return on investment. The process by which science knowledge is spread is called diffusion. It is therefore important to better understand and measure the benefits of this diffusion of knowledge. In particular, it is important to understand whether advances in Internet searching can speed up the diffusion of scientific knowledge and accelerate scientific progress despite the fact that the vast majority of scientific information resources continue to be held in deep web databases that manymore » search engines cannot fully access. To address the complexity of the search issue, the term global discovery is used for the act of searching across heterogeneous environments and distant communities. This article discusses these issues and describes research being conducted by the Office of Scientific and Technical Information (OSTI).« less
Toward Personalized Targeted Therapeutics: An Overview.
Weathers, Shiao-Pei S; Gilbert, Mark R
2017-04-01
In neuro-oncology, there has been a movement towards personalized medicine, or tailoring treatment to the individual patient. Ideally, tumor and patient evaluations would lead to the selection of the best treatment (based on tumor characterization) and the right dosing schedule (based on patient characterization). The recent advances in the molecular analysis of glioblastoma have created optimism that personalized targeted therapy is within reach. Although our understanding of the molecular complexity of glioblastoma has increased over the years, the path to developing effective targeted therapeutic strategies is wrought with many challenges, as described in this review. These challenges include disease heterogeneity, clinical and genomic patient variability, limited number of effective treatments, clinical trial inefficiency, drug delivery, and clinical trial support and accrual. To confront these challenges, it will be imperative to devise innovative and adaptive clinical trials in order to accelerate our efforts in improving the outcomes for our patients who have been in desperate need.
Toward benchmarking in catalysis science: Best practices, challenges, and opportunities
Bligaard, Thomas; Bullock, R. Morris; Campbell, Charles T.; ...
2016-03-07
Benchmarking is a community-based and (preferably) community-driven activity involving consensus-based decisions on how to make reproducible, fair, and relevant assessments. In catalysis science, important catalyst performance metrics include activity, selectivity, and the deactivation profile, which enable comparisons between new and standard catalysts. Benchmarking also requires careful documentation, archiving, and sharing of methods and measurements, to ensure that the full value of research data can be realized. Beyond these goals, benchmarking presents unique opportunities to advance and accelerate understanding of complex reaction systems by combining and comparing experimental information from multiple, in situ and operando techniques with theoretical insights derived frommore » calculations characterizing model systems. This Perspective describes the origins and uses of benchmarking and its applications in computational catalysis, heterogeneous catalysis, molecular catalysis, and electrocatalysis. As a result, it also discusses opportunities and challenges for future developments in these fields.« less
Accelerating Climate Simulations Through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark
2009-01-01
Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.
Anomalous Transport of High Energy Cosmic Rays in Galactic Superbubbles
NASA Technical Reports Server (NTRS)
Barghouty, Nasser F.
2014-01-01
High-energy cosmic rays may exhibit anomalous transport as they traverse and are accelerated by a collection of supernovae explosions in a galactic superbubble. Signatures of this anomalous transport can show up in the particles' evolution and their spectra. In a continuous-time-random- walk (CTRW) model assuming standard diffusive shock acceleration theory (DSA) for each shock encounter, and where the superbubble (an OB stars association) is idealized as a heterogeneous region of particle sources and sinks, acceleration and transport in the superbubble can be shown to be sub-diffusive. While the sub-diffusive transport can be attributed to the stochastic nature of the acceleration time according to DSA theory, the spectral break appears to be an artifact of transport in a finite medium. These CTRW simulations point to a new and intriguing phenomenon associated with the statistical nature of collective acceleration of high energy cosmic rays in galactic superbubbles.
Phenotypic and genotypic heterogeneity of Lynch syndrome: a complex diagnostic challenge.
Lynch, Henry T; Lanspa, Stephen; Shaw, Trudy; Casey, Murray Joseph; Rendell, Marc; Stacey, Mark; Townley, Theresa; Snyder, Carrie; Hitchins, Megan; Bailey-Wilson, Joan
2018-07-01
Lynch syndrome is the hereditary disorder that most frequently predisposes to colorectal cancer as well as predisposing to a number of extracolonic cancers, most prominently endometrial cancer. It is caused by germline mutations in the mismatch repair genes. Both its phenotype and genotype show marked heterogeneity. This review gives a historical overview of the syndrome, its heterogeneity, its genomic landscape, and its implications for complex diagnosis, genetic counseling and putative implications for immunotherapy.
Predicting sample lifetimes in creep fracture of heterogeneous materials
NASA Astrophysics Data System (ADS)
Koivisto, Juha; Ovaska, Markus; Miksic, Amandine; Laurson, Lasse; Alava, Mikko J.
2016-08-01
Materials flow—under creep or constant loads—and, finally, fail. The prediction of sample lifetimes is an important and highly challenging problem because of the inherently heterogeneous nature of most materials that results in large sample-to-sample lifetime fluctuations, even under the same conditions. We study creep deformation of paper sheets as one heterogeneous material and thus show how to predict lifetimes of individual samples by exploiting the "universal" features in the sample-inherent creep curves, particularly the passage to an accelerating creep rate. Using simulations of a viscoelastic fiber bundle model, we illustrate how deformation localization controls the shape of the creep curve and thus the degree of lifetime predictability.
Ion acceleration in shell cylinders irradiated by a short intense laser pulse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreev, A.; ELI-ALPS, Szeged; Platonov, K.
The interaction of a short high intensity laser pulse with homo and heterogeneous shell cylinders has been analyzed using particle-in-cell simulations and analytical modeling. We show that the shell cylinder is proficient of accelerating and focusing ions in a narrow region. In the case of shell cylinder, the ion energy exceeds the ion energy for a flat target of the same thickness. The constructed model enables the evaluation of the ion energy and the number of ions in the focusing region.
Karan, Chandan Kumar; Bhattacharjee, Manish
2018-04-16
Two new bimetallic iron-alkali metal complexes of amino acid (serine)-based reduced Schiff base ligand were synthesized and structurally characterized. Their efficacy as catalysts for the chemical fixation of carbon dioxide was explored. The heterogeneous version of the catalytic reaction was developed by the immobilization of these homogeneous bimetallic iron-alkali metal complexes in an anion-exchange resin. The resin-bound complexes can be used as recyclable catalysts up to six cycles.
Contagion on complex networks with persuasion
NASA Astrophysics Data System (ADS)
Huang, Wei-Min; Zhang, Li-Jie; Xu, Xin-Jian; Fu, Xinchu
2016-03-01
The threshold model has been widely adopted as a classic model for studying contagion processes on social networks. We consider asymmetric individual interactions in social networks and introduce a persuasion mechanism into the threshold model. Specifically, we study a combination of adoption and persuasion in cascading processes on complex networks. It is found that with the introduction of the persuasion mechanism, the system may become more vulnerable to global cascades, and the effects of persuasion tend to be more significant in heterogeneous networks than those in homogeneous networks: a comparison between heterogeneous and homogeneous networks shows that under weak persuasion, heterogeneous networks tend to be more robust against random shocks than homogeneous networks; whereas under strong persuasion, homogeneous networks are more stable. Finally, we study the effects of adoption and persuasion threshold heterogeneity on systemic stability. Though both heterogeneities give rise to global cascades, the adoption heterogeneity has an overwhelmingly stronger impact than the persuasion heterogeneity when the network connectivity is sufficiently dense.
Contagion on complex networks with persuasion
Huang, Wei-Min; Zhang, Li-Jie; Xu, Xin-Jian; Fu, Xinchu
2016-01-01
The threshold model has been widely adopted as a classic model for studying contagion processes on social networks. We consider asymmetric individual interactions in social networks and introduce a persuasion mechanism into the threshold model. Specifically, we study a combination of adoption and persuasion in cascading processes on complex networks. It is found that with the introduction of the persuasion mechanism, the system may become more vulnerable to global cascades, and the effects of persuasion tend to be more significant in heterogeneous networks than those in homogeneous networks: a comparison between heterogeneous and homogeneous networks shows that under weak persuasion, heterogeneous networks tend to be more robust against random shocks than homogeneous networks; whereas under strong persuasion, homogeneous networks are more stable. Finally, we study the effects of adoption and persuasion threshold heterogeneity on systemic stability. Though both heterogeneities give rise to global cascades, the adoption heterogeneity has an overwhelmingly stronger impact than the persuasion heterogeneity when the network connectivity is sufficiently dense. PMID:27029498
Contagion on complex networks with persuasion.
Huang, Wei-Min; Zhang, Li-Jie; Xu, Xin-Jian; Fu, Xinchu
2016-03-31
The threshold model has been widely adopted as a classic model for studying contagion processes on social networks. We consider asymmetric individual interactions in social networks and introduce a persuasion mechanism into the threshold model. Specifically, we study a combination of adoption and persuasion in cascading processes on complex networks. It is found that with the introduction of the persuasion mechanism, the system may become more vulnerable to global cascades, and the effects of persuasion tend to be more significant in heterogeneous networks than those in homogeneous networks: a comparison between heterogeneous and homogeneous networks shows that under weak persuasion, heterogeneous networks tend to be more robust against random shocks than homogeneous networks; whereas under strong persuasion, homogeneous networks are more stable. Finally, we study the effects of adoption and persuasion threshold heterogeneity on systemic stability. Though both heterogeneities give rise to global cascades, the adoption heterogeneity has an overwhelmingly stronger impact than the persuasion heterogeneity when the network connectivity is sufficiently dense.
Bogaerts, Thomas; Van Yperen-De Deyne, Andy; Liu, Ying-Ya; Lynen, Frederic; Van Speybroeck, Veronique; Van Der Voort, Pascal
2013-09-21
An enantioselective catalyst, consisting of a chiral Mn(III)salen complex entrapped in the MIL-101 metal organic framework, is reported. For the first time, we assemble a robust MOF-cage around a chiral complex. The heterogeneous catalyst shows the same selectivity as the homogeneous complex and is fully recyclable. Theoretical calculations provide insight into this retention of selectivity.
NASA Astrophysics Data System (ADS)
Manfredi, Sabato
2018-05-01
The pinning/leader control problems provide the design of the leader or pinning controller in order to guide a complex network to a desired trajectory or target (synchronisation or consensus). Let a time-invariant complex network, pinning/leader control problems include the design of the leader or pinning controller gain and number of nodes to pin in order to guide a network to a desired trajectory (synchronization or consensus). Usually, lower is the number of pinned nodes larger is the pinning gain required to assess network synchronisation. On the other side, realistic application scenario of complex networks is characterised by switching topologies, time-varying node coupling strength and link weight that make hard to solve the pinning/leader control problem. Additionally, the system dynamics at nodes can be heterogeneous. In this paper, we derive robust stabilisation conditions of time-varying heterogeneous complex networks with jointly connected topologies when coupling strength and link weight interactions are affected by time-varying uncertainties. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, we formulate low computationally demanding stabilisability conditions to design a pinning/leader control gain for robust network synchronisation. The effectiveness of the proposed approach is shown by several design examples applied to a paradigmatic well-known complex network composed of heterogeneous Chua's circuits.
NASA Astrophysics Data System (ADS)
Wlodarczyk, Jakub; Kierdaszuk, Borys
2005-08-01
Decays of tyrosine fluorescence in protein-ligand complexes are described by a model of continuous distribution of fluorescence lifetimes. Resulted analytical power-like decay function provides good fits to highly complex fluorescence kinetics. Moreover, this is a manifestation of so-called Tsallis q-exponential function, which is suitable for description of the systems with long-range interactions, memory effect, as well as with fluctuations of the characteristic lifetime of fluorescence. The proposed decay functions were applied to analysis of fluorescence decays of tyrosine in a protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli (the product of the deoD gene), free in aqueous solution and in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate). The power-like function provides new information about enzyme-ligand complex formation based on the physically justified heterogeneity parameter directly related to the lifetime distribution. A measure of the heterogeneity parameter in the enzyme systems is provided by a variance of fluorescence lifetime distribution. The possible number of deactivation channels and excited state mean lifetime can be easily derived without a priori knowledge of the complexity of studied system. Moreover, proposed model is simpler then traditional multi-exponential one, and better describes heterogeneous nature of studied systems.
The Spallation Neutron Source accelerator system design
NASA Astrophysics Data System (ADS)
Henderson, S.; Abraham, W.; Aleksandrov, A.; Allen, C.; Alonso, J.; Anderson, D.; Arenius, D.; Arthur, T.; Assadi, S.; Ayers, J.; Bach, P.; Badea, V.; Battle, R.; Beebe-Wang, J.; Bergmann, B.; Bernardin, J.; Bhatia, T.; Billen, J.; Birke, T.; Bjorklund, E.; Blaskiewicz, M.; Blind, B.; Blokland, W.; Bookwalter, V.; Borovina, D.; Bowling, S.; Bradley, J.; Brantley, C.; Brennan, J.; Brodowski, J.; Brown, S.; Brown, R.; Bruce, D.; Bultman, N.; Cameron, P.; Campisi, I.; Casagrande, F.; Catalan-Lasheras, N.; Champion, M.; Champion, M.; Chen, Z.; Cheng, D.; Cho, Y.; Christensen, K.; Chu, C.; Cleaves, J.; Connolly, R.; Cote, T.; Cousineau, S.; Crandall, K.; Creel, J.; Crofford, M.; Cull, P.; Cutler, R.; Dabney, R.; Dalesio, L.; Daly, E.; Damm, R.; Danilov, V.; Davino, D.; Davis, K.; Dawson, C.; Day, L.; Deibele, C.; Delayen, J.; DeLong, J.; Demello, A.; DeVan, W.; Digennaro, R.; Dixon, K.; Dodson, G.; Doleans, M.; Doolittle, L.; Doss, J.; Drury, M.; Elliot, T.; Ellis, S.; Error, J.; Fazekas, J.; Fedotov, A.; Feng, P.; Fischer, J.; Fox, W.; Fuja, R.; Funk, W.; Galambos, J.; Ganni, V.; Garnett, R.; Geng, X.; Gentzlinger, R.; Giannella, M.; Gibson, P.; Gillis, R.; Gioia, J.; Gordon, J.; Gough, R.; Greer, J.; Gregory, W.; Gribble, R.; Grice, W.; Gurd, D.; Gurd, P.; Guthrie, A.; Hahn, H.; Hardek, T.; Hardekopf, R.; Harrison, J.; Hatfield, D.; He, P.; Hechler, M.; Heistermann, F.; Helus, S.; Hiatt, T.; Hicks, S.; Hill, J.; Hill, J.; Hoff, L.; Hoff, M.; Hogan, J.; Holding, M.; Holik, P.; Holmes, J.; Holtkamp, N.; Hovater, C.; Howell, M.; Hseuh, H.; Huhn, A.; Hunter, T.; Ilg, T.; Jackson, J.; Jain, A.; Jason, A.; Jeon, D.; Johnson, G.; Jones, A.; Joseph, S.; Justice, A.; Kang, Y.; Kasemir, K.; Keller, R.; Kersevan, R.; Kerstiens, D.; Kesselman, M.; Kim, S.; Kneisel, P.; Kravchuk, L.; Kuneli, T.; Kurennoy, S.; Kustom, R.; Kwon, S.; Ladd, P.; Lambiase, R.; Lee, Y. Y.; Leitner, M.; Leung, K.-N.; Lewis, S.; Liaw, C.; Lionberger, C.; Lo, C. C.; Long, C.; Ludewig, H.; Ludvig, J.; Luft, P.; Lynch, M.; Ma, H.; MacGill, R.; Macha, K.; Madre, B.; Mahler, G.; Mahoney, K.; Maines, J.; Mammosser, J.; Mann, T.; Marneris, I.; Marroquin, P.; Martineau, R.; Matsumoto, K.; McCarthy, M.; McChesney, C.; McGahern, W.; McGehee, P.; Meng, W.; Merz, B.; Meyer, R.; Meyer, R.; Miller, B.; Mitchell, R.; Mize, J.; Monroy, M.; Munro, J.; Murdoch, G.; Musson, J.; Nath, S.; Nelson, R.; Nelson, R.; O`Hara, J.; Olsen, D.; Oren, W.; Oshatz, D.; Owens, T.; Pai, C.; Papaphilippou, I.; Patterson, N.; Patterson, J.; Pearson, C.; Pelaia, T.; Pieck, M.; Piller, C.; Plawski, T.; Plum, M.; Pogge, J.; Power, J.; Powers, T.; Preble, J.; Prokop, M.; Pruyn, J.; Purcell, D.; Rank, J.; Raparia, D.; Ratti, A.; Reass, W.; Reece, K.; Rees, D.; Regan, A.; Regis, M.; Reijonen, J.; Rej, D.; Richards, D.; Richied, D.; Rode, C.; Rodriguez, W.; Rodriguez, M.; Rohlev, A.; Rose, C.; Roseberry, T.; Rowton, L.; Roybal, W.; Rust, K.; Salazer, G.; Sandberg, J.; Saunders, J.; Schenkel, T.; Schneider, W.; Schrage, D.; Schubert, J.; Severino, F.; Shafer, R.; Shea, T.; Shishlo, A.; Shoaee, H.; Sibley, C.; Sims, J.; Smee, S.; Smith, J.; Smith, K.; Spitz, R.; Staples, J.; Stein, P.; Stettler, M.; Stirbet, M.; Stockli, M.; Stone, W.; Stout, D.; Stovall, J.; Strelo, W.; Strong, H.; Sundelin, R.; Syversrud, D.; Szajbler, M.; Takeda, H.; Tallerico, P.; Tang, J.; Tanke, E.; Tepikian, S.; Thomae, R.; Thompson, D.; Thomson, D.; Thuot, M.; Treml, C.; Tsoupas, N.; Tuozzolo, J.; Tuzel, W.; Vassioutchenko, A.; Virostek, S.; Wallig, J.; Wanderer, P.; Wang, Y.; Wang, J. G.; Wangler, T.; Warren, D.; Wei, J.; Weiss, D.; Welton, R.; Weng, J.; Weng, W.-T.; Wezensky, M.; White, M.; Whitlatch, T.; Williams, D.; Williams, E.; Wilson, K.; Wiseman, M.; Wood, R.; Wright, P.; Wu, A.; Ybarrolaza, N.; Young, K.; Young, L.; Yourd, R.; Zachoszcz, A.; Zaltsman, A.; Zhang, S.; Zhang, W.; Zhang, Y.; Zhukov, A.
2014-11-01
The Spallation Neutron Source (SNS) was designed and constructed by a collaboration of six U.S. Department of Energy national laboratories. The SNS accelerator system consists of a 1 GeV linear accelerator and an accumulator ring providing 1.4 MW of proton beam power in microsecond-long beam pulses to a liquid mercury target for neutron production. The accelerator complex consists of a front-end negative hydrogen-ion injector system, an 87 MeV drift tube linear accelerator, a 186 MeV side-coupled linear accelerator, a 1 GeV superconducting linear accelerator, a 248-m circumference accumulator ring and associated beam transport lines. The accelerator complex is supported by ~100 high-power RF power systems, a 2 K cryogenic plant, ~400 DC and pulsed power supply systems, ~400 beam diagnostic devices and a distributed control system handling ~100,000 I/O signals. The beam dynamics design of the SNS accelerator is presented, as is the engineering design of the major accelerator subsystems.
NASA Astrophysics Data System (ADS)
Kalenchuk, K. S.; Hutchinson, D.; Diederichs, M. S.
2013-12-01
Downie Slide, one of the world's largest landslides, is a massive, active, composite, extremely slow rockslide located on the west bank of the Revelstoke Reservoir in British Columbia. It is a 1.5 billion m3 rockslide measuring 2400 m along the river valley, 3300m from toe to headscarp and up to 245 m thick. Significant contributions to the field of landslide geomechanics have been made by analyses of spatially and temporally discriminated slope deformations, and how these are controlled by complex geological and geotechnical factors. Downie Slide research demonstrates the importance of delineating massive landslides into morphological regions in order to characterize global slope behaviour and identify localized events, which may or may not influence the overall slope deformation patterns. Massive slope instabilities do not behave as monolithic masses, rather, different landslide zones can display specific landslide processes occurring at variable rates of deformation. The global deformation of Downie Slide is extremely slow moving; however localized regions of the slope incur moderate to high rates of movement. Complex deformation processes and composite failure mechanism are contributed to by topography, non-uniform shear surfaces, heterogeneous rockmass and shear zone strength and stiffness characteristics. Further, from the analysis of temporal changes in landslide behaviour it has been clearly recognized that different regions of the slope respond differently to changing hydrogeological boundary conditions. State-of-the-art methodologies have been developed for numerical simulation of large landslides; these provide important tools for investigating dynamic landslide systems which account for complex three-dimensional geometries, heterogenous shear zone strength parameters, internal shear zones, the interaction of discrete landslide zones and piezometric fluctuations. Numerical models of Downie Slide have been calibrated to reproduce observed slope behaviour, and the calibration process has provided important insight to key factors controlling massive slope mechanics. Through numerical studies it has been shown that the three-dimensional interpretation of basal slip surface geometry and spatial heterogeneity in shear zone stiffness are important factors controlling large-scale slope deformation processes. The role of secondary internal shears and the interaction between landslide morphological zones has also been assessed. Further, numerical simulation of changing groundwater conditions has produced reasonable correlation with field observations. Calibrated models are valuable tools for the forward prediction of landslide dynamics. Calibrated Downie Slide models have been used to investigate how trigger scenarios may accelerate deformations at Downie Slide. The ability to reproduce observed behaviour and forward test hypothesized changes to boundary conditions has valuable application in hazard management of massive landslides. The capacity of decision makers to interpret large amounts of data, respond to rapid changes in a system and understand complex slope dynamics has been enhanced.
NASA Astrophysics Data System (ADS)
Guzmán, H. A.; Lárraga, M. E.; Alvarez-Icaza, L.; Carvajal, J.
2018-02-01
In this paper, a reliable cellular automata model oriented to faithfully reproduce deceleration and acceleration according to realistic reactions of drivers, when vehicles with different deceleration capabilities are considered is presented. The model focuses on describing complex traffic phenomena by coding in its rules the basic mechanisms of drivers behavior, vehicles capabilities and kinetics, while preserving simplicity. In particular, vehiclés kinetics is based on uniform accelerated motion, rather than in impulsive accelerated motion as in most existing CA models. Thus, the proposed model calculates in an analytic way three safe preserving distances to determine the best action a follower vehicle can take under a worst case scenario. Besides, the prediction analysis guarantees that under the proper assumptions, collision between vehicles may not happen at any future time. Simulations results indicate that all interactions of heterogeneous vehicles (i.e., car-truck, truck-car, car-car and truck-truck) are properly reproduced by the model. In addition, the model overcomes one of the major limitations of CA models for traffic modeling: the inability to perform smooth approach to slower or stopped vehicles. Moreover, the model is also capable of reproducing most empirical findings including the backward speed of the downstream front of the traffic jam, and different congested traffic patterns induced by a system with open boundary conditions with an on-ramp. Like most CA models, integer values are used to make the model run faster, which makes the proposed model suitable for real time traffic simulation of large networks.
Observations of the Coronal Mass Ejection with a Complex Acceleration Profile
NASA Astrophysics Data System (ADS)
Reva, A. A.; Kirichenko, A. S.; Ulyanov, A. S.; Kuzin, S. V.
2017-12-01
We study the coronal mass ejection (CME) with a complex acceleration profile. The event occurred on 2009 April 23. It had an impulsive acceleration phase, an impulsive deceleration phase, and a second impulsive acceleration phase. During its evolution, the CME showed signatures of different acceleration mechanisms: kink instability, prominence drainage, flare reconnection, and a CME–CME collision. The special feature of the observations is the usage of the TESIS EUV telescope. The instrument could image the solar corona in the Fe 171 Å line up to a distance of 2 {R}ȯ from the center of the Sun. This allows us to trace the CME up to the LASCO/C2 field of view without losing the CME from sight. The onset of the CME was caused by kink instability. The mass drainage occurred after the kink instability. The mass drainage played only an auxiliary role: it decreased the CME mass, which helped to accelerate the CME. The first impulsive acceleration phase was caused by the flare reconnection. We observed the two-ribbon flare and an increase of the soft X-ray flux during the first impulsive acceleration phase. The impulsive deceleration and the second impulsive acceleration phases were caused by the CME–CME collision. The studied event shows that CMEs are complex phenomena that cannot be explained with only one acceleration mechanism. We should seek a combination of different mechanisms that accelerate CMEs at different stages of their evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Ning
Independent of the methods of nuclear waste disposal, the degradation of packaging materials could lead to mobilization and transport of radionuclides into the geosphere. This process can be significantly accelerated due to the association of radionuclides with the backfill materials or mobile colloids in groundwater. The transport of these colloids is complicated by the inherent coupling of physical and chemical heterogeneities (e.g., pore space geometry, grain size, charge heterogeneity, and surface hydrophobicity) in natural porous media that can exist on the length scale of a few grains. In addition, natural colloids themselves are often heterogeneous in their surface properties (e.g.,more » clay platelets possess opposite charges on the surface and along the rim). Both physical and chemical heterogeneities influence the transport and retention of radionuclides under various groundwater conditions. However, the precise mechanisms how these coupled heterogeneities influence colloidal transport are largely elusive. This knowledge gap is a major source of uncertainty in developing accurate models to represent the transport process and to predict distribution of radionuclides in the geosphere.« less
Monte Carlo N Particle code - Dose distribution of clinical electron beams in inhomogeneous phantoms
Nedaie, H. A.; Mosleh-Shirazi, M. A.; Allahverdi, M.
2013-01-01
Electron dose distributions calculated using the currently available analytical methods can be associated with large uncertainties. The Monte Carlo method is the most accurate method for dose calculation in electron beams. Most of the clinical electron beam simulation studies have been performed using non- MCNP [Monte Carlo N Particle] codes. Given the differences between Monte Carlo codes, this work aims to evaluate the accuracy of MCNP4C-simulated electron dose distributions in a homogenous phantom and around inhomogeneities. Different types of phantoms ranging in complexity were used; namely, a homogeneous water phantom and phantoms made of polymethyl methacrylate slabs containing different-sized, low- and high-density inserts of heterogeneous materials. Electron beams with 8 and 15 MeV nominal energy generated by an Elekta Synergy linear accelerator were investigated. Measurements were performed for a 10 cm × 10 cm applicator at a source-to-surface distance of 100 cm. Individual parts of the beam-defining system were introduced into the simulation one at a time in order to show their effect on depth doses. In contrast to the first scattering foil, the secondary scattering foil, X and Y jaws and applicator provide up to 5% of the dose. A 2%/2 mm agreement between MCNP and measurements was found in the homogenous phantom, and in the presence of heterogeneities in the range of 1-3%, being generally within 2% of the measurements for both energies in a "complex" phantom. A full-component simulation is necessary in order to obtain a realistic model of the beam. The MCNP4C results agree well with the measured electron dose distributions. PMID:23533162
Wang, Lu-Yong; Fasulo, D
2006-01-01
Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.
NASA Astrophysics Data System (ADS)
Nowicki, Waldemar; Gąsowska, Anna; Kirszensztejn, Piotr
2016-05-01
UV-vis spectroscopy measurements confirmed the reaction in heterogeneous system between Pt(II) ions and ethylenediamine type ligand, n-(2-aminoethyl)-3-aminopropyl-trimethoxysilane, immobilized at the silica surface. The formation of complexes is a consequence of interaction between the amine groups from the ligand grafted onto SiO2 and ions of platinum. A potentiometric titration technique was to determine the stability constants of complexes of Pt(II) with immobilized insoluble ligand (SG-L), on the silica gel. The results show the formation of three surface complexes of the same type (PtHSG-L, Pt(HSG-L)2, PtSG-L) with SG-L ligand, in a wide range of pH for different Debye length. The concentration distribution of the complexes in a heterogeneous system is evaluated.
Epidemic outbreaks in complex heterogeneous networks
NASA Astrophysics Data System (ADS)
Moreno, Y.; Pastor-Satorras, R.; Vespignani, A.
2002-04-01
We present a detailed analytical and numerical study for the spreading of infections with acquired immunity in complex population networks. We show that the large connectivity fluctuations usually found in these networks strengthen considerably the incidence of epidemic outbreaks. Scale-free networks, which are characterized by diverging connectivity fluctuations in the limit of a very large number of nodes, exhibit the lack of an epidemic threshold and always show a finite fraction of infected individuals. This particular weakness, observed also in models without immunity, defines a new epidemiological framework characterized by a highly heterogeneous response of the system to the introduction of infected individuals with different connectivity. The understanding of epidemics in complex networks might deliver new insights in the spread of information and diseases in biological and technological networks that often appear to be characterized by complex heterogeneous architectures.
Dose and scatter characteristics of a novel cone beam CT system for musculoskeletal extremities
NASA Astrophysics Data System (ADS)
Zbijewski, W.; Sisniega, A.; Vaquero, J. J.; Muhit, A.; Packard, N.; Senn, R.; Yang, D.; Yorkston, J.; Carrino, J. A.; Siewerdsen, J. H.
2012-03-01
A novel cone-beam CT (CBCT) system has been developed with promising capabilities for musculoskeletal imaging (e.g., weight-bearing extremities and combined radiographic / volumetric imaging). The prototype system demonstrates diagnostic-quality imaging performance, while the compact geometry and short scan orbit raise new considerations for scatter management and dose characterization that challenge conventional methods. The compact geometry leads to elevated, heterogeneous x-ray scatter distributions - even for small anatomical sites (e.g., knee or wrist), and the short scan orbit results in a non-uniform dose distribution. These complex dose and scatter distributions were investigated via experimental measurements and GPU-accelerated Monte Carlo (MC) simulation. The combination provided a powerful basis for characterizing dose distributions in patient-specific anatomy, investigating the benefits of an antiscatter grid, and examining distinct contributions of coherent and incoherent scatter in artifact correction. Measurements with a 16 cm CTDI phantom show that the dose from the short-scan orbit (0.09 mGy/mAs at isocenter) varies from 0.16 to 0.05 mGy/mAs at various locations on the periphery (all obtained at 80 kVp). MC estimation agreed with dose measurements within 10-15%. Dose distribution in patient-specific anatomy was computed with MC, confirming such heterogeneity and highlighting the elevated energy deposition in bone (factor of ~5-10) compared to soft-tissue. Scatter-to-primary ratio (SPR) up to ~1.5-2 was evident in some regions of the knee. A 10:1 antiscatter grid was found earlier to result in significant improvement in soft-tissue imaging performance without increase in dose. The results of MC simulations elucidated the mechanism behind scatter reduction in the presence of a grid. A ~3-fold reduction in average SPR was found in the MC simulations; however, a linear grid was found to impart additional heterogeneity in the scatter distribution, mainly due to the increase in the contribution of coherent scatter with increased spatial variation. Scatter correction using MC-generated scatter distributions demonstrated significant improvement in cupping and streaks. Physical experimentation combined with GPU-accelerated MC simulation provided a sophisticated, yet practical approach in identifying low-dose acquisition techniques, optimizing scatter correction methods, and evaluating patientspecific dose.
NASA Technical Reports Server (NTRS)
Farhat, Charbel; Rixen, Daniel
1996-01-01
We present an optimal preconditioning algorithm that is equally applicable to the dual (FETI) and primal (Balancing) Schur complement domain decomposition methods, and which successfully addresses the problems of subdomain heterogeneities including the effects of large jumps of coefficients. The proposed preconditioner is derived from energy principles and embeds a new coarsening operator that propagates the error globally and accelerates convergence. The resulting iterative solver is illustrated with the solution of highly heterogeneous elasticity problems.
Rupture Dynamics and Ground Motion from Earthquakes in Heterogeneous Media
NASA Astrophysics Data System (ADS)
Bydlon, S.; Dunham, E. M.; Kozdon, J. E.
2012-12-01
Heterogeneities in the material properties of Earth's crust scatter propagating seismic waves. The effects of scattered waves are reflected in the seismic coda and depend on the relative strength of the heterogeneities, spatial arrangement, and distance from source to receiver. In the vicinity of the fault, scattered waves influence the rupture process by introducing fluctuations in the stresses driving propagating ruptures. Further variability in the rupture process is introduced by naturally occurring geometric complexity of fault surfaces, and the stress changes that accompany slip on rough surfaces. We have begun a modeling effort to better understand the origin of complexity in the earthquake source process, and to quantify the relative importance of source complexity and scattering along the propagation path in causing incoherence of high frequency ground motion. To do this we extended our two-dimensional high order finite difference rupture dynamics code to accommodate material heterogeneities. We generate synthetic heterogeneous media using Von Karman correlation functions and their associated power spectral density functions. We then nucleate ruptures on either flat or rough faults, which obey strongly rate-weakening friction laws. Preliminary results for flat faults with uniform frictional properties and initial stresses indicate that off-fault material heterogeneity alone can lead to a complex rupture process. Our simulations reveal the excitation of high frequency bursts of waves, which radiate energy away from the propagating rupture. The average rupture velocity is thus reduced relative to its value in simulations employing homogeneous material properties. In the coming months, we aim to more fully explore parameter space by varying the correlation length, Hurst exponent, and amplitude of medium heterogeneities, as well as the statistical properties characterizing fault roughness.
Fuller, James A; Goldstick, Jason; Bartram, Jamie; Eisenberg, Joseph N S
2016-01-15
Global access to safe drinking water and sanitation has improved dramatically during the Millennium Development Goal (MDG) period. However, there is substantial heterogeneity in progress between countries and inequality within countries. We assessed countries' temporal patterns in access to drinking water and sanitation using publicly available data. We then classified countries using non-linear modeling techniques as having one of the following trajectories: 100% coverage, linear growth, linear decline, no change, saturation, acceleration, deceleration, negative acceleration, or negative deceleration. We further assessed the degree to which temporal profiles follow a sigmoidal pattern and how these patterns might vary within a given country between rural and urban settings. Among countries with more than 10 data points, between 15% and 38% showed a non-linear trajectory, depending on the indicator. Overall, countries' progress followed a sigmoidal trend, but some countries are making better progress and some worse progress than would be expected. We highlight several countries that are not on track to meet the MDG for water or sanitation, but whose access is accelerating, suggesting better performance during the coming years. Conversely, we also highlight several countries that have made sufficient progress to meet the MDG target, but in which access is decelerating. Patterns were heterogeneous and non-linearity was common. Characterization of these heterogeneous patterns will help policy makers allocate resources more effectively. For example, policy makers can identify countries that could make use of additional resources or might be in need of additional institutional capacity development to properly manage resources; this will be essential to meet the forthcoming Sustainable Development Goals. Copyright © 2015 Elsevier B.V. All rights reserved.
Quantum computational complexity, Einstein's equations and accelerated expansion of the Universe
NASA Astrophysics Data System (ADS)
Ge, Xian-Hui; Wang, Bin
2018-02-01
We study the relation between quantum computational complexity and general relativity. The quantum computational complexity is proposed to be quantified by the shortest length of geodesic quantum curves. We examine the complexity/volume duality in a geodesic causal ball in the framework of Fermi normal coordinates and derive the full non-linear Einstein equation. Using insights from the complexity/action duality, we argue that the accelerated expansion of the universe could be driven by the quantum complexity and free from coincidence and fine-tunning problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunert, Sebastian; Wang, Yaqi; Gleicher, Frederick
This paper presents a flexible nonlinear diffusion acceleration (NDA) method that discretizes both the S N transport equation and the diffusion equation using the discontinuous finite element method (DFEM). The method is flexible in that the diffusion equation can be discretized on a coarser mesh with the only restriction that it is nested within the transport mesh and the FEM shape function orders of the two equations can be different. The consistency of the transport and diffusion solutions at convergence is defined by using a projection operator mapping the transport into the diffusion FEM space. The diffusion weak form ismore » based on the modified incomplete interior penalty (MIP) diffusion DFEM discretization that is extended by volumetric drift, interior face, and boundary closure terms. In contrast to commonly used coarse mesh finite difference (CMFD) methods, the presented NDA method uses a full FEM discretized diffusion equation for acceleration. Suitable projection and prolongation operators arise naturally from the FEM framework. Via Fourier analysis and numerical experiments for a one-group, fixed source problem the following properties of the NDA method are established for structured quadrilateral meshes: (1) the presented method is unconditionally stable and effective in the presence of mild material heterogeneities if the same mesh and identical shape functions either of the bilinear or biquadratic type are used, (2) the NDA method remains unconditionally stable in the presence of strong heterogeneities, (3) the NDA method with bilinear elements extends the range of effectiveness and stability by a factor of two when compared to CMFD if a coarser diffusion mesh is selected. In addition, the method is tested for solving the C5G7 multigroup, eigenvalue problem using coarse and fine mesh acceleration. Finally, while NDA does not offer an advantage over CMFD for fine mesh acceleration, it reduces the iteration count required for convergence by almost a factor of two in the case of coarse mesh acceleration.« less
Schunert, Sebastian; Wang, Yaqi; Gleicher, Frederick; ...
2017-02-21
This paper presents a flexible nonlinear diffusion acceleration (NDA) method that discretizes both the S N transport equation and the diffusion equation using the discontinuous finite element method (DFEM). The method is flexible in that the diffusion equation can be discretized on a coarser mesh with the only restriction that it is nested within the transport mesh and the FEM shape function orders of the two equations can be different. The consistency of the transport and diffusion solutions at convergence is defined by using a projection operator mapping the transport into the diffusion FEM space. The diffusion weak form ismore » based on the modified incomplete interior penalty (MIP) diffusion DFEM discretization that is extended by volumetric drift, interior face, and boundary closure terms. In contrast to commonly used coarse mesh finite difference (CMFD) methods, the presented NDA method uses a full FEM discretized diffusion equation for acceleration. Suitable projection and prolongation operators arise naturally from the FEM framework. Via Fourier analysis and numerical experiments for a one-group, fixed source problem the following properties of the NDA method are established for structured quadrilateral meshes: (1) the presented method is unconditionally stable and effective in the presence of mild material heterogeneities if the same mesh and identical shape functions either of the bilinear or biquadratic type are used, (2) the NDA method remains unconditionally stable in the presence of strong heterogeneities, (3) the NDA method with bilinear elements extends the range of effectiveness and stability by a factor of two when compared to CMFD if a coarser diffusion mesh is selected. In addition, the method is tested for solving the C5G7 multigroup, eigenvalue problem using coarse and fine mesh acceleration. Finally, while NDA does not offer an advantage over CMFD for fine mesh acceleration, it reduces the iteration count required for convergence by almost a factor of two in the case of coarse mesh acceleration.« less
Feasibility study of a cyclotron complex for hadron therapy
NASA Astrophysics Data System (ADS)
Smirnov, V.; Vorozhtsov, S.
2018-04-01
An accelerator complex for hadron therapy based on a chain of cyclotrons is under development at JINR (Dubna, Russia), and the corresponding conceptual design is under preparation. The complex mainly consists of two superconducting cyclotrons. The first accelerator is a compact cyclotron used as an injector to the main accelerator, which is a six-fold separated sector machine. The facility is intended for generation of protons and carbon beams. The H2+ and 12C6+ ions from the corresponding ECR ion sources are accelerated in the injector-cyclotron up to the output energy of 70 MeV/u. Then, the H2+ ions are extracted from the injector by a stripping foil, and the resulting proton beam with the energy of 70 MeV is used for medical purposes. After acceleration in the main cyclotron, the carbon beam can be either used directly for therapy or introduced to the main cyclotron for obtaining the final energy of 400 MeV/u. The basic requirements to the project are the following: compliance to medical requirements, compact size, feasible design, and high reliability of all systems of the complex. The advantages of the dual cyclotron design can help reaching these goals. The initial calculations show that this design is technically feasible with acceptable beam dynamics. The accelerator complex with a relatively compact size can be a good solution for medical applications. The basic parameters of the facility and detailed investigation of the magnetic system and beam dynamics are described.
di Virgilio, Agustina; Morales, Juan M; Lambertucci, Sergio A; Shepard, Emily L C; Wilson, Rory P
2018-01-01
Precision Livestock Farming (PLF) is a promising approach to minimize the conflicts between socio-economic activities and landscape conservation. However, its application on extensive systems of livestock production can be challenging. The main difficulties arise because animals graze on large natural pastures where they are exposed to competition with wild herbivores for heterogeneous and scarce resources, predation risk, adverse weather, and complex topography. Considering that the 91% of the world's surface devoted to livestock production is composed of extensive systems (i.e., rangelands), our general aim was to develop a PLF methodology that quantifies: (i) detailed behavioural patterns, (ii) feeding rate, and (iii) costs associated with different behaviours and landscape traits. For this, we used Merino sheep in Patagonian rangelands as a case study. We combined data from an animal-attached multi-sensor tag (tri-axial acceleration, tri-axial magnetometry, temperature sensor and Global Positioning System) with landscape layers from a Geographical Information System to acquire data. Then, we used high accuracy decision trees, dead reckoning methods and spatial data processing techniques to show how this combination of tools could be used to assess energy balance, predation risk and competition experienced by livestock through time and space. The combination of methods proposed here are a useful tool to assess livestock behaviour and the different factors that influence extensive livestock production, such as topography, environmental temperature, predation risk and competition for heterogeneous resources. We were able to quantify feeding rate continuously through time and space with high accuracy and show how it could be used to estimate animal production and the intensity of grazing on the landscape. We also assessed the effects of resource heterogeneity (inferred through search times), and the potential costs associated with predation risk, competition, thermoregulation and movement on complex topography. The quantification of feeding rate and behavioural costs provided by our approach could be used to estimate energy balance and to predict individual growth, survival and reproduction. Finally, we discussed how the information provided by this combination of methods can be used to develop wildlife-friendly strategies that also maximize animal welfare, quality and environmental sustainability.
Intrafamilial phenotypic heterogeneity of the Poland complex: a case report.
Parano, E; Falsaperla, R; Pavone, V; Toscano, A; Bolan, E A; Trifiletti, R R
1995-08-01
Three cases of familial unilateral gluteal hypoplasia are reported. The index case in addition to having gluteal hypoplasia also has unilateral pectoral muscle hypoplasia. Another relative has unilateral symbrachydactyly of the distal phalanges of one foot. All four affected individuals in our pedigree were female. We propose that our cases are best classified as part of the Poland complex of anomalies. Our cases emphasize that intrafamilial phenotypic heterogeneity is possible within the Poland complex.
NASA Astrophysics Data System (ADS)
Rodebaugh, Raymond Francis, Jr.
2000-11-01
In this project we applied modifications of the Fermi- Eyges multiple scattering theory to attempt to achieve the goals of a fast, accurate electron dose calculation algorithm. The dose was first calculated for an ``average configuration'' based on the patient's anatomy using a modification of the Hogstrom algorithm. It was split into a measured central axis depth dose component based on the material between the source and the dose calculation point, and an off-axis component based on the physics of multiple coulomb scattering for the average configuration. The former provided the general depth dose characteristics along the beam fan lines, while the latter provided the effects of collimation. The Gaussian localized heterogeneities theory of Jette provided the lateral redistribution of the electron fluence by heterogeneities. Here we terminated Jette's infinite series of fluence redistribution terms after the second term. Experimental comparison data were collected for 1 cm thick x 1 cm diameter air and aluminum pillboxes using the Varian 2100C linear accelerator at Rush-Presbyterian- St. Luke's Medical Center. For an air pillbox, the algorithm results were in reasonable agreement with measured data at both 9 and 20 MeV. For the Aluminum pill box, there were significant discrepancies between the results of this algorithm and experiment. This was particularly apparent for the 9 MeV beam. Of course a one cm thick Aluminum heterogeneity is unlikely to be encountered in a clinical situation; the thickness, linear stopping power, and linear scattering power of Aluminum are all well above what would normally be encountered. We found that the algorithm is highly sensitive to the choice of the average configuration. This is an indication that the series of fluence redistribution terms does not converge fast enough to terminate after the second term. It also makes it difficult to apply the algorithm to cases where there are no a priori means of choosing the best average configuration or where there is a complex geometry containing both lowly and highly scattering heterogeneities. There is some hope of decreasing the sensitivity to the average configuration by including portions of the next term of the localized heterogeneities series.
Moving beyond heterogeneity and process complexity: a new vision for watershed hydrology
J. J. McDonnell; M. Sivapalan; K. Vache; S. Dunn; G. Grant; R. Haggerty; C. Hinz; R. Hooper; J. Kirchner; M.L. Roderick; J. Selker; M. Weiler
2007-01-01
Field studies in watershed hydrology continue to characterize and catalogue the enormous heterogeneity and complexity of rainfall runoff processes in more and more watersheds, in different hydroclimatic regimes, and at different scales. Nevertheless, the ability to generalize these findings to ungauged regions remains out of reach. In spite of their apparent physical...
Chen, Ping; Zhang, Linxing; Xue, Zi-Ling; Wu, Yun-Dong; Zhang, Xinhao
2017-06-19
The reactions of early-transition-metal complexes with H 2 O have been investigated. An understanding of these elementary steps promotes the design of precursors for the preparation of metal oxide materials or supported heterogeneous catalysts. Density functional theory (DFT) calculations have been conducted to investigate two elementary steps of the reactions between tungsten alkylidyne complexes and H 2 O, i.e., the addition of H 2 O to the W≡C bond and ligand hydrolysis. Four tungsten alkylidyne complexes, W(≡CSiMe 3 )(CH 2 SiMe 3 ) 3 (A-1), W(≡CSiMe 3 )(CH 2 t Bu) 3 (B-1), W(≡C t Bu)(CH 2 t Bu) 3 (C-1), and W(≡C t Bu)(O t Bu) 3 (D-1), have been compared. The DFT studies provide an energy profile of the two competing pathways. An additional H 2 O molecule can serve as a proton shuttle, accelerating the H 2 O addition reaction. The effect of atoms at the α and β positions has also been examined. Because the lone-pair electrons of an O atom at the α position can interact with the orbital of the proton, the barrier of the ligand-hydrolysis reaction for D-1 is dramatically reduced. Both the electronic and steric effects of the silyl group at the β position lower the barriers of both the H 2 O addition and ligand-hydrolysis reactions. These new mechanistic findings may lead to the further development of metal complex precursors.
Emergent dynamics of spatio-temporal chaos in a heterogeneous excitable medium.
Bittihn, Philip; Berg, Sebastian; Parlitz, Ulrich; Luther, Stefan
2017-09-01
Self-organized activation patterns in excitable media such as spiral waves and spatio-temporal chaos underlie dangerous cardiac arrhythmias. While the interaction of single spiral waves with different types of heterogeneity has been studied extensively, the effect of heterogeneity on fully developed spatio-temporal chaos remains poorly understood. We investigate how the complexity and stability properties of spatio-temporal chaos in the Bär-Eiswirth model of excitable media depend on the heterogeneity of the underlying medium. We employ different measures characterizing the chaoticity of the system and find that the spatial arrangement of multiple discrete lower excitability regions has a strong impact on the complexity of the dynamics. Varying the number, shape, and spatial arrangement of the heterogeneities, we observe strong emergent effects ranging from increases in chaoticity to the complete cessation of chaos, contrasting the expectation from the homogeneous behavior. The implications of our findings for the development and treatment of arrhythmias in the heterogeneous cardiac muscle are discussed.
Emergent dynamics of spatio-temporal chaos in a heterogeneous excitable medium
NASA Astrophysics Data System (ADS)
Bittihn, Philip; Berg, Sebastian; Parlitz, Ulrich; Luther, Stefan
2017-09-01
Self-organized activation patterns in excitable media such as spiral waves and spatio-temporal chaos underlie dangerous cardiac arrhythmias. While the interaction of single spiral waves with different types of heterogeneity has been studied extensively, the effect of heterogeneity on fully developed spatio-temporal chaos remains poorly understood. We investigate how the complexity and stability properties of spatio-temporal chaos in the Bär-Eiswirth model of excitable media depend on the heterogeneity of the underlying medium. We employ different measures characterizing the chaoticity of the system and find that the spatial arrangement of multiple discrete lower excitability regions has a strong impact on the complexity of the dynamics. Varying the number, shape, and spatial arrangement of the heterogeneities, we observe strong emergent effects ranging from increases in chaoticity to the complete cessation of chaos, contrasting the expectation from the homogeneous behavior. The implications of our findings for the development and treatment of arrhythmias in the heterogeneous cardiac muscle are discussed.
CryoEM and image sorting for flexible protein/DNA complexes.
Villarreal, Seth A; Stewart, Phoebe L
2014-07-01
Intrinsically disordered regions of proteins and conformational flexibility within complexes can be critical for biological function. However, disorder, flexibility, and heterogeneity often hinder structural analyses. CryoEM and single particle image processing techniques offer the possibility of imaging samples with significant flexibility. Division of particle images into more homogenous subsets after data acquisition can help compensate for heterogeneity within the sample. We present the utility of an eigenimage sorting analysis for examining two protein/DNA complexes with significant conformational flexibility and heterogeneity. These complexes are integral to the non-homologous end joining pathway, and are involved in the repair of double strand breaks of DNA. Both complexes include the DNA-dependent protein kinase catalytic subunit (DNA-PKcs) and biotinylated DNA with bound streptavidin, with one complex containing the Ku heterodimer. Initial 3D reconstructions of the two DNA-PKcs complexes resembled a cryoEM structure of uncomplexed DNA-PKcs without additional density clearly attributable to the remaining components. Application of eigenimage sorting allowed division of the DNA-PKcs complex datasets into more homogeneous subsets. This led to visualization of density near the base of the DNA-PKcs that can be attributed to DNA, streptavidin, and Ku. However, comparison of projections of the subset structures with 2D class averages indicated that a significant level of heterogeneity remained within each subset. In summary, image sorting methods allowed visualization of extra density near the base of DNA-PKcs, suggesting that DNA binds in the vicinity of the base of the molecule and potentially to a flexible region of DNA-PKcs. Copyright © 2013 Elsevier Inc. All rights reserved.
Gerards, Mike; Sallevelt, Suzanne C E H; Smeets, Hubert J M
2016-03-01
Leigh syndrome is a progressive neurodegenerative disorder, affecting 1 in 40,000 live births. Most patients present with symptoms between the ages of three and twelve months, but adult onset Leigh syndrome has also been described. The disease course is characterized by a rapid deterioration of cognitive and motor functions, in most cases resulting in death due to respiratory failure. Despite the high genetic heterogeneity of Leigh syndrome, patients present with identical, symmetrical lesions in the basal ganglia or brainstem on MRI, while additional clinical manifestations and age of onset varies from case to case. To date, mutations in over 60 genes, both nuclear and mitochondrial DNA encoded, have been shown to cause Leigh syndrome, still explaining only half of all cases. In most patients, these mutations directly or indirectly affect the activity of the mitochondrial respiratory chain or pyruvate dehydrogenase complex. Exome sequencing has accelerated the discovery of new genes and pathways involved in Leigh syndrome, providing novel insights into the pathophysiological mechanisms. This is particularly important as no general curative treatment is available for this devastating disorder, although several recent studies imply that early treatment might be beneficial for some patients depending on the gene or process affected. Timely, gene-based personalized treatment may become an important strategy in rare, genetically heterogeneous disorders like Leigh syndrome, stressing the importance of early genetic diagnosis and identification of new genes/pathways. In this review, we provide a comprehensive overview of the most important clinical manifestations and genes/pathways involved in Leigh syndrome, and discuss the current state of therapeutic interventions in patients. Copyright © 2015 Elsevier Inc. All rights reserved.
Fermilab | Tritium at Fermilab | Frequently asked questions
computing Quantum initiatives Research and development Key discoveries Benefits of particle physics Particle Accelerators Leading accelerator technology Accelerator complex Illinois Accelerator Research Center Fermilab questions about tritium Tritium in surface water Indian Creek Kress Creek Ferry Creek Tritium in sanitary
Autonomic Management of Application Workflows on Hybrid Computing Infrastructure
Kim, Hyunjoo; el-Khamra, Yaakoub; Rodero, Ivan; ...
2011-01-01
In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints.more » The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Selkoe, D.J.; Podlisny, M.B.; Joachim, C.L.
1988-10-01
Progressive cerebral deposition of extracellular filaments composed of the {beta}-amyloid protein ({beta}AP) is a constant feature of Alzheimer disease (AD). Since the gene on chromosome 21 encoding the {beta}AP precursor ({beta}APP) is not known to be altered in AD, transcriptional or posttranslational changes may underlie accelerated {beta}AP deposition. Using two antibodies to the predicted carboxyl terminus of {beta}APP, the authors have identified the native {beta}APP in brain and nonneural human tissues as a 110- to 135-kDa protein complex that is insoluble in buffer and found in various membrane-rich subcellular fractions. These proteins are relatively uniformly distributed in adult brain, abundantmore » in fetal brain, and detected in nonneural tissues that contain {beta}APP mRNA. Similarly sized proteins occur in rat, cow, and monkey brain and in cultured human HL-60 and HeLa cells; the precise patterns in the 110- to 135-kDa range are heterogeneous among various tissues and cell lines. They conclude that the highly conserved {beta}APP molecule occurs in mammalian tissues as a heterogeneous group of membrane-associated proteins of {approx} 120 kDa. Detection of the nonamyloidogenic carboxyl terminus within plaques suggests that proteolytic processing of the {beta}APP into insoluble filaments occurs locally in cortical regions that develop {beta}-amyloid deposits with age.« less
Yin, Xianyong; Low, Hui Qi; Wang, Ling; Li, Yonghong; Ellinghaus, Eva; Han, Jiali; Estivill, Xavier; Sun, Liangdan; Zuo, Xianbo; Shen, Changbing; Zhu, Caihong; Zhang, Anping; Sanchez, Fabio; Padyukov, Leonid; Catanese, Joseph J; Krueger, Gerald G; Duffin, Kristina Callis; Mucha, Sören; Weichenthal, Michael; Weidinger, Stephan; Lieb, Wolfgang; Foo, Jia Nee; Li, Yi; Sim, Karseng; Liany, Herty; Irwan, Ishak; Teo, Yikying; Theng, Colin T S; Gupta, Rashmi; Bowcock, Anne; De Jager, Philip L; Qureshi, Abrar A; de Bakker, Paul I W; Seielstad, Mark; Liao, Wilson; Ståhle, Mona; Franke, Andre; Zhang, Xuejun; Liu, Jianjun
2015-04-23
Psoriasis is a common inflammatory skin disease with complex genetics and different degrees of prevalence across ethnic populations. Here we present the largest trans-ethnic genome-wide meta-analysis (GWMA) of psoriasis in 15,369 cases and 19,517 controls of Caucasian and Chinese ancestries. We identify four novel associations at LOC144817, COG6, RUNX1 and TP63, as well as three novel secondary associations within IFIH1 and IL12B. Fine-mapping analysis of MHC region demonstrates an important role for all three HLA class I genes and a complex and heterogeneous pattern of HLA associations between Caucasian and Chinese populations. Further, trans-ethnic comparison suggests population-specific effect or allelic heterogeneity for 11 loci. These population-specific effects contribute significantly to the ethnic diversity of psoriasis prevalence. This study not only provides novel biological insights into the involvement of immune and keratinocyte development mechanism, but also demonstrates a complex and heterogeneous genetic architecture of psoriasis susceptibility across ethnic populations.
Yin, Xianyong; Low, Hui Qi; Wang, Ling; Li, Yonghong; Ellinghaus, Eva; Han, Jiali; Estivill, Xavier; Sun, Liangdan; Zuo, Xianbo; Shen, Changbing; Zhu, Caihong; Zhang, Anping; Sanchez, Fabio; Padyukov, Leonid; Catanese, Joseph J.; Krueger, Gerald G.; Duffin, Kristina Callis; Mucha, Sören; Weichenthal, Michael; Weidinger, Stephan; Lieb, Wolfgang; Foo, Jia Nee; Li, Yi; Sim, Karseng; Liany, Herty; Irwan, Ishak; Teo, Yikying; Theng, Colin T. S.; Gupta, Rashmi; Bowcock, Anne; De Jager, Philip L.; Qureshi, Abrar A.; de Bakker, Paul I. W.; Seielstad, Mark; Liao, Wilson; Ståhle, Mona; Franke, Andre; Zhang, Xuejun; Liu, Jianjun
2015-01-01
Psoriasis is a common inflammatory skin disease with complex genetics and different degrees of prevalence across ethnic populations. Here we present the largest trans-ethnic genome-wide meta-analysis (GWMA) of psoriasis in 15,369 cases and 19,517 controls of Caucasian and Chinese ancestries. We identify four novel associations at LOC144817, COG6, RUNX1 and TP63, as well as three novel secondary associations within IFIH1 and IL12B. Fine-mapping analysis of MHC region demonstrates an important role for all three HLA class I genes and a complex and heterogeneous pattern of HLA associations between Caucasian and Chinese populations. Further, trans-ethnic comparison suggests population-specific effect or allelic heterogeneity for 11 loci. These population-specific effects contribute significantly to the ethnic diversity of psoriasis prevalence. This study not only provides novel biological insights into the involvement of immune and keratinocyte development mechanism, but also demonstrates a complex and heterogeneous genetic architecture of psoriasis susceptibility across ethnic populations. PMID:25903422
On the robustness of complex heterogeneous gene expression networks.
Gómez-Gardeñes, Jesús; Moreno, Yamir; Floría, Luis M
2005-04-01
We analyze a continuous gene expression model on the underlying topology of a complex heterogeneous network. Numerical simulations aimed at studying the chaotic and periodic dynamics of the model are performed. The results clearly indicate that there is a region in which the dynamical and structural complexity of the system avoid chaotic attractors. However, contrary to what has been reported for Random Boolean Networks, the chaotic phase cannot be completely suppressed, which has important bearings on network robustness and gene expression modeling.
Quantile Regression with Censored Data
ERIC Educational Resources Information Center
Lin, Guixian
2009-01-01
The Cox proportional hazards model and the accelerated failure time model are frequently used in survival data analysis. They are powerful, yet have limitation due to their model assumptions. Quantile regression offers a semiparametric approach to model data with possible heterogeneity. It is particularly powerful for censored responses, where the…
NASA Astrophysics Data System (ADS)
Kolesnikov, V. I.
2017-06-01
The NICA (Nuclotron-based Ion Collider fAcility) project is aimed in the construction at JINR (Dubna) a modern accelerator complex equipped with three detectors: the MultiPurpose Detector (MPD) and the Spin Physics Detector (SPD) at the NICA collider, as well as a fixed target experiment BM&N which will be use extracted beams from the Nuclotron accelerator. In this report, an overview of the main physics objectives of the NICA heavy-ion program will be given and the recent progress in the NICA construction (both accelerator complex and detectors) will be described.
Multiscale implementation of infinite-swap replica exchange molecular dynamics.
Yu, Tang-Qing; Lu, Jianfeng; Abrams, Cameron F; Vanden-Eijnden, Eric
2016-10-18
Replica exchange molecular dynamics (REMD) is a popular method to accelerate conformational sampling of complex molecular systems. The idea is to run several replicas of the system in parallel at different temperatures that are swapped periodically. These swaps are typically attempted every few MD steps and accepted or rejected according to a Metropolis-Hastings criterion. This guarantees that the joint distribution of the composite system of replicas is the normalized sum of the symmetrized product of the canonical distributions of these replicas at the different temperatures. Here we propose a different implementation of REMD in which (i) the swaps obey a continuous-time Markov jump process implemented via Gillespie's stochastic simulation algorithm (SSA), which also samples exactly the aforementioned joint distribution and has the advantage of being rejection free, and (ii) this REMD-SSA is combined with the heterogeneous multiscale method to accelerate the rate of the swaps and reach the so-called infinite-swap limit that is known to optimize sampling efficiency. The method is easy to implement and can be trivially parallelized. Here we illustrate its accuracy and efficiency on the examples of alanine dipeptide in vacuum and C-terminal β-hairpin of protein G in explicit solvent. In this latter example, our results indicate that the landscape of the protein is a triple funnel with two folded structures and one misfolded structure that are stabilized by H-bonds.
Evaluation of the OpenCL AES Kernel using the Intel FPGA SDK for OpenCL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal
The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. In this report, we evaluate the performance of the kernel using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board. Compared to the M506 module, the board provides more hardware resources for a larger design exploration space. The kernel performance is measured with the compute kernel throughput, an upper bound to the FPGA throughput. The report presents the experimental results in details. The Appendix lists the kernel source code.« less
NASA Astrophysics Data System (ADS)
Klise, K. A.; Weissmann, G. S.; McKenna, S. A.; Tidwell, V. C.; Frechette, J. D.; Wawrzyniec, T. F.
2007-12-01
Solute plumes are believed to disperse in a non-Fickian manner due to small-scale heterogeneity and variable velocities that create preferential pathways. In order to accurately predict dispersion in naturally complex geologic media, the connection between heterogeneity and dispersion must be better understood. Since aquifer properties can not be measured at every location, it is common to simulate small-scale heterogeneity with random field generators based on a two-point covariance (e.g., through use of sequential simulation algorithms). While these random fields can produce preferential flow pathways, it is unknown how well the results simulate solute dispersion through natural heterogeneous media. To evaluate the influence that complex heterogeneity has on dispersion, we utilize high-resolution terrestrial lidar to identify and model lithofacies from outcrop for application in particle tracking solute transport simulations using RWHet. The lidar scan data are used to produce a lab (meter) scale two-dimensional model that captures 2-8 mm scale natural heterogeneity. Numerical simulations utilize various methods to populate the outcrop structure captured by the lidar-based image with reasonable hydraulic conductivity values. The particle tracking simulations result in residence time distributions used to evaluate the nature of dispersion through complex media. Particle tracking simulations through conductivity fields produced from the lidar images are then compared to particle tracking simulations through hydraulic conductivity fields produced from sequential simulation algorithms. Based on this comparison, the study aims to quantify the difference in dispersion when using realistic and simplified representations of aquifer heterogeneity. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Characterization, Modeling, and Accelerating Emulation of Aircraft Coating Exposure and Degradation
2009-09-30
nucleation sites for conjugated polymer electrodeposition on AA 2024-T3. In particular, the role of secondary phase heterogeneities in the nucleation ...work is mainly contained in the MS Thesis of T. Chen, and was presented publicly on October 14, 2010 at the NACE Eastern Area Conference in
The dual-mode (partition/hole-filling) model of soil organic matter (SOM) as
a heterogeneous polymerlike sorbent of hydrophobic compounds predicts that a
competing solute will accelerate diffusion of the primary solute by blocking the
holes, allowing the principal ...
Calibration of an Unsteady Groundwater Flow Model for a Complex, Strongly Heterogeneous Aquifer
NASA Astrophysics Data System (ADS)
Curtis, Z. K.; Liao, H.; Li, S. G.; Phanikumar, M. S.; Lusch, D.
2016-12-01
Modeling of groundwater systems characterized by complex three-dimensional structure and heterogeneity remains a significant challenge. Most of today's groundwater models are developed based on relatively simple conceptual representations in favor of model calibratibility. As more complexities are modeled, e.g., by adding more layers and/or zones, or introducing transient processes, more parameters have to be estimated and issues related to ill-posed groundwater problems and non-unique calibration arise. Here, we explore the use of an alternative conceptual representation for groundwater modeling that is fully three-dimensional and can capture complex 3D heterogeneity (both systematic and "random") without over-parameterizing the aquifer system. In particular, we apply Transition Probability (TP) geostatistics on high resolution borehole data from a water well database to characterize the complex 3D geology. Different aquifer material classes, e.g., `AQ' (aquifer material), `MAQ' (marginal aquifer material'), `PCM' (partially confining material), and `CM' (confining material), are simulated, with the hydraulic properties of each material type as tuning parameters during calibration. The TP-based approach is applied to simulate unsteady groundwater flow in a large, complex, and strongly heterogeneous glacial aquifer system in Michigan across multiple spatial and temporal scales. The resulting model is calibrated to observed static water level data over a time span of 50 years. The results show that the TP-based conceptualization enables much more accurate and robust calibration/simulation than that based on conventional deterministic layer/zone based conceptual representations.
Simulating Mass Removal of Groundwater Contaminant Plumes with Complex and Simple Models
NASA Astrophysics Data System (ADS)
Lopez, J.; Guo, Z.; Fogg, G. E.
2016-12-01
Chlorinated solvents used in industrial, commercial, and other applications continue to pose significant threats to human health through contamination of groundwater resources. A recent National Research Council report concludes that it is unlikely that remediation of these complex sites will be achieved in a time frame of 50-100 years under current methods and standards (NRC, 2013). Pump and treat has been a common strategy at many sites to contain and treat groundwater contamination. In these sites, extensive retention of contaminant mass in low-permeability materials (tailing) has been observed after years or decades of pumping. Although transport models can be built that contain enough of the complex, 3D heterogeneity to simulate the tailing and long cleanup times, this is seldom done because of the large data and computational burdens. Hence, useful, reliable models to simulate various cleanup strategies are rare. The purpose of this study is to explore other potential ways to simulate the mass-removal processes with shorter time and less cost but still produce robust results by capturing effects of the heterogeneity and long-term retention of mass. A site containing a trichloroethylene groundwater plume was selected as the study area. The plume is located within alluvial sediments in the Tucson Basin. A fully heterogeneous domain is generated first and MODFLOW is used to simulate the flow field. Contaminant transport is simulated using both MT3D and RWHet for the fully heterogeneous model. Other approaches, including dual-domain mass transfer and heterogeneous chemical reactions, are manipulated to simulate the mass removal in a less heterogeneous, or homogeneous, domain and results are compared to the results obtained from complex models. The capability of these simpler models to simulate remediation processes, especially capture the late-time tailing, are examined.
Rodman R. Linn; Carolyn H. Sieg; Chad M. Hoffman; Judith L. Winterkamp; Joel D. McMillin
2013-01-01
We used a physics-based model, HIGRAD/FIRETEC, to explore changes in within-stand wind behavior and fire propagation associated with three time periods in pinyon-juniper woodlands following a drought-induced bark beetle outbreak and subsequent tree mortality. Pinyon-juniper woodland fuel complexes are highly heterogeneous. Trees often are clumped, with sparse patches...
Heterogeneous processes in CF4/O2 plasmas probed using laser-induced fluorescence of CF2
NASA Astrophysics Data System (ADS)
Hansen, S. G.; Luckman, G.; Nieman, George C.; Colson, Steven D.
1990-09-01
Laser-induced fluorescence of CF2 is used to monitor heterogeneous processes in ≊300 mTorr CF4/O2 plasmas. CF2 is rapidly removed at fluorinated copper and silver surfaces in 13.56-MHz rf discharges as judged by a distinct dip in its spatial distribution. These metals, when employed as etch masks, are known to accelerate plasma etching of silicon, and the present results suggest catalytic dehalogenation of CF2 is involved in this process. In contrast, aluminum and silicon dioxide exhibit negligible reactivity with CF2, which suggests that aluminum masks will not appreciably accelerate silicon etching and that ground state CF2 does not efficiently etch silicon dioxide. Measurement of CF2 decay in a pulsed discharge coupled with direct laser sputtering of metal into the gas phase indicates the interaction between CF2 and the active metals is purely heterogeneous. Aluminum does, however, exhibit homogeneous reactivity with CF2. Redistribution of active metal by plasma sputtering readily occurs; silicon etch rates may also be enhanced by the metal's presence on the silicon surface. Polymers contribute CF2 to the plasma as they etch. The observation of an induction period suggests fluorination of the polymer surface is the first step in its degradation. Polymeric etch masks can therefore depress the silicon etch rate by removal of F atoms, the primary etchants.
Accelerating the discovery of space-time patterns of infectious diseases using parallel computing.
Hohl, Alexander; Delmelle, Eric; Tang, Wenwu; Casas, Irene
2016-11-01
Infectious diseases have complex transmission cycles, and effective public health responses require the ability to monitor outbreaks in a timely manner. Space-time statistics facilitate the discovery of disease dynamics including rate of spread and seasonal cyclic patterns, but are computationally demanding, especially for datasets of increasing size, diversity and availability. High-performance computing reduces the effort required to identify these patterns, however heterogeneity in the data must be accounted for. We develop an adaptive space-time domain decomposition approach for parallel computation of the space-time kernel density. We apply our methodology to individual reported dengue cases from 2010 to 2011 in the city of Cali, Colombia. The parallel implementation reaches significant speedup compared to sequential counterparts. Density values are visualized in an interactive 3D environment, which facilitates the identification and communication of uneven space-time distribution of disease events. Our framework has the potential to enhance the timely monitoring of infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.
OpenCL-based vicinity computation for 3D multiresolution mesh compression
NASA Astrophysics Data System (ADS)
Hachicha, Soumaya; Elkefi, Akram; Ben Amar, Chokri
2017-03-01
3D multiresolution mesh compression systems are still widely addressed in many domains. These systems are more and more requiring volumetric data to be processed in real-time. Therefore, the performance is becoming constrained by material resources usage and an overall reduction in the computational time. In this paper, our contribution entirely lies on computing, in real-time, triangles neighborhood of 3D progressive meshes for a robust compression algorithm based on the scan-based wavelet transform(WT) technique. The originality of this latter algorithm is to compute the WT with minimum memory usage by processing data as they are acquired. However, with large data, this technique is considered poor in term of computational complexity. For that, this work exploits the GPU to accelerate the computation using OpenCL as a heterogeneous programming language. Experiments demonstrate that, aside from the portability across various platforms and the flexibility guaranteed by the OpenCL-based implementation, this method can improve performance gain in speedup factor of 5 compared to the sequential CPU implementation.
Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian
The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functionalmore » characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.« less
Differential Variance Analysis: a direct method to quantify and visualize dynamic heterogeneities
NASA Astrophysics Data System (ADS)
Pastore, Raffaele; Pesce, Giuseppe; Caggioni, Marco
2017-03-01
Many amorphous materials show spatially heterogenous dynamics, as different regions of the same system relax at different rates. Such a signature, known as Dynamic Heterogeneity, has been crucial to understand the nature of the jamming transition in simple model systems and is currently considered very promising to characterize more complex fluids of industrial and biological relevance. Unfortunately, measurements of dynamic heterogeneities typically require sophisticated experimental set-ups and are performed by few specialized groups. It is now possible to quantitatively characterize the relaxation process and the emergence of dynamic heterogeneities using a straightforward method, here validated on video microscopy data of hard-sphere colloidal glasses. We call this method Differential Variance Analysis (DVA), since it focuses on the variance of the differential frames, obtained subtracting images at different time-lags. Moreover, direct visualization of dynamic heterogeneities naturally appears in the differential frames, when the time-lag is set to the one corresponding to the maximum dynamic susceptibility. This approach opens the way to effectively characterize and tailor a wide variety of soft materials, from complex formulated products to biological tissues.
NASA Astrophysics Data System (ADS)
Yin, Xiang-Chu; Yu, Huai-Zhong; Kukshenko, Victor; Xu, Zhao-Yong; Wu, Zhishen; Li, Min; Peng, Keyin; Elizarov, Surgey; Li, Qi
2004-12-01
In order to verify some precursors such as LURR (Load/Unload Response Ratio) and AER (Accelerating Energy Release) before large earthquakes or macro-fracture in heterogeneous brittle media, four acoustic emission experiments involving large rock specimens under tri-axial stress, have been conducted. The specimens were loaded in two ways: monotonous or cycling. The experimental results confirm that LURR and AER are precursors of macro-fracture in brittle media. A new measure called the state vector has been proposed to describe the damage evolution of loaded rock specimens.
Numerical Simulation of the Anomalous Transport of High-Energy Cosmic Rays in Galactic Superbubble
NASA Technical Reports Server (NTRS)
Barghouty, A. F.; Price, E. M.; MeWaldt, R. A.
2013-01-01
A continuous-time random-walk (CTRW) model to simulate the transport and acceleration of high-energy cosmic rays in galactic superbubbles has recently been put forward (Barghouty & Schnee 2102). The new model has been developed to simulate and highlight signatures of anomalous transport on particles' evolution and their spectra in a multi-shock, collective acceleration context. The superbubble is idealized as a heterogeneous region of particle sources and sinks bounded by a random surface. This work concentrates on the effects of the bubble's assumed astrophysical characteristics (cf. geometry and roughness) on the particles' spectra.
NASA Astrophysics Data System (ADS)
Badalyan, A. M.; Bakhturova, L. F.; Kaichev, V. V.; Polyakov, O. V.; Pchelyakov, O. P.; Smirnov, G. I.
2011-09-01
A new technique for depositing thin nanostructured layers on semiconductor and insulating substrates that is based on heterogeneous gas-phase synthesis from low-dimensional volatile metal complexes is suggested and tried out. Thin nanostructured copper layers are deposited on silicon and quartz substrates from low-dimensional formate complexes using a combined synthesis-mass transport process. It is found that copper in layers thus deposited is largely in a metal state (Cu0) and has the form of closely packed nanograins with a characteristic structure.
2007-06-15
technology prize competitions have been used since the 18th century to spur innovation and advance the development of complex and slowly maturing disruptive ... technologies The Defense Advanced Research Projects Agency (DARPA) has used advanced technology competitions in 2004 and 2005 to rapidly accelerate the
Minireview: Genetic basis of heterogeneity and severity in sickle cell disease
Habara, Alawi
2016-01-01
Sickle cell disease, a common single gene disorder, has a complex pathophysiology that at its root is initiated by the polymerization of deoxy sickle hemoglobin. Sickle vasoocclusion and hemolytic anemia drive the development of disease complications. In this review, we focus on the genetic modifiers of disease heterogeneity. The phenotypic heterogeneity of disease is only partially explained by genetic variability of fetal hemoglobin gene expression and co-inheritance of α thalassemia. Given the complexity of pathophysiology, many different definitions of severity are possible complicating a full understanding of its genetic foundation. The pathophysiological complexity and the interlocking nature of the biological processes underpinning disease severity are becoming better understood. Nevertheless, useful genetic signatures of severity, regardless of how this is defined, are insufficiently developed to be used for treatment decisions and for counseling. PMID:26936084
Moise, Nicolae; Moya, Ismaël
2004-06-28
We report the first direct decomposition of the fluorescence lifetime heterogeneity during multiphasic fluorescence induction in dark-adapted leaves by multi-frequency phase and modulation fluorometry (PMF). A very fast component, assigned to photosystem I (PSI), was found to be constant in lifetime and yield, whereas the two slow components, which are strongly affected by the closure of the reaction centers by light, were assigned to PSII. Based on a modified "reversible radical pair" kinetic model with three compartments, we showed that a loosely connected pigment complex, which is assumed to be the CP47 complex, plays a specific role with respect to the structure and function of the PSII: (i) it explains the heterogeneity of PSII fluorescence lifetime as a compartmentation of excitation energy in the antenna, (ii) it is the site of a conformational change in the first second of illumination, and (iii) it is involved in the mechanisms of nonphotochemical quenching (NPQ). On the basis of the multi-frequency PMF analysis, we reconciled two apparently antagonistic aspects of chlorophyll a fluorescence in vivo: it is heterogeneous with respect to the kinetic structure (several lifetime components) and homogeneous with respect to average quantities (quasi-linear mean tau-Phi relationship).
Modification of heterogeneous chemistry by complex substrate morphology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henson, B.F.; Buelow, S.J.; Robinson, J.M.
1998-12-31
This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Chemistry in many environmental systems is determined at some stage by heterogeneous reaction with a surface. Typically the surface exists as a dispersion or matrix of particulate matter or pores, and a determination of the heterogeneous chemistry of the system must address the extent to which the complexity of the environmental surface affects the reaction rates. Reactions that are of current interest are the series of chlorine nitrate reactions important in polar ozone depletion. The authors have applied surfacemore » spectroscopic techniques developed at LANL to address the chemistry of chlorine nitrate reactions on porous nitric and sulfuric acid ice surfaces as a model study of the measurement of complex, heterogeneous reaction rates. The result of the study is an experimental determination of the surface coverage of one adsorbed reagent and a mechanism of reactivity based on the dependence of this coverage on temperature and vapor pressure. The resulting mechanism allows the first comprehensive modeling of chlorine nitrate reaction probability data from several laboratories.« less
GPU-accelerated Tersoff potentials for massively parallel Molecular Dynamics simulations
NASA Astrophysics Data System (ADS)
Nguyen, Trung Dac
2017-03-01
The Tersoff potential is one of the empirical many-body potentials that has been widely used in simulation studies at atomic scales. Unlike pair-wise potentials, the Tersoff potential involves three-body terms, which require much more arithmetic operations and data dependency. In this contribution, we have implemented the GPU-accelerated version of several variants of the Tersoff potential for LAMMPS, an open-source massively parallel Molecular Dynamics code. Compared to the existing MPI implementation in LAMMPS, the GPU implementation exhibits a better scalability and offers a speedup of 2.2X when run on 1000 compute nodes on the Titan supercomputer. On a single node, the speedup ranges from 2.0 to 8.0 times, depending on the number of atoms per GPU and hardware configurations. The most notable features of our GPU-accelerated version include its design for MPI/accelerator heterogeneous parallelism, its compatibility with other functionalities in LAMMPS, its ability to give deterministic results and to support both NVIDIA CUDA- and OpenCL-enabled accelerators. Our implementation is now part of the GPU package in LAMMPS and accessible for public use.
NASA Astrophysics Data System (ADS)
Yang, Hongyong; Han, Fujun; Zhao, Mei; Zhang, Shuning; Yue, Jun
2017-08-01
Because many networked systems can only be characterized with fractional-order dynamics in complex environments, fractional-order calculus has been studied deeply recently. When diverse individual features are shown in different agents of networked systems, heterogeneous fractional-order dynamics will be used to describe the complex systems. Based on the distinguishing properties of agents, heterogeneous fractional-order multi-agent systems (FOMAS) are presented. With the supposition of multiple leader agents in FOMAS, distributed containment control of FOMAS is studied in directed weighted topologies. By applying Laplace transformation and frequency domain theory of the fractional-order operator, an upper bound of delays is obtained to ensure containment consensus of delayed heterogenous FOMAS. Consensus results of delayed FOMAS in this paper can be extended to systems with integer-order models. Finally, numerical examples are used to verify our results.
McOmish, Caitlin E; Burrows, Emma L; Hannan, Anthony J
2014-10-01
Psychiatric disorders affect a substantial proportion of the population worldwide. This high prevalence, combined with the chronicity of the disorders and the major social and economic impacts, creates a significant burden. As a result, an important priority is the development of novel and effective interventional strategies for reducing incidence rates and improving outcomes. This review explores the progress that has been made to date in establishing valid animal models of psychiatric disorders, while beginning to unravel the complex factors that may be contributing to the limitations of current methodological approaches. We propose some approaches for optimizing the validity of animal models and developing effective interventions. We use schizophrenia and autism spectrum disorders as examples of disorders for which development of valid preclinical models, and fully effective therapeutics, have proven particularly challenging. However, the conclusions have relevance to various other psychiatric conditions, including depression, anxiety and bipolar disorders. We address the key aspects of construct, face and predictive validity in animal models, incorporating genetic and environmental factors. Our understanding of psychiatric disorders is accelerating exponentially, revealing extraordinary levels of genetic complexity, heterogeneity and pleiotropy. The environmental factors contributing to individual, and multiple, disorders also exhibit breathtaking complexity, requiring systematic analysis to experimentally explore the environmental mediators and modulators which constitute the 'envirome' of each psychiatric disorder. Ultimately, genetic and environmental factors need to be integrated via animal models incorporating the spatiotemporal complexity of gene-environment interactions and experience-dependent plasticity, thus better recapitulating the dynamic nature of brain development, function and dysfunction. © 2014 The British Pharmacological Society.
Socio-Economic Profiles of Selected Ethnic/Visible Minority Groups--1981 Census.
ERIC Educational Resources Information Center
Department of the Secretary of State, Ottawa (Ontario). Multiculturalism Directorate.
In Canada today no single ethnocultural group makes up a majority of the population. Increases in the number of immigrants, especially people from Third World nations, continue to accelerate Canada's ethnocultural heterogeneity. A new descriptive term "visible minority," is now used to describe persons who are non-white, and distinct…
NASA Astrophysics Data System (ADS)
Christou, Michalis; Christoudias, Theodoros; Morillo, Julián; Alvarez, Damian; Merx, Hendrik
2016-09-01
We examine an alternative approach to heterogeneous cluster-computing in the many-core era for Earth system models, using the European Centre for Medium-Range Weather Forecasts Hamburg (ECHAM)/Modular Earth Submodel System (MESSy) Atmospheric Chemistry (EMAC) model as a pilot application on the Dynamical Exascale Entry Platform (DEEP). A set of autonomous coprocessors interconnected together, called Booster, complements a conventional HPC Cluster and increases its computing performance, offering extra flexibility to expose multiple levels of parallelism and achieve better scalability. The EMAC model atmospheric chemistry code (Module Efficiently Calculating the Chemistry of the Atmosphere (MECCA)) was taskified with an offload mechanism implemented using OmpSs directives. The model was ported to the MareNostrum 3 supercomputer to allow testing with Intel Xeon Phi accelerators on a production-size machine. The changes proposed in this paper are expected to contribute to the eventual adoption of Cluster-Booster division and Many Integrated Core (MIC) accelerated architectures in presently available implementations of Earth system models, towards exploiting the potential of a fully Exascale-capable platform.
Hardware accelerated high performance neutron transport computation based on AGENT methodology
NASA Astrophysics Data System (ADS)
Xiao, Shanjie
The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.
Unsupervised data mining in nanoscale x-ray spectro-microscopic study of NdFeB magnet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Xiaoyue; Yang, Feifei; Antono, Erin
Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanentmore » magnet material, Nd 2Fe 14B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For instance, it shows that the surface of common Nd 2Fe 14B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material’s overall properties.« less
Unsupervised data mining in nanoscale x-ray spectro-microscopic study of NdFeB magnet
Duan, Xiaoyue; Yang, Feifei; Antono, Erin; ...
2016-09-29
Novel developments in X-ray based spectro-microscopic characterization techniques have increased the rate of acquisition of spatially resolved spectroscopic data by several orders of magnitude over what was possible a few years ago. This accelerated data acquisition, with high spatial resolution at nanoscale and sensitivity to subtle differences in chemistry and atomic structure, provides a unique opportunity to investigate hierarchically complex and structurally heterogeneous systems found in functional devices and materials systems. However, handling and analyzing the large volume data generated poses significant challenges. Here we apply an unsupervised data-mining algorithm known as DBSCAN to study a rare-earth element based permanentmore » magnet material, Nd 2Fe 14B. We are able to reduce a large spectro-microscopic dataset of over 300,000 spectra to 3, preserving much of the underlying information. Scientists can easily and quickly analyze in detail three characteristic spectra. Our approach can rapidly provide a concise representation of a large and complex dataset to materials scientists and chemists. For instance, it shows that the surface of common Nd 2Fe 14B magnet is chemically and structurally very different from the bulk, suggesting a possible surface alteration effect possibly due to the corrosion, which could affect the material’s overall properties.« less
Fermilab proton accelerator complex status and improvement plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiltsev, Vladimir
2017-05-30
Fermilab carries out an extensive program of accelerator-based high energy particle physics research at the Intensity Frontier that relies on the operation of 8 GeV and 120 GeV proton beamlines for a n umber of fixed target experiments. Routine operation with a world-record 700kW of average 120 GeV beam power on the neutrino target was achieved in 2017 as the result of the Proton Improvement Plan (PIP) upgrade. There are plans to further increase the power to 900 – 1000 kW. The next major upgrade of the FNAL accelerator complex, called PIP-II, is under development. It aims at 1.2MW beammore » power on target at the start of the LBNF/DUNE experiment in the middle of the next decade and assumes replacement of the existing 40-years old 400 MeV normal-conducting Linac with a modern 800 MeV superconducting RF linear accelerator. There are several concepts to further double the beam power to >2.4MW after replacement of the existing 8 GeV Booster synchrotron. In this article we discuss current performance of the Fermilab proton accelerator complex, the upgrade plans for the next two decades and the accelerator R&D program to address cost and performance risks for these upgrades.« less
Accelerating DNA analysis applications on GPU clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tumeo, Antonino; Villa, Oreste
DNA analysis is an emerging application of high performance bioinformatic. Modern sequencing machinery are able to provide, in few hours, large input streams of data which needs to be matched against exponentially growing databases known fragments. The ability to recognize these patterns effectively and fastly may allow extending the scale and the reach of the investigations performed by biology scientists. Aho-Corasick is an exact, multiple pattern matching algorithm often at the base of this application. High performance systems are a promising platform to accelerate this algorithm, which is computationally intensive but also inherently parallel. Nowadays, high performance systems also includemore » heterogeneous processing elements, such as Graphic Processing Units (GPUs), to further accelerate parallel algorithms. Unfortunately, the Aho-Corasick algorithm exhibits large performance variabilities, depending on the size of the input streams, on the number of patterns to search and on the number of matches, and poses significant challenges on current high performance software and hardware implementations. An adequate mapping of the algorithm on the target architecture, coping with the limit of the underlining hardware, is required to reach the desired high throughputs. Load balancing also plays a crucial role when considering the limited bandwidth among the nodes of these systems. In this paper we present an efficient implementation of the Aho-Corasick algorithm for high performance clusters accelerated with GPUs. We discuss how we partitioned and adapted the algorithm to fit the Tesla C1060 GPU and then present a MPI based implementation for a heterogeneous high performance cluster. We compare this implementation to MPI and MPI with pthreads based implementations for a homogeneous cluster of x86 processors, discussing the stability vs. the performance and the scaling of the solutions, taking into consideration aspects such as the bandwidth among the different nodes.« less
NASA Astrophysics Data System (ADS)
Liu, Peng; Ju, Yang; Gao, Feng; Ranjith, Pathegama G.; Zhang, Qianbing
2018-03-01
Understanding and characterization of the three-dimensional (3-D) propagation and distribution of hydrofracturing cracks in heterogeneous rock are key for enhancing the stimulation of low-permeability petroleum reservoirs. In this study, we investigated the propagation and distribution characteristics of hydrofracturing cracks, by conducting true triaxial hydrofracturing tests and computed tomography on artificial heterogeneous rock specimens. Silica sand, Portland cement, and aedelforsite were mixed to create artificial heterogeneous rock specimens using the data of mineral compositions, coarse gravel distribution, and mechanical properties that were measured from the natural heterogeneous glutenite cores. To probe the effects of material heterogeneity on hydrofracturing cracks, the artificial homogenous specimens were created using the identical matrix compositions of the heterogeneous rock specimens and then fractured for comparison. The effects of horizontal geostress ratio on the 3-D growth and distribution of cracks during hydrofracturing were examined. A fractal-based method was proposed to characterize the complexity of fractures and the efficiency of hydrofracturing stimulation of heterogeneous media. The material heterogeneity and horizontal geostress ratio were found to significantly influence the 3-D morphology, growth, and distribution of hydrofracturing cracks. A horizontal geostress ratio of 1.7 appears to be the upper limit for the occurrence of multiple cracks, and higher ratios cause a single crack perpendicular to the minimum horizontal geostress component. The fracturing efficiency is associated with not only the fractured volume but also the complexity of the crack network.
A weighted U statistic for association analyses considering genetic heterogeneity.
Wei, Changshuai; Elston, Robert C; Lu, Qing
2016-07-20
Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing
NASA Astrophysics Data System (ADS)
Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A.; Ferreira, Placid M.; Kim, Seok; Min, Bumki
2016-06-01
Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities.
Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing.
Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A; Ferreira, Placid M; Kim, Seok; Min, Bumki
2016-06-10
Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities.
Guided and magnetic self-assembly of tunable magnetoceptive gels
NASA Astrophysics Data System (ADS)
Tasoglu, S.; Yu, C. H.; Gungordu, H. I.; Guven, S.; Vural, T.; Demirci, U.
2014-09-01
Self-assembly of components into complex functional patterns at microscale is common in nature, and used increasingly in numerous disciplines such as optoelectronics, microfabrication, sensors, tissue engineering and computation. Here, we describe the use of stable radicals to guide the self-assembly of magnetically tunable gels, which we call ‘magnetoceptive’ materials at the scale of hundreds of microns to a millimeter, each can be programmed by shape and composition, into heterogeneous complex structures. Using paramagnetism of free radicals as a driving mechanism, complex heterogeneous structures are built in the magnetic field generated by permanent magnets. The overall magnetic signature of final structure is erased via an antioxidant vitamin E, subsequent to guided self-assembly. We demonstrate unique capabilities of radicals and antioxidants in fabrication of soft systems with heterogeneity in material properties, such as porosity, elastic modulus and mass density; then in bottom-up tissue engineering and finally, levitational and selective assembly of microcomponents.
Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing
Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A.; Ferreira, Placid M.; Kim, Seok; Min, Bumki
2016-01-01
Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities. PMID:27283594
Guided and magnetic self-assembly of tunable magnetoceptive gels
Tasoglu, S.; Yu, C.H.; Gungordu, H.I.; Guven, S.; Vural, T.; Demirci, U.
2014-01-01
Self-assembly of components into complex functional patterns at microscale is common in nature, and used increasingly in numerous disciplines such as optoelectronics, microfabrication, sensors, tissue engineering and computation. Here, we describe the use of stable radicals to guide the self-assembly of magnetically tunable gels, which we call ‘magnetoceptive’ materials at the scale of hundreds of microns to a millimeter, each can be programmed by shape and composition, into heterogeneous complex structures. Using paramagnetism of free radicals as a driving mechanism, complex heterogeneous structures are built in the magnetic field generated by permanent magnets. The overall magnetic signature of final structure is erased via an antioxidant vitamin E, subsequent to guided self-assembly. We demonstrate unique capabilities of radicals and antioxidants in fabrication of soft systems with heterogeneity in material properties, such as porosity, elastic modulus and mass density; then in bottom-up tissue engineering and finally, levitational and selective assembly of microcomponents. PMID:25175148
NASA Astrophysics Data System (ADS)
Xu, Jun; Zhang, Yuanhang; Wang, Wei
2006-12-01
The air quality model CMAQ-MADRID (Community Multiscale Air Quality-Model of Aerosol Dynamics, Reaction, Ionization and Dissolution) was employed to simulate summer O3 formation in Beijing China, in order to explore the impacts of four heterogeneous reactions on O3 formation in an urban area. The results showed that the impacts were obvious and exhibited the characteristics of a typical response of a VOC-limited regime in the urban area. For the four heterogeneous reactions considered, the NO2 and HO2 heterogeneous reactions have the most severe impacts on O3 formation. During the O3 formation period, the NO2 heterogeneous reaction increased new radical creation by 30%, raising the atmospheric activity as more NO→NO2 conversion occurred, thus causing the O3 to rise. The increase of O3 peak concentration reached a maximum value of 67 ppb in the urban area. In the morning hours, high NO titration reduced the effect of the photolysis of HONO, which was produced heterogeneously at night in the surface layer. The NO2 heterogeneous reaction in the daytime is likely one of the major reasons causing the O3 increase in the Beijing urban area. The HO2 heterogeneous reaction accelerated radical termination, resulting in a decrease of the radical concentration by 44% at the most. O3 peak concentration decreased by a maximum amount of 24 ppb in the urban area. The simulation results were improved when the heterogeneous reactions were included, with the O3 and HONO model results close to the observations.
Overview of Accelerator Applications in Energy
NASA Astrophysics Data System (ADS)
Garnett, Robert W.; Sheffield, Richard L.
An overview of the application of accelerators and accelerator technology in energy is presented. Applications span a broad range of cost, size, and complexity and include large-scale systems requiring high-power or high-energy accelerators to drive subcritical reactors for energy production or waste transmutation, as well as small-scale industrial systems used to improve oil and gas exploration and production. The enabling accelerator technologies will also be reviewed and future directions discussed.
Li, Wei; Orozco, Ruben; Camargos, Natalia; Liu, Haizhou
2017-04-04
Persulfate (S 2 O 8 2- )-based in situ chemical oxidation (ISCO) has gained more attention in recent years due to the generation of highly reactive and selective sulfate radical (SO 4 •- ). This study examined the effects of important groundwater chemical parameters, i.e., alkalinity, pH, and chloride on benzene degradation via heterogeneous persulfate activation by three Fe(III)- and Mn(IV)-containing aquifer minerals: ferrihydrite, goethite, and pyrolusite. A comprehensive kinetic model was established to elucidate the mechanisms of radical generation and mineral surface complexation. Results showed that an increase of alkalinity up to 10 meq/L decreased the rates of persulfate decomposition and benzene degradation, which was associated with the formation of unreactive surface carbonato complexes. An increase in pH generally accelerated persulfate decomposition due to enhanced formation of reactive surface hydroxo complexation. A change in the chloride level up to 5 mM had a negligibly effect on the reaction kinetics. Kinetics modeling also suggested that SO 4 •- was transformed to hydroxyl radical (HO • ) and carbonate radical (CO 3 •- ) at higher pHs. Furthermore, the yields of two major products of benzene oxidation, i.e., phenol and aldehyde, were positively correlated with the branching ratio of SO 4 •- reacting with benzene, but inversely correlated with that of HO • or CO 3 •- , indicating that SO 4 •- preferentially oxidized benzene via pathways involving fewer hydroxylation steps compared to HO • or CO 3 •- .
Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale
Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason
2017-01-01
With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft’s FPGA deployment in its Bing search engine and Intel’s 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems—like Apache Spark and Hadoop—to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster. PMID:28317049
Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale.
Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason
2016-10-01
With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft's FPGA deployment in its Bing search engine and Intel's 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems-like Apache Spark and Hadoop-to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster.
Heterogeneity: The key to failure forecasting
Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.
2015-01-01
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power. PMID:26307196
Impact of heterogeneity on groundwater salinization due to coastal pumping
NASA Astrophysics Data System (ADS)
Yu, X.; Michael, H. A.
2017-12-01
Groundwater abstraction causes and accelerates seawater intrusion in many coastal areas. In heterogeneous aquifers, preferential flow paths can lead to fast intrusion, while low permeability layers can serve as barriers. The extent to which different types of heterogeneous aquifers are vulnerable to pumping-induced seawater intrusion has not been well studied. Here we show that the connectedness of pumping location and local boundary condition drive salinization patterns. Salinization patterns in homogeneous aquifers were relatively simple and only related to the hydraulic properties and pumping rate. The salinization rates and patterns in heterogeneous aquifers were much more complicated and related to pumping location, rate and depth, preferential flow path locations, and local boundary conditions. An intrusion classification approach was developed with three types in homogeneous aquifers and four types in heterogeneous aquifers. After classification the main factors of salinized areas, intrusion rates and salinization time could be identified. The ranges of these salinization assessment criteria suggested different aspect of groundwater vulnerability in each class. We anticipate the classification approach to be a starting point for more comprehensive groundwater abstraction vulnerability assessment (including consideration of pumping rates, locations and depths, connectivity, preferential flow paths, etc.), which is critical for coastal water resources management.
Heterogeneity: The key to failure forecasting.
Vasseur, Jérémie; Wadsworth, Fabian B; Lavallée, Yan; Bell, Andrew F; Main, Ian G; Dingwell, Donald B
2015-08-26
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.
Heterogeneity: The key to failure forecasting
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.
2015-08-01
Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.
Martin-Ruiz, Carmen; Saretzki, Gabriele; Petrie, Joanne; Ladhoff, Juliane; Jeyapalan, Jessie; Wei, Wenyi; Sedivy, John; von Zglinicki, Thomas
2004-04-23
The replicative life span of human fibroblasts is heterogeneous, with a fraction of cells senescing at every population doubling. To find out whether this heterogeneity is due to premature senescence, i.e. driven by a nontelomeric mechanism, fibroblasts with a senescent phenotype were isolated from growing cultures and clones by flow cytometry. These senescent cells had shorter telomeres than their cycling counterparts at all population doubling levels and both in mass cultures and in individual subclones, indicating heterogeneity in the rate of telomere shortening. Ectopic expression of telomerase stabilized telomere length in the majority of cells and rescued them from early senescence, suggesting a causal role of telomere shortening. Under standard cell culture conditions, there was a minor fraction of cells that showed a senescent phenotype and short telomeres despite active telomerase. This fraction increased under chronic mild oxidative stress, which is known to accelerate telomere shortening. It is possible that even high telomerase activity cannot fully compensate for telomere shortening in all cells. The data show that heterogeneity of the human fibroblast replicative life span can be caused by significant stochastic cell-to-cell variation in telomere shortening.
Liu, Zhiming; Luo, Jiawei
2017-08-01
Associating protein complexes to human inherited diseases is critical for better understanding of biological processes and functional mechanisms of the disease. Many protein complexes have been identified and functionally annotated by computational and purification methods so far, however, the particular roles they were playing in causing disease have not yet been well determined. In this study, we present a novel method to identify associations between protein complexes and diseases. First, we construct a disease-protein heterogeneous network based on data integration and laplacian normalization. Second, we apply a random walk with restart on heterogeneous network (RWRH) algorithm on this network to quantify the strength of the association between proteins and the query disease. Third, we sum over the scores of member proteins to obtain a summary score for each candidate protein complex, and then rank all candidate protein complexes according to their scores. With a series of leave-one-out cross-validation experiments, we found that our method not only possesses high performance but also demonstrates robustness regarding the parameters and the network structure. We test our approach with breast cancer and select top 20 highly ranked protein complexes, 17 of the selected protein complexes are evidenced to be connected with breast cancer. Our proposed method is effective in identifying disease-related protein complexes based on data integration and laplacian normalization. Copyright © 2017. Published by Elsevier Ltd.
Jaiswal, Abhishek; Egami, Takeshi; Zhang, Yang
2015-04-01
The phase behavior of multi-component metallic liquids is exceedingly complex because of the convoluted many-body and many-elemental interactions. Herein, we present systematic studies of the dynamic aspects of such a model ternary metallic liquid Cu 40Zr 51Al 9 using molecular dynamics simulation with embedded atom method. We observed a dynamical crossover from Arrhenius to super-Arrhenius behavior in the transport properties (diffusion coefficient, relaxation times, and shear viscosity) bordered at T x ~1300K. Unlike in many molecular and macromolecular liquids, this crossover phenomenon occurs in the equilibrium liquid state well above the melting temperature of the system (T m ~ 900K),more » and the crossover temperature is roughly twice of the glass-transition temperature (T g). Below T x, we found the elemental dynamics decoupled and the Stokes-Einstein relation broke down, indicating the onset of heterogeneous spatially correlated dynamics in the system mediated by dynamic communications among local configurational excitations. To directly characterize and visualize the correlated dynamics, we employed a non-parametric, unsupervised machine learning technique and identified dynamical clusters of atoms with similar atomic mobility. The revealed average dynamical cluster size shows an accelerated increase below T x and mimics the trend observed in other ensemble averaged quantities that are commonly used to quantify the spatially heterogeneous dynamics such as the non-Gaussian parameter and the four-point correlation function.« less
Xu, Maoqi; Chen, Liang
2018-01-01
The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Positive Selection in Rapidly Evolving Plastid–Nuclear Enzyme Complexes
Rockenbach, Kate; Havird, Justin C.; Monroe, J. Grey; Triant, Deborah A.; Taylor, Douglas R.; Sloan, Daniel B.
2016-01-01
Rates of sequence evolution in plastid genomes are generally low, but numerous angiosperm lineages exhibit accelerated evolutionary rates in similar subsets of plastid genes. These genes include clpP1 and accD, which encode components of the caseinolytic protease (CLP) and acetyl-coA carboxylase (ACCase) complexes, respectively. Whether these extreme and repeated accelerations in rates of plastid genome evolution result from adaptive change in proteins (i.e., positive selection) or simply a loss of functional constraint (i.e., relaxed purifying selection) is a source of ongoing controversy. To address this, we have taken advantage of the multiple independent accelerations that have occurred within the genus Silene (Caryophyllaceae) by examining phylogenetic and population genetic variation in the nuclear genes that encode subunits of the CLP and ACCase complexes. We found that, in species with accelerated plastid genome evolution, the nuclear-encoded subunits in the CLP and ACCase complexes are also evolving rapidly, especially those involved in direct physical interactions with plastid-encoded proteins. A massive excess of nonsynonymous substitutions between species relative to levels of intraspecific polymorphism indicated a history of strong positive selection (particularly in CLP genes). Interestingly, however, some species are likely undergoing loss of the native (heteromeric) plastid ACCase and putative functional replacement by a duplicated cytosolic (homomeric) ACCase. Overall, the patterns of molecular evolution in these plastid–nuclear complexes are unusual for anciently conserved enzymes. They instead resemble cases of antagonistic coevolution between pathogens and host immune genes. We discuss a possible role of plastid–nuclear conflict as a novel cause of accelerated evolution. PMID:27707788
Duplex Heterogeneous Nucleation Behavior of Precipitates in C-Mn Steel Containing Sn
NASA Astrophysics Data System (ADS)
Sun, Guilin; Tao, Sufen
2018-04-01
The two successive heterogeneous nucleation behaviors of FeSn2-MnS-Al2O3 complex precipitates in ultrahigh Sn-bearing steel were investigated. First, Al2O3 was the nucleation site of the MnS at the end of solidification. Then, FeSn2 nucleated heterogeneously on the MnS particles that nucleated on the Al2O3 particles. The formation sequence of the precipitated phase caused the duplex heterogeneous nucleation to occur consecutively at most twice.
Kirschbaum, Mark A.; Schenk, Christopher J.
2010-01-01
Valley-fill deposits form a significant class of hydrocarbon reservoirs in many basins of the world. Maximizing recovery of fluids from these reservoirs requires an understanding of the scales of fluid-flow heterogeneity present within the valley-fill system. The Upper Cretaceous Dakota Sandstone in the San Rafael Swell, Utah contains well exposed, relatively accessible outcrops that allow a unique view of the external geometry and internal complexity of a set of rocks interpreted to be deposits of an incised valley fill. These units can be traced on outcrop for tens of miles, and individual sandstone bodies are exposed in three dimensions because of modern erosion in side canyons in a semiarid setting and by exhumation of the overlying, easily erodible Mancos Shale. The Dakota consists of two major units: (1) a lower amalgamated sandstone facies dominated by large-scale cross stratification with several individual sandstone bodies ranging in thickness from 8 to 28 feet, ranging in width from 115 to 150 feet, and having lengths as much as 5,000 feet, and (2) an upper facies composed of numerous mud-encased lenticular sandstones, dominated by ripple-scale lamination, in bedsets ranging in thickness from 5 to 12 feet. The lower facies is interpreted to be fluvial, probably of mainly braided stream origin that exhibits multiple incisions amalgamated into a complex sandstone body. The upper facies has lower energy, probably anastomosed channels encased within alluvial and coastal-plain floodplain sediments. The Dakota valley-fill complex has multiple scales of heterogeneity that could affect fluid flow in similar oil and gas subsurface reservoirs. The largest scale heterogeneity is at the formation level, where the valley-fill complex is sealed within overlying and underlying units. Within the valley-fill complex, there are heterogeneities between individual sandstone bodies, and at the smallest scale, internal heterogeneities within the bodies themselves. These different scales of fluid-flow compartmentalization present a challenge to hydrocarbon exploration targeting paleovalley deposits, and producing fields containing these types of reservoirs may have significant bypassed pay, especially where well spacing is large.
Simple heterogeneity parametrization for sea surface temperature and chlorophyll
NASA Astrophysics Data System (ADS)
Skákala, Jozef; Smyth, Timothy J.
2016-06-01
Using satellite maps this paper offers a complex analysis of chlorophyll & SST heterogeneity in the shelf seas around the southwest of the UK. The heterogeneity scaling follows a simple power law and is consequently parametrized by two parameters. It is shown that in most cases these two parameters vary only relatively little with time. The paper offers a detailed comparison of field heterogeneity between different regions. How much heterogeneity is in each region preserved in the annual median data is also determined. The paper explicitly demonstrates how one can use these results to calculate representative measurement area for in situ networks.
Measurement of Coriolis Acceleration with a Smartphone
ERIC Educational Resources Information Center
Shaku, Asif; Kraft, Jakob
2016-01-01
Undergraduate physics laboratories seldom have experiments that measure the Coriolis acceleration. This has traditionally been the case owing to the inherent complexities of making such measurements. Articles on the experimental determination of the Coriolis acceleration are few and far between in the physics literature. However, because modern…
Fast sparse recovery and coherence factor weighting in optoacoustic tomography
NASA Astrophysics Data System (ADS)
He, Hailong; Prakash, Jaya; Buehler, Andreas; Ntziachristos, Vasilis
2017-03-01
Sparse recovery algorithms have shown great potential to reconstruct images with limited view datasets in optoacoustic tomography, with a disadvantage of being computational expensive. In this paper, we improve the fast convergent Split Augmented Lagrangian Shrinkage Algorithm (SALSA) method based on least square QR (LSQR) formulation for performing accelerated reconstructions. Further, coherence factor is calculated to weight the final reconstruction result, which can further reduce artifacts arising in limited-view scenarios and acoustically heterogeneous mediums. Several phantom and biological experiments indicate that the accelerated SALSA method with coherence factor (ASALSA-CF) can provide improved reconstructions and much faster convergence compared to existing sparse recovery methods.
Genome-wide detection of intervals of genetic heterogeneity associated with complex traits
Llinares-López, Felipe; Grimm, Dominik G.; Bodenham, Dean A.; Gieraths, Udo; Sugiyama, Mahito; Rowan, Beth; Borgwardt, Karsten
2015-01-01
Motivation: Genetic heterogeneity, the fact that several sequence variants give rise to the same phenotype, is a phenomenon that is of the utmost interest in the analysis of complex phenotypes. Current approaches for finding regions in the genome that exhibit genetic heterogeneity suffer from at least one of two shortcomings: (i) they require the definition of an exact interval in the genome that is to be tested for genetic heterogeneity, potentially missing intervals of high relevance, or (ii) they suffer from an enormous multiple hypothesis testing problem due to the large number of potential candidate intervals being tested, which results in either many false positives or a lack of power to detect true intervals. Results: Here, we present an approach that overcomes both problems: it allows one to automatically find all contiguous sequences of single nucleotide polymorphisms in the genome that are jointly associated with the phenotype. It also solves both the inherent computational efficiency problem and the statistical problem of multiple hypothesis testing, which are both caused by the huge number of candidate intervals. We demonstrate on Arabidopsis thaliana genome-wide association study data that our approach can discover regions that exhibit genetic heterogeneity and would be missed by single-locus mapping. Conclusions: Our novel approach can contribute to the genome-wide discovery of intervals that are involved in the genetic heterogeneity underlying complex phenotypes. Availability and implementation: The code can be obtained at: http://www.bsse.ethz.ch/mlcb/research/bioinformatics-and-computational-biology/sis.html. Contact: felipe.llinares@bsse.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072488
Sarah C. Elmendorf; Gregory H.R. Henry; Robert D. Hollister; Robert G. Björk; Anne D. Bjorkman; Terry V. Callaghan; [and others] NO-VALUE; William Gould; Joel Mercado
2012-01-01
Understanding the sensitivity of tundra vegetation to climate warming is critical to forecasting future biodiversity and vegetation feedbacks to climate. In situ warming experiments accelerate climate change on a small scale to forecast responses of local plant communities. Limitations of this approach include the apparent site-specificity of results and uncertainty...
The Present Status of Siam Photon Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pairsuwan, Weerapong; Ishii, Takehiko; Isoyama, Goro
We report the technical problems encountered in commissioning and improving the performance of the accelerator complex which consists of a 1 GeV light source storage ring, a 1 GeV booster synchrotron, and a 40 MeV injector linac. Regulation work for an attached beam line with an experimental station for photoemission studies is also described. Beam instability and low injection efficiency are the major issues for the accelerator complex. In the beam line, the accurate optical alignment of the monochromator system and the modification of the measurement control software supplied by a marker are the work having been performed. The resultsmore » of the work on the accelerator complex will be helpful to the commissioning of the machine obtained secondhand and reformed to some extent.« less
DOT National Transportation Integrated Search
2004-01-01
Accelerated Construction Technology Transfer (ACTT) is a strategic process that uses various innovative techniques, strategies, and technologies to minimize actual construction time, while enhancing quality and safety on today's large, complex multip...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, Daniel
8-Session Symposium on STRUCTURE AND DYNAMICS IN COMPLEX CHEMICAL SYSTEMS: GAINING NEW INSIGHTS THROUGH RECENT ADVANCES IN TIME-RESOLVED SPECTROSCOPIES. The intricacy of most chemical, biochemical, and material processes and their applications are underscored by the complex nature of the environments in which they occur. Substantial challenges for building a global understanding of a heterogeneous system include (1) identifying unique signatures associated with specific structural motifs within the heterogeneous distribution, and (2) resolving the significance of each of multiple time scales involved in both small- and large-scale nuclear reorganization. This symposium focuses on the progress in our understanding of dynamics inmore » complex systems driven by recent innovations in time-resolved spectroscopies and theoretical developments. Such advancement is critical for driving discovery at the molecular level facilitating new applications. Broad areas of interest include: Structural relaxation and the impact of structure on dynamics in liquids, interfaces, biochemical systems, materials, and other heterogeneous environments.« less
Assessing the groundwater salinization in closed hydrologic basins due to overdraft
NASA Astrophysics Data System (ADS)
Guo, Z.; Pauloo, R.; Fogg, G. E.
2016-12-01
Population growth and the expansion of agriculture, coupled with climate uncertainties, have accelerated groundwater pumping and overdraft in alluvial aquifers worldwide. In many agricultural basins, the low rate of replenishment is far exceeded by the rate of groundwater pumping in overdrafted aquifers, which results in the substantial water table declines and in effect contributes to the formation of a "closed" basin. In fact, even modest amounts of groundwater system drawdown that do not produce what is construed as overdraft, can result in most of the groundwater discharge occurring as evapotranspiration via irrigation practices, converting the basin to a closed groundwater basin. Moreover, in past decades, extreme weather conditions (i.e., severe drought in California for the past five years) have resulted in substantially reduced surface water storage. This increases demand for groundwater to supplement low surface water supplies, and consequently, drives groundwater overdraft, and hence, groundwater salinization. In these newly closed basins, just as in other naturally closed basins such as Death Valley and the Great Salt Lake, groundwater salinity must increase not only due to evaporation, but also due to rock water interactions in the groundwater system, and lack of a natural outlet for the groundwater. In this study, the water balance and salt balance in closed basins of the Central Valley, California are computed. Groundwater degradation under the current overdraft conditions is further investigated using simple models that are developed by upscaling more complex and heterogeneous transport models. The focus of this study is to determine the applicability of these simple models to represent regional transport without explicitly including the large-scale heterogeneity inherent in the more complex models. Groundwater salinization processes, including salt accumulation caused by evapotranspiration of applied irrigation water and rock-groundwater interactions are simulated, and the time scales under which groundwater salinity may pose a threat to societies is estimated. Lastly, and most importantly, management strategies to mitigate groundwater salinization are examined.
Efficient Execution of Microscopy Image Analysis on CPU, GPU, and MIC Equipped Cluster Systems.
Andrade, G; Ferreira, R; Teodoro, George; Rocha, Leonardo; Saltz, Joel H; Kurc, Tahsin
2014-10-01
High performance computing is experiencing a major paradigm shift with the introduction of accelerators, such as graphics processing units (GPUs) and Intel Xeon Phi (MIC). These processors have made available a tremendous computing power at low cost, and are transforming machines into hybrid systems equipped with CPUs and accelerators. Although these systems can deliver a very high peak performance, making full use of its resources in real-world applications is a complex problem. Most current applications deployed to these machines are still being executed in a single processor, leaving other devices underutilized. In this paper we explore a scenario in which applications are composed of hierarchical data flow tasks which are allocated to nodes of a distributed memory machine in coarse-grain, but each of them may be composed of several finer-grain tasks which can be allocated to different devices within the node. We propose and implement novel performance aware scheduling techniques that can be used to allocate tasks to devices. We evaluate our techniques using a pathology image analysis application used to investigate brain cancer morphology, and our experimental evaluation shows that the proposed scheduling strategies significantly outperforms other efficient scheduling techniques, such as Heterogeneous Earliest Finish Time - HEFT, in cooperative executions using CPUs, GPUs, and MICs. We also experimentally show that our strategies are less sensitive to inaccuracy in the scheduling input data and that the performance gains are maintained as the application scales.
Efficient Execution of Microscopy Image Analysis on CPU, GPU, and MIC Equipped Cluster Systems
Andrade, G.; Ferreira, R.; Teodoro, George; Rocha, Leonardo; Saltz, Joel H.; Kurc, Tahsin
2015-01-01
High performance computing is experiencing a major paradigm shift with the introduction of accelerators, such as graphics processing units (GPUs) and Intel Xeon Phi (MIC). These processors have made available a tremendous computing power at low cost, and are transforming machines into hybrid systems equipped with CPUs and accelerators. Although these systems can deliver a very high peak performance, making full use of its resources in real-world applications is a complex problem. Most current applications deployed to these machines are still being executed in a single processor, leaving other devices underutilized. In this paper we explore a scenario in which applications are composed of hierarchical data flow tasks which are allocated to nodes of a distributed memory machine in coarse-grain, but each of them may be composed of several finer-grain tasks which can be allocated to different devices within the node. We propose and implement novel performance aware scheduling techniques that can be used to allocate tasks to devices. We evaluate our techniques using a pathology image analysis application used to investigate brain cancer morphology, and our experimental evaluation shows that the proposed scheduling strategies significantly outperforms other efficient scheduling techniques, such as Heterogeneous Earliest Finish Time - HEFT, in cooperative executions using CPUs, GPUs, and MICs. We also experimentally show that our strategies are less sensitive to inaccuracy in the scheduling input data and that the performance gains are maintained as the application scales. PMID:26640423
Overload cascading failure on complex networks with heterogeneous load redistribution
NASA Astrophysics Data System (ADS)
Hou, Yueyi; Xing, Xiaoyun; Li, Menghui; Zeng, An; Wang, Yougui
2017-09-01
Many real systems including the Internet, power-grid and financial networks experience rare but large overload cascading failures triggered by small initial shocks. Many models on complex networks have been developed to investigate this phenomenon. Most of these models are based on the load redistribution process and assume that the load on a failed node shifts to nearby nodes in the networks either evenly or according to the load distribution rule before the cascade. Inspired by the fact that real power-grid tends to place the excess load on the nodes with high remaining capacities, we study a heterogeneous load redistribution mechanism in a simplified sandpile model in this paper. We find that weak heterogeneity in load redistribution can effectively mitigate the cascade while strong heterogeneity in load redistribution may even enlarge the size of the final failure. With a parameter θ to control the degree of the redistribution heterogeneity, we identify a rather robust optimal θ∗ = 1. Finally, we find that θ∗ tends to shift to a larger value if the initial sand distribution is homogeneous.
NASA Astrophysics Data System (ADS)
Chao, Tsi-Chian; Tsai, Yi-Chun; Chen, Shih-Kuan; Wu, Shu-Wei; Tung, Chuan-Jong; Hong, Ji-Hong; Wang, Chun-Chieh; Lee, Chung-Chi
2017-08-01
The purpose of this study was to investigate the density heterogeneity pattern as a factor affecting Bragg peak degradation, including shifts in Bragg peak depth (ZBP), distal range (R80 and R20), and distal fall-off (R80-R20) using Monte Carlo N-Particles, eXtension (MCNPX). Density heterogeneities of different patterns with increasing complexity were placed downstream of commissioned proton beams at the Proton and Radiation Therapy Centre of Chang Gung Memorial Hospital, including one 150 MeV wobbling broad beam (10×10 cm2) and one 150 MeV proton pencil beam (FWHM of cross-plane=2.449 cm, FWHM of in-plane=2.256 cm). MCNPX 2.7.0 was used to model the transport and interactions of protons and secondary particles in density heterogeneity patterns and water using its repeated structure geometry. Different heterogeneity patterns were inserted into a 21×21×20 cm3 phantom. Mesh tally was used to track the dose distribution when the proton beam passed through the different density heterogeneity patterns. The results show that different heterogeneity patterns do cause different Bragg peak degradations owing to multiple Coulomb scattering (MCS) occurring in the density heterogeneities. A trend of increasing R20 and R80-R20 with increasing geometry complexity was observed. This means that Bragg peak degradation is mainly caused by the changes to the proton spectrum owing to MCS in the density heterogeneities. In contrast, R80 did not change considerably with different heterogeneity patterns, which indicated that the energy spectrum has only minimum effects on R80. Bragg peak degradation can occur both for a broad proton beam and a pencil beam, but is less significant for the broad beam.
Leang, Sarom S; Rendell, Alistair P; Gordon, Mark S
2014-03-11
Increasingly, modern computer systems comprise a multicore general-purpose processor augmented with a number of special purpose devices or accelerators connected via an external interface such as a PCI bus. The NVIDIA Kepler Graphical Processing Unit (GPU) and the Intel Phi are two examples of such accelerators. Accelerators offer peak performances that can be well above those of the host processor. How to exploit this heterogeneous environment for legacy application codes is not, however, straightforward. This paper considers how matrix operations in typical quantum chemical calculations can be migrated to the GPU and Phi systems. Double precision general matrix multiply operations are endemic in electronic structure calculations, especially methods that include electron correlation, such as density functional theory, second order perturbation theory, and coupled cluster theory. The use of approaches that automatically determine whether to use the host or an accelerator, based on problem size, is explored, with computations that are occurring on the accelerator and/or the host. For data-transfers over PCI-e, the GPU provides the best overall performance for data sizes up to 4096 MB with consistent upload and download rates between 5-5.6 GB/s and 5.4-6.3 GB/s, respectively. The GPU outperforms the Phi for both square and nonsquare matrix multiplications.
γ-Secretase Heterogeneity in the Aph1 Subunit: Relevance for Alzheimer’s Disease
Serneels, Lutgarde; Van Biervliet, Jérôme; Craessaerts, Katleen; Dejaegere, Tim; Horré, Katrien; Van Houtvin, Tine; Esselmann, Hermann; Paul, Sabine; Schäfer, Martin K.; Berezovska, Oksana; Hyman, Bradley T.; Sprangers, Ben; Sciot, Raf; Moons, Lieve; Jucker, Mathias; Yang, Zhixiang; May, Patrick C.; Karran, Eric; Wiltfang, Jens; D’Hooge, Rudi; De Strooper, Bart
2009-01-01
The γ-secretase complex plays a role in Alzheimer’s disease (AD) and cancer progression. The development of clinical useful inhibitors, however, is complicated by the role of the γ-secretase complex in regulated intramembrane proteolysis of Notch and other essential proteins. Different γ-secretase complexes containing different Presenilin or Aph1 protein subunits are present in various tissues. Here we show that these complexes have heterogeneous biochemical and physiological properties. Specific inactivation of the Aph1B γ-secretase in a murine Alzheimer’s disease model led to improvements of Alzheimer’s disease-relevant phenotypic features without any Notch-related side effects. The Aph1B complex contributes to total γ-secretase activity in the human brain, thus specific targeting of Aph1B-containing γ-secretase complexes may be helpful in generating less toxic therapies for Alzheimer’s disease. PMID:19299585
Heterogeneity in stream water temperatures created by local influx of cooler subsurface waters into geomorphically complex stream channels was associated with increased abundance of rainbow trout (Oncorhynchus mykiss) and chinook salmon (O. tshawytscha) in northeastern Oregon. Th...
Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.
2013-01-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945
Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E
2013-06-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.
Simplifying the complexity of resistance heterogeneity in metastasis
Lavi, Orit; Greene, James M.; Levy, Doron; Gottesman, Michael M.
2014-01-01
The main goal of treatment regimens for metastasis is to control growth rates, not eradicate all cancer cells. Mathematical models offer methodologies that incorporate high-throughput data with dynamic effects on net growth. The ideal approach would simplify, but not over-simplify, a complex problem into meaningful and manageable estimators that predict a patient’s response to specific treatments. Here, we explore three fundamental approaches with different assumptions concerning resistance mechanisms, in which the cells are categorized into either discrete compartments or described by a continuous range of resistance levels. We argue in favor of modeling resistance as a continuum and demonstrate how integrating cellular growth rates, density-dependent versus exponential growth, and intratumoral heterogeneity improves predictions concerning the resistance heterogeneity of metastases. PMID:24491979
NASA Astrophysics Data System (ADS)
Loiseau, Jason; Georges, William; Frost, David; Higgins, Andrew
2015-06-01
The incidence angle of a detonation wave is often assumed to weakly influence the terminal velocity of an explosively driven flyer. For explosives heavily loaded with dense additives, this may not be true due to differences in momentum and energy transfer between detonation products, additive particles, and the flyer. For tangential incidence the particles are first accelerated against the flyer via an expansion fan, whereas they are first accelerated by the detonation wave in the normal case. In the current study we evaluate the effect of normal versus tangential incidence on the acceleration of flyers by nitromethane heavily loaded with a variety of additives. Normal detonation was initiated via an explosively driven slapper. Flyer acceleration was measured with heterodyne laser interferometry (PDV). The influence of wave angle is evaluated by comparing the terminal velocity in the two cases (i.e., normal and grazing) for the heavily loaded mixtures. The decrement in flyer velocity correlated primarily with additive volume fraction and had a weak dependence on additive density or particle size. The Gurney energy of the heterogeneous explosive was observed to increase with flyer mass, presumably due to the timescale over which impinging particles could transfer momentum.
Shi, Chengyu; Guo, Bingqi; Cheng, Chih-Yao; Eng, Tony; Papanikolaou, Nikos
2010-09-21
A low-energy electronic brachytherapy source (EBS), the model S700 Axxent x-ray device developed by Xoft Inc., has been used in high dose rate (HDR) intracavitary accelerated partial breast irradiation (APBI) as an alternative to an Ir-192 source. The prescription dose and delivery schema of the electronic brachytherapy APBI plan are the same as the Ir-192 plan. However, due to its lower mean energy than the Ir-192 source, an EBS plan has dosimetric and biological features different from an Ir-192 source plan. Current brachytherapy treatment planning methods may have large errors in treatment outcome prediction for an EBS plan. Two main factors contribute to the errors: the dosimetric influence of tissue heterogeneities and the enhancement of relative biological effectiveness (RBE) of electronic brachytherapy. This study quantified the effects of these two factors and revisited the plan quality of electronic brachytherapy APBI. The influence of tissue heterogeneities is studied by a Monte Carlo method and heterogeneous 'virtual patient' phantoms created from CT images and structure contours; the effect of RBE enhancement in the treatment outcome was estimated by biologically effective dose (BED) distribution. Ten electronic brachytherapy APBI cases were studied. The results showed that, for electronic brachytherapy cases, tissue heterogeneities and patient boundary effect decreased dose to the target and skin but increased dose to the bones. On average, the target dose coverage PTV V(100) reduced from 95.0% in water phantoms (planned) to only 66.7% in virtual patient phantoms (actual). The actual maximum dose to the ribs is 3.3 times higher than the planned dose; the actual mean dose to the ipsilateral breast and maximum dose to the skin were reduced by 22% and 17%, respectively. Combining the effect of tissue heterogeneities and RBE enhancement, BED coverage of the target was 89.9% in virtual patient phantoms with RBE enhancement (actual BED) as compared to 95.2% in water phantoms without RBE enhancement (planned BED). About 10% increase in the source output is required to raise BED PTV V(100) to 95%. As a conclusion, the composite effect of dose reduction in the target due to heterogeneities and RBE enhancement results in a net effect of 5.3% target BED coverage loss for electronic brachytherapy. Therefore, it is suggested that about 10% increase in the source output may be necessary to achieve sufficient target coverage higher than 95%.
NASA Astrophysics Data System (ADS)
Shi, Chengyu; Guo, Bingqi; Cheng, Chih-Yao; Eng, Tony; Papanikolaou, Nikos
2010-09-01
A low-energy electronic brachytherapy source (EBS), the model S700 Axxent™ x-ray device developed by Xoft Inc., has been used in high dose rate (HDR) intracavitary accelerated partial breast irradiation (APBI) as an alternative to an Ir-192 source. The prescription dose and delivery schema of the electronic brachytherapy APBI plan are the same as the Ir-192 plan. However, due to its lower mean energy than the Ir-192 source, an EBS plan has dosimetric and biological features different from an Ir-192 source plan. Current brachytherapy treatment planning methods may have large errors in treatment outcome prediction for an EBS plan. Two main factors contribute to the errors: the dosimetric influence of tissue heterogeneities and the enhancement of relative biological effectiveness (RBE) of electronic brachytherapy. This study quantified the effects of these two factors and revisited the plan quality of electronic brachytherapy APBI. The influence of tissue heterogeneities is studied by a Monte Carlo method and heterogeneous 'virtual patient' phantoms created from CT images and structure contours; the effect of RBE enhancement in the treatment outcome was estimated by biologically effective dose (BED) distribution. Ten electronic brachytherapy APBI cases were studied. The results showed that, for electronic brachytherapy cases, tissue heterogeneities and patient boundary effect decreased dose to the target and skin but increased dose to the bones. On average, the target dose coverage PTV V100 reduced from 95.0% in water phantoms (planned) to only 66.7% in virtual patient phantoms (actual). The actual maximum dose to the ribs is 3.3 times higher than the planned dose; the actual mean dose to the ipsilateral breast and maximum dose to the skin were reduced by 22% and 17%, respectively. Combining the effect of tissue heterogeneities and RBE enhancement, BED coverage of the target was 89.9% in virtual patient phantoms with RBE enhancement (actual BED) as compared to 95.2% in water phantoms without RBE enhancement (planned BED). About 10% increase in the source output is required to raise BED PTV V100 to 95%. As a conclusion, the composite effect of dose reduction in the target due to heterogeneities and RBE enhancement results in a net effect of 5.3% target BED coverage loss for electronic brachytherapy. Therefore, it is suggested that about 10% increase in the source output may be necessary to achieve sufficient target coverage higher than 95%.
Muon Acceleration Concepts for NuMAX: "Dual-use" Linac and "Dogbone" RLA
Bogacz, S. A.
2018-02-01
In this paper, we summarize the current state of a concept for muon acceleration aimed at a future Neutrino Factory. The main thrust of these studies was to reduce the overall cost while maintaining performance by exploring the interplay between the complexity of the cooling systems and the acceptance of the accelerator complex. To ensure adequate survival for the short-lived muons, acceleration must occur at high average gradient. The need for large transverse and longitudinal acceptances drives the design of the acceleration system to an initially low RF frequency, e.g., 325 MHz, which is then increased to 650 MHz asmore » the transverse size shrinks with increasing energy. High-gradient normal conducting RF cavities at these frequencies require extremely high peak-power RF sources. Hence superconducting RF (SRF) cavities are chosen. Finally, we consider two cost effective schemes for accelerating muon beams for a stageable Neutrino Factory: exploration of the so-called "dual-use" linac concept, where the same linac structure is used for acceleration of both H - and muons and, alternatively, an SRF-efficient design based on a multi-pass (4.5) "dogbone" RLA, extendable to multi-pass FFAG-like arcs.« less
Fu, Feng; Nowak, Martin A.; Bonhoeffer, Sebastian
2015-01-01
Acquired resistance is one of the major barriers to successful cancer therapy. The development of resistance is commonly attributed to genetic heterogeneity. However, heterogeneity of drug penetration of the tumor microenvironment both on the microscopic level within solid tumors as well as on the macroscopic level across metastases may also contribute to acquired drug resistance. Here we use mathematical models to investigate the effect of drug heterogeneity on the probability of escape from treatment and the time to resistance. Specifically we address scenarios with sufficiently potent therapies that suppress growth of all preexisting genetic variants in the compartment with the highest possible drug concentration. To study the joint effect of drug heterogeneity, growth rate, and evolution of resistance, we analyze a multi-type stochastic branching process describing growth of cancer cells in multiple compartments with different drug concentrations and limited migration between compartments. We show that resistance is likely to arise first in the sanctuary compartment with poor drug penetrations and from there populate non-sanctuary compartments with high drug concentrations. Moreover, we show that only below a threshold rate of cell migration does spatial heterogeneity accelerate resistance evolution, otherwise deterring drug resistance with excessively high migration rates. Our results provide new insights into understanding why cancers tend to quickly become resistant, and that cell migration and the presence of sanctuary sites with little drug exposure are essential to this end. PMID:25789469
NASA Astrophysics Data System (ADS)
Velasquez, Camilo S.; Pimenta, Egnalda P. S.; Lins, Vanessa F. C.
2018-05-01
This work evaluates the corrosion resistance of galvanized steel treated with tricationic phosphate and zirconium conversion coating after painting, by using electrochemical techniques, accelerated and field corrosion tests. A non-uniform and heterogeneous distribution of zirconium on the steel surface was observed due to preferential nucleation of the zirconium on the aluminum-rich sites on the surface of galvanized steel. The long-term anti-corrosion performance in a saline solution was better for the phosphate coating up to 120 days. The coating capacitance registered a higher increase for the zirconium coatings than the phosphate coatings up to 120 days of immersion. This result agrees with the higher porosity of zirconium coating in relation to the phosphate coating. After 3840 h of accelerated corrosion test, and after 1 year of accelerated field test, zirconium-treated samples showed an average scribe delamination length higher than the phosphate-treated samples.
NASA Astrophysics Data System (ADS)
Velasquez, Camilo S.; Pimenta, Egnalda P. S.; Lins, Vanessa F. C.
2018-04-01
This work evaluates the corrosion resistance of galvanized steel treated with tricationic phosphate and zirconium conversion coating after painting, by using electrochemical techniques, accelerated and field corrosion tests. A non-uniform and heterogeneous distribution of zirconium on the steel surface was observed due to preferential nucleation of the zirconium on the aluminum-rich sites on the surface of galvanized steel. The long-term anti-corrosion performance in a saline solution was better for the phosphate coating up to 120 days. The coating capacitance registered a higher increase for the zirconium coatings than the phosphate coatings up to 120 days of immersion. This result agrees with the higher porosity of zirconium coating in relation to the phosphate coating. After 3840 h of accelerated corrosion test, and after 1 year of accelerated field test, zirconium-treated samples showed an average scribe delamination length higher than the phosphate-treated samples.
NASA Astrophysics Data System (ADS)
Li, Yubiao; Qian, Gujie; Brown, Paul L.; Gerson, Andrea R.
2017-09-01
Dissolution and oxidation of sulfide minerals play key roles in both acid and metalliferous rock drainage and supergene enrichment. Surface speciation heterogeneity, critical to understanding mechanisms of mineral sulfide dissolution, has to date largely not been considered. To this end synchrotron scanning photoelectron microscopy (SPEM) was employed to examine freshly fractured and partially dissolved chalcopyrite (CuFeS2) surfaces (pH 1.0 HClO4 solution, redox potential 650 mV relative to a standard hydrogen electrode, 75 °C). S2- (bulk), S22- and Sn2- were found to be present on all samples at varying concentrations. Oxidation was observed to take place heterogeneously at the sub-micron scale. As compared to chalcopyrite partially dissolved for 5 days, extended dissolution to 10 days did not show appreciably enhanced oxidation of surface species; however surface roughness increased markedly due to the growth/overlap of oxidised sulfur species. On addition of 4 mM iron both S0 and SO42- were observed but not SO32-, indicating that the greater Fe3+ activity/concentration promotes heterogeneous sulfur oxidation. On contact of pyrite (FeS2) with chalcopyrite, significantly greater chalcopyrite surface oxidation was observed than for the other systems examined, with S0, SO32- and SO42- being identified heterogeneously across the surface. It is proposed that chalcopyrite oxidative dissolution is enhanced by increasing its cathodic area, e.g. contacting with pyrite, while increased Fe3+ activity/concentration also contributes to increased dissolution rates. The high degree of surface heterogeneity of these surface products indicates that these surfaces are not passivated by their formation. These results suggest that chalcopyrite dissolution will be accelerated when in contact with pyrite at solution redox potential intermediate between the rest potentials of chalcopyrite and pyrite (560 mV and 660 mV, respectively) and/or iron rich acidic waters with resulting enhanced formation of secondary sulfur containing species and release of copper and iron. This in turn suggests accelerated supergene formation and enhanced metalliferous drainage under these conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogacz, Slawomir Alex
Here, we summarize current state of concept for muon acceleration aimed at future Neutrino Factory. The main thrust of these studies was to reduce the overall cost while maintaining performance through exploring interplay between complexity of the cooling systems and the acceptance of the accelerator complex. To ensure adequate survival of the short-lived muons, acceleration must occur at high average gradient. The need for large transverse and longitudinal acceptances drives the design of the acceleration system to initially low RF frequency, e.g. 325 MHz, and then increased to 650 MHz, as the transverse size shrinks with increasing energy. High-gradient normalmore » conducting RF cavities at these frequencies require extremely high peak-power RF sources. Hence superconducting RF (SRF) cavities are chosen. Here, we considered two cost effective schemes for accelerating muon beams for a stagable Neutrino Factory: Exploration of the so-called 'dual-use' linac concept, where the same linac structure is used for acceleration of both H- and muons and alternatively, the SRF efficient design based on multi-pass (4.5) 'dogbone' RLA, extendable to multi-pass FFAG-like arcs.« less
Thomsen, Kirsten; Yokota, Takashi; Hasan-Olive, Md Mahdi; Sherazi, Niloofar; Fakouri, Nima Borhan; Desler, Claus; Regnell, Christine Elisabeth; Larsen, Steen; Rasmussen, Lene Juel; Dela, Flemming; Bergersen, Linda Hildegard; Lauritzen, Martin
2018-01-01
Brain aging is accompanied by declining mitochondrial respiration. We hypothesized that mitochondrial morphology and dynamics would reflect this decline. Using hippocampus and frontal cortex of a segmental progeroid mouse model lacking Cockayne syndrome protein B (CSB m/m ) and C57Bl/6 (WT) controls and comparing young (2-5 months) to middle-aged mice (13-14 months), we found that complex I-linked state 3 respiration (CI) was reduced at middle age in CSB m/m hippocampus, but not in CSB m/m cortex or WT brain. In hippocampus of both genotypes, mitochondrial size heterogeneity increased with age. Notably, an inverse correlation between heterogeneity and CI was found in both genotypes, indicating that heterogeneity reflects mitochondrial dysfunction. The ratio between fission and fusion gene expression reflected age-related alterations in mitochondrial morphology but not heterogeneity. Mitochondrial DNA content was lower, and hypoxia-induced factor 1α mRNA was greater at both ages in CSB m/m compared to WT brain. Our findings show that decreased CI and increased mitochondrial size heterogeneity are highly associated and point to declining mitochondrial quality control as an initial event in brain aging. Copyright © 2017 Elsevier Inc. All rights reserved.
Heterogeneous Organo-Catalysis: Sustainable Pathways to Furanics from Biomass
Glucose and fructose are among the most abundant plant-derived materials1 and have been converted into useful building units often used in the drug discovery and polymer architecture.2 Unfortunately, most of these conversions require mineral acids and complex heterogeneous cataly...
Heterogeneous variances in multi-environment yield trials for corn hybrids
USDA-ARS?s Scientific Manuscript database
Recent developments in statistics and computing have enabled much greater levels of complexity in statistical models of multi-environment yield trial data. One particular feature of interest to breeders is simultaneously modeling heterogeneity of variances among environments and cultivars. Our obj...
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Pescosolido, Bernice A.; Martin, Jack K.
2016-01-01
Since the beginning of the twenty-first century, research on stigma has continued. Building on conceptual and empirical work, the recent period clarifies new types of stigmas, expansion of measures, identification of new directions, and increasingly complex levels. Standard beliefs have been challenged, the relationship between stigma research and public debates reconsidered, and new scientific foundations for policy and programs suggested. We begin with a summary of the most recent Annual Review articles on stigma, which reminded sociologists of conceptual tools, informed them of developments from academic neighbors, and claimed findings from the early period of “resurgence.” Continued (even accelerated) progress has also revealed a central problem. Terms and measures are often used interchangeably, leading to confusion and decreasing accumulated knowledge. Drawing from this work but focusing on the past 14 years of stigma research (including mental illness, sexual orientation, HIV/AIDS, and race/ethnicity), we provide a theoretical architecture of concepts (e.g., prejudice, experienced/received discrimination), drawn together through a stigma process (i.e., stigmatization), based on four theoretical premises. Many characteristics of the mark (e.g., discredited, concealable) and variants (i.e., stigma types and targets) become the focus of increasingly specific and multidimensional definitions. Drawing from complex and systems science, we propose a stigma complex, a system of interrelated, heterogeneous parts bringing together insights across disciplines to provide a more realistic and complicated sense of the challenge facing research and change efforts. The Framework Integrating Normative Influences on Stigma (FINIS) offers a multilevel approach that can be tailored to stigmatized statuses. Finally, we outline challenges for the next phase of stigma research, with the goal of continuing scientific activity that enhances our understanding of stigma and builds the scientific foundation for efforts to reduce intolerance. PMID:26855471
The Effect of Accelerated Mathematics Instruction on Heterogeneous Groups of Sixth Grade Students
ERIC Educational Resources Information Center
Nance, Wendy J.
2013-01-01
The United States currently lags behind globally in the areas of math and science. In order to compete and meet the skills necessary for the future workforce, it has become necessary to seek out instructional strategies that will increase student achievement in those academic areas. With the wide variety of diversity occurring in public schools…
Heterogeneous Oxidation of Laboratory-generated Mixed Composition and Biomass Burning Particles
NASA Astrophysics Data System (ADS)
Lim, C. Y.; Sugrue, R. A.; Hagan, D. H.; Cappa, C. D.; Kroll, J. H.; Browne, E. C.
2016-12-01
Heterogeneous oxidation of organic aerosol (OA) can significantly transform the chemical and physical properties of particulate matter in the atmosphere, leading to changes to the chemical composition of OA and potential volatilization of organic compounds. It has become increasingly apparent that the heterogeneous oxidation kinetics of OA depend on the phase and morphology of the particles. However, most laboratory experiments to date have been performed on single-component, purely organic precursors, which may exhibit fundamentally different behavior than more complex particles in the atmosphere. Here we present laboratory studies of the heterogeneous oxidation of two more complex chemical systems: thin, organic coatings on inorganic seed particles and biomass burning OA. In the first system, squalane (C30H62), a model compound for reduced OA, is coated onto dry ammonium sulfate particles at various thicknesses (10-20 nm) and exposed to hydroxyl radical (OH) in a flow tube reactor. In the second, we use a semi-batch reactor to study the heterogeneous OH-initiated oxidation of biomass burning particles as a part of the 2016 FIREX campaign in Missoula, MT. The resulting changes in chemical composition are monitored with an Aerodyne High Resolution Time-of-flight Aerosol Mass Spectrometer (AMS) and a soot-particle AMS for the non-refractory and refractory systems, respectively. We show that the heterogeneous oxidation kinetics of these multicomponent particles are substantially different than that of the single-component particles. The oxidation of organic coatings is rapid, undergoing dramatic changes to carbon oxidation state and losing a significant amount of organic mass after relatively low OH exposures (equivalent to several days of atmospheric processing). In the case of biomass burning particles, the kinetics are complex, with different components (inferred by aerosol mass spectrometry) undergoing oxidation at different rates.
Khodabandeloo, Babak; Melvin, Dyan; Jo, Hongki
2017-01-01
Direct measurements of external forces acting on a structure are infeasible in many cases. The Augmented Kalman Filter (AKF) has several attractive features that can be utilized to solve the inverse problem of identifying applied forces, as it requires the dynamic model and the measured responses of structure at only a few locations. But, the AKF intrinsically suffers from numerical instabilities when accelerations, which are the most common response measurements in structural dynamics, are the only measured responses. Although displacement measurements can be used to overcome the instability issue, the absolute displacement measurements are challenging and expensive for full-scale dynamic structures. In this paper, a reliable model-based data fusion approach to reconstruct dynamic forces applied to structures using heterogeneous structural measurements (i.e., strains and accelerations) in combination with AKF is investigated. The way of incorporating multi-sensor measurements in the AKF is formulated. Then the formulation is implemented and validated through numerical examples considering possible uncertainties in numerical modeling and sensor measurement. A planar truss example was chosen to clearly explain the formulation, while the method and formulation are applicable to other structures as well. PMID:29149088
Cao, Zhipeng; Oh, Sukhoon; Otazo, Ricardo; Sica, Christopher T.; Griswold, Mark A.; Collins, Christopher M.
2014-01-01
Purpose Introduce a novel compressed sensing reconstruction method to accelerate proton resonance frequency (PRF) shift temperature imaging for MRI induced radiofrequency (RF) heating evaluation. Methods A compressed sensing approach that exploits sparsity of the complex difference between post-heating and baseline images is proposed to accelerate PRF temperature mapping. The method exploits the intra- and inter-image correlations to promote sparsity and remove shared aliasing artifacts. Validations were performed on simulations and retrospectively undersampled data acquired in ex-vivo and in-vivo studies by comparing performance with previously proposed techniques. Results The proposed complex difference constrained compressed sensing reconstruction method improved the reconstruction of smooth and local PRF temperature change images compared to various available reconstruction methods in a simulation study, a retrospective study with heating of a human forearm in vivo, and a retrospective study with heating of a sample of beef ex vivo . Conclusion Complex difference based compressed sensing with utilization of a fully-sampled baseline image improves the reconstruction accuracy for accelerated PRF thermometry. It can be used to improve the volumetric coverage and temporal resolution in evaluation of RF heating due to MRI, and may help facilitate and validate temperature-based methods for safety assurance. PMID:24753099
NASA Astrophysics Data System (ADS)
Dergunov, Alexander D.; Shabrova, Elena V.; Dobretsov, Gennady E.
2010-03-01
To investigate the influence of lipid unsaturation and neutral lipid on the maturation of high density lipoproteins, the discoidal complexes of apoA-I, phosphatidylcholine and cholesteryl ester (CE) were prepared. Saturated dipalmitoylphosphatidylcholine (DPPC) and unsaturated palmitoyllinoleoylphosphatidylcholine (PLPC), palmitoyloleoylphosphatidylcholine (POPC), and fluorescent probe cholesteryl 1-pyrenedecanoate (CPD) that forms in a diffusion- and concentration-dependent manner short-lived dimer of unexcited and excited molecules (excimer) were used. The apoA-I/DPPC/CPD complexes were heterogeneous by size, composition and probe location. CPD molecules incorporated more efficiently into larger complexes and accumulated in a central part of the discs. The apoA-I/POPC(PLPC)/CPD were also heterogeneous, however, probe molecules distributed preferentially into smaller complexes and accumulated at disc periphery. The kinetics of CPD transfer by recombinant cholesteryl ester transfer protein (CETP) to human plasma LDL is well described by two-exponential decay, the fast component with a shorter transfer time being more populated in PLPC compared to DPPC complexes. The presence of CE molecules in discoidal HDL results in particle heterogeneity. ApoA-I influences the CETP activity modulating the properties of apolipoprotein-phospholipid interface. This may include CE molecules accumulation in the boundary lipid in unsaturated phosphatidylcholine and cluster formation in the bulk bilayer in saturated phosphatidylcholine.
Compact RF ion source for industrial electrostatic ion accelerator
NASA Astrophysics Data System (ADS)
Kwon, Hyeok-Jung; Park, Sae-Hoon; Kim, Dae-Il; Cho, Yong-Sub
2016-02-01
Korea Multi-purpose Accelerator Complex is developing a single-ended electrostatic ion accelerator to irradiate gaseous ions, such as hydrogen and nitrogen, on materials for industrial applications. ELV type high voltage power supply has been selected. Because of the limited space, electrical power, and robust operation, a 200 MHz RF ion source has been developed. In this paper, the accelerator system, test stand of the ion source, and its test results are described.
Compact RF ion source for industrial electrostatic ion accelerator.
Kwon, Hyeok-Jung; Park, Sae-Hoon; Kim, Dae-Il; Cho, Yong-Sub
2016-02-01
Korea Multi-purpose Accelerator Complex is developing a single-ended electrostatic ion accelerator to irradiate gaseous ions, such as hydrogen and nitrogen, on materials for industrial applications. ELV type high voltage power supply has been selected. Because of the limited space, electrical power, and robust operation, a 200 MHz RF ion source has been developed. In this paper, the accelerator system, test stand of the ion source, and its test results are described.
Commissioning for the European XFEL facility
NASA Astrophysics Data System (ADS)
Nölle, D.
2017-06-01
The European XFEL is a 4th generation light source based on the Self Amplified Spontaneous Emission (SASE) FreeElectron-Laser concept. It is currently being commissioned in North- Germany. The core installation is a 17.5 GeV superconducting accelerator driving 3 SASE lines with photon energies from 1 to beyond 20 keV range with a maximum of 27.000 pulses per second. The international facility is organized as a limited liability company with shareholders from the contributing countries. DESY has taken over the leadership of the accelerator construction consortium, and will be in charge of the operation of the accelerator complex. The facility was set up with contributions from the 11 shareholder countries, either being hardware systems and/or staff or cash contributions. The construction is almost complete, and the commissioning phase has started by the end of 2015. This contribution will report the status of the accelerator complex with emphasis on the commissioning of the accelerator and an outlook to the commissioning of the SASE 1 FEL line.
NASA Astrophysics Data System (ADS)
Coffey, G. L.; Savage, H. M.; Polissar, P. J.; Rowe, C. D.
2017-12-01
Faults are generally heterogeneous along-strike, with changes in thickness and structural complexity that should influence coseismic slip. However, observational limitations (e.g. limited outcrop or borehole samples) can obscure this complexity. Here we investigate the heterogeneity of frictional heating determined from biomarker thermal maturity and microstructural observations along a well-exposed fault to understand whether coseismic stress and frictional heating are related to structural complexity. We focus on the Muddy Mountain thrust, Nevada, a Sevier-age structure that has continuous exposure of its fault core and considerable structural variability for up to 50 m, to explore the distribution of earthquake slip and temperature rise along strike. We present new biomarker thermal maturity results that capture the heating history of fault rocks. Biomarkers are organic molecules produced by living organisms and preserved in the rock record. During heating, their structure is altered systematically with increasing time and temperature. Preliminary results show significant variability in thermal maturity along-strike at the Muddy Mountain thrust, suggesting differences in coseismic temperature rise on the meter- scale. Temperatures upwards of 500°C were generated in the principal slip zone at some locations, while in others, no significant temperature rise occurred. These results demonstrate that stress or slip heterogeneity occurred along the Muddy Mountain thrust at the meter-scale and considerable along-strike complexity existed, highlighting the importance of careful interpretation of whole-fault behavior from observations at a single point on a fault.
van Rooyen, Jason M; Murat, Jean-Benjamin; Hammoudi, Pierre-Mehdi; Kieffer-Jaquinod, Sylvie; Coute, Yohann; Sharma, Amit; Pelloux, Hervé; Belrhali, Hassan; Hakimi, Mohamed-Ali
2014-01-01
In Toxoplasma gondii, as in other eukaryotes, a subset of the amino-acyl-tRNA synthetases are arranged into an abundant cytoplasmic multi-aminoacyl-tRNA synthetase (MARS) complex. Through a series of genetic pull-down assays, we have identified the enzymes of this complex as: methionyl-, glutaminyl-, glutamyl-, and tyrosyl-tRNA synthetases, and we show that the N-terminal GST-like domain of a partially disordered hybrid scaffold protein, Tg-p43, is sufficient for assembly of the intact complex. Our gel filtration studies revealed significant heterogeneity in the size and composition of isolated MARS complexes. By targeting the tyrosyl-tRNA synthetases subunit, which was found exclusively in the complete 1 MDa complex, we were able to directly visualize MARS particles in the electron microscope. Image analyses of the negative stain data revealed the observed heterogeneity and instability of these complexes to be driven by the intrinsic flexibility of the domain arrangements within the MARS complex. These studies provide unique insights into the assembly of these ubiquitous but poorly understood eukaryotic complexes.
Single-cell sequencing and tumorigenesis: improved understanding of tumor evolution and metastasis.
Ellsworth, Darrell L; Blackburn, Heather L; Shriver, Craig D; Rabizadeh, Shahrooz; Soon-Shiong, Patrick; Ellsworth, Rachel E
2017-12-01
Extensive genomic and transcriptomic heterogeneity in human cancer often negatively impacts treatment efficacy and survival, thus posing a significant ongoing challenge for modern treatment regimens. State-of-the-art DNA- and RNA-sequencing methods now provide high-resolution genomic and gene expression portraits of individual cells, facilitating the study of complex molecular heterogeneity in cancer. Important developments in single-cell sequencing (SCS) technologies over the past 5 years provide numerous advantages over traditional sequencing methods for understanding the complexity of carcinogenesis, but significant hurdles must be overcome before SCS can be clinically useful. In this review, we: (1) highlight current methodologies and recent technological advances for isolating single cells, single-cell whole-genome and whole-transcriptome amplification using minute amounts of nucleic acids, and SCS, (2) summarize research investigating molecular heterogeneity at the genomic and transcriptomic levels and how this heterogeneity affects clonal evolution and metastasis, and (3) discuss the promise for integrating SCS in the clinical care arena for improved patient care.
Netherton, Tucker; Li, Yuting; Nitsch, Paige; Shaitelman, Simona; Balter, Peter; Gao, Song; Klopp, Ann; Muruganandham, Manickam; Court, Laurence
2018-06-01
Using a new linear accelerator with high dose rate (800 MU/min), fast MLC motions (5.0 cm/s), fast gantry rotation (15 s/rotation), and 1 cm wide MLCs, we aimed to quantify the effects of complexity, arc number, and fractionation on interplay for breast and lung treatments under target motion. To study lung interplay, eight VMAT plans (1-6 arcs) and four-nine-field sliding-window IMRT plans varying in complexity were created. For the breast plans, four-four-field sliding-window IMRT plans were created. Using the Halcyon 1.0 linear accelerator, each plan was delivered five times each under sinusoidal breathing motion to a phantom with 20 implanted MOSFET detectors; MOSFET dose (cGy), delivery time, and MU/cGy values were recorded. Maximum and mean dose deviations were calculated from MOSFET data. The number of MOSFETs with at least 19 of 20 detectors agreeing with their expected dose within 5% per fraction was calculated across 10 6 iterations to model dose deviation as function of fraction number for all plan variants. To put interplay plans into clinical context, additional IMRT and VMAT plans were created and delivered for the sites of head and neck, prostate, whole brain, breast, pelvis, and lung. Average modulation and interplay effect were compared to those from conventional linear accelerators, as reported from previous studies. The mean beam modulation for plans created for the Halcyon 1.0 linear accelerator was 2.9 MU/cGy (two- to four-field IMRT breast plans), 6.2 MU/cGy (at least five-field IMRT), and 3.6 MU/cGy (four-arc VMAT). To achieve treatment plan objectives, Halcyon 1.0 VMAT plans require more arcs and modulation than VMAT on conventional linear accelerators. Maximum and mean dose deviations increased with increasing plan complexity under tumor motion for breast and lung treatments. Concerning VMAT plans under motion, maximum, and mean dose deviations were higher for one arc than for two arcs regardless of plan complexity. For plan variants with maximum dose deviations greater than 3.7%, dose deviation as a function of fraction number was protracted. For treatments on the Halcyon 1.0 linear accelerator, the convergence of dose deviation with fraction number happened more slowly than reported for conventional linear accelerators. However, if plan complexity is reduced for IMRT and if tumor motion is less than ~10-mm, interplay is greatly reduced. To minimize dose deviations across multiple fractions for dynamic targets, we recommend limiting treatment plan complexity and avoiding one-arc VMAT on the Halcyon 1.0 linear accelerator when interplay is a concern. © 2018 American Association of Physicists in Medicine.
Super-resolved Parallel MRI by Spatiotemporal Encoding
Schmidt, Rita; Baishya, Bikash; Ben-Eliezer, Noam; Seginer, Amir; Frydman, Lucio
2016-01-01
Recent studies described an alternative “ultrafast” scanning method based on spatiotemporal (SPEN) principles. SPEN demonstrates numerous potential advantages over EPI-based alternatives, at no additional expense in experimental complexity. An important aspect that SPEN still needs to achieve for providing a competitive acquisition alternative entails exploiting parallel imaging algorithms, without compromising its proven capabilities. The present work introduces a combination of multi-band frequency-swept pulses simultaneously encoding multiple, partial fields-of-view; together with a new algorithm merging a Super-Resolved SPEN image reconstruction and SENSE multiple-receiving methods. The ensuing approach enables one to reduce both the excitation and acquisition times of ultrafast SPEN acquisitions by the customary acceleration factor R, without compromises in either the ensuing spatial resolution, SAR deposition, or the capability to operate in multi-slice mode. The performance of these new single-shot imaging sequences and their ancillary algorithms were explored on phantoms and human volunteers at 3T. The gains of the parallelized approach were particularly evident when dealing with heterogeneous systems subject to major T2/T2* effects, as is the case upon single-scan imaging near tissue/air interfaces. PMID:24120293
Advanced glycation end-products: a review.
Singh, R; Barden, A; Mori, T; Beilin, L
2001-02-01
Advanced glycation end-products are a complex and heterogeneous group of compounds that have been implicated in diabetes related complications. At present it is not known if they are the cause or the consequence of the complications observed. We discuss the chemistry of advanced glycated end-product formation and their patho-biochemistry particularly in relation to the diabetic microvascular complications of retinopathy, neuropathy and nephropathy as well as their role in the accelerated vasculopathy observed in diabetes. The concept of carbonyl stress as a cause for advanced glycated end-product toxicity is mentioned. We discuss alterations in the concentrations of advanced glycated end-products in the body, particularly in relation to changes occurring with age, diabetes and its complications such as nephropathy. Problems relating to current methods of advanced glycated end-product detection and measurement are highlighted including the lack of a universally established method of detection or unit of measurement. Agents used for the treatment of advanced glycated end-product accumulation are reviewed, with an emphasis on the results of the recent phase III trials using aminoguanidine and diabetes related complications.
NMR study of xenotropic murine leukemia virus-related virus protease in a complex with amprenavir
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furukawa, Ayako; Okamura, Hideyasu; Morishita, Ryo
2012-08-24
Highlights: Black-Right-Pointing-Pointer Protease (PR) of XMR virus (XMRV) was successfully synthesized with cell-free system. Black-Right-Pointing-Pointer Interface of XMRV PR with an inhibitor, amprenavir (APV), was identified with NMR. Black-Right-Pointing-Pointer Structural heterogeneity is induced for two PR protomers in the APV:PR = 1:2 complex. Black-Right-Pointing-Pointer Structural heterogeneity is transmitted even to distant regions from the interface. Black-Right-Pointing-Pointer Long-range transmission of structural change may be utilized for drug discovery. -- Abstract: Xenotropic murine leukemia virus-related virus (XMRV) is a virus created through recombination of two murine leukemia proviruses under artificial conditions during the passage of human prostate cancer cells in athymic nudemore » mice. The homodimeric protease (PR) of XMRV plays a critical role in the production of functional viral proteins and is a prerequisite for viral replication. We synthesized XMRV PR using the wheat germ cell-free expression system and carried out structural analysis of XMRV PR in a complex with an inhibitor, amprenavir (APV), by means of NMR. Five different combinatorially {sup 15}N-labeled samples were prepared and backbone resonance assignments were made by applying Otting's method, with which the amino acid types of the [{sup 1}H, {sup 15}N] HSQC resonances were automatically identified using the five samples (Wu et al., 2006) . A titration experiment involving APV revealed that one APV molecule binds to one XMRV PR dimer. For many residues, two distinct resonances were observed, which is thought to be due to the structural heterogeneity between the two protomers in the APV:XMRV PR = 1:2 complex. PR residues at the interface with APV have been identified on the basis of chemical shift perturbation and identification of the intermolecular NOEs by means of filtered NOE experiments. Interestingly, chemical shift heterogeneity between the two protomers of XMRV PR has been observed not only at the interface with APV but also in regions apart from the interface. This indicates that the structural heterogeneity induced by the asymmetry of the binding of APV to the XMRV PR dimer is transmitted to distant regions. This is in contrast to the case of the APV:HIV-1 PR complex, in which the structural heterogeneity is only localized at the interface. Long-range transmission of the structural change identified for the XMRV PR complex might be utilized for the discovery of a new type of drug.« less
Wang, M.; Holmes-Davis, R.; Rafinski, Z.; Jedrzejewska, B.; Choi, K. Y.; Zwick, M.; Bupp, C.; Izmailov, A.; Paczkowski, J.; Warner, B.; Koshinsky, H.
2009-01-01
In many settings, molecular testing is needed but unavailable due to complexity and cost. Simple, rapid, and specific DNA detection technologies would provide important alternatives to existing detection methods. Here we report a novel, rapid nucleic acid detection method based on the accelerated photobleaching of the light-sensitive cyanine dye, 3,3′-diethylthiacarbocyanine iodide (DiSC2(3) I−), in the presence of a target genomic DNA and a complementary peptide nucleic acid (PNA) probe. On the basis of the UV–vis, circular dichroism, and fluorescence spectra of DiSC2(3) with PNA–DNA oligomer duplexes and on characterization of a product of photolysis of DiSC2(3) I−, a possible reaction mechanism is proposed. We propose that (1) a novel complex forms between dye, PNA, and DNA, (2) this complex functions as a photosensitizer producing 1O2, and (3) the 1O2 produced promotes photobleaching of dye molecules in the mixture. Similar cyanine dyes (DiSC3(3), DiSC4(3), DiSC5(3), and DiSCpy(3)) interact with preformed PNA–DNA oligomer duplexes but do not demonstrate an equivalent accelerated photobleaching effect in the presence of PNA and target genomic DNA. The feasibility of developing molecular diagnostic assays based on the accelerated photobleaching (the smartDNA assay) that results from the novel complex formed between DiSC2(3) and PNA–DNA is under way. PMID:19231844
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
NASA Astrophysics Data System (ADS)
Hagen, Stephen J.; Son, Minjun
2017-02-01
Bacterial pathogens rely on chemical signaling and environmental cues to regulate disease-causing behavior in complex microenvironments. The human pathogen Streptococcus mutans employs a particularly complex signaling and sensing scheme to regulate genetic competence and other virulence behaviors in the oral biofilms it inhabits. Individual S. mutans cells make the decision to enter the competent state by integrating chemical and physical cues received from their microenvironment along with endogenously produced peptide signals. Studies at the single-cell level, using microfluidics to control the extracellular environment, provide physical insight into how the cells process these inputs to generate complex and often heterogeneous outputs. Fine changes in environmental stimuli can dramatically alter the behavior of the competence circuit. Small shifts in pH can switch the quorum sensing response on or off, while peptide-rich media appear to switch the output from a unimodal to a bimodal behavior. Therefore, depending on environmental cues, the quorum sensing circuitry can either synchronize virulence across the population, or initiate and amplify heterogeneity in that behavior. Much of this complex behavior can be understood within the framework of a quorum sensing system that can operate both as an intercellular signaling mechanism and intracellularly as a noisy bimodal switch.
Parihar, Sanjay; Pathan, Soyeb; Jadeja, R N; Patel, Anjali; Gupta, Vivek K
2012-01-16
1-Phenyl-3-methyl-4-touloyl-5-pyrazolone (ligand) was synthesized and used to prepare an oxovanadium(IV) complex. The complex was characterized by single-crystal X-ray analysis and various spectroscopic techniques. The single-crystal X-ray analysis of the complex shows that the ligands are coordinated in a syn configuration to each other and create a distorted octahedral environment around the metal ion. A heterogeneous catalyst comprising an oxovanadium(IV) complex and hydrous zirconia was synthesized, characterized by various physicochemical techniques, and successfully used for the solvent-free oxidation of styrene. The influence of the reaction parameters (percent loading, molar ratio of the substrate to H(2)O(2), amount of catalyst, and reaction time) was studied. The catalyst was reused three times without any significant loss in the catalytic activity.
A prospective analysis of the role of cognition in three models of aging and schizophrenia.
Cohen, Carl I; Murante, Tessa
2017-07-02
This study uses longitudinal data from a sample of older adults with schizophrenia spectrum disorder (OAS) to examine the role of cognition in 3 models of aging and schizophrenia-accelerated aging, paradoxical aging, and heterogeneity of course-and their clinical relevance. The sample consisted of 103 community-dwelling persons aged 55 and over (mean=61years) with early-onset schizophrenia. Mean follow-up was 52.5months (range: 12-116months); 55% were men; 55% were white. We identified 21 potential predictor variables and used the Dementia Rating Scale (DRS) to assess cognition. There were no significant differences in the DRS at baseline (T1) and follow-up (T2). However, 20%, 22% and 58% of persons exhibited >0.5 effect size increase or decrease, or no change in their DRS scores, respectively; 19% were rapid decliners (>-2.11pts/year) and 19% were rapid improvers (>+2.11pts/year). In multivariable analysis, there were 3 predictors of higher DRS (T2): DRS (T1), decline in anxiety score, and race (white). The heterogeneity model best characterized the trajectory of cognition in later life. The accelerated aging model did not represent typical cognitive trajectories since most individuals were stable or improved. The heterogeneous trajectories made it difficult to generalize about cognition's role in the paradoxical aging model. Despite the paucity of predictors, our findings suggested that it may be clinically productive to enlist remediation strategies that target anxiety and cognition, and direct more attention to non-white OAS. Copyright © 2017. Published by Elsevier B.V.
Impact of Degree Heterogeneity on Attack Vulnerability of Interdependent Networks
NASA Astrophysics Data System (ADS)
Sun, Shiwen; Wu, Yafang; Ma, Yilin; Wang, Li; Gao, Zhongke; Xia, Chengyi
2016-09-01
The study of interdependent networks has become a new research focus in recent years. We focus on one fundamental property of interdependent networks: vulnerability. Previous studies mainly focused on the impact of topological properties upon interdependent networks under random attacks, the effect of degree heterogeneity on structural vulnerability of interdependent networks under intentional attacks, however, is still unexplored. In order to deeply understand the role of degree distribution and in particular degree heterogeneity, we construct an interdependent system model which consists of two networks whose extent of degree heterogeneity can be controlled simultaneously by a tuning parameter. Meanwhile, a new quantity, which can better measure the performance of interdependent networks after attack, is proposed. Numerical simulation results demonstrate that degree heterogeneity can significantly increase the vulnerability of both single and interdependent networks. Moreover, it is found that interdependent links between two networks make the entire system much more fragile to attacks. Enhancing coupling strength between networks can greatly increase the fragility of both networks against targeted attacks, which is most evident under the case of max-max assortative coupling. Current results can help to deepen the understanding of structural complexity of complex real-world systems.
Pore-scale modeling of hydromechanical coupled mechanics in hydrofracturing process
NASA Astrophysics Data System (ADS)
Chen, Zhiqiang; Wang, Moran
2017-05-01
Hydrofracturing is an important technique in petroleum industry to stimulate well production. Yet the mechanism of induced fracture growth is still not fully understood, which results in some unsatisfactory wells even with hydrofracturing treatments. In this work we establish a more accurate numerical framework for hydromechanical coupling, where the solid deformation and fracturing are modeled by discrete element method and the fluid flow is simulated directly by lattice Boltzmann method at pore scale. After validations, hydrofracturing is simulated with consideration on the strength heterogeneity effects on fracture geometry and microfailure mechanism. A modified topological index is proposed to quantify the complexity of fracture geometry. The results show that strength heterogeneity has a significant influence on hydrofracturing. In heterogeneous samples, the fracturing behavior is crack nucleation around the tip of fracture and connection of it to the main fracture, which is usually accompanied by shear failure. However, in homogeneous ones the fracture growth is achieved by the continuous expansion of the crack, where the tensile failure often dominates. It is the fracturing behavior that makes the fracture geometry in heterogeneous samples much more complex than that in homogeneous ones. In addition, higher pore pressure leads to more shear failure events for both heterogeneous and homogeneous samples.
A general mechanism for competitor-induced dissociation of molecular complexes
Paramanathan, Thayaparan; Reeves, Daniel; Friedman, Larry J.; Kondev, Jane; Gelles, Jeff
2014-01-01
The kinetic stability of non-covalent macromolecular complexes controls many biological phenomena. Here we find that physical models of complex dissociation predict that competitor molecules will in general accelerate the breakdown of isolated bimolecular complexes by occluding rapid rebinding of the two binding partners. This prediction is largely independent of molecular details. We confirm the prediction with single-molecule fluorescence experiments on a well-characterized DNA strand dissociation reaction. Contrary to common assumptions, competitor–induced acceleration of dissociation can occur in biologically relevant competitor concentration ranges and does not necessarily implyternary association of competitor with the bimolecular complex. Thus, occlusion of complex rebinding may play a significant role in a variety of biomolecular processes. The results also show that single-molecule colocalization experiments can accurately measure dissociation rates despite their limited spatio temporal resolution. PMID:25342513
Improvement Plans of Fermilab’s Proton Accelerator Complex
NASA Astrophysics Data System (ADS)
Shiltsev, Vladimir
2017-09-01
The flagship of Fermilab’s long term research program is the Deep Underground Neutrino Experiment (DUNE), located Sanford Underground Research Facility (SURF) in Lead, South Dakota, which will study neutrino oscillations with a baseline of 1300 km. The neutrinos will be produced in the Long Baseline Neutrino Facility (LBNF), a proposed new beam line from Fermilab’s Main Injector. The physics goals of the DUNE require a proton beam with a power of some 2.4 MW at 120 GeV, which is roughly four times the current maximum power. Here I discuss current performance of the Fermilab proton accelerator complex, our plans for construction of the SRF proton linac as key part of the Proton Improvement Plan-II (PIP-II), outline the main challenges toward multi-MW beam power operation of the Fermilab accelerator complex and the staged plan to achieve the required performance over the next 15 years.
Autism Spectrum Disorders Associated with Chromosomal Abnormalities
ERIC Educational Resources Information Center
Lo-Castro, Adriana; Benvenuto, Arianna; Galasso, Cinzia; Porfirio, Cristina; Curatolo, Paolo
2010-01-01
Autism spectrum disorders (ASDs) constitute a class of severe neurodevelopmental conditions with complex multifactorial and heterogeneous etiology. Despite high estimates of heritability, genetic causes of ASDs remain elusive, due to a high degree of genetic and phenotypic heterogeneity. So far, several "monogenic" forms of autism have been…
Pros and Cons of the Acceleration Scheme (NF-IDS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogacz, Alex; Bogacz, Slawomir
The overall goal of the acceleration systems: large acceptance acceleration to 25 GeV and beam shaping can be accomplished by various fixed field accelerators at different stages. They involve three superconducting linacs: a single pass linear Pre-accelerator followed by a pair of multi-pass Recirculating Linear Accelerators (RLA) and finally a nonâ scaling FFAG ring. The present baseline acceleration scenario has been optimized to take maximum advantage of appropriate acceleration scheme at a given stage. Pros and cons of various stages are discussed here in detail. The solenoid based Pre-accelerator offers very large acceptance and facilitates correction of energy gain acrossmore » the bunch and significant longitudinal compression trough induced synchrotron motion. However, far off-crest acceleration reduces the effective acceleration gradient and adds complexity through the requirement of individual RF phase control for each cavity. Close proximity of strong solenoids and superc« less
Low Level RF Control for the PIP-II Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelen, J. P.; Chase, B. E.; Cullerton, E.
The PIP-II accelerator is a proposed upgrade to the Fermilab accelerator complex that will replace the existing, 400 MeV room temperature LINAC with an 800 MeV superconducting LINAC. Part of this upgrade includes a new injection scheme into the booster that levies tight requirements on the LLRF control system for the cavities. In this paper we discuss the challenges of the PIP-II accelerator and the present status of the LLRF system for this project.
Matthes, Jochen; Pery, Tal; Gründemann, Stephan; Buntkowsky, Gerd; Sabo-Etienne, Sylviane; Chaudret, Bruno; Limbach, Hans-Heinrich
2004-07-14
Some transition metal complexes are known to catalyze ortho/para hydrogen conversion, hydrogen isotope scrambling, and hydrogenation reactions in liquid solution. Using the example of Vaska's complex, we present here evidence by NMR that the solvent is not necessary for these reactions to occur. Thus, solid frozen solutions or polycrystalline powdered samples of homogeneous catalysts may become heterogeneous catalysts. Comparative liquid- and solid-state studies provide novel insight into the reaction mechanisms.
Network Coding on Heterogeneous Multi-Core Processors for Wireless Sensor Networks
Kim, Deokho; Park, Karam; Ro, Won W.
2011-01-01
While network coding is well known for its efficiency and usefulness in wireless sensor networks, the excessive costs associated with decoding computation and complexity still hinder its adoption into practical use. On the other hand, high-performance microprocessors with heterogeneous multi-cores would be used as processing nodes of the wireless sensor networks in the near future. To this end, this paper introduces an efficient network coding algorithm developed for the heterogenous multi-core processors. The proposed idea is fully tested on one of the currently available heterogeneous multi-core processors referred to as the Cell Broadband Engine. PMID:22164053
Real-time text extraction based on the page layout analysis system
NASA Astrophysics Data System (ADS)
Soua, M.; Benchekroun, A.; Kachouri, R.; Akil, M.
2017-05-01
Several approaches were proposed in order to extract text from scanned documents. However, text extraction in heterogeneous documents stills a real challenge. Indeed, text extraction in this context is a difficult task because of the variation of the text due to the differences of sizes, styles and orientations, as well as to the complexity of the document region background. Recently, we have proposed the improved hybrid binarization based on Kmeans method (I-HBK)5 to extract suitably the text from heterogeneous documents. In this method, the Page Layout Analysis (PLA), part of the Tesseract OCR engine, is used to identify text and image regions. Afterwards our hybrid binarization is applied separately on each kind of regions. In one side, gamma correction is employed before to process image regions. In the other side, binarization is performed directly on text regions. Then, a foreground and background color study is performed to correct inverted region colors. Finally, characters are located from the binarized regions based on the PLA algorithm. In this work, we extend the integration of the PLA algorithm within the I-HBK method. In addition, to speed up the separation of text and image step, we employ an efficient GPU acceleration. Through the performed experiments, we demonstrate the high F-measure accuracy of the PLA algorithm reaching 95% on the LRDE dataset. In addition, we illustrate the sequential and the parallel compared PLA versions. The obtained results give a speedup of 3.7x when comparing the parallel PLA implementation on GPU GTX 660 to the CPU version.
NASA Astrophysics Data System (ADS)
Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan
2016-04-01
Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.
Marwan, Ahmed I.; Shabeka, Uladzimir; Dobrinskikh, Evgenia
2018-01-01
In this article, we report an up-to-date summary on tracheal occlusion (TO) as an approach to drive accelerated lung growth and strive to review the different maternal- and fetal-derived local and systemic signals and mechanisms that may play a significant biological role in lung growth and formation of heterogeneous topological zones following TO. Pulmonary hypoplasia is a condition whereby branching morphogenesis and embryonic pulmonary vascular development are globally affected and is classically seen in congenital diaphragmatic hernia. TO is an innovative approach aimed at driving accelerated lung growth in the most severe forms of diaphragmatic hernia and has been shown to result in improved neonatal outcomes. Currently, most research on mechanisms of TO-induced lung growth is focused on mechanical forces and is viewed from the perspective of homogeneous changes within the lung. We suggest that the key principle in understanding changes in fetal lungs after TO is taking into account formation of unique variable topological zones. Following TO, fetal lungs might temporarily look like a dynamically changing topologic mosaic with varying proliferation rates, dissimilar scale of vasculogenesis, diverse patterns of lung tissue damage, variable metabolic landscape, and different structures. The reasons for this dynamic topological mosaic pattern may include distinct degree of increased hydrostatic pressure in different parts of the lung, dissimilar degree of tissue stress/damage and responses to this damage, and incomparable patterns of altered lung zones with variable response to systemic maternal and fetal factors, among others. The local interaction between these factors and their accompanying processes in addition to the potential role of other systemic factors might lead to formation of a common vector of biological response unique to each zone. The study of the interaction between various networks formed after TO (action of mechanical forces, activation of mucosal mast cells, production and secretion of damage-associated molecular pattern substances, low-grade local pulmonary inflammation, and cardiac contraction-induced periodic agitation of lung tissue, among others) will bring us closer to an appreciation of the biological phenomenon of topological heterogeneity within the fetal lungs. PMID:29376042
Multiscale characterization and mechanical modeling of an Al-Zn-Mg electron beam weld
NASA Astrophysics Data System (ADS)
Puydt, Quentin; Flouriot, Sylvain; Ringeval, Sylvain; Parry, Guillaume; De Geuser, Frédéric; Deschamps, Alexis
Welding of precipitation hardening alloys results in multi-scale microstructural heterogeneities, from the hardening nano-scale precipitates to the micron-scale solidification structures and to the component geometry. This heterogeneity results in a complex mechanical response, with gradients in strength, stress triaxiality and damage initiation sites.
Simulating dispersal of reintroduced species within heterogeneous landscapes
Robert H. Gardner; Eric J. Gustafson
2004-01-01
This paper describes the development and application of a spatially explicit, individual based model of animal dispersal (J-walk) to determine the relative effects of landscape heterogeneity, prey availability, predation risk, and the energy requirements and behavior of dispersing organisms on dispersal success. Significant unknowns exist for the simulation of complex...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogacz, Alex
We summarize the current state of a concept for muon acceleration aimed at a future Neutrino Factory and extendable to Higgs Factory. The main thrust of these studies was to reduce the overall cost while maintaining performance by exploring the interplay between the complexity of the cooling systems and the acceptance of the accelerator complex. To ensure adequate survival for the short-lived muons, acceleration must occur at high average gradient. The need for large transverse and longitudinal acceptances drives the design of the acceleration system to an initially low RF frequency, e.g., 325 MHz, which is then increased to 650more » MHz as the transverse size shrinks with increasing energy. High-gradient normal conducting RF cavities at these frequencies require extremely high peak-power RF sources. Hence superconducting RF (SRF) cavities are chosen. We consider an SRF-efficient design based on a multi-pass (4.5) ?dogbone? RLA, extendable to multi-pass FFAG-like arcs.« less
The challenge of turbulent acceleration of relativistic particles in the intra-cluster medium
NASA Astrophysics Data System (ADS)
Brunetti, Gianfranco
2016-01-01
Acceleration of cosmic-ray electrons (CRe) in the intra-cluster medium (ICM) is probed by radio observations that detect diffuse, megaparsec-scale, synchrotron sources in a fraction of galaxy clusters. Giant radio halos are the most spectacular manifestations of non-thermal activity in the ICM and are currently explained assuming that turbulence, driven during massive cluster-cluster mergers, reaccelerates CRe at several giga-electron volts. This scenario implies a hierarchy of complex mechanisms in the ICM that drain energy from large scales into electromagnetic fluctuations in the plasma and collisionless mechanisms of particle acceleration at much smaller scales. In this paper we focus on the physics of acceleration by compressible turbulence. The spectrum and damping mechanisms of the electromagnetic fluctuations, and the mean free path (mfp) of CRe, are the most relevant ingredients that determine the efficiency of acceleration. These ingredients in the ICM are, however, poorly known, and we show that calculations of turbulent acceleration are also sensitive to these uncertainties. On the other hand this fact implies that the non-thermal properties of galaxy clusters probe the complex microphysics and the weakly collisional nature of the ICM.
Osada, Naoki; Akashi, Hiroshi
2012-01-01
Accelerated rates of mitochondrial protein evolution have been proposed to reflect Darwinian coadaptation for efficient energy production for mammalian flight and brain activity. However, several features of mammalian mtDNA (absence of recombination, small effective population size, and high mutation rate) promote genome degradation through the accumulation of weakly deleterious mutations. Here, we present evidence for "compensatory" adaptive substitutions in nuclear DNA- (nDNA) encoded mitochondrial proteins to prevent fitness decline in primate mitochondrial protein complexes. We show that high mutation rate and small effective population size, key features of primate mitochondrial genomes, can accelerate compensatory adaptive evolution in nDNA-encoded genes. We combine phylogenetic information and the 3D structure of the cytochrome c oxidase (COX) complex to test for accelerated compensatory changes among interacting sites. Physical interactions among mtDNA- and nDNA-encoded components are critical in COX evolution; amino acids in close physical proximity in the 3D structure show a strong tendency for correlated evolution among lineages. Only nuclear-encoded components of COX show evidence for positive selection and adaptive nDNA-encoded changes tend to follow mtDNA-encoded amino acid changes at nearby sites in the 3D structure. This bias in the temporal order of substitutions supports compensatory weak selection as a major factor in accelerated primate COX evolution.
Molecular Active Sites in Heterogeneous Ir-La/C-Catalyzed Carbonylation of Methanol to Acetates.
Kwak, Ja Hun; Dagle, Robert; Tustin, Gerald C; Zoeller, Joseph R; Allard, Lawrence F; Wang, Yong
2014-02-06
We report that when Ir and La halides are deposited on carbon, exposure to CO spontaneously generates a discrete molecular heterobimetallic structure, containing an Ir-La covalent bond that acts as a highly active, selective, and stable heterogeneous catalyst for the carbonylation of methanol to produce acetic acid. This catalyst exhibits a very high productivity of ∼1.5 mol acetyl/mol Ir·s with >99% selectivity to acetyl (acetic acid and methyl acetate) without detectable loss in activity or selectivity for more than 1 month of continuous operation. The enhanced activity can be mechanistically rationalized by the presence of La within the ligand sphere of the discrete molecular Ir-La heterobimetallic structure, which acts as a Lewis acid to accelerate the normally rate-limiting CO insertion in Ir-catalyzed carbonylation. Similar approaches may provide opportunities for attaining molecular (single site) behavior similar to homogeneous catalysis on heterogeneous surfaces for other industrial applications.
Operationalizing Anticipatory Governance
2011-09-01
ward to known events. They provide a means to test in the mind, or in a virtual setting, what we might otherwise have to try in reality . Other...process that can be used to correct our strategic myopia and secure America’s global place in the 21st century. Acceleration and Complexity Our era is...process that can be used to correct our strategic myopia and secure America’s global place in the 21st century. Acceleration and Complexity Our era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
High Intensity Proton Accelerator Project in Japan (J-PARC).
Tanaka, Shun-ichi
2005-01-01
The High Intensity Proton Accelerator Project, named as J-PARC, was started on 1 April 2001 at Tokai-site of JAERI. The accelerator complex of J-PARC consists of three accelerators: 400 MeV Linac, 3 GeV rapid cycle synchrotron and 50 GeV synchrotron; and four major experimental facilities: Material and Life Science Facility, Nuclear and Particle Physics Facility, Nuclear Transmutation Experiment Facility and Neutrino Facility. The outline of the J-PARC is presented with the current status of construction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, P.; /Fermilab; Cary, J.
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchen, D.G.; Hohn, M.E.; Aminian, K.
1993-04-01
The purpose of this research is to develop techniques to measure and predict heterogeneities in oil reservoirs that are the products of complex deposystems. The unit chosen for study is the Lower Mississippian Big Injun sandstone, a prolific oil producer (nearly 60 fields) in West Virginia. This research effort has been designed and is being implemented as an integrated effort involving stratigraphy, structural geology, petrology, seismic study, petroleum engineering, modeling and geostatistics. Sandstone bodies are being mapped within their regional depositional systems, and then sandstone bodies are being classified in a scheme of relative heterogeneity to determine heterogeneity across depositionalmore » systems. Facies changes are being mapped within given reservoirs, and the environments of deposition responsible for each facies are being interpreted to predict the inherent relative heterogeneity of each facies. Structural variations will be correlated both with production, where the availability of production data will permit, and with variations in geologic and engineering parameters that affect production. A reliable seismic model of the Big Injun reservoirs in Granny Creek field is being developed to help interpret physical heterogeneity in that field. Pore types are being described and related to permeability, fluid flow and diagenesis, and petrographic data are being integrated with facies and depositional environments to develop a technique to use diagenesis as a predictive tool in future reservoir development. Another objective in the Big Injun study is to determine the effect of heterogeneity on fluid flow and efficient hydrocarbon recovery in order to improve reservoir management. Graphical methods will be applied to Big Injun production data and new geostatistical methods will be developed to detect regional trends in heterogeneity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchen, D.G.; Hohn, M.E.; Aminian, K.
1993-04-01
The purpose of this research is to develop techniques to measure and predict heterogeneities in oil reservoirs that are the products of complex deposystems. The unit chosen for study is the Lower Mississippian Big Injun sandstone, a prolific oil producer (nearly 60 fields) in West Virginia. This research effort has been designed and is being implemented as an integrated effort involving stratigraphy, structural geology, petrology, seismic study, petroleum engineering, modeling and geostatistics. Sandstone bodies are being mapped within their regional depositional systems, and then sandstone bodies are being classified in a scheme of relative heterogeneity to determine heterogeneity across depositionalmore » systems. Facies changes are being mapped within given reservoirs, and the environments of deposition responsible for each facies are being interpreted to predict the inherent relative heterogeneity of each facies. Structural variations will be correlated both with production, where the availability of production data will permit, and with variations in geologic and engineering parameters that affect production. A reliable seismic model of the Big Injun reservoirs in Granny Creek field is being developed to help interpret physical heterogeneity in that field. Pore types are being described and related to permeability, fluid flow and diagenesis, and petrographic data are being integrated with facies and depositional environments to develop a technique to use diagenesis as a predictive tool in future reservoir development. Another objective in the Big Injun study is to determine the effect of heterogeneity on fluid flow and efficient hydrocarbon recovery in order to improve reservoir management. Graphical methods will be applied to Big Injun production data and new geostatistical methods will be developed to detect regional trends in heterogeneity.« less
Stahel, R; Bogaerts, J; Ciardiello, F; de Ruysscher, D; Dubsky, P; Ducreux, M; Finn, S; Laurent-Puig, P; Peters, S; Piccart, M; Smit, E; Sotiriou, C; Tejpar, S; Van Cutsem, E; Tabernero, J
2015-02-01
Despite intense efforts, the socioeconomic burden of cancer remains unacceptably high and treatment advances for many common cancers have been limited, suggesting a need for a new approach to drug development. One issue central to this lack of progress is the heterogeneity and genetic complexity of many tumours. This results in considerable variability in therapeutic response and requires knowledge of the molecular profile of the tumour to guide appropriate treatment selection for individual patients. While recent advances in the molecular characterisation of different cancer types have the potential to transform cancer treatment through precision medicine, such an approach presents a major economic challenge for drug development, since novel targeted agents may only be suitable for a small cohort of patients. Identifying the patients who would benefit from individual therapies and recruiting sufficient numbers of patients with particular cancer subtypes into clinical trials is challenging, and will require collaborative efforts from research groups and industry in order to accelerate progress. A number of molecular screening platforms have already been initiated across Europe, and it is hoped that these networks, along with future collaborations, will benefit not only patients but also society through cost reductions as a result of more efficient use of resources. This review discusses how current developments in translational oncology may be applied in clinical practice in the future, assesses current programmes for the molecular characterisation of cancer and describes possible collaborative approaches designed to maximise the benefits of translational science for patients with cancer. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Analysis of mesoscopic attenuation in gas-hydrate bearing sediments
NASA Astrophysics Data System (ADS)
Rubino, J. G.; Ravazzoli, C. L.; Santos, J. E.
2007-05-01
Several authors have shown that seismic wave attenuation combined with seismic velocities constitute a useful geophysical tool to infer the presence and amounts of gas hydrates lying in the pore space of the sediments. However, it is still not fully understood the loss mechanism associated to the presence of the hydrates, and most of the works dealing with this problem focuse on macroscopic fluid flow, friction between hydrates and sediment matrix and squirt flow. It is well known that an important cause of the attenuation levels observed in seismic data from some sedimentary regions is the mesoscopic loss mechanism, caused by heterogeneities in the rock and fluid properties greater than the pore size but much smaller than the wavelengths. In order to analyze this effect in heterogeneous gas-hydrate bearing sediments, we developed a finite-element procedure to obtain the effective complex modulus of an heterogeneous porous material containing gas hydrates in its pore space using compressibility tests at different oscillatory frequencies in the seismic range. The complex modulus were obtained by solving Biot's equations of motion in the space-frequency domain with appropriate boundary conditions representing a gedanken laboratory experiment measuring the complex volume change of a representative sample of heterogeneous bulk material. This complex modulus in turn allowed us to obtain the corresponding effective phase velocity and quality factor for each frequency and spatial gas hydrate distribution. Physical parameters taken from the Mallik 5L-38 Gas Hydrate Research well (Mackenzie Delta, Canada) were used to analyze the mesoscopic effects in realistic hydrated sediments.
Hollaus, K; Weiss, B; Magele, Ch; Hutten, H
2004-02-01
The acceleration of the solution of the quasi-static electric field problem considering anisotropic complex conductivity simulated by tetrahedral finite elements of first order is investigated by geometric multigrid.
Systems heterogeneity: An integrative way to understand cancer heterogeneity.
Wang, Diane Catherine; Wang, Xiangdong
2017-04-01
The concept of systems heterogeneity was firstly coined and explained in the Special Issue, as a new alternative to understand the importance and complexity of heterogeneity in cancer. Systems heterogeneity can offer a full image of heterogeneity at multi-dimensional functions and multi-omics by integrating gene or protein expression, epigenetics, sequencing, phosphorylation, transcription, pathway, or interaction. The Special Issue starts with the roles of epigenetics in the initiation and development of cancer heterogeneity through the interaction between permanent genetic mutations and dynamic epigenetic alterations. Cell heterogeneity was defined as the difference in biological function and phenotypes between cells in the same organ/tissue or in different organs, as well as various challenges, as exampled in telocytes. The single cell heterogeneity has the value of identifying diagnostic biomarkers and therapeutic targets and clinical potential of single cell systems heterogeneity in clinical oncology. A number of signaling pathways and factors contribute to the development of systems heterogeneity. Proteomic heterogeneity can change the strategy and thinking of drug discovery and development by understanding the interactions between proteins or proteins with drugs in order to optimize drug efficacy and safety. The association of cancer heterogeneity with cancer cell evolution and metastasis was also overviewed as a new alternative for diagnostic biomarkers and therapeutic targets in clinical application. Copyright © 2016 Elsevier Ltd. All rights reserved.
de Hoogt, Ronald; Estrada, Marta F; Vidic, Suzana; Davies, Emma J; Osswald, Annika; Barbier, Michael; Santo, Vítor E; Gjerde, Kjersti; van Zoggel, Hanneke J A A; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M; Verschuren, Emmy W; Hickman, John; Graeser, Ralph
2017-11-21
Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented.
de Hoogt, Ronald; Estrada, Marta F.; Vidic, Suzana; Davies, Emma J.; Osswald, Annika; Barbier, Michael; Santo, Vítor E.; Gjerde, Kjersti; van Zoggel, Hanneke J. A. A.; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M.; Verschuren, Emmy W.; Hickman, John; Graeser, Ralph
2017-01-01
Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented. PMID:29160867
Complex adaptive therapeutic strategy (CATS) for cancer.
Cho, Yong Woo; Kim, Sang Yoon; Kwon, Ick Chan; Kim, In-San
2014-02-10
Tumors begin with a single cell, but as each tumor grows and evolves, it becomes a wide collection of clones that display remarkable heterogeneity in phenotypic features, which has posed a big challenge to current targeted anticancer therapy. Intra- and inter-tumoral heterogeneity is attributable in part to genetic mutations but also to adaptation and evolution of tumors to heterogeneity in tumor microenvironments. If tumors are viewed not only as a disease but also as a complex adaptive system (CAS), tumors should be treated as such and a more systemic approach is needed. Some of many tumors therapeutic strategies are discussed here from a view of a tumor as CAS, which can be collectively called a complex adaptive therapeutic strategy (CATS). The central theme of CATS is based on three intermediate concepts: i) disruption of artifacts, ii) disruption of connections, and iii) reprogramming of cancer-immune dynamics. Each strategy presented here is a piece of the puzzle for CATS. Although each piece by itself may be neither novel nor profound, an assembled puzzle could be a novel and innovative cancer therapeutic strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Attrition and success rates of accelerated students in nursing courses: a systematic review.
Doggrell, Sheila Anne; Schaffer, Sally
2016-01-01
There is a comprehensive literature on the academic outcomes (attrition and success) of students in traditional/baccalaureate nursing programs, but much less is known about the academic outcomes of students in accelerated nursing programs. The aim of this systematic review is to report on the attrition and success rates (either internal examination or NCLEX-RN) of accelerated students, compared to traditional students. For the systematic review, the databases (Pubmed, Cinahl and PsychINFO) and Google Scholar were searched using the search terms 'accelerated' or 'accreditation for prior learning', 'fast-track' or 'top up' and 'nursing' with 'attrition' or 'retention' or 'withdrawal' or 'success' from 1994 to January 2016. All relevant articles were included, regardless of quality. The findings of 19 studies of attrition rates and/or success rates for accelerated students are reported. For international accelerated students, there were only three studies, which are heterogeneous, and have major limitations. One of three studies has lower attrition rates, and one has shown higher success rates, than traditional students. In contrast, another study has shown high attrition and low success for international accelerated students. For graduate accelerated students, most of the studies are high quality, and showed that they have rates similar or better than traditional students. Thus, five of six studies have shown similar or lower attrition rates. Four of these studies with graduate accelerated students and an additional seven studies of success rates only, have shown similar or better success rates, than traditional students. There are only three studies of non-university graduate accelerated students, and these had weaknesses, but were consistent in reporting higher attrition rates than traditional students. The paucity and weakness of information available makes it unclear as to the attrition and/or success of international accelerated students in nursing programs. The good information available suggests that accelerated programs may be working reasonably well for the graduate students. However, the limited information available for non-university graduate students is weak, but consistent, in suggesting they may struggle in accelerated courses. Further studies are needed to determine the attrition and success rates of accelerated students, particularly for international and non-university graduate students.
NASA Astrophysics Data System (ADS)
Chen, X.; Zachara, J. M.; Vermeul, V. R.; Freshley, M.; Hammond, G. E.
2015-12-01
The behavior of a persistent uranium plume in an extended groundwater- river water (GW-SW) interaction zone at the DOE Hanford site is dominantly controlled by river stage fluctuations in the adjacent Columbia River. The plume behavior is further complicated by substantial heterogeneity in physical and geochemical properties of the host aquifer sediments. Multi-scale field and laboratory experiments and reactive transport modeling were integrated to understand the complex plume behavior influenced by highly variable hydrologic and geochemical conditions in time and space. In this presentation we (1) describe multiple data sets from field-scale uranium adsorption and desorption experiments performed at our experimental well-field, (2) develop a reactive transport model that incorporates hydrologic and geochemical heterogeneities characterized from multi-scale and multi-type datasets and a surface complexation reaction network based on laboratory studies, and (3) compare the modeling and observation results to provide insights on how to refine the conceptual model and reduce prediction uncertainties. The experimental results revealed significant spatial variability in uranium adsorption/desorption behavior, while modeling demonstrated that ambient hydrologic and geochemical conditions and heterogeneities in sediment physical and chemical properties both contributed to complex plume behavior and its persistence. Our analysis provides important insights into the characterization, understanding, modeling, and remediation of groundwater contaminant plumes influenced by surface water and groundwater interactions.
NASA Astrophysics Data System (ADS)
Carpenter, B. M.; Scuderi, M. M.; Collettini, C.; Marone, C.
2014-12-01
Observations of heterogeneous and complex fault slip are often attributed to the complexity of fault structure and/or spatial heterogeneity of fault frictional behavior. Such complex slip patterns have been observed for earthquakes on normal faults throughout central Italy, where many of the Mw 6 to 7 earthquakes in the Apennines nucleate at depths where the lithology is dominated by carbonate rocks. To explore the relationship between fault structure and heterogeneous frictional properties, we studied the exhumed Monte Maggio Fault, located in the northern Apennines. We collected intact specimens of the fault zone, including the principal slip surface and hanging wall cataclasite, and performed experiments at a normal stress of 10 MPa under saturated conditions. Experiments designed to reactivate slip between the cemented principal slip surface and cataclasite show a 3 MPa stress drop as the fault surface fails, then velocity-neutral frictional behavior and significant frictional healing. Overall, our results suggest that (1) earthquakes may readily nucleate in areas of the fault where the slip surface separates massive limestone and are likely to propagate in areas where fault gouge is in contact with the slip surface; (2) postseismic slip is more likely to occur in areas of the fault where gouge is present; and (3) high rates of frictional healing and low creep relaxation observed between solid fault surfaces could lead to significant aftershocks in areas of low stress drop.
ERIC Educational Resources Information Center
Hendricks, Paige
2016-01-01
The foundation of the United States' educational system is that all students will be educated equally by offering access to knowledge, opportunities, and services resulting in the creation of positive societal contributors. However, this task is complex and challenging. Heterogeneous student populations due to increased culturally diversity, do…
Fermilab’s Accelerator Complex: Current Status, Upgrades and Outlook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Convery, M. E.
We report on the status of the Fermilab accelerator complex, including recent performance, upgrades in progress, and plans for the future. Beam delivery to the neutrino experiments surpassed our goals for the past year. The Proton Improvement Plan is well underway with successful 15 Hz beam operation. Beam power of 700 kW to the NOvA experiment was demonstrated and will be routine in the next year. We are also preparing the Muon Campus to commission beam to the g-2 experiment.
A distributed scheduling algorithm for heterogeneous real-time systems
NASA Technical Reports Server (NTRS)
Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi
1991-01-01
Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.
Phenotypically heterogeneous populations in spatially heterogeneous environments
NASA Astrophysics Data System (ADS)
Patra, Pintu; Klumpp, Stefan
2014-03-01
The spatial expansion of a population in a nonuniform environment may benefit from phenotypic heterogeneity with interconverting subpopulations using different survival strategies. We analyze the crossing of an antibiotic-containing environment by a bacterial population consisting of rapidly growing normal cells and slow-growing, but antibiotic-tolerant persister cells. The dynamics of crossing is characterized by mean first arrival times and is found to be surprisingly complex. It displays three distinct regimes with different scaling behavior that can be understood based on an analytical approximation. Our results suggest that a phenotypically heterogeneous population has a fitness advantage in nonuniform environments and can spread more rapidly than a homogeneous population.
Measuring the effects of heterogeneity on distributed systems
NASA Technical Reports Server (NTRS)
El-Toweissy, Mohamed; Zeineldine, Osman; Mukkamala, Ravi
1991-01-01
Distributed computer systems in daily use are becoming more and more heterogeneous. Currently, much of the design and analysis studies of such systems assume homogeneity. This assumption of homogeneity has been mainly driven by the resulting simplicity in modeling and analysis. A simulation study is presented which investigated the effects of heterogeneity on scheduling algorithms for hard real time distributed systems. In contrast to previous results which indicate that random scheduling may be as good as a more complex scheduler, this algorithm is shown to be consistently better than a random scheduler. This conclusion is more prevalent at high workloads as well as at high levels of heterogeneity.
Numerical simulation of backward erosion piping in heterogeneous fields
NASA Astrophysics Data System (ADS)
Liang, Yue; Yeh, Tian-Chyi Jim; Wang, Yu-Li; Liu, Mingwei; Wang, Junjie; Hao, Yonghong
2017-04-01
Backward erosion piping (BEP) is one of the major causes of seepage failures in levees. Seepage fields dictate the BEP behaviors and are influenced by the heterogeneity of soil properties. To investigate the effects of the heterogeneity on the seepage failures, we develop a numerical algorithm and conduct simulations to study BEP progressions in geologic media with spatially stochastic parameters. Specifically, the void ratio e, the hydraulic conductivity k, and the ratio of the particle contents r of the media are represented as the stochastic variables. They are characterized by means and variances, the spatial correlation structures, and the cross correlation between variables. Results of the simulations reveal that the heterogeneity accelerates the development of preferential flow paths, which profoundly increase the likelihood of seepage failures. To account for unknown heterogeneity, we define the probability of the seepage instability (PI) to evaluate the failure potential of a given site. Using Monte-Carlo simulation (MCS), we demonstrate that the PI value is significantly influenced by the mean and the variance of ln k and its spatial correlation scales. But the other parameters, such as means and variances of e and r, and their cross correlation, have minor impacts. Based on PI analyses, we introduce a risk rating system to classify the field into different regions according to risk levels. This rating system is useful for seepage failures prevention and assists decision making when BEP occurs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Z. M.; Laser Fusion Research Center, CAEP, Mianyang 621900; He, X. T.
A complex target (CT) configuration tailored for generating high quality proton bunch by circularly polarized laser pulses at intensities of 10{sup 20-21} W/cm{sup 2} is proposed. Two-dimensional particle-in-cell simulations show that both the collimation and mono-energetic qualities of the accelerated proton bunch obtained using a front-shaped thin foil can be greatly enhanced by the backside inhomogeneous plasma layer. The main mechanisms for improving the accelerated protons are identified and discussed. These include stabilization of the photon cavity, providing hole-boring supplementary acceleration and suppressing the thermal-electron effects. A theory for tailoring the CT parameters is also presented.
Extraordinary Tools for Extraordinary Science: The Impact ofSciDAC on Accelerator Science&Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert D.
2006-08-10
Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook''. Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now takemore » hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.« less
NASA Astrophysics Data System (ADS)
Ryne, Robert D.
2006-09-01
Particle accelerators are among the most complex and versatile instruments of scientific exploration. They have enabled remarkable scientific discoveries and important technological advances that span all programs within the DOE Office of Science (DOE/SC). The importance of accelerators to the DOE/SC mission is evident from an examination of the DOE document, ''Facilities for the Future of Science: A Twenty-Year Outlook.'' Of the 28 facilities listed, 13 involve accelerators. Thanks to SciDAC, a powerful suite of parallel simulation tools has been developed that represent a paradigm shift in computational accelerator science. Simulations that used to take weeks or more now take hours, and simulations that were once thought impossible are now performed routinely. These codes have been applied to many important projects of DOE/SC including existing facilities (the Tevatron complex, the Relativistic Heavy Ion Collider), facilities under construction (the Large Hadron Collider, the Spallation Neutron Source, the Linac Coherent Light Source), and to future facilities (the International Linear Collider, the Rare Isotope Accelerator). The new codes have also been used to explore innovative approaches to charged particle acceleration. These approaches, based on the extremely intense fields that can be present in lasers and plasmas, may one day provide a path to the outermost reaches of the energy frontier. Furthermore, they could lead to compact, high-gradient accelerators that would have huge consequences for US science and technology, industry, and medicine. In this talk I will describe the new accelerator modeling capabilities developed under SciDAC, the essential role of multi-disciplinary collaboration with applied mathematicians, computer scientists, and other IT experts in developing these capabilities, and provide examples of how the codes have been used to support DOE/SC accelerator projects.
Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Xiujuan; Chen, Jiapei
2017-03-01
Due to the existence of complexities of heterogeneities, hierarchy, discreteness, and interactions in municipal solid waste management (MSWM) systems such as Beijing, China, a series of socio-economic and eco-environmental problems may emerge or worsen and result in irredeemable damages in the following decades. Meanwhile, existing studies, especially ones focusing on MSWM in Beijing, could hardly reflect these complexities in system simulations and provide reliable decision support for management practices. Thus, a framework of distributed mixed-integer fuzzy hierarchical programming (DMIFHP) is developed in this study for MSWM under these complexities. Beijing is selected as a representative case. The Beijing MSWM system is comprehensively analyzed in many aspects such as socio-economic conditions, natural conditions, spatial heterogeneities, treatment facilities, and system complexities, building a solid foundation for system simulation and optimization. Correspondingly, the MSWM system in Beijing is discretized as 235 grids to reflect spatial heterogeneity. A DMIFHP model which is a nonlinear programming problem is constructed to parameterize the Beijing MSWM system. To enable scientific solving of it, a solution algorithm is proposed based on coupling of fuzzy programming and mixed-integer linear programming. Innovations and advantages of the DMIFHP framework are discussed. The optimal MSWM schemes and mechanism revelations will be discussed in another companion paper due to length limitation.
Robust mechanobiological behavior emerges in heterogeneous myosin systems.
Egan, Paul F; Moore, Jeffrey R; Ehrlicher, Allen J; Weitz, David A; Schunn, Christian; Cagan, Jonathan; LeDuc, Philip
2017-09-26
Biological complexity presents challenges for understanding natural phenomenon and engineering new technologies, particularly in systems with molecular heterogeneity. Such complexity is present in myosin motor protein systems, and computational modeling is essential for determining how collective myosin interactions produce emergent system behavior. We develop a computational approach for altering myosin isoform parameters and their collective organization, and support predictions with in vitro experiments of motility assays with α-actinins as molecular force sensors. The computational approach models variations in single myosin molecular structure, system organization, and force stimuli to predict system behavior for filament velocity, energy consumption, and robustness. Robustness is the range of forces where a filament is expected to have continuous velocity and depends on used myosin system energy. Myosin systems are shown to have highly nonlinear behavior across force conditions that may be exploited at a systems level by combining slow and fast myosin isoforms heterogeneously. Results suggest some heterogeneous systems have lower energy use near stall conditions and greater energy consumption when unloaded, therefore promoting robustness. These heterogeneous system capabilities are unique in comparison with homogenous systems and potentially advantageous for high performance bionanotechnologies. Findings open doors at the intersections of mechanics and biology, particularly for understanding and treating myosin-related diseases and developing approaches for motor molecule-based technologies.
Robust mechanobiological behavior emerges in heterogeneous myosin systems
NASA Astrophysics Data System (ADS)
Egan, Paul F.; Moore, Jeffrey R.; Ehrlicher, Allen J.; Weitz, David A.; Schunn, Christian; Cagan, Jonathan; LeDuc, Philip
2017-09-01
Biological complexity presents challenges for understanding natural phenomenon and engineering new technologies, particularly in systems with molecular heterogeneity. Such complexity is present in myosin motor protein systems, and computational modeling is essential for determining how collective myosin interactions produce emergent system behavior. We develop a computational approach for altering myosin isoform parameters and their collective organization, and support predictions with in vitro experiments of motility assays with α-actinins as molecular force sensors. The computational approach models variations in single myosin molecular structure, system organization, and force stimuli to predict system behavior for filament velocity, energy consumption, and robustness. Robustness is the range of forces where a filament is expected to have continuous velocity and depends on used myosin system energy. Myosin systems are shown to have highly nonlinear behavior across force conditions that may be exploited at a systems level by combining slow and fast myosin isoforms heterogeneously. Results suggest some heterogeneous systems have lower energy use near stall conditions and greater energy consumption when unloaded, therefore promoting robustness. These heterogeneous system capabilities are unique in comparison with homogenous systems and potentially advantageous for high performance bionanotechnologies. Findings open doors at the intersections of mechanics and biology, particularly for understanding and treating myosin-related diseases and developing approaches for motor molecule-based technologies.
Yang, Liyou; Chen, Liangfan
1998-03-24
Attractive multi-junction solar cells and single junction solar cells with excellent conversion efficiency can be produced with a microcrystalline tunnel junction, microcrystalline recombination junction or one or more microcrystalline doped layers by special plasma deposition processes which includes plasma etching with only hydrogen or other specified etchants to enhance microcrystalline growth followed by microcrystalline. nucleation with a doped hydrogen-diluted feedstock.
A Heterogeneous High-Performance System for Computational and Computer Science
2016-11-15
Patents Submitted Patents Awarded Awards Graduate Students Names of Post Doctorates Names of Faculty Supported Names of Under Graduate students supported...team of research faculty from the departments of computer science and natural science at Bowie State University. The supercomputer is not only to...accelerated HPC systems. The supercomputer is also ideal for the research conducted in the Department of Natural Science, as research faculty work on
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza; Evans, John W.
2014-01-01
For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.
Bacterial filamentation accelerates colonization of adhesive spots embedded in biopassive surfaces
NASA Astrophysics Data System (ADS)
Möller, Jens; Emge, Philippe; Avalos Vizcarra, Ima; Kollmannsberger, Philip; Vogel, Viola
2013-12-01
Sessile bacteria adhere to engineered surfaces and host tissues and pose a substantial clinical and economical risk when growing into biofilms. Most engineered and biological interfaces are of chemically heterogeneous nature and provide adhesive islands for bacterial attachment and growth. To mimic either defects in a surface coating of biomedical implants or heterogeneities within mucosal layers (Peyer's patches), we embedded micrometre-sized adhesive islands in a poly(ethylene glycol) biopassive background. We show experimentally and computationally that filamentation of Escherichia coli can significantly accelerate the bacterial surface colonization under physiological flow conditions. Filamentation can thus provide an advantage to a bacterial population to bridge non-adhesive distances exceeding 5 μm. Bacterial filamentation, caused by blocking of bacterial division, is common among bacterial species and can be triggered by environmental conditions or antibiotic treatment. While great awareness exists that the build-up of antibiotic resistance serves as intrinsic survival strategy, we show here that antibiotic treatment can actually promote surface colonization by triggering filamentation, which in turn prevents daughter cells from being washed away. Our combined microfabrication and computational approaches provide quantitative insights into mechanisms that enable biofouling of biopassive surfaces with embedded adhesive spots, even for spot distances that are multiples of the bacterial length.
Calculating the True and Observed Rates of Complex Heterogeneous Catalytic Reactions
NASA Astrophysics Data System (ADS)
Avetisov, A. K.; Zyskin, A. G.
2018-06-01
Equations of the theory of steady-state complex reactions are considered in matrix form. A set of stage stationarity equations is given, and an algorithm is described for deriving the canonic set of stationarity equations with appropriate corrections for the existence of fast stages in a mechanism. A formula for calculating the number of key compounds is presented. The applicability of the Gibbs rule to estimating the number of independent compounds in a complex reaction is analyzed. Some matrix equations relating the rates of dependent and key substances are derived. They are used as a basis to determine the general diffusion stoichiometry relationships between temperature, the concentrations of dependent reaction participants, and the concentrations of key reaction participants in a catalyst grain. An algorithm is described for calculating heat and mass transfer in a catalyst grain with respect to arbitrary complex heterogeneous catalytic reactions.
A ℓ2, 1 norm regularized multi-kernel learning for false positive reduction in Lung nodule CAD.
Cao, Peng; Liu, Xiaoli; Zhang, Jian; Li, Wei; Zhao, Dazhe; Huang, Min; Zaiane, Osmar
2017-03-01
The aim of this paper is to describe a novel algorithm for False Positive Reduction in lung nodule Computer Aided Detection(CAD). In this paper, we describes a new CT lung CAD method which aims to detect solid nodules. Specially, we proposed a multi-kernel classifier with a ℓ 2, 1 norm regularizer for heterogeneous feature fusion and selection from the feature subset level, and designed two efficient strategies to optimize the parameters of kernel weights in non-smooth ℓ 2, 1 regularized multiple kernel learning algorithm. The first optimization algorithm adapts a proximal gradient method for solving the ℓ 2, 1 norm of kernel weights, and use an accelerated method based on FISTA; the second one employs an iterative scheme based on an approximate gradient descent method. The results demonstrates that the FISTA-style accelerated proximal descent method is efficient for the ℓ 2, 1 norm formulation of multiple kernel learning with the theoretical guarantee of the convergence rate. Moreover, the experimental results demonstrate the effectiveness of the proposed methods in terms of Geometric mean (G-mean) and Area under the ROC curve (AUC), and significantly outperforms the competing methods. The proposed approach exhibits some remarkable advantages both in heterogeneous feature subsets fusion and classification phases. Compared with the fusion strategies of feature-level and decision level, the proposed ℓ 2, 1 norm multi-kernel learning algorithm is able to accurately fuse the complementary and heterogeneous feature sets, and automatically prune the irrelevant and redundant feature subsets to form a more discriminative feature set, leading a promising classification performance. Moreover, the proposed algorithm consistently outperforms the comparable classification approaches in the literature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Phylogeny, rate variation, and genome size evolution of Pelargonium (Geraniaceae).
Weng, Mao-Lun; Ruhlman, Tracey A; Gibby, Mary; Jansen, Robert K
2012-09-01
The phylogeny of 58 Pelargonium species was estimated using five plastid markers (rbcL, matK, ndhF, rpoC1, trnL-F) and one mitochondrial gene (nad5). The results confirmed the monophyly of three major clades and four subclades within Pelargonium but also indicate the need to revise some sectional classifications. This phylogeny was used to examine karyotype evolution in the genus: plotting chromosome sizes, numbers and 2C-values indicates that genome size is significantly correlated with chromosome size but not number. Accelerated rates of nucleotide substitution have been previously detected in both plastid and mitochondrial genes in Pelargonium, but sparse taxon sampling did not enable identification of the phylogenetic distribution of these elevated rates. Using the multigene phylogeny as a constraint, we investigated lineage- and locus-specific heterogeneity of substitution rates in Pelargonium for an expanded number of taxa and demonstrated that both plastid and mitochondrial genes have had accelerated substitution rates but with markedly disparate patterns. In the plastid, the exons of rpoC1 have significantly accelerated substitution rates compared to its intron and the acceleration was mainly due to nonsynonymous substitutions. In contrast, the mitochondrial gene, nad5, experienced substantial acceleration of synonymous substitution rates in three internal branches of Pelargonium, but this acceleration ceased in all terminal branches. Several lineages also have dN/dS ratios significantly greater than one for rpoC1, indicating that positive selection is acting on this gene, whereas the accelerated synonymous substitutions in the mitochondrial gene are the result of elevated mutation rates. Published by Elsevier Inc.
High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics
Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis
2014-07-28
The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less
Gao, Kai; Chung, Eric T.; Gibson, Richard L.; ...
2015-06-05
The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less
Overview on Clinical Relevance of Intra-Tumor Heterogeneity.
Stanta, Giorgio; Bonin, Serena
2018-01-01
Today, clinical evaluation of tumor heterogeneity is an emergent issue to improve clinical oncology. In particular, intra-tumor heterogeneity (ITH) is closely related to cancer progression, resistance to therapy, and recurrences. It is interconnected with complex molecular mechanisms including spatial and temporal phenomena, which are often peculiar for every single patient. This review tries to describe all the types of ITH including morphohistological ITH, and at the molecular level clonal ITH derived from genomic instability and nonclonal ITH derived from microenvironment interaction. It is important to consider the different types of ITH as a whole for any patient to investigate on cancer progression, prognosis, and treatment opportunities. From a practical point of view, analytical methods that are widely accessible today, or will be in the near future, are evaluated to investigate the complex pattern of ITH in a reproducible way for a clinical application.
Measurement of Coriolis Acceleration with a Smartphone
NASA Astrophysics Data System (ADS)
Shakur, Asif; Kraft, Jakob
2016-05-01
Undergraduate physics laboratories seldom have experiments that measure the Coriolis acceleration. This has traditionally been the case owing to the inherent complexities of making such measurements. Articles on the experimental determination of the Coriolis acceleration are few and far between in the physics literature. However, because modern smartphones come with a raft of built-in sensors, we have a unique opportunity to experimentally determine the Coriolis acceleration conveniently in a pedagogically enlightening environment at modest cost by using student-owned smartphones. Here we employ the gyroscope and accelerometer in a smartphone to verify the dependence of Coriolis acceleration on the angular velocity of a rotatingtrack and the speed of the sliding smartphone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staller, G.E.; Hamilton, I.D.; Aker, M.F.
1978-02-01
A single-unit electron beam accelerator was designed, fabricated, and assembled in Sandia's Technical Area V to conduct magnetically insulated transmission experiments. Results of these experiments will be utilized in the future design of larger, more complex accelerators. This design makes optimum use of existing facilities and equipment. When designing new components, possible future applications were considered as well as compatibility with existing facilities and hardware.
Accelerating Cogent Confabulation: An Exploration in the Architecture Design Space
2008-06-01
DATES COVERED (From - To) 1-8 June 2008 4. TITLE AND SUBTITLE ACCELERATING COGENT CONFABULATION: AN EXPLORATION IN THE ARCHITECTURE DESIGN SPACE 5a...spiking neural networks is proposed in reference [8]. Reference [9] investigates the architecture design of a Brain-state-in-a-box model. The...Richard Linderman2, Thomas Renz2, Qing Wu1 Accelerating Cogent Confabulation: an Exploration in the Architecture Design Space POSTPRINT complexity
Conduction Cooling of a Niobium SRF Cavity Using a Cryocooler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feldman, Joshua; Geelhoed, Michael; Dhuley, Ram
Superconducting Radio Frequency (SRF) cavities are the primary choice for accelerating charged particles in high-energy research accelerators. Institutions like Fermilab use SRF cavities because they enable significantly higher gradients and quality factors than normal-conducting RF cavities and DC voltage cavities. To cool the SRF cavities to low temperatures (typically around 2 K), liquid helium refrigerators are used. Producing and maintaining the necessary liquid helium requires large, elaborate cryogenic plants involving dewars, compressors, expansion engines, and recyclers. The cost, complexity, and space required for such plants is part of the reason that industry has not yet adopted SRF-based accelerators. At themore » Illinois Accelerator Research Center (IARC) at Fermilab, our team seeks to make SRF technology accessible not only to large research accelerators, but to industry as well. If we eliminate the complexity associated with liquid helium plants, SRF-based industrial accelerators may finally become a reality. One way to do this is to eliminate the use of liquid helium baths altogether and develop a brand-new cooling technique for SRF cavities: conduction cooling using a cryocooler. Recent advances in SRF technology have made it possible to operate SRF cavities at 4 K, a temperature easily achievable using commercial cryocoolers. Our IARC team is taking advantage of this technology to cool SRF cavities.« less
Biotinylated Rh(III) complexes in engineered streptavidin for accelerated asymmetric C-H activation.
Hyster, Todd K; Knörr, Livia; Ward, Thomas R; Rovis, Tomislav
2012-10-26
Enzymes provide an exquisitely tailored chiral environment to foster high catalytic activities and selectivities, but their native structures are optimized for very specific biochemical transformations. Designing a protein to accommodate a non-native transition metal complex can broaden the scope of enzymatic transformations while raising the activity and selectivity of small-molecule catalysis. Here, we report the creation of a bifunctional artificial metalloenzyme in which a glutamic acid or aspartic acid residue engineered into streptavidin acts in concert with a docked biotinylated rhodium(III) complex to enable catalytic asymmetric carbon-hydrogen (C-H) activation. The coupling of benzamides and alkenes to access dihydroisoquinolones proceeds with up to nearly a 100-fold rate acceleration compared with the activity of the isolated rhodium complex and enantiomeric ratios as high as 93:7.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Zhou; H. Huang; M. Deo
Log and seismic data indicate that most shale formations have strong heterogeneity. Conventional analytical and semi-analytical fracture models are not enough to simulate the complex fracture propagation in these highly heterogeneous formation. Without considering the intrinsic heterogeneity, predicted morphology of hydraulic fracture may be biased and misleading in optimizing the completion strategy. In this paper, a fully coupling fluid flow and geomechanics hydraulic fracture simulator based on dual-lattice Discrete Element Method (DEM) is used to predict the hydraulic fracture propagation in heterogeneous reservoir. The heterogeneity of rock is simulated by assigning different material force constant and critical strain to differentmore » particles and is adjusted by conditioning to the measured data and observed geological features. Based on proposed model, the effects of heterogeneity at different scale on micromechanical behavior and induced macroscopic fractures are examined. From the numerical results, the microcrack will be more inclined to form at the grain weaker interface. The conventional simulator with homogeneous assumption is not applicable for highly heterogeneous shale formation.« less
Extrachromosomal oncogene amplification drives tumor evolution and genetic heterogeneity
Turner, Kristen M.; Deshpande, Viraj; Beyter, Doruk; Koga, Tomoyuki; Rusert, Jessica; Lee, Catherine; Li, Bin; Arden, Karen; Ren, Bing; Nathanson, David A.; Kornblum, Harley I.; Taylor, Michael D.; Kaushal, Sharmeela; Cavenee, Webster K.; Wechsler-Reya, Robert; Furnari, Frank B.; Vandenberg, Scott R.; Rao, P. Nagesh; Wahl, Geoffrey M.; Bafna, Vineet; Mischel, Paul S.
2017-01-01
Human cells have twenty-three pairs of chromosomes but in cancer, genes can be amplified in chromosomes or in circular extrachromosomal DNA (ECDNA), whose frequency and functional significance are not understood1–4. We performed whole genome sequencing, structural modeling and cytogenetic analyses of 17 different cancer types, including 2572 metaphases, and developed ECdetect to conduct unbiased integrated ECDNA detection and analysis. ECDNA was found in nearly half of human cancers varying by tumor type, but almost never in normal cells. Driver oncogenes were amplified most commonly on ECDNA, elevating transcript level. Mathematical modeling predicted that ECDNA amplification elevates oncogene copy number and increases intratumoral heterogeneity more effectively than chromosomal amplification, which we validated by quantitative analyses of cancer samples. These results suggest that ECDNA contributes to accelerated evolution in cancer. PMID:28178237
Displacement of organelles in plant gravireceptor cells by vibrational forces and ultrasound.
NASA Astrophysics Data System (ADS)
Kuznetsov, O.; Nechitailo, G.; Kuznetsov, A.
Plant gravity perception can be studied by displacing statoliths inside receptor cells by forces other than gravity. Due to mechanical heterogeneity of statocytes various ponderomotive forces can be used for this purpose. In a plant subjected to non- symmetric vibrations statoliths experience inertial force proportional to the difference between their density and that of cytoplasm and to the instantaneous acceleration of the cell. This force causes cyclic motion of statoliths relative to cytoplasm and, depending on the profile of oscillations, can result in a net displacement of them (due to complex rheology of the cell interior), similar to sedimentation. This can be described as "vibrational" ponderomotive force acting on the statoliths. Vertically growing Arabidopsis seedlings, subjected to horizontal, sawtooth shaped oscillations (250 Hz, 1.5 mm amplitude), showed 17+/-2o root curvature toward and shoot curvature of 11+/-3o against the stronger acceleration. When the polarity of the oscillations was reversed, the direction of curvature of shoots and roots was also reversed. Control experiments with starchless mutants (TC7) produced no net curvature, which indicates that dense starch-filled amyloplasts are needed for the effect. These control experiments also eliminate touch-induced reactions or other side-effects as the cause of the curvature. Linum roots curved 25+/-7o . Ceratodon protonemata subjected to the same oscillations have shown displacement of plastids and curvature consistent with the pattern observed during graviresponse: positively gravitropic wwr mutant curved in the direction of the plastid displacement, WT curved in the opposite direction. Acoustic ponderomotive forces, originating from transfer of a sonic beam momentum to the medium due to sound scattering and attenuation in a mechanically heterogeneous system, also can displace statoliths. Vertical flax seedlings curved away from the ultrasonic source (800 kHz, 0.1 W/cm2 ) presumably as a reaction to amyloplasts displacement by acoustic forces. Besides investigating the graviperception mechanism, vibrational and acoustic forces can serve as tools for analyzing mechanical properties of cell interior. Practical applications of this technology could include providing directional stimuli for plants in microgravity by low doses of vibrations. Vibrations present on board of spacecraft may have vectorial effects on plants and other organisms, and their influence should be assessed.
Complex Nonlinear Dynamic System of Oligopolies Price Game with Heterogeneous Players Under Noise
NASA Astrophysics Data System (ADS)
Liu, Feng; Li, Yaguang
A nonlinear four oligopolies price game with heterogeneous players, that are boundedly rational and adaptive, is built using two different special demand costs. Based on the theory of complex discrete dynamical system, the stability and the existing equilibrium point are investigated. The complex dynamic behavior is presented via bifurcation diagrams, the Lyapunov exponents to show equilibrium state, bifurcation and chaos with the variation in parameters. As disturbance is ubiquitous in economic systems, this paper focuses on the analysis of delay feedback control method under noise circumstances. Stable dynamics is confirmed to depend mainly on the low price adjustment speed, and if all four players have limited opportunities to stabilize the market, the new adaptive player facing profits of scale are found to be higher than the incumbents of bounded rational.
ERIC Educational Resources Information Center
Pelphrey, Kevin A.; Shultz, Sarah; Hudac, Caitlin M.; Vander Wyk, Brent C.
2011-01-01
The expression of autism spectrum disorder (ASD) is highly heterogeneous, owing to the complex interactions between genes, the brain, and behavior throughout development. Here we present a model of ASD that implicates an early and initial failure to develop the specialized functions of one or more of the set of neuroanatomical structures involved…
NASA Astrophysics Data System (ADS)
Osipenko, A.; Krylov, K.
In ophiolite complexes from the Eastern Asian accretion belts the spatial heterogeneity of geochemical parameters for different components of an ophiolite sequence is estab- lished: restite mantle-derived peridotites, cumulative layered complex and volcanics. This heterogeneity is displayed as at a regional level (tens - hundred km), and at a level of local structures (hundred i - first tens km). As a rule, distinction is observed on a complex of geochemical parameters (concentration and form of REE spectra, EPG distribution, isotope characteristics, Cr-spinel and pyroxene composition etc.). Revealed at once in several suprasubduction-type ophiolite belts (Kamuikotan, Philip- pines New Guinea etc.), the spatial variations of geochemical parameters have not gradual, and discrete character. For an explanation of the reasons of ophiolite com- positional heterogeneity several mechanisms are offered: (1) tectonical overlapping of various fragments of lithosphere; (2) different specify of deep processes, resulting to compositional heterogeneity of rocks from the same lithosphere level; 3) hetero- geneity of the upper mantle and/or mantle metasomatism; 4) evolution of ophiolites (Shervais, 2001) and/or center of magma generation (mixture of continuous series of melt portions, separated during different stages of progressive mantle source melting (Bazylev et al., 2001)); 5) preservations of relict blocks of low lithosphere and upper mantle from the previous stage in suprasubduction conditions. The authors consider regional geochemical heterogeneity and segmentation of suprasubduction ophiolites (SSZ-type) on an example of peridotites from the Eastern Kamchatka ophiolite belt (EKOB), where sublongitude zones, crossed the basic geological structures of a penin- sula (including EROB) were allocated earlier. For each of zones the complex of geo- chemical attributes, steady is established within the limits of a zone, but distinct from of the characteristics of other zones. Among the factors causing an unequal degree of partial melting of peridotites, a main role play a geothermal regime and composition of fluid phase (first of all, the role of water fluid is great). These parameters, in turn, are supervised by a geodynamic regime of magma generation (such characteristics as speed of subduction and geometry of a subducted plate) and finally determine speed of uplift from the diapir in mantle, depth of the termination of partial melting, amount of 1 extracted melt, form and capacity of the magma chamber etc. The local heterogeneity in SSZ-ophiolites is considered on an example of a complex of the Kamchatka Cape Peninsula - the largest ophiolite complex in EKOB. Isotope, geochemical and miner- alogical study have shown, that a part, prevailing on volume, of this complex consist suprasubduction-type magmatic rocks (restite high-depleted harzburgites and related layered cumulative complex), whereas peridotites of harzburgite-lherzolite series and high-grade metabasites (retrograde eclogites and garnet amphibolites) composition- ally correspond to series of N-MORB and Ò-MORB-type. The presence in ophiolite of the Kamchatka Cape Peninsula alongside with high-depleted harzburgites as well moderately- and low-depleted peridotites of harzburgite-lherzolite series allows to as- sume, that Late Mesozoic suprasubduction ophiolites were formed on peridotitic basis of abyssal type. Thus the transformation of "oceanic" substrate was not complete, that has allowed to be kept relict peridotites of lherzolitic type and high-pressure metamor- phics. Probably it reflects pulsing character of geodynamics of suprasubduction-type ophiolite formation, it is possible is connected with "jumping" of spreading axes in suprasubduction conditions. During followed multistage napping in a northeast direc- tion in the Upper Cretaceous time disintegrated fragments of both mantle complexes were tectonically concurrent. In the report the alternative versions of tectonic models of development are also discussed for the Eastern Kamchatka ophiolites. 2
NASA Astrophysics Data System (ADS)
Nuh, M. Z.; Nasir, N. F.
2017-08-01
Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.
Use of Graph Database for the Integration of Heterogeneous Biological Data.
Yoon, Byoung-Ha; Kim, Seon-Kyu; Kim, Seon-Young
2017-03-01
Understanding complex relationships among heterogeneous biological data is one of the fundamental goals in biology. In most cases, diverse biological data are stored in relational databases, such as MySQL and Oracle, which store data in multiple tables and then infer relationships by multiple-join statements. Recently, a new type of database, called the graph-based database, was developed to natively represent various kinds of complex relationships, and it is widely used among computer science communities and IT industries. Here, we demonstrate the feasibility of using a graph-based database for complex biological relationships by comparing the performance between MySQL and Neo4j, one of the most widely used graph databases. We collected various biological data (protein-protein interaction, drug-target, gene-disease, etc.) from several existing sources, removed duplicate and redundant data, and finally constructed a graph database containing 114,550 nodes and 82,674,321 relationships. When we tested the query execution performance of MySQL versus Neo4j, we found that Neo4j outperformed MySQL in all cases. While Neo4j exhibited a very fast response for various queries, MySQL exhibited latent or unfinished responses for complex queries with multiple-join statements. These results show that using graph-based databases, such as Neo4j, is an efficient way to store complex biological relationships. Moreover, querying a graph database in diverse ways has the potential to reveal novel relationships among heterogeneous biological data.
Use of Graph Database for the Integration of Heterogeneous Biological Data
Yoon, Byoung-Ha; Kim, Seon-Kyu
2017-01-01
Understanding complex relationships among heterogeneous biological data is one of the fundamental goals in biology. In most cases, diverse biological data are stored in relational databases, such as MySQL and Oracle, which store data in multiple tables and then infer relationships by multiple-join statements. Recently, a new type of database, called the graph-based database, was developed to natively represent various kinds of complex relationships, and it is widely used among computer science communities and IT industries. Here, we demonstrate the feasibility of using a graph-based database for complex biological relationships by comparing the performance between MySQL and Neo4j, one of the most widely used graph databases. We collected various biological data (protein-protein interaction, drug-target, gene-disease, etc.) from several existing sources, removed duplicate and redundant data, and finally constructed a graph database containing 114,550 nodes and 82,674,321 relationships. When we tested the query execution performance of MySQL versus Neo4j, we found that Neo4j outperformed MySQL in all cases. While Neo4j exhibited a very fast response for various queries, MySQL exhibited latent or unfinished responses for complex queries with multiple-join statements. These results show that using graph-based databases, such as Neo4j, is an efficient way to store complex biological relationships. Moreover, querying a graph database in diverse ways has the potential to reveal novel relationships among heterogeneous biological data. PMID:28416946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-12
... Administration (FHA): Multifamily Accelerated Processing (MAP)--Lender and Underwriter Eligibility Criteria and....gov . FOR FURTHER INFORMATION CONTACT: Terry W. Clark, Office of Multifamily Development, Office of... qualifications could underwrite loans involving more complex multifamily housing programs and transactions. II...
Acceleration during magnetic reconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beresnyak, Andrey; Li, Hui
2015-07-16
The presentation begins with colorful depictions of solar x-ray flares and references to pulsar phenomena. Plasma reconnection is complex, could be x-point dominated or turbulent, field lines could break due to either resistivity or non-ideal effects, such as electron pressure anisotropy. Electron acceleration is sometimes observed, and sometimes not. One way to study this complex problem is to have many examples of the process (reconnection) and compare them; the other way is to simplify and come to something robust. Ideal MHD (E=0) turbulence driven by magnetic energy is assumed, and the first-order acceleration is sought. It is found that dissipationmore » in big (length >100 ion skin depths) current sheets is universal and independent on microscopic resistivity and the mean imposed field; particles are regularly accelerated while experiencing curvature drift in flows driven by magnetic tension. One example of such flow is spontaneous reconnection. This explains hot electrons with a power-law tail in solar flares, as well as ultrashort time variability in some astrophysical sources.« less
Li, L; Ng, T B; Song, M; Yuan, F; Liu, Z K; Wang, C L; Jiang, Y; Fu, M; Liu, F
2007-06-01
The antioxidant effects of a polysaccharide-peptide complex (F22) from mushroom (Pleurotus abalonus)-fruiting bodies were studied. The activities of superoxide dismutase (SOD), catalase (CAT), and glutathione peroxidase (GPx) in the liver, kidney, and brain of senescence-accelerated mice showed a marked increase after treatment with the polysaccharide-peptide complex. Concurrently, the gene expression levels of SOD, CAT, and GPx, as determined with real-time polymerase chain reaction, were up-regulated in the liver, kidney, and brain, whereas the MDA content in these organs declined. The maximal lifespan of the mice was prolonged.
3D modeling of carbonates petro-acoustic heterogeneities
NASA Astrophysics Data System (ADS)
Baden, Dawin; Guglielmi, Yves; Saracco, Ginette; Marié, Lionel; Viseur, Sophie
2015-04-01
Characterizing carbonate reservoirs heterogeneity is a challenging issue for Oil & Gas Industry, CO2 sequestration and all kinds of fluid manipulations in natural reservoirs, due to the significant impact of heterogeneities on fluid flow and storage within the reservoir. Although large scale (> meter) heterogeneities such as layers petrophysical contrasts are well addressed by computing facies-based models, low scale (< meter) heterogeneities are often poorly constrained because of the complexity in predicting their spatial arrangement. In this study, we conducted petro-acoustic measurements on cores of different size and diameter (Ø = 1", 1.5" and 5") in order to evaluate anisotropy or heterogeneity in carbonates at different laboratory scales. Different types of heterogeneities which generally occur in carbonate reservoir units (e.g. petrographic, diagenetic, and tectonic related) were sampled. Dry / wet samples were investigated with different ultrasonic apparatus and using different sensors allowing acoustic characterization through a bandwidth varying from 50 to 500 kHz. Comprehensive measurements realized on each samples allowed statistical analyses of petro-acoustic properties such as attenuation, shear and longitudinal wave velocity. The cores properties (geological and acoustic facies) were modeled in 3D using photogrammetry and GOCAD geo-modeler. This method successfully allowed detecting and imaging in three dimensions differential diagenesis effects characterized by the occurrence of decimeter-scale diagenetic horizons in samples assumed to be homogeneous and/or different diagenetic sequences between shells filling and the packing matrix. We then discuss how small interfaces such as cracks, stylolithes and laminations which are also imaged may have guided these differential effects, considering that understanding the processes may be taken as an analogue to actual fluid drainage complexity in deep carbonate reservoir.
Neural Networks for Modeling and Control of Particle Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelen, A. L.; Biedron, S. G.; Chase, B. E.
Myriad nonlinear and complex physical phenomena are host to particle accelerators. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems,more » as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Moreover, many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. For the purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We also describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.« less
Neural Networks for Modeling and Control of Particle Accelerators
NASA Astrophysics Data System (ADS)
Edelen, A. L.; Biedron, S. G.; Chase, B. E.; Edstrom, D.; Milton, S. V.; Stabile, P.
2016-04-01
Particle accelerators are host to myriad nonlinear and complex physical phenomena. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems, as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. The purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.
Neural Networks for Modeling and Control of Particle Accelerators
Edelen, A. L.; Biedron, S. G.; Chase, B. E.; ...
2016-04-01
Myriad nonlinear and complex physical phenomena are host to particle accelerators. They often involve a multitude of interacting systems, are subject to tight performance demands, and should be able to run for extended periods of time with minimal interruptions. Often times, traditional control techniques cannot fully meet these requirements. One promising avenue is to introduce machine learning and sophisticated control techniques inspired by artificial intelligence, particularly in light of recent theoretical and practical advances in these fields. Within machine learning and artificial intelligence, neural networks are particularly well-suited to modeling, control, and diagnostic analysis of complex, nonlinear, and time-varying systems,more » as well as systems with large parameter spaces. Consequently, the use of neural network-based modeling and control techniques could be of significant benefit to particle accelerators. For the same reasons, particle accelerators are also ideal test-beds for these techniques. Moreover, many early attempts to apply neural networks to particle accelerators yielded mixed results due to the relative immaturity of the technology for such tasks. For the purpose of this paper is to re-introduce neural networks to the particle accelerator community and report on some work in neural network control that is being conducted as part of a dedicated collaboration between Fermilab and Colorado State University (CSU). We also describe some of the challenges of particle accelerator control, highlight recent advances in neural network techniques, discuss some promising avenues for incorporating neural networks into particle accelerator control systems, and describe a neural network-based control system that is being developed for resonance control of an RF electron gun at the Fermilab Accelerator Science and Technology (FAST) facility, including initial experimental results from a benchmark controller.« less
Innovative HPC architectures for the study of planetary plasma environments
NASA Astrophysics Data System (ADS)
Amaya, Jorge; Wolf, Anna; Lembège, Bertrand; Zitz, Anke; Alvarez, Damian; Lapenta, Giovanni
2016-04-01
DEEP-ER is an European Commission founded project that develops a new type of High Performance Computer architecture. The revolutionary system is currently used by KU Leuven to study the effects of the solar wind on the global environments of the Earth and Mercury. The new architecture combines the versatility of Intel Xeon computing nodes with the power of the upcoming Intel Xeon Phi accelerators. Contrary to classical heterogeneous HPC architectures, where it is customary to find CPU and accelerators in the same computing nodes, in the DEEP-ER system CPU nodes are grouped together (Cluster) and independently from the accelerator nodes (Booster). The system is equipped with a state of the art interconnection network, a highly scalable and fast I/O and a fail recovery resiliency system. The final objective of the project is to introduce a scalable system that can be used to create the next generation of exascale supercomputers. The code iPic3D from KU Leuven is being adapted to this new architecture. This particle-in-cell code can now perform the computation of the electromagnetic fields in the Cluster while the particles are moved in the Booster side. Using fast and scalable Xeon Phi accelerators in the Booster we can introduce many more particles per cell in the simulation than what is possible in the current generation of HPC systems, allowing to calculate fully kinetic plasmas with very low interpolation noise. The system will be used to perform fully kinetic, low noise, 3D simulations of the interaction of the solar wind with the magnetosphere of the Earth and Mercury. Preliminary simulations have been performed in other HPC centers in order to compare the results in different systems. In this presentation we show the complexity of the plasma flow around the planets, including the development of hydrodynamic instabilities at the flanks, the presence of the collision-less shock, the magnetosheath, the magnetopause, reconnection zones, the formation of the plasma sheet and the magnetotail, and the variation of ion/electron plasma flows when crossing these frontiers. The simulations also give access to detailed information about the particle dynamics and their velocity distribution at locations that can be used for comparison with satellite data.
Smith, Benjamin A; Padrick, Shae B; Doolittle, Lynda K; Daugherty-Clarke, Karen; Corrêa, Ivan R; Xu, Ming-Qun; Goode, Bruce L; Rosen, Michael K; Gelles, Jeff
2013-09-03
During cell locomotion and endocytosis, membrane-tethered WASP proteins stimulate actin filament nucleation by the Arp2/3 complex. This process generates highly branched arrays of filaments that grow toward the membrane to which they are tethered, a conflict that seemingly would restrict filament growth. Using three-color single-molecule imaging in vitro we revealed how the dynamic associations of Arp2/3 complex with mother filament and WASP are temporally coordinated with initiation of daughter filament growth. We found that WASP proteins dissociated from filament-bound Arp2/3 complex prior to new filament growth. Further, mutations that accelerated release of WASP from filament-bound Arp2/3 complex proportionally accelerated branch formation. These data suggest that while WASP promotes formation of pre-nucleation complexes, filament growth cannot occur until it is triggered by WASP release. This provides a mechanism by which membrane-bound WASP proteins can stimulate network growth without restraining it. DOI:http://dx.doi.org/10.7554/eLife.01008.001.
Puetzer, Jennifer L; Bonassar, Lawrence J
2016-07-01
The meniscus is a dense fibrocartilage tissue that withstands the complex loads of the knee via a unique organization of collagen fibers. Attempts to condition engineered menisci with compression or tensile loading alone have failed to reproduce complex structure on the microscale or anatomic scale. Here we show that axial loading of anatomically shaped tissue-engineered meniscus constructs produced spatial distributions of local strain similar to those seen in the meniscus when the knee is loaded at full extension. Such loading drove formation of tissue with large organized collagen fibers, levels of mechanical anisotropy, and compressive moduli that match native tissue. Loading accelerated the development of native-sized and aligned circumferential and radial collagen fibers. These loading patterns contained both tensile and compressive components that enhanced the major biochemical and functional properties of the meniscus, with loading significantly improved glycosaminoglycan (GAG) accumulation 200-250%, collagen accumulation 40-55%, equilibrium modulus 1000-1800%, and tensile moduli 500-1200% (radial and circumferential). Furthermore, this study demonstrates local changes in mechanical environment drive heterogeneous tissue development and organization within individual constructs, highlighting the importance of recapitulating native loading environments. Loaded menisci developed cartilage-like tissue with rounded cells, a dense collagen matrix, and increased GAG accumulation in the more compressively loaded horns, and fibrous collagen-rich tissue in the more tensile loaded outer 2/3, similar to native menisci. Loaded constructs reached a level of organization not seen in any previous engineered menisci and demonstrate great promise as meniscal replacements.
NASA Astrophysics Data System (ADS)
Stark, Julian; Rothe, Thomas; Kieß, Steffen; Simon, Sven; Kienle, Alwin
2016-04-01
Single cell nuclei were investigated using two-dimensional angularly and spectrally resolved scattering microscopy. We show that even for a qualitative comparison of experimental and theoretical data, the standard Mie model of a homogeneous sphere proves to be insufficient. Hence, an accelerated finite-difference time-domain method using a graphics processor unit and domain decomposition was implemented to analyze the experimental scattering patterns. The measured cell nuclei were modeled as single spheres with randomly distributed spherical inclusions of different size and refractive index representing the nucleoli and clumps of chromatin. Taking into account the nuclear heterogeneity of a large number of inclusions yields a qualitative agreement between experimental and theoretical spectra and illustrates the impact of the nuclear micro- and nanostructure on the scattering patterns.
Stark, Julian; Rothe, Thomas; Kieß, Steffen; Simon, Sven; Kienle, Alwin
2016-04-07
Single cell nuclei were investigated using two-dimensional angularly and spectrally resolved scattering microscopy. We show that even for a qualitative comparison of experimental and theoretical data, the standard Mie model of a homogeneous sphere proves to be insufficient. Hence, an accelerated finite-difference time-domain method using a graphics processor unit and domain decomposition was implemented to analyze the experimental scattering patterns. The measured cell nuclei were modeled as single spheres with randomly distributed spherical inclusions of different size and refractive index representing the nucleoli and clumps of chromatin. Taking into account the nuclear heterogeneity of a large number of inclusions yields a qualitative agreement between experimental and theoretical spectra and illustrates the impact of the nuclear micro- and nanostructure on the scattering patterns.
NASA Astrophysics Data System (ADS)
Sherman, Eilon
2016-06-01
Signal transduction is mediated by heterogeneous and dynamic protein complexes. Such complexes play a critical role in diverse cell functions, with the important example of T cell activation. Biochemical studies of signalling complexes and their imaging by diffraction limited microscopy have resulted in an intricate network of interactions downstream the T cell antigen receptor (TCR). However, in spite of their crucial roles in T cell activation, much remains to be learned about these signalling complexes, including their heterogeneous contents and size distribution, their complex arrangements in the PM, and the molecular requirements for their formation. Here, we review how recent advancements in single molecule localization microscopy have helped to shed new light on the organization of signalling complexes in single molecule detail in intact T cells. From these studies emerges a picture where cells extensively employ hierarchical and dynamic patterns of nano-scale organization to control the local concentration of interacting molecular species. These patterns are suggested to play a critical role in cell decision making. The combination of SMLM with more traditional techniques is expected to continue and critically contribute to our understanding of multimolecular protein complexes and their significance to cell function.
Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian
2017-01-31
Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less
Hoffmann, Max J; Engelmann, Felix; Matera, Sebastian
2017-01-28
Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO 2 (110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.
Controls on Seasonal Terminus Positions at Central West Greenland Tidewater Glaciers
NASA Astrophysics Data System (ADS)
Fried, M.; Catania, G. A.; Bartholomaus, T. C.; Stearns, L. A.; Sutherland, D.; Shroyer, E.; Nash, J. D.; Carroll, D.
2016-12-01
Each year, tidewater glaciers in Greenland undergo seasonal terminus position cycles, characterized by wintertime advance and summertime retreat. In many cases, this seasonal cycle is superimposed on top of long-term terminus retreat. Understanding the mechanisms that control the seasonal cycle - and how such controls differ between glaciers - might elucidate how tidewater glaciers regulate dynamic ice loss on these longer timescales. However, the controls on terminus position are numerous and complex, making it difficult to identify the dominant process controlling terminus position. To address this, we examine satellite-derived terminus position time series for a suite of glaciers in central west Greenland in conjunction with observations of environmental forcings. In particular, we focus on estimated runoff at the glacier grounding line, mélange conditions in the proglacial fjord and (where possible) in-situ measurements of ocean temperature. We find that seasonal terminus advance and retreat more closely follow the presence or absence of runoff than mélange conditions and, where studied, ocean forcing. At the majority of glaciers studied, localized terminus ablation occurs where runoff-driven submarine melt emerges at the grounding line. This often induces heterogeneous rates of retreat across the glacier front and leads to the formation of local terminus embayments. Calving accelerates in these embayments allowing for local runoff to influence the magnitude and timing of mean seasonal retreat. At glaciers with grounding line depths in excess of 500 m, localized retreat due to submarine melt can be outstripped by large slab rotation calving events, likely initiated by different forcing mechanisms. Our observations emphasize that across-flow heterogeneities in terminus position are diagnostic of how runoff-induced melt helps control seasonal terminus cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian
Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less
NASA Astrophysics Data System (ADS)
Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian
2017-01-01
Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO2(110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.
Gong, Weili; Zhang, Huaiqiang; Tian, Li; Liu, Shijia; Wu, Xiuyun; Li, Fuli; Wang, Lushan
2016-07-01
The structure of xylan, which has a 1,4-linked β-xylose backbone with various substituents, is much more heterogeneous and complex than that of cellulose. Because of this, complete degradation of xylan needs a large number of enzymes that includes GH10, GH11, and GH3 family xylanases together with auxiliary enzymes. Fluorescence-assisted carbohydrate electrophoresis (FACE) is able to accurately differentiate unsubstituted and substituted xylooligosaccharides (XOS) in the heterogeneous products generated by different xylanases and allows changes in concentrations of specific XOS to be analyzed quantitatively. Based on a quantitative analysis of XOS profiles over time using FACE, we have demonstrated that GH10 and GH11 family xylanases immediately degrade xylan into sizeable XOS, which are converted into smaller XOS in a much lower speed. The shortest substituted XOS produced by hydrolysis of the substituted xylan backbone by GH10 and GH11 family xylanases were MeGlcA(2) Xyl3 and MeGlcA(2) Xyl4 , respectively. The unsubstituted xylan backbone was degraded into xylose, xylobiose, and xylotriose by both GH10 and GH11 family xylanases; the product profiles are not family-specific but, instead, depend on different subsite binding affinities in the active sites of individual enzymes. Synergystic action between xylanases and β-xylosidase degraded MeGlcA(2) Xyl4 into xylose and MeGlcA(2) Xyl3 but further degradation of MeGlcA(2) Xyl3 required additional enzymes. Synergy between xylanases and β-xylosidase was also found to significantly accelerate the conversion of XOS into xylose. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Xu, Junshan; Zhang, Baohua
2018-03-01
Development of stress heterogeneity in two-phase rocks was investigated via a finite element method at 1000-1200 K and 100 MPa. Two groups of rock models were considered: anorthite-diopside and anorthite-clinopyroxene, with a phase volume ratio of 1:1 in each group and different dislocation creep rates between phases ( 4-8 orders of magnitude). Our numerical results indicate that the stress inside the model can be several times higher than the differential stress applied to the model and stress will tend to concentrate in hard phase, especially near the sharp boundaries with soft phase. Moreover, large stress gradient in hard phase and nearly homogeneous stress in soft phase will lead to the initialization of localized dynamic recrystallization or fracture. These numerical observations suggest that the rheological contrast between two phases plays a crucial role in stress heterogeneity rather than other factors (such as grain size, the boundary conditions or mesh density), which may eventually accelerate development of stress heterogeneity in the lower crust. Our study provides new insights into the dynamic processes of grain size reduction in the lower crust, which may cause the transformation from dislocation creep to diffusion creep and enable the weakened shear zones.
Sun, Caili; Chai, Zongzheng; Liu, Guobin; Xue, Sha
2017-01-01
Analyzing the dynamic patterns of species diversity and spatial heterogeneity of vegetation in grasslands during secondary succession could help with the maintenance and management of these ecosystems. Here, we evaluated the influence of secondary succession on grassland plant diversity and spatial heterogeneity of abandoned croplands on the Loess Plateau (China) during four phases of recovery: 1-5, 5-10, 10-20, and 20-30 years. The species composition and dominance of the grassland vegetation changed markedly during secondary succession and formed a clear successional series, with the species assemblage dominated by Artemisia capillaris → Heteropappus altaicus→ A. sacrorum . The diversity pattern was one of low-high-low, with diversity peaking in the 10-20 year phase, thus corresponding to a hump-backed model in which maximum diversity occurring at the intermediate stages. A spatially aggregated pattern prevailed throughout the entire period of grassland recovery; this was likely linked to the dispersal properties of herbaceous plants and to high habitat heterogeneity. We conclude that natural succession was conducive to the successful recovery of native vegetation. From a management perspective, native pioneer tree species should be introduced about 20 years after abandoning croplands to accelerate the natural succession of grassland vegetation.
Assembly of tissue engineered blood vessels with spatially-controlled heterogeneities.
Strobel, Hannah A; Hookway, Tracy; Piola, Marco; Fiore, Gianfranco Beniamino; Soncini, Monica; Alsberg, Eben; Rolle, Marsha
2018-05-04
Tissue-engineered human blood vessels may enable in vitro disease modeling and drug screening to accelerate advances in vascular medicine. Existing methods for tissue engineered blood vessel (TEBV) fabrication create homogenous tubes not conducive to modeling the focal pathologies characteristic of vascular disease. We developed a system for generating self-assembled human smooth muscle cell ring-units, which were fused together into TEBVs. The goal of this study was to assess the feasibility of modular assembly and fusion of ring building units to fabricate spatially-controlled, heterogeneous tissue tubes. We first aimed to enhance fusion and reduce total culture time, and determined that reducing ring pre-culture duration improved tube fusion. Next, we incorporated electrospun polymer ring units onto tube ends as reinforced extensions, which allowed us to cannulate tubes after only 7 days of fusion, and culture tubes with luminal flow in a custom bioreactor. To create focal heterogeneities, we incorporated gelatin microspheres into select ring units during self-assembly, and fused these rings between ring units without microspheres. Cells within rings maintained their spatial position within tissue tubes after fusion. This work describes a platform approach for creating modular TEBVs with spatially-defined structural heterogeneities, which may ultimately be applied to mimic focal diseases such as intimal hyperplasia or aneurysm.
Sun, Caili; Chai, Zongzheng; Liu, Guobin; Xue, Sha
2017-01-01
Analyzing the dynamic patterns of species diversity and spatial heterogeneity of vegetation in grasslands during secondary succession could help with the maintenance and management of these ecosystems. Here, we evaluated the influence of secondary succession on grassland plant diversity and spatial heterogeneity of abandoned croplands on the Loess Plateau (China) during four phases of recovery: 1–5, 5–10, 10–20, and 20–30 years. The species composition and dominance of the grassland vegetation changed markedly during secondary succession and formed a clear successional series, with the species assemblage dominated by Artemisia capillaris→ Heteropappus altaicus→ A. sacrorum. The diversity pattern was one of low–high–low, with diversity peaking in the 10–20 year phase, thus corresponding to a hump-backed model in which maximum diversity occurring at the intermediate stages. A spatially aggregated pattern prevailed throughout the entire period of grassland recovery; this was likely linked to the dispersal properties of herbaceous plants and to high habitat heterogeneity. We conclude that natural succession was conducive to the successful recovery of native vegetation. From a management perspective, native pioneer tree species should be introduced about 20 years after abandoning croplands to accelerate the natural succession of grassland vegetation. PMID:28900433
Cloud-based mobility management in heterogeneous wireless networks
NASA Astrophysics Data System (ADS)
Kravchuk, Serhii; Minochkin, Dmytro; Omiotek, Zbigniew; Bainazarov, Ulan; Weryńska-Bieniasz, RóŻa; Iskakova, Aigul
2017-08-01
Mobility management is the key feature that supports the roaming of users between different systems. Handover is the essential aspect in the development of solutions supporting mobility scenarios. The handover process becomes more complex in a heterogeneous environment compared to the homogeneous one. Seamlessness and reduction of delay in servicing the handover calls, which can reduce the handover dropping probability, also require complex algorithms to provide a desired QoS for mobile users. A challenging problem to increase the scalability and availability of handover decision mechanisms is discussed. The aim of the paper is to propose cloud based handover as a service concept to cope with the challenges that arise.
NASA Technical Reports Server (NTRS)
Cho, S. Y.; Yetter, R. A.; Dryer, F. L.
1992-01-01
Various chemically reacting flow problems highlighting chemical and physical fundamentals rather than flow geometry are presently investigated by means of a comprehensive mathematical model that incorporates multicomponent molecular diffusion, complex chemistry, and heterogeneous processes, in the interest of obtaining sensitivity-related information. The sensitivity equations were decoupled from those of the model, and then integrated one time-step behind the integration of the model equations, and analytical Jacobian matrices were applied to improve the accuracy of sensitivity coefficients that are calculated together with model solutions.
Curvature and temperature of complex networks.
Krioukov, Dmitri; Papadopoulos, Fragkiskos; Vahdat, Amin; Boguñá, Marián
2009-09-01
We show that heterogeneous degree distributions in observed scale-free topologies of complex networks can emerge as a consequence of the exponential expansion of hidden hyperbolic space. Fermi-Dirac statistics provides a physical interpretation of hyperbolic distances as energies of links. The hidden space curvature affects the heterogeneity of the degree distribution, while clustering is a function of temperature. We embed the internet into the hyperbolic plane and find a remarkable congruency between the embedding and our hyperbolic model. Besides proving our model realistic, this embedding may be used for routing with only local information, which holds significant promise for improving the performance of internet routing.
NASA Astrophysics Data System (ADS)
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2017-12-01
Multiagent systems (MAS) provide a useful tool for exploring the complex dynamics and behavior of financial markets and now MAS approach has been widely implemented and documented in the empirical literature. This paper introduces the implementation of an innovative multi-scale mathematical model for a computational agent-based financial market. The paper develops a method to quantify the degree of self-organization which emerges in the system and shows that the capacity of self-organization is maximized when the agent behaviors are heterogeneous. Numerical results are presented and analyzed, showing how the global market behavior emerges from specific individual behavior interactions.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
The noisy voter model on complex networks.
Carro, Adrián; Toral, Raúl; San Miguel, Maxi
2016-04-20
We propose a new analytical method to study stochastic, binary-state models on complex networks. Moving beyond the usual mean-field theories, this alternative approach is based on the introduction of an annealed approximation for uncorrelated networks, allowing to deal with the network structure as parametric heterogeneity. As an illustration, we study the noisy voter model, a modification of the original voter model including random changes of state. The proposed method is able to unfold the dependence of the model not only on the mean degree (the mean-field prediction) but also on more complex averages over the degree distribution. In particular, we find that the degree heterogeneity--variance of the underlying degree distribution--has a strong influence on the location of the critical point of a noise-induced, finite-size transition occurring in the model, on the local ordering of the system, and on the functional form of its temporal correlations. Finally, we show how this latter point opens the possibility of inferring the degree heterogeneity of the underlying network by observing only the aggregate behavior of the system as a whole, an issue of interest for systems where only macroscopic, population level variables can be measured.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherman, Andrew J
A heterogeneous body having ceramic rich cermet regions in a more ductile metal matrix. The heterogeneous bodies are formed by thermal spray operations on metal substrates. The thermal spray operations apply heat to a cermet powder and project it onto a solid substrate. The cermet powder is composed of complex composite particles in which a complex ceramic-metallic core particle is coated with a matrix precursor. The cermet regions are generally comprised of complex ceramic-metallic composites that correspond approximately to the core particles. The cermet regions are approximately lenticular shaped with an average width that is at least approximately twice themore » average thickness. The cermet regions are imbedded within the matrix phase and generally isolated from one another. They have obverse and reverse surfaces. The matrix phase is formed from the matrix precursor coating on the core particles. The amount of heat applied during the formation of the heterogeneous body is controlled so that the core particles soften but do not become so fluid that they disperse throughout the matrix phase. The force of the impact on the surface of the substrate tends to flatten them. The flattened cermet regions tend to be approximately aligned with one another in the body.« less
Statistical and sampling issues when using multiple particle tracking
NASA Astrophysics Data System (ADS)
Savin, Thierry; Doyle, Patrick S.
2007-08-01
Video microscopy can be used to simultaneously track several microparticles embedded in a complex material. The trajectories are used to extract a sample of displacements at random locations in the material. From this sample, averaged quantities characterizing the dynamics of the probes are calculated to evaluate structural and/or mechanical properties of the assessed material. However, the sampling of measured displacements in heterogeneous systems is singular because the volume of observation with video microscopy is finite. By carefully characterizing the sampling design in the experimental output of the multiple particle tracking technique, we derive estimators for the mean and variance of the probes’ dynamics that are independent of the peculiar statistical characteristics. We expose stringent tests of these estimators using simulated and experimental complex systems with a known heterogeneous structure. Up to a certain fundamental limitation, which we characterize through a material degree of sampling by the embedded probe tracking, these estimators can be applied to quantify the heterogeneity of a material, providing an original and intelligible kind of information on complex fluid properties. More generally, we show that the precise assessment of the statistics in the multiple particle tracking output sample of observations is essential in order to provide accurate unbiased measurements.
Defect Genome of Cubic Perovskites for Fuel Cell Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balachandran, Janakiraman; Lin, Lianshan; Anchell, Jonathan S.
Heterogeneities such as point defects, inherent to material systems, can profoundly influence material functionalities critical for numerous energy applications. This influence in principle can be identified and quantified through development of large defect data sets which we call the defect genome, employing high-throughput ab initio calculations. However, high-throughput screening of material models with point defects dramatically increases the computational complexity and chemical search space, creating major impediments toward developing a defect genome. In this paper, we overcome these impediments by employing computationally tractable ab initio models driven by highly scalable workflows, to study formation and interaction of various point defectsmore » (e.g., O vacancies, H interstitials, and Y substitutional dopant), in over 80 cubic perovskites, for potential proton-conducting ceramic fuel cell (PCFC) applications. The resulting defect data sets identify several promising perovskite compounds that can exhibit high proton conductivity. Furthermore, the data sets also enable us to identify and explain, insightful and novel correlations among defect energies, material identities, and defect-induced local structural distortions. Finally, such defect data sets and resultant correlations are necessary to build statistical machine learning models, which are required to accelerate discovery of new materials.« less
Defect Genome of Cubic Perovskites for Fuel Cell Applications
Balachandran, Janakiraman; Lin, Lianshan; Anchell, Jonathan S.; ...
2017-10-10
Heterogeneities such as point defects, inherent to material systems, can profoundly influence material functionalities critical for numerous energy applications. This influence in principle can be identified and quantified through development of large defect data sets which we call the defect genome, employing high-throughput ab initio calculations. However, high-throughput screening of material models with point defects dramatically increases the computational complexity and chemical search space, creating major impediments toward developing a defect genome. In this paper, we overcome these impediments by employing computationally tractable ab initio models driven by highly scalable workflows, to study formation and interaction of various point defectsmore » (e.g., O vacancies, H interstitials, and Y substitutional dopant), in over 80 cubic perovskites, for potential proton-conducting ceramic fuel cell (PCFC) applications. The resulting defect data sets identify several promising perovskite compounds that can exhibit high proton conductivity. Furthermore, the data sets also enable us to identify and explain, insightful and novel correlations among defect energies, material identities, and defect-induced local structural distortions. Finally, such defect data sets and resultant correlations are necessary to build statistical machine learning models, which are required to accelerate discovery of new materials.« less
Costa - Introduction to 2015 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, James E.
In parallel with Sandia National Laboratories having two major locations (NM and CA), along with a number of smaller facilities across the nation, so too is the distribution of scientific, engineering and computing resources. As a part of Sandia’s Institutional Computing Program, CA site-based Sandia computer scientists and engineers have been providing mission and research staff with local CA resident expertise on computing options while also focusing on two growing high performance computing research problems. The first is how to increase system resilience to failure, as machines grow larger, more complex and heterogeneous. The second is how to ensure thatmore » computer hardware and configurations are optimized for specialized data analytical mission needs within the overall Sandia computing environment, including the HPC subenvironment. All of these activities support the larger Sandia effort in accelerating development and integration of high performance computing into national security missions. Sandia continues to both promote national R&D objectives, including the recent Presidential Executive Order establishing the National Strategic Computing Initiative and work to ensure that the full range of computing services and capabilities are available for all mission responsibilities, from national security to energy to homeland defense.« less
Computational Approaches to Drug Repurposing and Pharmacology
Hodos, Rachel A; Kidd, Brian A; Khader, Shameer; Readhead, Ben P; Dudley, Joel T
2016-01-01
Data in the biological, chemical, and clinical domains are accumulating at ever-increasing rates and have the potential to accelerate and inform drug development in new ways. Challenges and opportunities now lie in developing analytic tools to transform these often complex and heterogeneous data into testable hypotheses and actionable insights. This is the aim of computational pharmacology, which uses in silico techniques to better understand and predict how drugs affect biological systems, which can in turn improve clinical use, avoid unwanted side effects, and guide selection and development of better treatments. One exciting application of computational pharmacology is drug repurposing- finding new uses for existing drugs. Already yielding many promising candidates, this strategy has the potential to improve the efficiency of the drug development process and reach patient populations with previously unmet needs such as those with rare diseases. While current techniques in computational pharmacology and drug repurposing often focus on just a single data modality such as gene expression or drug-target interactions, we rationalize that methods such as matrix factorization that can integrate data within and across diverse data types have the potential to improve predictive performance and provide a fuller picture of a drug's pharmacological action. PMID:27080087
Analysis and application of opinion model with multiple topic interactions.
Xiong, Fei; Liu, Yun; Wang, Liang; Wang, Ximeng
2017-08-01
To reveal heterogeneous behaviors of opinion evolution in different scenarios, we propose an opinion model with topic interactions. Individual opinions and topic features are represented by a multidimensional vector. We measure an agent's action towards a specific topic by the product of opinion and topic feature. When pairs of agents interact for a topic, their actions are introduced to opinion updates with bounded confidence. Simulation results show that a transition from a disordered state to a consensus state occurs at a critical point of the tolerance threshold, which depends on the opinion dimension. The critical point increases as the dimension of opinions increases. Multiple topics promote opinion interactions and lead to the formation of macroscopic opinion clusters. In addition, more topics accelerate the evolutionary process and weaken the effect of network topology. We use two sets of large-scale real data to evaluate the model, and the results prove its effectiveness in characterizing a real evolutionary process. Our model achieves high performance in individual action prediction and even outperforms state-of-the-art methods. Meanwhile, our model has much smaller computational complexity. This paper provides a demonstration for possible practical applications of theoretical opinion dynamics.
TransCut: interactive rendering of translucent cutouts.
Li, Dongping; Sun, Xin; Ren, Zhong; Lin, Stephen; Tong, Yiying; Guo, Baining; Zhou, Kun
2013-03-01
We present TransCut, a technique for interactive rendering of translucent objects undergoing fracturing and cutting operations. As the object is fractured or cut open, the user can directly examine and intuitively understand the complex translucent interior, as well as edit material properties through painting on cross sections and recombining the broken pieces—all with immediate and realistic visual feedback. This new mode of interaction with translucent volumes is made possible with two technical contributions. The first is a novel solver for the diffusion equation (DE) over a tetrahedral mesh that produces high-quality results comparable to the state-of-art finite element method (FEM) of Arbree et al. but at substantially higher speeds. This accuracy and efficiency is obtained by computing the discrete divergences of the diffusion equation and constructing the DE matrix using analytic formulas derived for linear finite elements. The second contribution is a multiresolution algorithm to significantly accelerate our DE solver while adapting to the frequent changes in topological structure of dynamic objects. The entire multiresolution DE solver is highly parallel and easily implemented on the GPU. We believe TransCut provides a novel visual effect for heterogeneous translucent objects undergoing fracturing and cutting operations.
Blanken, Tessa F; Deserno, Marie K; Dalege, Jonas; Borsboom, Denny; Blanken, Peter; Kerkhof, Gerard A; Cramer, Angélique O J
2018-04-11
Network theory, as a theoretical and methodological framework, is energizing many research fields, among which clinical psychology and psychiatry. Fundamental to the network theory of psychopathology is the role of specific symptoms and their interactions. Current statistical tools, however, fail to fully capture this constitutional property. We propose community detection tools as a means to evaluate the complex network structure of psychopathology, free from its original boundaries of distinct disorders. Unique to this approach is that symptoms can belong to multiple communities. Using a large community sample and spanning a broad range of symptoms (Symptom Checklist-90-Revised), we identified 18 communities of interconnected symptoms. The differential role of symptoms within and between communities offers a framework to study the clinical concepts of comorbidity, heterogeneity and hallmark symptoms. Symptoms with many and strong connections within a community, defined as stabilizing symptoms, could be thought of as the core of a community, whereas symptoms that belong to multiple communities, defined as communicating symptoms, facilitate the communication between problem areas. We propose that defining symptoms on their stabilizing and/or communicating role within and across communities accelerates our understanding of these clinical phenomena, central to research and treatment of psychopathology.
Multitemporal spatial pattern analysis of Tulum's tropical coastal landscape
NASA Astrophysics Data System (ADS)
Ramírez-Forero, Sandra Carolina; López-Caloca, Alejandra; Silván-Cárdenas, José Luis
2011-11-01
The tropical coastal landscape of Tulum in Quintana Roo, Mexico has a high ecological, economical, social and cultural value, it provides environmental and tourism services at global, national, regional and local levels. The landscape of the area is heterogeneous and presents random fragmentation patterns. In recent years, tourist services of the region has been increased promoting an accelerate expansion of hotels, transportation and recreation infrastructure altering the complex landscape. It is important to understand the environmental dynamics through temporal changes on the spatial patterns and to propose a better management of this ecological area to the authorities. This paper addresses a multi-temporal analysis of land cover changes from 1993 to 2000 in Tulum using Thematic Mapper data acquired by Landsat-5. Two independent methodologies were applied for the analysis of changes in the landscape and for the definition of fragmentation patterns. First, an Iteratively Multivariate Alteration Detection (IR-MAD) algorithm was used to detect and localize land cover change/no-change areas. Second, the post-classification change detection evaluated using the Support Vector Machine (SVM) algorithm. Landscape metrics were calculated from the results of IR-MAD and SVM. The analysis of the metrics indicated, among other things, a higher fragmentation pattern along roadways.
Consistent global structures of complex RNA states through multidimensional chemical mapping
Cheng, Clarence Yu; Chou, Fang-Chieh; Kladwang, Wipapat; Tian, Siqi; Cordero, Pablo; Das, Rhiju
2015-01-01
Accelerating discoveries of non-coding RNA (ncRNA) in myriad biological processes pose major challenges to structural and functional analysis. Despite progress in secondary structure modeling, high-throughput methods have generally failed to determine ncRNA tertiary structures, even at the 1-nm resolution that enables visualization of how helices and functional motifs are positioned in three dimensions. We report that integrating a new method called MOHCA-seq (Multiplexed •OH Cleavage Analysis with paired-end sequencing) with mutate-and-map secondary structure inference guides Rosetta 3D modeling to consistent 1-nm accuracy for intricately folded ncRNAs with lengths up to 188 nucleotides, including a blind RNA-puzzle challenge, the lariat-capping ribozyme. This multidimensional chemical mapping (MCM) pipeline resolves unexpected tertiary proximities for cyclic-di-GMP, glycine, and adenosylcobalamin riboswitch aptamers without their ligands and a loose structure for the recently discovered human HoxA9D internal ribosome entry site regulon. MCM offers a sequencing-based route to uncovering ncRNA 3D structure, applicable to functionally important but potentially heterogeneous states. DOI: http://dx.doi.org/10.7554/eLife.07600.001 PMID:26035425
The application of phase contrast X-ray techniques for imaging Li-ion battery electrodes
NASA Astrophysics Data System (ADS)
Eastwood, D. S.; Bradley, R. S.; Tariq, F.; Cooper, S. J.; Taiwo, O. O.; Gelb, J.; Merkle, A.; Brett, D. J. L.; Brandon, N. P.; Withers, P. J.; Lee, P. D.; Shearing, P. R.
2014-04-01
In order to accelerate the commercialization of fuel cells and batteries across a range of applications, an understanding of the mechanisms by which they age and degrade at the microstructural level is required. Here, the most widely commercialized Li-ion batteries based on porous graphite based electrodes which de/intercalate Li+ ions during charge/discharge are studied by two phase contrast enhanced X-ray imaging modes, namely in-line phase contrast and Zernike phase contrast at the micro (synchrotron) and nano (laboratory X-ray microscope) level, respectively. The rate of charge cycling is directly dependent on the nature of the electrode microstructure, which are typically complex multi-scale 3D geometries with significant microstructural heterogeneities. We have been able to characterise the porosity and the tortuosity by micro-CT as well as the morphology of 5 individual graphite particles by nano-tomography finding that while their volume varied significantly their sphericity was surprisingly similar. The volume specific surface areas of the individual grains measured by nano-CT are significantly larger than the total volume specific surface area of the electrode from the micro-CT imaging, which can be attributed to the greater particle surface area visible at higher resolution.
Complex dynamics and empirical evidence (Invited Paper)
NASA Astrophysics Data System (ADS)
Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto
2005-05-01
Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.
Dong, Yi; Goubert, Guillaume; Groves, Michael N; Lemay, Jean-Christian; Hammer, Bjørk; McBreen, Peter H
2017-05-16
The modification of heterogeneous catalysts through the chemisorption of chiral molecules is a method to create catalytic sites for enantioselective surface reactions. The chiral molecule is called a chiral modifier by analogy to the terms chiral auxiliary or chiral ligand used in homogeneous asymmetric catalysis. While there has been progress in understanding how chirality transfer occurs, the intrinsic difficulties in determining enantioselective reaction mechanisms are compounded by the multisite nature of heterogeneous catalysts and by the challenges facing stereospecific surface analysis. However, molecular descriptions have now emerged that are sufficiently detailed to herald rapid advances in the area. The driving force for the development of heterogeneous enantioselective catalysts stems, at the minimum, from the practical advantages they might offer over their homogeneous counterparts in terms of process scalability and catalyst reusability. The broader rewards from their study lie in the insights gained on factors controlling selectivity in heterogeneous catalysis. Reactions on surfaces to produce a desired enantiomer in high excess are particularly challenging since at room temperature, barrier differences as low as ∼2 kcal/mol between pathways to R and S products are sufficient to yield an enantiomeric ratio (er) of 90:10. Such small energy differences are comparable to weak interadsorbate interaction energies and are much smaller than chemisorption or even most physisorption energies. In this Account, we describe combined experimental and theoretical surface studies of individual diastereomeric complexes formed between chiral modifiers and prochiral reactants on the Pt(111) surface. Our work is inspired by the catalysis literature on the enantioselective hydrogenation of activated ketones on cinchona-modified Pt catalysts. Using scanning tunneling microscopy (STM) measurements and density functional theory (DFT) calculations, we probe the structures and relative abundances of non-covalently bonded complexes formed between three representative prochiral molecules and (R)-(+)-1-(1-naphthyl)ethylamine ((R)-NEA). All three prochiral molecules, 2,2,2-trifluoroacetophenone (TFAP), ketopantolactone (KPL), and methyl 3,3,3-trifluoropyruvate (MTFP), are found to form multiple complexation configurations around the ethylamine group of chemisorbed (R)-NEA. The principal intermolecular interaction is NH···O H-bonding. In each case, submolecularly resolved STM images permit the determination of the prochiral ratio (pr), pro-R to pro-S, proper to specific locations around the ethylamine group. The overall pr observed in experiments on large ensembles of KPL-(R)-NEA complexes is close to the er reported in the literature for the hydrogenation of KPL to pantolactone on (R)-NEA-modified Pt catalysts at 1 bar H 2 . The results of independent DFT and STM studies are merged to determine the geometries of the most abundant complexation configurations. The structures reveal the hierarchy of chemisorption and sometimes multiple H-bonding interactions operating in complexes. In particular, privileged complexes formed by KPL and MTFP reveal the participation of secondary CH···O interactions in stereocontrol. State-specific STM measurements on individual TFAP-(R)-NEA complexes show that complexation states interconvert through processes including prochiral inversion. The state-specific information on structure, prochirality, dynamics, and energy barriers delivered by the combination of DFT and STM provides insight on how to design better chiral modifiers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohn, M.E.; Patchen, D.G.; Heald, M.
Non-uniform composition and permeability of a reservoir, commonly referred to as reservoir heterogeneity, is recognized as a major factor in the efficient recovery of oil during primary production and enhanced recovery operations. Heterogeneities are present at various scales and are caused by various factors, including folding and faulting, fractures, diagenesis and depositional environments. Thus, a reservoir consists of a complex flow system, or series of flow systems, dependent on lithology, sandstone genesis, and structural and thermal history. Ultimately, however, fundamental flow units are controlled by the distribution and type of depositional environments. Reservoir heterogeneity is difficult to measure and predict,more » especially in more complex reservoirs such as fluvial-deltaic sandstones. The Appalachian Oil and Natural Gas Research Consortium (AONGRC), a partnership of Appalachian basin state geological surveys in Kentucky, Ohio, Pennsylvania, and West Virginia, and West Virginia University, studied the Lower Mississippian Big Injun sandstone in West Virginia. The Big Injun research was multidisciplinary and designed to measure and map heterogeneity in existing fields and undrilled areas. The main goal was to develop an understanding of the reservoir sufficient to predict, in a given reservoir, optimum drilling locations versus high-risk locations for infill, outpost, or deeper-pool tests.« less
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
Gea, Guerriero; Kjell, Sergeant; Jean-François, Hausman
2013-01-01
Lignin and cellulose represent the two main components of plant secondary walls and the most abundant polymers on Earth. Quantitatively one of the principal products of the phenylpropanoid pathway, lignin confers high mechanical strength and hydrophobicity to plant walls, thus enabling erect growth and high-pressure water transport in the vessels. Lignin is characterized by a high natural heterogeneity in its composition and abundance in plant secondary cell walls, even in the different tissues of the same plant. A typical example is the stem of fibre crops, which shows a lignified core enveloped by a cellulosic, lignin-poor cortex. Despite the great value of fibre crops for humanity, however, still little is known on the mechanisms controlling their cell wall biogenesis, and particularly, what regulates their spatially-defined lignification pattern. Given the chemical complexity and the heterogeneous composition of fibre crops’ secondary walls, only the use of multidisciplinary approaches can convey an integrated picture and provide exhaustive information covering different levels of biological complexity. The present review highlights the importance of combining high throughput -omics approaches to get a complete understanding of the factors regulating the lignification heterogeneity typical of fibre crops. PMID:23708098
A feasibility study of a hypersonic real-gas facility
NASA Technical Reports Server (NTRS)
Gully, J. H.; Driga, M. D.; Weldon, W. F.
1987-01-01
A four month feasibility study of a hypersonic real-gas free flight test facility for NASA Langley Research Center (LARC) was performed. The feasibility of using a high-energy electromagnetic launcher (EML) to accelerate complex models (lifting and nonlifting) in the hypersonic, real-gas facility was examined. Issues addressed include: design and performance of the accelerator; design and performance of the power supply; design and operation of the sabot and payload during acceleration and separation; effects of high current, magnetic fields, temperature, and stress on the sabot and payload; and survivability of payload instrumentation during acceleration, flight, and soft catch.
Overview of Accelerators with Potential Use in Homeland Security
NASA Astrophysics Data System (ADS)
Garnett, Robert W.
Quite a broad range of accelerators have been applied to solving many of the challenging problems related to homeland security and defense. These accelerator systems range from relatively small, simple, and compact, to large and complex, based on the specific application requirements. They have been used or proposed as sources of primary and secondary probe beams for applications such as radiography and to induce specific reactions that are key signatures for detecting conventional explosives or fissile material. A brief overview and description of these accelerator systems, their specifications, and application will be presented. Some recent technology trends will also be discussed.
Banerjee, Rupa; Jayaraj, Gopal Gunanathan; Peter, Joshua Jebakumar; Kumar, Vignesh; Mapa, Koyeli
2016-08-01
DnaK or Hsp70 of Escherichia coli is a master regulator of the bacterial proteostasis network. Allosteric communication between the two functional domains of DnaK, the N-terminal nucleotide-binding domain (NBD) and the C-terminal substrate- or peptide-binding domain (SBD) regulate its activity. X-ray crystallography and NMR studies have provided snapshots of distinct conformations of Hsp70 proteins in various physiological states; however, the conformational heterogeneity and dynamics of allostery-driven Hsp70 activity remains underexplored. In this work, we employed single-molecule Förster resonance energy transfer (sm-FRET) measurements to capture distinct intradomain conformational states of a region within the DnaK-SBD known as the lid. Our data conclusively demonstrate prominent conformational heterogeneity of the DnaK lid in ADP-bound states; in contrast, the ATP-bound open conformations are homogeneous. Interestingly, a nonhydrolysable ATP analogue, AMP-PNP, imparts heterogeneity to the lid conformations mimicking the ADP-bound state. The cochaperone DnaJ confers ADP-like heterogeneous lid conformations to DnaK, although the presence of the cochaperone accelerates the substrate-binding rate by a hitherto unknown mechanism. Irrespective of the presence of DnaJ, binding of a peptide substrate to the DnaK-SBD leads to prominent lid closure. Lid closure is only partial upon binding to molten globule-like authentic cellular substrates, probably to accommodate non-native substrate proteins of varied structures. © 2016 Federation of European Biochemical Societies.
Alexander, Helen K.; Mayer, Stephanie I.; Bonhoeffer, Sebastian
2017-01-01
Abstract Mutation rate is a crucial evolutionary parameter that has typically been treated as a constant in population genetic analyses. However, the propensity to mutate is likely to vary among co-existing individuals within a population, due to genetic polymorphisms, heterogeneous environmental influences, and random physiological fluctuations. We review the evidence for mutation rate heterogeneity and explore its consequences by extending classic population genetic models to allow an arbitrary distribution of mutation rate among individuals, either with or without inheritance. With this general new framework, we rigorously establish the effects of heterogeneity at various evolutionary timescales. In a single generation, variation of mutation rate about the mean increases the probability of producing zero or many simultaneous mutations on a genome. Over multiple generations of mutation and selection, heterogeneity accelerates the appearance of both deleterious and beneficial multi-point mutants. At mutation-selection balance, higher-order mutant frequencies are likewise boosted, while lower-order mutants exhibit subtler effects; nonetheless, population mean fitness is always enhanced. We quantify the dependencies on moments of the mutation rate distribution and selection coefficients, and clarify the role of mutation rate inheritance. While typical methods of estimating mutation rate will recover only the population mean, analyses assuming mutation rate is fixed to this mean could underestimate the potential for multi-locus adaptation, including medically relevant evolution in pathogenic and cancerous populations. We discuss the potential to empirically parameterize mutation rate distributions, which have to date hardly been quantified. PMID:27836985
Single-cell sequencing reveals karyotype heterogeneity in murine and human malignancies.
Bakker, Bjorn; Taudt, Aaron; Belderbos, Mirjam E; Porubsky, David; Spierings, Diana C J; de Jong, Tristan V; Halsema, Nancy; Kazemier, Hinke G; Hoekstra-Wakker, Karina; Bradley, Allan; de Bont, Eveline S J M; van den Berg, Anke; Guryev, Victor; Lansdorp, Peter M; Colomé-Tatché, Maria; Foijer, Floris
2016-05-31
Chromosome instability leads to aneuploidy, a state in which cells have abnormal numbers of chromosomes, and is found in two out of three cancers. In a chromosomal instable p53 deficient mouse model with accelerated lymphomagenesis, we previously observed whole chromosome copy number changes affecting all lymphoma cells. This suggests that chromosome instability is somehow suppressed in the aneuploid lymphomas or that selection for frequently lost/gained chromosomes out-competes the CIN-imposed mis-segregation. To distinguish between these explanations and to examine karyotype dynamics in chromosome instable lymphoma, we use a newly developed single-cell whole genome sequencing (scWGS) platform that provides a complete and unbiased overview of copy number variations (CNV) in individual cells. To analyse these scWGS data, we develop AneuFinder, which allows annotation of copy number changes in a fully automated fashion and quantification of CNV heterogeneity between cells. Single-cell sequencing and AneuFinder analysis reveals high levels of copy number heterogeneity in chromosome instability-driven murine T-cell lymphoma samples, indicating ongoing chromosome instability. Application of this technology to human B cell leukaemias reveals different levels of karyotype heterogeneity in these cancers. Our data show that even though aneuploid tumours select for particular and recurring chromosome combinations, single-cell analysis using AneuFinder reveals copy number heterogeneity. This suggests ongoing chromosome instability that other platforms fail to detect. As chromosome instability might drive tumour evolution, karyotype analysis using single-cell sequencing technology could become an essential tool for cancer treatment stratification.
Wei, Ran; Yan, Yue-Hong; Harris, AJ; Kang, Jong-Soo; Shen, Hui; Zhang, Xian-Chun
2017-01-01
Abstract The eupolypods II ferns represent a classic case of evolutionary radiation and, simultaneously, exhibit high substitution rate heterogeneity. These factors have been proposed to contribute to the contentious resolutions among clades within this fern group in multilocus phylogenetic studies. We investigated the deep phylogenetic relationships of eupolypod II ferns by sampling all major families and using 40 plastid genomes, or plastomes, of which 33 were newly sequenced with next-generation sequencing technology. We performed model-based analyses to evaluate the diversity of molecular evolutionary rates for these ferns. Our plastome data, with more than 26,000 informative characters, yielded good resolution for deep relationships within eupolypods II and unambiguously clarified the position of Rhachidosoraceae and the monophyly of Athyriaceae. Results of rate heterogeneity analysis revealed approximately 33 significant rate shifts in eupolypod II ferns, with the most heterogeneous rates (both accelerations and decelerations) occurring in two phylogenetically difficult lineages, that is, the Rhachidosoraceae–Aspleniaceae and Athyriaceae clades. These observations support the hypothesis that rate heterogeneity has previously constrained the deep phylogenetic resolution in eupolypods II. According to the plastome data, we propose that 14 chloroplast markers are particularly phylogenetically informative for eupolypods II both at the familial and generic levels. Our study demonstrates the power of a character-rich plastome data set and high-throughput sequencing for resolving the recalcitrant lineages, which have undergone rapid evolutionary radiation and dramatic changes in substitution rates. PMID:28854625
Arenillas, Ana; Rubiera, Fernando; Pis, José J
2002-12-15
Nitrogen oxides are one of the major environmental problems arising from fossil fuel combustion. Coal char is relatively rich in nitrogen, and so this is an important source of nitrogen oxides during coal combustion. However, due to its carbonaceous nature, char can also reduce NO through heterogeneous reduction. The objectives of this work were on one hand to compare NO emissions from coal combustion in two different types of equipment and on the other hand to study the influence of char surface chemistry on NO reduction. A series of combustion tests were carried out in two different scale devices: a thermogravimetric analyzer coupled to a mass spectrometer and an FTIR (TG-MS-FTIR) and a fluidized bed reactor with an on line battery of analyzers. The TG-MS-FTIR system was also used to perform a specific study on NO heterogeneous reduction reactions using chars with different surface chemistry. According to the results obtained, it can be said that the TG-MS-FTIR system provides valuable information about NO heterogeneous reduction and it can give good trends of the behavior in other combustion equipments (i.e., fluidized bed combustors). It has been also pointed out that NO-char interaction depends to a large extent on temperature. In the low-temperature range (<800 degrees C), NO heterogeneous reduction seems to be controlled by the evolution of surface complexes. In the high-temperature range (>800 degrees C), a different mechanism is involved in NO heterogeneous reduction, the nature of the carbon matrix being a key factor.
Tumor Heterogeneity in Breast Cancer
Turashvili, Gulisa; Brogi, Edi
2017-01-01
Breast cancer is a heterogeneous disease and differs greatly among different patients (intertumor heterogeneity) and even within each individual tumor (intratumor heterogeneity). Clinical and morphologic intertumor heterogeneity is reflected by staging systems and histopathologic classification of breast cancer. Heterogeneity in the expression of established prognostic and predictive biomarkers, hormone receptors, and human epidermal growth factor receptor 2 oncoprotein is the basis for targeted treatment. Molecular classifications are indicators of genetic tumor heterogeneity, which is probed with multigene assays and can lead to improved stratification into low- and high-risk groups for personalized therapy. Intratumor heterogeneity occurs at the morphologic, genomic, transcriptomic, and proteomic levels, creating diagnostic and therapeutic challenges. Understanding the molecular and cellular mechanisms of tumor heterogeneity that are relevant to the development of treatment resistance is a major area of research. Despite the improved knowledge of the complex genetic and phenotypic features underpinning tumor heterogeneity, there has been only limited advancement in diagnostic, prognostic, or predictive strategies for breast cancer. The current guidelines for reporting of biomarkers aim to maximize patient eligibility for targeted therapy, but do not take into account intratumor heterogeneity. The molecular classification of breast cancer is not implemented in routine clinical practice. Additional studies and in-depth analysis are required to understand the clinical significance of rapidly accumulating data. This review highlights inter- and intratumor heterogeneity of breast carcinoma with special emphasis on pathologic findings, and provides insights into the clinical significance of molecular and cellular mechanisms of heterogeneity. PMID:29276709
CMSA: a heterogeneous CPU/GPU computing system for multiple similar RNA/DNA sequence alignment.
Chen, Xi; Wang, Chen; Tang, Shanjiang; Yu, Ce; Zou, Quan
2017-06-24
The multiple sequence alignment (MSA) is a classic and powerful technique for sequence analysis in bioinformatics. With the rapid growth of biological datasets, MSA parallelization becomes necessary to keep its running time in an acceptable level. Although there are a lot of work on MSA problems, their approaches are either insufficient or contain some implicit assumptions that limit the generality of usage. First, the information of users' sequences, including the sizes of datasets and the lengths of sequences, can be of arbitrary values and are generally unknown before submitted, which are unfortunately ignored by previous work. Second, the center star strategy is suited for aligning similar sequences. But its first stage, center sequence selection, is highly time-consuming and requires further optimization. Moreover, given the heterogeneous CPU/GPU platform, prior studies consider the MSA parallelization on GPU devices only, making the CPUs idle during the computation. Co-run computation, however, can maximize the utilization of the computing resources by enabling the workload computation on both CPU and GPU simultaneously. This paper presents CMSA, a robust and efficient MSA system for large-scale datasets on the heterogeneous CPU/GPU platform. It performs and optimizes multiple sequence alignment automatically for users' submitted sequences without any assumptions. CMSA adopts the co-run computation model so that both CPU and GPU devices are fully utilized. Moreover, CMSA proposes an improved center star strategy that reduces the time complexity of its center sequence selection process from O(mn 2 ) to O(mn). The experimental results show that CMSA achieves an up to 11× speedup and outperforms the state-of-the-art software. CMSA focuses on the multiple similar RNA/DNA sequence alignment and proposes a novel bitmap based algorithm to improve the center star strategy. We can conclude that harvesting the high performance of modern GPU is a promising approach to accelerate multiple sequence alignment. Besides, adopting the co-run computation model can maximize the entire system utilization significantly. The source code is available at https://github.com/wangvsa/CMSA .
Kadlecek, Stephen; Hamedani, Hooman; Xu, Yinan; Emami, Kiarash; Xin, Yi; Ishii, Masaru; Rizi, Rahim
2013-10-01
Alveolar oxygen tension (Pao2) is sensitive to the interplay between local ventilation, perfusion, and alveolar-capillary membrane permeability, and thus reflects physiologic heterogeneity of healthy and diseased lung function. Several hyperpolarized helium ((3)He) magnetic resonance imaging (MRI)-based Pao2 mapping techniques have been reported, and considerable effort has gone toward reducing Pao2 measurement error. We present a new Pao2 imaging scheme, using parallel accelerated MRI, which significantly reduces measurement error. The proposed Pao2 mapping scheme was computer-simulated and was tested on both phantoms and five human subjects. Where possible, correspondence between actual local oxygen concentration and derived values was assessed for both bias (deviation from the true mean) and imaging artifact (deviation from the true spatial distribution). Phantom experiments demonstrated a significantly reduced coefficient of variation using the accelerated scheme. Simulation results support this observation and predict that correspondence between the true spatial distribution and the derived map is always superior using the accelerated scheme, although the improvement becomes less significant as the signal-to-noise ratio increases. Paired measurements in the human subjects, comparing accelerated and fully sampled schemes, show a reduced Pao2 distribution width for 41 of 46 slices. In contrast to proton MRI, acceleration of hyperpolarized imaging has no signal-to-noise penalty; its use in Pao2 measurement is therefore always beneficial. Comparison of multiple schemes shows that the benefit arises from a longer time-base during which oxygen-induced depolarization modifies the signal strength. Demonstration of the accelerated technique in human studies shows the feasibility of the method and suggests that measurement error is reduced here as well, particularly at low signal-to-noise levels. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.
Electrostatic Steering Accelerates C3d:CR2 Association.
Mohan, Rohith R; Huber, Gary A; Morikis, Dimitrios
2016-08-25
Electrostatic effects are ubiquitous in protein interactions and are found to be pervasive in the complement system as well. The interaction between complement fragment C3d and complement receptor 2 (CR2) has evolved to become a link between innate and adaptive immunity. Electrostatic interactions have been suggested to be the driving factor for the association of the C3d:CR2 complex. In this study, we investigate the effects of ionic strength and mutagenesis on the association of C3d:CR2 through Brownian dynamics simulations. We demonstrate that the formation of the C3d:CR2 complex is ionic strength-dependent, suggesting the presence of long-range electrostatic steering that accelerates the complex formation. Electrostatic steering occurs through the interaction of an acidic surface patch in C3d and the positively charged CR2 and is supported by the effects of mutations within the acidic patch of C3d that slow or diminish association. Our data are in agreement with previous experimental mutagenesis and binding studies and computational studies. Although the C3d acidic patch may be locally destabilizing because of unfavorable Coulombic interactions of like charges, it contributes to the acceleration of association. Therefore, acceleration of function through electrostatic steering takes precedence to stability. The site of interaction between C3d and CR2 has been the target for delivery of CR2-bound nanoparticle, antibody, and small molecule biomarkers, as well as potential therapeutics. A detailed knowledge of the physicochemical basis of C3d:CR2 association may be necessary to accelerate biomarker and drug discovery efforts.
Frontiers of beam diagnostics in plasma accelerators: Measuring the ultra-fast and ultra-cold
NASA Astrophysics Data System (ADS)
Cianchi, A.; Anania, M. P.; Bisesto, F.; Chiadroni, E.; Curcio, A.; Ferrario, M.; Giribono, A.; Marocchino, A.; Pompili, R.; Scifo, J.; Shpakov, V.; Vaccarezza, C.; Villa, F.; Mostacci, A.; Bacci, A.; Rossi, A. R.; Serafini, L.; Zigler, A.
2018-05-01
Advanced diagnostics are essential tools in the development of plasma-based accelerators. The accurate measurement of the quality of beams at the exit of the plasma channel is crucial to optimize the parameters of the plasma accelerator. 6D electron beam diagnostics will be reviewed with emphasis on emittance measurement, which is particularly complex due to large energy spread and divergence of the emerging beams, and on femtosecond bunch length measurements.
NASA Astrophysics Data System (ADS)
Xiao, Lan-Xi; Zhu, Yuan-Qing; Zhang, Shao-Quan; Liu, Xu; Guo, Yu
1999-11-01
In this paper, crust medium is treated as Maxwell medium, and crust model includes hard inclusion, soft inclusion, deep-level fault. The stress concentration and its evolution with time are obtained by using three-dimensional finite element method and differential method. The conclusions are draw as follows: (1) The average stress concentration and maximum shear stress concentration caused by non-heterogeneous of crust are very high in hard inclusion and around the deep fault. With the time passing by, the concentration of average stress in the model gradually trends to uniform. At the same time, the concentration of maximum shear stress in hard inclusion increases gradually. This character is favorable to transfer shear strain energy from soft inclusion to hard inclusion. (2) When the upper mantle beneath the inclusion upheave at a certain velocity of 1 cm/a, the changes of average stress concentration with time become complex, and the boundary of the hard and soft inclusion become unconspicuous, but the maximum shear stress concentration increases much more in the hard inclusion with time at a higher velocity. This feature make for transformation of energy from the soft inclusion to the hard inclusion. (3) The changes of average stress concentration and maximum shear stress concentration with time around the deep-level fault result in further accumulation of maximum shear stress concentration and finally cause the deep-level fault instable and accelerated creep along fault direction. (4) The changes of vertical displacement on the surface of the model, which is caused by the accelerated creep of the deep-level fault, is similar to that of the observation data before Xingtai strong earthquake.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weissmann, Gary S
2013-12-06
The objective of this project was to characterize the influence that naturally complex geologic media has on anomalous dispersion and to determine if the nature of dispersion can be estimated from the underlying heterogeneous media. The UNM portion of this project was to provide detailed representations of aquifer heterogeneity through producing highly-resolved models of outcrop analogs to aquifer materials. This project combined outcrop-scale heterogeneity characterization (conducted at the University of New Mexico), laboratory experiments (conducted at Sandia National Laboratory), and numerical simulations (conducted at Sandia National Laboratory and Colorado School of Mines). The study was designed to test whether establishedmore » dispersion theory accurately predicts the behavior of solute transport through heterogeneous media and to investigate the relationship between heterogeneity and the parameters that populate these models. The dispersion theory tested by this work was based upon the fractional advection-dispersion equation (fADE) model. Unlike most dispersion studies that develop a solute transport model by fitting the solute transport breakthrough curve, this project explored the nature of the heterogeneous media to better understand the connection between the model parameters and the aquifer heterogeneity. We also evaluated methods for simulating the heterogeneity to see whether these approaches (e.g., geostatistical) could reasonably replicate realistic heterogeneity. The UNM portion of this study focused on capturing realistic geologic heterogeneity of aquifer analogs using advanced outcrop mapping methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, M R; Beachey, D J; Sarfehnia, A
Purpose: A new commercial GPU-based Monte Carlo dose calculation algorithm (GPUMCD) developed by the vendor Elekta™ to be used in the Monaco Treatment Planning System (TPS) is capable of modeling dose for both a standard linear accelerator and for an Elekta MRI-Linear accelerator (modeling magnetic field effects). We are evaluating this algorithm in two parts: commissioning the algorithm for an Elekta Agility linear accelerator (the focus of this work) and evaluating the algorithm’s ability to model magnetic field effects for an MRI-linear accelerator. Methods: A beam model was developed in the Monaco TPS (v.5.09.06) using the commissioned beam data formore » a 6MV Agility linac. A heterogeneous phantom representing tumor-in-lung, lung, bone-in-tissue, and prosthetic was designed/built. Dose calculations in Monaco were done using the current clinical algorithm (XVMC) and the new GPUMCD algorithm (1 mm3 voxel size, 0.5% statistical uncertainty) and in the Pinnacle TPS using the collapsed cone convolution (CCC) algorithm. These were compared with the measured doses using an ionization chamber (A1SL) and Gafchromic EBT3 films for 2×2 cm{sup 2}, 5×5 cm{sup 2}, and 10×10 cm{sup 2} field sizes. Results: The calculated central axis percentage depth doses (PDDs) in homogeneous solid water were within 2% compared to measurements for XVMC and GPUMCD. For tumor-in-lung and lung phantoms, doses calculated by all of the algorithms were within the experimental uncertainty of the measurements (±2% in the homogeneous phantom and ±3% for the tumor-in-lung or lung phantoms), except for 2×2 cm{sup 2} field size where only the CCC algorithm differs from film by 5% in the lung region. The analysis for bone-in-tissue and the prosthetic phantoms are ongoing. Conclusion: The new GPUMCD algorithm calculated dose comparable to both the XVMC algorithm and to measurements in both a homogeneous solid water medium and the heterogeneous phantom representing lung or tumor-in-lung for 2×2 cm{sup 2}-10×10 cm{sup 2} field sizes. Funding support was obtained from Elekta.« less
JDiffraction: A GPGPU-accelerated JAVA library for numerical propagation of scalar wave fields
NASA Astrophysics Data System (ADS)
Piedrahita-Quintero, Pablo; Trujillo, Carlos; Garcia-Sucerquia, Jorge
2017-05-01
JDiffraction, a GPGPU-accelerated JAVA library for numerical propagation of scalar wave fields, is presented. Angular spectrum, Fresnel transform, and Fresnel-Bluestein transform are the numerical algorithms implemented in the methods and functions of the library to compute the scalar propagation of the complex wavefield. The functionality of the library is tested with the modeling of easy to forecast numerical experiments and also with the numerical reconstruction of a digitally recorded hologram. The performance of JDiffraction is contrasted with a library written for C++, showing great competitiveness in the apparently less complex environment of JAVA language. JDiffraction also includes JAVA easy-to-use methods and functions that take advantage of the computation power of the graphic processing units to accelerate the processing times of 2048×2048 pixel images up to 74 frames per second.
Complex interactions between diapirs and 4-D subduction driven mantle wedge circulation.
NASA Astrophysics Data System (ADS)
Sylvia, R. T.; Kincaid, C. R.
2015-12-01
Analogue laboratory experiments generate 4-D flow of mantle wedge fluid and capture the evolution of buoyant mesoscale diapirs. The mantle is modeled with viscous glucose syrup with an Arrhenius type temperature dependent viscosity. To characterize diapir evolution we experiment with a variety of fluids injected from multiple point sources. Diapirs interact with kinematically induced flow fields forced by subducting plate motions replicating a range of styles observed in dynamic subduction models (e.g., rollback, steepening, gaps). Data is collected using high definition timelapse photography and quantified using image velocimetry techniques. While many studies assume direct vertical connections between the volcanic arc and the deeper mantle source region, our experiments demonstrate the difficulty of creating near vertical conduits. Results highlight extreme curvature of diapir rise paths. Trench-normal deflection occurs as diapirs are advected downward away from the trench before ascending into wedge apex directed return flow. Trench parallel deflections up to 75% of trench length are seen in all cases, exacerbated by complex geometry and rollback motion. Interdiapir interaction is also important; upwellings with similar trajectory coalesce and rapidly accelerate. Moreover, we observe a new mode of interaction whereby recycled diapir material is drawn down along the slab surface and then initiates rapid fluid migration updip along the slab-wedge interface. Variability in trajectory and residence time leads to complex petrologic inferences. Material from disparate source regions can surface at the same location, mix in the wedge, or become fully entrained in creeping flow adding heterogeneity to the mantle. Active diapirism or any other vertical fluid flux mechanism employing rheological weakening lowers viscosity in the recycling mantle wedge affecting both solid and fluid flow characteristics. Many interesting and insightful results have been presented based upon 2-D, steady-state thermal and flow regimes. We reiterate the importance of 4-D time evolution in subduction models. Analogue experiments allow added feedbacks and complexity improving intuition and providing insight for further investigation.
Continuous Heterogeneous Photocatalysis in Serial Micro-Batch Reactors.
Pieber, Bartholomäus; Shalom, Menny; Antonietti, Markus; Seeberger, Peter H; Gilmore, Kerry
2018-01-29
Solid reagents, leaching catalysts, and heterogeneous photocatalysts are commonly employed in batch processes but are ill-suited for continuous-flow chemistry. Heterogeneous catalysts for thermal reactions are typically used in packed-bed reactors, which cannot be penetrated by light and thus are not suitable for photocatalytic reactions involving solids. We demonstrate that serial micro-batch reactors (SMBRs) allow for the continuous utilization of solid materials together with liquids and gases in flow. This technology was utilized to develop selective and efficient fluorination reactions using a modified graphitic carbon nitride heterogeneous catalyst instead of costly homogeneous metal polypyridyl complexes. The merger of this inexpensive, recyclable catalyst and the SMBR approach enables sustainable and scalable photocatalysis. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ecological systems are generally considered among the most complex because they are characterized by a large number of diverse components, nonlinear interactions, scale multiplicity, and spatial heterogeneity. Hierarchy theory, as well as empirical evidence, suggests that comp...
Highly Efficient Catalytic Cyclic Carbonate Formation by Pyridyl Salicylimines.
Subramanian, Saravanan; Park, Joonho; Byun, Jeehye; Jung, Yousung; Yavuz, Cafer T
2018-03-21
Cyclic carbonates as industrial commodities offer a viable nonredox carbon dioxide fixation, and suitable heterogeneous catalysts are vital for their widespread implementation. Here, we report a highly efficient heterogeneous catalyst for CO 2 addition to epoxides based on a newly identified active catalytic pocket consisting of pyridine, imine, and phenol moieties. The polymeric, metal-free catalyst derived from this active site converts less-reactive styrene oxide under atmospheric pressure in quantitative yield and selectivity to the corresponding carbonate. The catalyst does not need additives, solvents, metals, or co-catalysts, can be reused at least 10 cycles without the loss of activity, and scaled up easily to a kilogram scale. Density functional theory calculations reveal that the nucleophilicity of pyridine base gets stronger due to the conjugated imines and H-bonding from phenol accelerates the reaction forward by stabilizing the intermediate.
Length scales and pinning of interfaces
Tan, Likun
2016-01-01
The pinning of interfaces and free discontinuities by defects and heterogeneities plays an important role in a variety of phenomena, including grain growth, martensitic phase transitions, ferroelectricity, dislocations and fracture. We explore the role of length scale on the pinning of interfaces and show that the width of the interface relative to the length scale of the heterogeneity can have a profound effect on the pinning behaviour, and ultimately on hysteresis. When the heterogeneity is large, the pinning is strong and can lead to stick–slip behaviour as predicted by various models in the literature. However, when the heterogeneity is small, we find that the interface may not be pinned in a significant manner. This shows that a potential route to making materials with low hysteresis is to introduce heterogeneities at a length scale that is small compared with the width of the phase boundary. Finally, the intermediate setting where the length scale of the heterogeneity is comparable to that of the interface width is characterized by complex interactions, thereby giving rise to a non-monotone relationship between the relative heterogeneity size and the critical depinning stress. PMID:27002068
Emmanuel, A; Chohda, E; Botfield, C; Ellul, J
2018-01-01
Introduction Short hospital stays and accelerated discharge within 72 hours following colorectal cancer resections have not been widely achieved. Series reporting on accelerated discharge involve heterogeneous patient populations and exclude important groups. Strict adherence to some discharge requirements may lead to delays in discharge. The aim of this study was to evaluate the safety and feasibility of accelerated discharge within 72 hours of all elective colorectal cancer resections using simple discharge criteria. Methods Elective colorectal cancer resections performed between August 2009 and December 2015 by a single surgeon were reviewed. Perioperative care was based on an enhanced recovery programme. A set of simplified discharge criteria were used. Outcomes including postoperative complications, readmissions and reoperations were compared between patients discharged within 72 hours and those with a longer postoperative stay. Results Overall, 256 colorectal cancer resections (90% laparoscopic) were performed. The mean patient age was 70.8 years. The median length of stay was 3 days. Fifty-eight per cent of all patients and sixty-three per cent of patients undergoing laparoscopic surgery were discharged within 72 hours. Accelerated discharge was not associated with adverse outcomes compared with delayed discharge. Patients discharged within 72 hours had significantly fewer postoperative complications, readmissions and reoperations. Open surgery and stoma formation were associated with discharge after 72 hours but not age, co-morbidities, neoadjuvant chemoradiation or surgical procedure. Conclusions Accelerated discharge within 72 hours of elective colorectal resection for cancer is safely achievable for the majority of patients without compromising short-term outcomes.
2016-12-01
medical images (CT). A critical advantage of a voxel-based code is not only to include tissue heterogeneities but to integrate medically acquired...transport in tissue based on the acquired medical images of an individual patient (CT and MRI) with sufficient accuracy and at accelerated rates for...Human Sciences (HHS) (2015). These latter awards provided Akshay the opportunity to present this research at the IEEE Medical Imaging (NSS/MIC 2014) and
2017-04-13
modelling code, a parallel benchmark , and a communication avoiding version of the QR algorithm. Further, several improvements to the OmpSs model were...movement; and a port of the dynamic load balancing library to OmpSs. Finally, several updates to the tools infrastructure were accomplished, including: an...OmpSs: a basic algorithm on image processing applications, a mini application representative of an ocean modelling code, a parallel benchmark , and a
NASA Astrophysics Data System (ADS)
Doucet, R.; Olivares, M.; DeBlois, F.; Podgorsak, E. B.; Kawrakow, I.; Seuntjens, J.
2003-08-01
Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 × 10 cm2 applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid WaterTM (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.
Doucet, R; Olivares, M; DeBlois, F; Podgorsak, E B; Kawrakow, I; Seuntjens, J
2003-08-07
Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 x 10 cm2 applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid Water (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.
Genetically engineered mouse models in oncology research and cancer medicine.
Kersten, Kelly; de Visser, Karin E; van Miltenburg, Martine H; Jonkers, Jos
2017-02-01
Genetically engineered mouse models (GEMMs) have contributed significantly to the field of cancer research. In contrast to cancer cell inoculation models, GEMMs develop de novo tumors in a natural immune-proficient microenvironment. Tumors arising in advanced GEMMs closely mimic the histopathological and molecular features of their human counterparts, display genetic heterogeneity, and are able to spontaneously progress toward metastatic disease. As such, GEMMs are generally superior to cancer cell inoculation models, which show no or limited heterogeneity and are often metastatic from the start. Given that GEMMs capture both tumor cell-intrinsic and cell-extrinsic factors that drive de novo tumor initiation and progression toward metastatic disease, these models are indispensable for preclinical research. GEMMs have successfully been used to validate candidate cancer genes and drug targets, assess therapy efficacy, dissect the impact of the tumor microenvironment, and evaluate mechanisms of drug resistance. In vivo validation of candidate cancer genes and therapeutic targets is further accelerated by recent advances in genetic engineering that enable fast-track generation and fine-tuning of GEMMs to more closely resemble human patients. In addition, aligning preclinical tumor intervention studies in advanced GEMMs with clinical studies in patients is expected to accelerate the development of novel therapeutic strategies and their translation into the clinic. © 2016 The Authors. Published under the terms of the CC BY 4.0 license.
BowMapCL: Burrows-Wheeler Mapping on Multiple Heterogeneous Accelerators.
Nogueira, David; Tomas, Pedro; Roma, Nuno
2016-01-01
The computational demand of exact-search procedures has pressed the exploitation of parallel processing accelerators to reduce the execution time of many applications. However, this often imposes strict restrictions in terms of the problem size and implementation efforts, mainly due to their possibly distinct architectures. To circumvent this limitation, a new exact-search alignment tool (BowMapCL) based on the Burrows-Wheeler Transform and FM-Index is presented. Contrasting to other alternatives, BowMapCL is based on a unified implementation using OpenCL, allowing the exploitation of multiple and possibly different devices (e.g., NVIDIA, AMD/ATI, and Intel GPUs/APUs). Furthermore, to efficiently exploit such heterogeneous architectures, BowMapCL incorporates several techniques to promote its performance and scalability, including multiple buffering, work-queue task-distribution, and dynamic load-balancing, together with index partitioning, bit-encoding, and sampling. When compared with state-of-the-art tools, the attained results showed that BowMapCL (using a single GPU) is 2 × to 7.5 × faster than mainstream multi-threaded CPU BWT-based aligners, like Bowtie, BWA, and SOAP2; and up to 4 × faster than the best performing state-of-the-art GPU implementations (namely, SOAP3 and HPG-BWT). When multiple and completely distinct devices are considered, BowMapCL efficiently scales the offered throughput, ensuring a convenient load-balance of the involved processing in the several distinct devices.
NASA Astrophysics Data System (ADS)
D'Urzo, Annalisa; Konijnenberg, Albert; Rossetti, Giulia; Habchi, Johnny; Li, Jinyu; Carloni, Paolo; Sobott, Frank; Longhi, Sonia; Grandori, Rita
2015-03-01
Intrinsically disordered proteins (IDPs) form biologically active complexes that can retain a high degree of conformational disorder, escaping structural characterization by conventional approaches. An example is offered by the complex between the intrinsically disordered NTAIL domain and the phosphoprotein X domain (PXD) from measles virus (MeV). Here, distinct conformers of the complex are detected by electrospray ionization-mass spectrometry (ESI-MS) and ion mobility (IM) techniques yielding estimates for the solvent-accessible surface area (SASA) in solution and the average collision cross-section (CCS) in the gas phase. Computational modeling of the complex in solution, based on experimental constraints, provides atomic-resolution structural models featuring different levels of compactness. The resulting models indicate high structural heterogeneity. The intermolecular interactions are predominantly hydrophobic, not only in the ordered core of the complex, but also in the dynamic, disordered regions. Electrostatic interactions become involved in the more compact states. This system represents an illustrative example of a hydrophobic complex that could be directly detected in the gas phase by native mass spectrometry. This work represents the first attempt to modeling the entire NTAIL domain bound to PXD at atomic resolution.
Rida, Padmashree C. G.; Cantuaria, Guilherme; Reid, Michelle D.; Kucuk, Omer
2016-01-01
Cancer is truly an iconic disease—a tour de force whose multiple formidable strengths can be attributed to the bewildering heterogeneity that a tumor can manifest both spatially and temporally. A Darwinian evolutionary process is believed to undergird, at least in part, the generation of this heterogeneity that contributes to poor clinical outcomes. Risk assessment in clinical oncology is currently based on a small number of clinicopathologic factors (like stage, histological grade, receptor status, and serum tumor markers) and offers limited accuracy in predicting disease course as evidenced by the prognostic heterogeneity that persists in risk segments produced by present-day models. We posit that this insufficiency stems from the exclusion of key risk contributors from such models, especially the omission of certain factors implicated in generating intratumoral heterogeneity. The extent of centrosome amplification and the mitotic propensity inherent in a tumor are two such vital factors whose contributions to poor prognosis are presently overlooked in risk prognostication. Supernumerary centrosomes occur widely in tumors and are potent drivers of chromosomal instability that fosters intratumoral heterogeneity. The mitotic propensity of a proliferating population of tumor cells reflects the cell cycling kinetics of that population. Since frequent passage through improperly regulated mitotic divisions accelerates production of diverse genotypes, the mitotic propensity inherent in a tumor serves as a powerful beacon of risk. In this review, we highlight how centrosome amplification and error-prone mitoses contribute to poor clinical outcomes and urge the need to develop these cancer-specific traits as much-needed clinically-facile prognostic biomarkers with immense potential value for individualized cancer treatment in the clinic. PMID:26358854
NASA Astrophysics Data System (ADS)
Schilling, Oleg
2016-11-01
Two-, three- and four-equation, single-velocity, multicomponent Reynolds-averaged Navier-Stokes (RANS) models, based on the turbulent kinetic energy dissipation rate or lengthscale, are used to simulate At = 0 . 5 Rayleigh-Taylor turbulent mixing with constant and complex accelerations. The constant acceleration case is inspired by the Cabot and Cook (2006) DNS, and the complex acceleration cases are inspired by the unstable/stable and unstable/neutral cases simulated using DNS (Livescu, Wei & Petersen 2011) and the unstable/stable/unstable case simulated using ILES (Ramaprabhu, Karkhanis & Lawrie 2013). The four-equation models couple equations for the mass flux a and negative density-specific volume correlation b to the K- ɛ or K- L equations, while the three-equation models use a two-fluid algebraic closure for b. The lengthscale-based models are also applied with no buoyancy production in the L equation to explore the consequences of neglecting this term. Predicted mixing widths, turbulence statistics, fields, and turbulent transport equation budgets are compared among these models to identify similarities and differences in the turbulence production, dissipation and diffusion physics represented by the closures used in these models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Shock Wave Propagation in Cementitious Materials at Micro/Meso Scales
2015-08-31
ABSTRACT 16. SECURITY CLASSIFICATION OF: Shock wave response of heterogeneous materials like cement and concrete is greatly influenced by the...constituents and their statistical distributions. The microstructure of cement is complex due to the presence of unhydrated water, nano /micro pores, and other...heterogeneous materials like cement and concrete is greatly influenced by the constituents and their statistical distributions. The microstructure of cement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, W Michael; Kohlmeyer, Axel; Plimpton, Steven J
The use of accelerators such as graphics processing units (GPUs) has become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high-performance computers, machines with nodes containing more than one type of floating-point processor (e.g. CPU and GPU), are now becoming more prevalent due to these advantages. In this paper, we present a continuation of previous work implementing algorithms for using accelerators into the LAMMPS molecular dynamics software for distributed memory parallel hybrid machines. In our previous work, we focused on acceleration for short-range models with anmore » approach intended to harness the processing power of both the accelerator and (multi-core) CPUs. To augment the existing implementations, we present an efficient implementation of long-range electrostatic force calculation for molecular dynamics. Specifically, we present an implementation of the particle-particle particle-mesh method based on the work by Harvey and De Fabritiis. We present benchmark results on the Keeneland InfiniBand GPU cluster. We provide a performance comparison of the same kernels compiled with both CUDA and OpenCL. We discuss limitations to parallel efficiency and future directions for improving performance on hybrid or heterogeneous computers.« less
Crystal cryocooling distorts conformational heterogeneity in a model Michaelis complex of DHFR
Keedy, Daniel A.; van den Bedem, Henry; Sivak, David A.; Petsko, Gregory A.; Ringe, Dagmar; Wilson, Mark A.; Fraser, James S.
2014-01-01
Summary Most macromolecular X-ray structures are determined from cryocooled crystals, but it is unclear whether cryocooling distorts functionally relevant flexibility. Here we compare independently acquired pairs of high-resolution datasets of a model Michaelis complex of dihydrofolate reductase (DHFR), collected by separate groups at both room and cryogenic temperatures. These datasets allow us to isolate the differences between experimental procedures and between temperatures. Our analyses of multiconformer models and time-averaged ensembles suggest that cryocooling suppresses and otherwise modifies sidechain and mainchain conformational heterogeneity, quenching dynamic contact networks. Despite some idiosyncratic differences, most changes from room temperature to cryogenic temperature are conserved, and likely reflect temperature-dependent solvent remodeling. Both cryogenic datasets point to additional conformations not evident in the corresponding room-temperature datasets, suggesting that cryocooling does not merely trap pre-existing conformational heterogeneity. Our results demonstrate that crystal cryocooling consistently distorts the energy landscape of DHFR, a paragon for understanding functional protein dynamics. PMID:24882744
Economic networks: Heterogeneity-induced vulnerability and loss of synchronization
NASA Astrophysics Data System (ADS)
Colon, Célian; Ghil, Michael
2017-12-01
Interconnected systems are prone to propagation of disturbances, which can undermine their resilience to external perturbations. Propagation dynamics can clearly be affected by potential time delays in the underlying processes. We investigate how such delays influence the resilience of production networks facing disruption of supply. Interdependencies between economic agents are modeled using systems of Boolean delay equations (BDEs); doing so allows us to introduce heterogeneity in production delays and in inventories. Complex network topologies are considered that reproduce realistic economic features, including a network of networks. Perturbations that would otherwise vanish can, because of delay heterogeneity, amplify and lead to permanent disruptions. This phenomenon is enabled by the interactions between short cyclic structures. Difference in delays between two interacting, and otherwise resilient, structures can in turn lead to loss of synchronization in damage propagation and thus prevent recovery. Finally, this study also shows that BDEs on complex networks can lead to metastable relaxation oscillations, which are damped out in one part of a network while moving on to another part.
NASA Astrophysics Data System (ADS)
Gu, En-Guo
In this paper, we formulate a dynamical model of common fishery resource harvested by multiagents with heterogeneous strategy: profit maximizers and gradient learners. Special attention is paid to the problem of heterogeneity of strategic behaviors. We mainly study the existence and the local stability of non-negative equilibria for the model through mathematical analysis. We analyze local bifurcations and complex dynamics such as coexisting attractors by numerical simulations. We also study the local and global dynamics of the exclusive gradient learners as a special case of the model. We discover that when adjusting the speed to be slightly high, the increasing ratio of gradient learners may lead to instability of the fixed point and makes the system sink into complicated dynamics such as quasiperiodic or chaotic attractor. The results reveal that gradient learners with high adjusting speed may ultimately be more harmful to the sustainable use of fish stock than the profit maximizers.
Genetic algorithm learning in a New Keynesian macroeconomic setup.
Hommes, Cars; Makarewicz, Tomasz; Massaro, Domenico; Smits, Tom
2017-01-01
In order to understand heterogeneous behavior amongst agents, empirical data from Learning-to-Forecast (LtF) experiments can be used to construct learning models. This paper follows up on Assenza et al. (2013) by using a Genetic Algorithms (GA) model to replicate the results from their LtF experiment. In this GA model, individuals optimize an adaptive, a trend following and an anchor coefficient in a population of general prediction heuristics. We replicate experimental treatments in a New-Keynesian environment with increasing complexity and use Monte Carlo simulations to investigate how well the model explains the experimental data. We find that the evolutionary learning model is able to replicate the three different types of behavior, i.e. convergence to steady state, stable oscillations and dampened oscillations in the treatments using one GA model. Heterogeneous behavior can thus be explained by an adaptive, anchor and trend extrapolating component and the GA model can be used to explain heterogeneous behavior in LtF experiments with different types of complexity.
Bueno-Orovio, Alfonso; Kay, David; Grau, Vicente; Rodriguez, Blanca; Burrage, Kevin
2014-01-01
Impulse propagation in biological tissues is known to be modulated by structural heterogeneity. In cardiac muscle, improved understanding on how this heterogeneity influences electrical spread is key to advancing our interpretation of dispersion of repolarization. We propose fractional diffusion models as a novel mathematical description of structurally heterogeneous excitable media, as a means of representing the modulation of the total electric field by the secondary electrical sources associated with tissue inhomogeneities. Our results, analysed against in vivo human recordings and experimental data of different animal species, indicate that structural heterogeneity underlies relevant characteristics of cardiac electrical propagation at tissue level. These include conduction effects on action potential (AP) morphology, the shortening of AP duration along the activation pathway and the progressive modulation by premature beats of spatial patterns of dispersion of repolarization. The proposed approach may also have important implications in other research fields involving excitable complex media. PMID:24920109
Heterogeneous Monolithic Integration of Single-Crystal Organic Materials.
Park, Kyung Sun; Baek, Jangmi; Park, Yoonkyung; Lee, Lynn; Hyon, Jinho; Koo Lee, Yong-Eun; Shrestha, Nabeen K; Kang, Youngjong; Sung, Myung Mo
2017-02-01
Manufacturing high-performance organic electronic circuits requires the effective heterogeneous integration of different nanoscale organic materials with uniform morphology and high crystallinity in a desired arrangement. In particular, the development of high-performance organic electronic and optoelectronic devices relies on high-quality single crystals that show optimal intrinsic charge-transport properties and electrical performance. Moreover, the heterogeneous integration of organic materials on a single substrate in a monolithic way is highly demanded for the production of fundamental organic electronic components as well as complex integrated circuits. Many of the various methods that have been designed to pattern multiple heterogeneous organic materials on a substrate and the heterogeneous integration of organic single crystals with their crystal growth are described here. Critical issues that have been encountered in the development of high-performance organic integrated electronics are also addressed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Correlation of Noise Signature to Pulsed Power Events at the HERMES III Accelerator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Barbara; Joseph, Nathan Ryan; Salazar, Juan Diego
2016-11-01
The HERMES III accelerator, which is located at Sandia National Laboratories' Tech Area IV, is the largest pulsed gamma X-ray source in the world. The accelerator is made up of 20 inductive cavities that are charged to 1 MV each by complex pulsed power circuitry. The firing time of the machine components ranges between the microsecond and nanosecond timescales. This results in a variety of electromagnetic frequencies when the accelerator fires. Testing was done to identify the HERMES electromagnetic noise signal and to map it to the various accelerator trigger events. This report will show the measurement methods used tomore » capture the noise spectrum produced from the machine and correlate this noise signature with machine events.« less
ILU industrial electron accelerators for medical-product sterilization and food treatment
NASA Astrophysics Data System (ADS)
Bezuglov, V. V.; Bryazgin, A. A.; Vlasov, A. Yu.; Voronin, L. A.; Panfilov, A. D.; Radchenko, V. M.; Tkachenko, V. O.; Shtarklev, E. A.
2016-12-01
Pulse linear electron accelerators of the ILU type have been developed and produced by the Institute of Nuclear Physics, Siberian Branch, Russian Academy of Sciences, for more than 30 years. Their distinctive features are simplicity of design, convenience in operation, and reliability during long work under conditions of industrial production. ILU accelerators have a range of energy of 0.7-10 MeV at a power of accelerated beam of up to 100 kW and they are optimally suitable for use as universal sterilizing complexes. The scientific novelty of these accelerators consists of their capability to work both in the electron-treatment mode of production and in the bremsstrahlung generation mode, which has high penetrating power.
Further Development of the Gyrotron- Powered Pellet Accelerator
NASA Astrophysics Data System (ADS)
Perkins, Francis
2007-11-01
The Gyrotron-Powered Pellet Accelerator provides an enabling technology to efficiently fuel ITER with fast pellets launched from the High Field Side (HFS) separatrix. Pellet experiments have repeatedly found that fuel efficiently is high - consistent with 100%. In contrast, Low Field Side (LFS) launch experiments find efficiencies of 50% or less. This report addresses what experimental program and what material choices can be made to retain program momentum. An initial program seeks to establish that our heterogeneous approach to conductivity works, maintaining s 1 mho/m. A demonstration of acceleration can be carried out in a very simple laboratory when the pusher material D2[Be] is replaced by LiH[C] which is a room temperature solid with a graphite particle suspension. No cryogenics or hazard chemicals. The mm-wave mirror will be graphite, the tamper is sapphire, and the payload LiD. The payload has a pellet has diameter = 3mm and a mass M = 4.4x10-4 kg which is 220 joules at V=1000 m/s. A barrel length of 15 cm completes the design specification.
The block adaptive multigrid method applied to the solution of the Euler equations
NASA Technical Reports Server (NTRS)
Pantelelis, Nikos
1993-01-01
In the present study, a scheme capable of solving very fast and robust complex nonlinear systems of equations is presented. The Block Adaptive Multigrid (BAM) solution method offers multigrid acceleration and adaptive grid refinement based on the prediction of the solution error. The proposed solution method was used with an implicit upwind Euler solver for the solution of complex transonic flows around airfoils. Very fast results were obtained (18-fold acceleration of the solution) using one fourth of the volumes of a global grid with the same solution accuracy for two test cases.
Protein-protein docking on hardware accelerators: comparison of GPU and MIC architectures
2015-01-01
Background The hardware accelerators will provide solutions to computationally complex problems in bioinformatics fields. However, the effect of acceleration depends on the nature of the application, thus selection of an appropriate accelerator requires some consideration. Results In the present study, we compared the effects of acceleration using graphics processing unit (GPU) and many integrated core (MIC) on the speed of fast Fourier transform (FFT)-based protein-protein docking calculation. The GPU implementation performed the protein-protein docking calculations approximately five times faster than the MIC offload mode implementation. The MIC native mode implementation has the advantage in the implementation costs. However, the performance was worse with larger protein pairs because of memory limitations. Conclusion The results suggest that GPU is more suitable than MIC for accelerating FFT-based protein-protein docking applications. PMID:25707855
The Gift of Time: Today's Academic Acceleration Case Study Voices of Experience
ERIC Educational Resources Information Center
Scheibel, Susan Riley
2010-01-01
The purpose of this qualitative case study was to examine today's academic acceleration from the lived experience and perspectives of two young adults whose education was shortened, thereby allowing them the gift of time. Through personal interviews, parent interviews, and physical artifacts, the researcher gained a complex, holistic understanding…
Fermilab | Science | Particle Accelerators | Fermilab's Accelerator Complex
Book Newsroom Newsroom News and features Press releases Photo gallery Fact sheets and brochures Media media Video of shutdown event Guest book Tevatron Impact June 11, 2012 About the symposium Symposium Science Security, Privacy, Legal Use of Cookies Quick Links Home Contact Phone Book Fermilab at Work For
Digital sorting of complex tissues for cell type-specific gene expression profiles.
Zhong, Yi; Wan, Ying-Wooi; Pang, Kaifang; Chow, Lionel M L; Liu, Zhandong
2013-03-07
Cellular heterogeneity is present in almost all gene expression profiles. However, transcriptome analysis of tissue specimens often ignores the cellular heterogeneity present in these samples. Standard deconvolution algorithms require prior knowledge of the cell type frequencies within a tissue or their in vitro expression profiles. Furthermore, these algorithms tend to report biased estimations. Here, we describe a Digital Sorting Algorithm (DSA) for extracting cell-type specific gene expression profiles from mixed tissue samples that is unbiased and does not require prior knowledge of cell type frequencies. The results suggest that DSA is a specific and sensitivity algorithm in gene expression profile deconvolution and will be useful in studying individual cell types of complex tissues.
Inertial Effects on Flow and Transport in Heterogeneous Porous Media.
Nissan, Alon; Berkowitz, Brian
2018-02-02
We investigate the effects of high fluid velocities on flow and tracer transport in heterogeneous porous media. We simulate fluid flow and advective transport through two-dimensional pore-scale matrices with varying structural complexity. As the Reynolds number increases, the flow regime transitions from linear to nonlinear; this behavior is controlled by the medium structure, where higher complexity amplifies inertial effects. The result is, nonintuitively, increased homogenization of the flow field, which leads in the context of conservative chemical transport to less anomalous behavior. We quantify the transport patterns via a continuous time random walk, using the spatial distribution of the kinetic energy within the fluid as a characteristic measure.
Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds
NASA Astrophysics Data System (ADS)
Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.
In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.
Transients in the inhibitory driving of neurons and their postsynaptic consequences.
Segundo, J P; Stiber, M; Altshuler, E; Vibert, J F
1994-09-01
The presynaptic fiber at an inhibitory synapse on a pacemaker neuron was forced to generate transients, defined here as spike trains with a trend, unceasingly accelerating or slowing. Experiments were on isolated crayfish stretch receptor organs. Spike train analyses used tools and notions from conventional point processes and from non-linear dynamics. Pre- and postsynaptic discharges contrasted clearly in terms of rates and interspike intervals. The inhibitory train evolved monotonically and smoothly, following tightly the simple prescribed curves; it was uniform, exhibiting throughout a single and simple discharge form (i.e. interval patterning). The inhibited postsynaptic train alternately accelerated and slowed, not following tightly any simple curve; it was heterogeneous, exhibiting in succession several different and often complex discharge forms, and switching abruptly from one to another. The inhibited trains depended on the inhibitory transient's span, range and average slope. Accordingly, transients separated (not cuttingly) into categories with prolonged spans (over 1 s) and slow slopes (around 1/s2) and those with short spans (under 1 s) and fast slopes (around 30/s2). Special transients elicited postsynaptic discharges that reproduced it faithfully, e.g. accelerated with the transient and proportionately; no transient elicited postsynaptic discharges faithful to its mirror image. Crayfish synapses are prototypes, so these findings should be expected in any other junction, as working hypotheses at least. Implications involve the operation of neural networks, including the role of distortions and their compensation, and the underlying mechanisms. Transients have received little attention, most work on synaptic coding concentrating on stationary discharges. Transients are inherent to the changing situations that pervade everyday life, however, and their biological importance is self-evident. The different discharges encountered during a transient had strong similarities to the stationary forms reported for different pacemaker drivings that are called locking, intermittency, erratic and stammering; they were, in fact, trendy versions of these. Such forms appear with several synaptic drivings in the same order along the presynaptic rate scale; they may constitute basic building blocks for synaptic operation. In terms of non-linear science, it is as if the attractors postulated for stationary drivings remained strongly influential during the transients, though affected by the rate of change.
Simulation of municipal solid waste degradation in aerobic and anaerobic bioreactor landfills.
Patil, Bhagwan Shamrao; C, Agnes Anto; Singh, Devendra Narain
2017-03-01
Municipal solid waste generation is huge in growing cities of developing nations such as India, owing to the rapid industrial and population growth. In addition to various methods for treatment and disposal of municipal solid waste (landfills, composting, bio-methanation, incineration and pyrolysis), aerobic/anaerobic bioreactor landfills are gaining popularity for economical and effective disposal of municipal solid waste. However, efficiency of municipal solid waste bioreactor landfills primarily depends on the municipal solid waste decomposition rate, which can be accelerated through monitoring moisture content and temperature by using the frequency domain reflectometry probe and thermocouples, respectively. The present study demonstrates that these landfill physical properties of the heterogeneous municipal solid waste mass can be monitored using these instruments, which facilitates proper scheduling of the leachate recirculation for accelerating the decomposition rate of municipal solid waste.
Accelerated Evolution in the Death Galaxy
NASA Astrophysics Data System (ADS)
Austin, Robert; Tung, Chih-Kuan; Gong, Xiu-Quing; Lambert, Guillaume; Liao, David
2010-03-01
We recall 4 main guiding principles of evolution: 1) instability of defections, 2) stress induced non-random mutations, 3) genetic heterogeneity, and 4) fragmented populations. Our previous preliminary experiments have been relatively simple 1-D stress experiments. We are proceeding with 2-D experiments whose design is guided by these principles. Our new experiment we have dubbed the Death Galaxy because of it's use of these design principles. The ``galaxy'' name comes from the fact that the structure is designed as an interconnected array of micro-ecologies, these micro-ecologies are similar to the stars that comprise an astronomical galaxy, and provide the fragmented small populations. A gradient of the antibiotic Cipro is introduced across the galaxy, and we will present results which show how bacterial evolution resulting in resistance to Cipro is accelerated by the physics principles underlying the device.
He, Jie; Yang, Xiaofang; Men, Bin; Wang, Dongsheng
2016-01-01
The heterogeneous Fenton reaction can generate highly reactive hydroxyl radicals (OH) from reactions between recyclable solid catalysts and H2O2 at acidic or even circumneutral pH. Hence, it can effectively oxidize refractory organics in water or soils and has become a promising environmentally friendly treatment technology. Due to the complex reaction system, the mechanism behind heterogeneous Fenton reactions remains unresolved but fascinating, and is crucial for understanding Fenton chemistry and the development and application of efficient heterogeneous Fenton technologies. Iron-based materials usually possess high catalytic activity, low cost, negligible toxicity and easy recovery, and are a superior type of heterogeneous Fenton catalysts. Therefore, this article reviews the fundamental but important interfacial mechanisms of heterogeneous Fenton reactions catalyzed by iron-based materials. OH, hydroperoxyl radicals/superoxide anions (HO2/O2(-)) and high-valent iron are the three main types of reactive oxygen species (ROS), with different oxidation reactivity and selectivity. Based on the mechanisms of ROS generation, the interfacial mechanisms of heterogeneous Fenton systems can be classified as the homogeneous Fenton mechanism induced by surface-leached iron, the heterogeneous catalysis mechanism, and the heterogeneous reaction-induced homogeneous mechanism. Different heterogeneous Fenton systems catalyzed by characteristic iron-based materials are comprehensively reviewed. Finally, related future research directions are also suggested. Copyright © 2015. Published by Elsevier B.V.
Overview of accelerators with potential use in homeland security
Garnett, Robert W.
2015-06-18
Quite a broad range of accelerators have been applied to solving many of the challenging problems related to homeland security and defense. These accelerator systems range from relatively small, simple, and compact, to large and complex, based on the specific application requirements. They have been used or proposed as sources of primary and secondary probe beams for applications such as radiography and to induce specific reactions that are key signatures for detecting conventional explosives or fissile material. A brief overview and description of these accelerator systems, their specifications, and application will be presented. Some recent technology trends will also bemore » discussed.« less
Smith, Benjamin A; Padrick, Shae B; Doolittle, Lynda K; Daugherty-Clarke, Karen; Corrêa, Ivan R; Xu, Ming-Qun; Goode, Bruce L; Rosen, Michael K; Gelles, Jeff
2013-01-01
During cell locomotion and endocytosis, membrane-tethered WASP proteins stimulate actin filament nucleation by the Arp2/3 complex. This process generates highly branched arrays of filaments that grow toward the membrane to which they are tethered, a conflict that seemingly would restrict filament growth. Using three-color single-molecule imaging in vitro we revealed how the dynamic associations of Arp2/3 complex with mother filament and WASP are temporally coordinated with initiation of daughter filament growth. We found that WASP proteins dissociated from filament-bound Arp2/3 complex prior to new filament growth. Further, mutations that accelerated release of WASP from filament-bound Arp2/3 complex proportionally accelerated branch formation. These data suggest that while WASP promotes formation of pre-nucleation complexes, filament growth cannot occur until it is triggered by WASP release. This provides a mechanism by which membrane-bound WASP proteins can stimulate network growth without restraining it. DOI: http://dx.doi.org/10.7554/eLife.01008.001 PMID:24015360
Caie, Peter D; Harrison, David J
2016-01-01
The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.
3-D FDTD simulation of shear waves for evaluation of complex modulus imaging.
Orescanin, Marko; Wang, Yue; Insana, Michael
2011-02-01
The Navier equation describing shear wave propagation in 3-D viscoelastic media is solved numerically with a finite differences time domain (FDTD) method. Solutions are formed in terms of transverse scatterer velocity waves and then verified via comparison to measured wave fields in heterogeneous hydrogel phantoms. The numerical algorithm is used as a tool to study the effects on complex shear modulus estimation from wave propagation in heterogeneous viscoelastic media. We used an algebraic Helmholtz inversion (AHI) technique to solve for the complex shear modulus from simulated and experimental velocity data acquired in 2-D and 3-D. Although 3-D velocity estimates are required in general, there are object geometries for which 2-D inversions provide accurate estimations of the material properties. Through simulations and experiments, we explored artifacts generated in elastic and dynamic-viscous shear modulus images related to the shear wavelength and average viscosity.
Vermehren-Schmaedick, Anke; Krueger, Wesley; Jacob, Thomas; Ramunno-Johnson, Damien; Balkowiec, Agnieszka; Lidke, Keith A.; Vu, Tania Q.
2014-01-01
Accumulating evidence underscores the importance of ligand-receptor dynamics in shaping cellular signaling. In the nervous system, growth factor-activated Trk receptor trafficking serves to convey biochemical signaling that underlies fundamental neural functions. Focus has been placed on axonal trafficking but little is known about growth factor-activated Trk dynamics in the neuronal soma, particularly at the molecular scale, due in large part to technical hurdles in observing individual growth factor-Trk complexes for long periods of time inside live cells. Quantum dots (QDs) are intensely fluorescent nanoparticles that have been used to study the dynamics of ligand-receptor complexes at the plasma membrane but the value of QDs for investigating ligand-receptor intracellular dynamics has not been well exploited. The current study establishes that QD conjugated brain-derived neurotrophic factor (QD-BDNF) binds to TrkB receptors with high specificity, activates TrkB downstream signaling, and allows single QD tracking capability for long recording durations deep within the soma of live neurons. QD-BDNF complexes undergo internalization, recycling, and intracellular trafficking in the neuronal soma. These trafficking events exhibit little time-synchrony and diverse heterogeneity in underlying dynamics that include phases of sustained rapid motor transport without pause as well as immobility of surprisingly long-lasting duration (several minutes). Moreover, the trajectories formed by dynamic individual BDNF complexes show no apparent end destination; BDNF complexes can be found meandering over long distances of several microns throughout the expanse of the neuronal soma in a circuitous fashion. The complex, heterogeneous nature of neuronal soma trafficking dynamics contrasts the reported linear nature of axonal transport data and calls for models that surpass our generally limited notions of nuclear-directed transport in the soma. QD-ligand probes are poised to provide understanding of how the molecular mechanisms underlying intracellular ligand-receptor trafficking shape cell signaling under conditions of both healthy and dysfunctional neurological disease models. PMID:24732948
Accelerated Gaussian mixture model and its application on image segmentation
NASA Astrophysics Data System (ADS)
Zhao, Jianhui; Zhang, Yuanyuan; Ding, Yihua; Long, Chengjiang; Yuan, Zhiyong; Zhang, Dengyi
2013-03-01
Gaussian mixture model (GMM) has been widely used for image segmentation in recent years due to its superior adaptability and simplicity of implementation. However, traditional GMM has the disadvantage of high computational complexity. In this paper an accelerated GMM is designed, for which the following approaches are adopted: establish the lookup table for Gaussian probability matrix to avoid the repetitive probability calculations on all pixels, employ the blocking detection method on each block of pixels to further decrease the complexity, change the structure of lookup table from 3D to 1D with more simple data type to reduce the space requirement. The accelerated GMM is applied on image segmentation with the help of OTSU method to decide the threshold value automatically. Our algorithm has been tested through image segmenting of flames and faces from a set of real pictures, and the experimental results prove its efficiency in segmentation precision and computational cost.
Isaacman, Gabriel; Chan, Arthur W H; Nah, Theodora; Worton, David R; Ruehl, Chris R; Wilson, Kevin R; Goldstein, Allen H
2012-10-02
Motor oil serves as a useful model system for atmospheric oxidation of hydrocarbon mixtures typical of anthropogenic atmospheric particulate matter, but its complexity often prevents comprehensive chemical speciation. In this work we fully characterize this formerly "unresolved complex mixture" at the molecular level using recently developed soft ionization gas chromatography techniques. Nucleated motor oil particles are oxidized in a flow tube reactor to investigate the relative reaction rates of observed hydrocarbon classes: alkanes, cycloalkanes, bicycloalkanes, tricycloalkanes, and steranes. Oxidation of hydrocarbons in a complex aerosol is found to be efficient, with approximately three-quarters (0.72 ± 0.06) of OH collisions yielding a reaction. Reaction rates of individual hydrocarbons are structurally dependent: compared to normal alkanes, reaction rates increased by 20-50% with branching, while rates decreased ∼20% per nonaromatic ring present. These differences in rates are expected to alter particle composition as a function of oxidation, with depletion of branched and enrichment of cyclic hydrocarbons. Due to this expected shift toward ring-opening reactions heterogeneous oxidation of the unreacted hydrocarbon mixture is less likely to proceed through fragmentation pathways in more oxidized particles. Based on the observed oxidation-induced changes in composition, isomer-resolved analysis has potential utility for determining the photochemical age of atmospheric particulate matter with respect to heterogeneous oxidation.
Overview of Solar Radio Bursts and their Sources
NASA Astrophysics Data System (ADS)
Golla, Thejappa; MacDowall, Robert J.
2018-06-01
Properties of radio bursts emitted by the Sun at frequencies below tens of MHz are reviewed. In this frequency range, the most prominent radio emissions are those of solar type II, complex type III and solar type IV radio bursts, excited probably by the energetic electron populations accelerated in completely different environments: (1) type II bursts are due to non-relativistic electrons accelerated by the CME driven interplanetary shocks, (2) complex type III bursts are due to near-relativistic electrons accelerated either by the solar flare reconnection process or by the SEP shocks, and (3) type IV bursts are due to relativistic electrons, trapped in the post-eruption arcades behind CMEs; these relativistic electrons probably are accelerated by the continued reconnection processes occurring beneath the CME. These radio bursts, which can serve as the natural plasma probes traversing the heliosphere by providing information about various crucial space plasma parameters, are also an ideal instrument for investigating acceleration mechanisms responsible for the high energy particles. The rich collection of valuable high quality radio and high time resolution in situ wave data from the WAVES experiments of the STEREO A, STEREO B and WIND spacecraft has provided an unique opportunity to study these different radio phenomena and understand the complex physics behind their excitation. We have developed Monte Carlo simulation techniques to estimate the propagation effects on the observed characteristics of these low frequency radio bursts. We will present some of the new results and describe how one can use these radio burst observations for space weather studies. We will also describe some of the non-linear plasma processes detected in the source regions of both solar type III and type II radio bursts. The analysis and simulation techniques used in these studies will be of immense use for future space based radio observations.
Electrostatic Steering Accelerates C3d:CR2 Association
2016-01-01
Electrostatic effects are ubiquitous in protein interactions and are found to be pervasive in the complement system as well. The interaction between complement fragment C3d and complement receptor 2 (CR2) has evolved to become a link between innate and adaptive immunity. Electrostatic interactions have been suggested to be the driving factor for the association of the C3d:CR2 complex. In this study, we investigate the effects of ionic strength and mutagenesis on the association of C3d:CR2 through Brownian dynamics simulations. We demonstrate that the formation of the C3d:CR2 complex is ionic strength-dependent, suggesting the presence of long-range electrostatic steering that accelerates the complex formation. Electrostatic steering occurs through the interaction of an acidic surface patch in C3d and the positively charged CR2 and is supported by the effects of mutations within the acidic patch of C3d that slow or diminish association. Our data are in agreement with previous experimental mutagenesis and binding studies and computational studies. Although the C3d acidic patch may be locally destabilizing because of unfavorable Coulombic interactions of like charges, it contributes to the acceleration of association. Therefore, acceleration of function through electrostatic steering takes precedence to stability. The site of interaction between C3d and CR2 has been the target for delivery of CR2-bound nanoparticle, antibody, and small molecule biomarkers, as well as potential therapeutics. A detailed knowledge of the physicochemical basis of C3d:CR2 association may be necessary to accelerate biomarker and drug discovery efforts. PMID:27092816
Utility of computer simulations in landscape genetics
Bryan K. Epperson; Brad H. McRae; Kim Scribner; Samuel A. Cushman; Michael S. Rosenberg; Marie-Josee Fortin; Patrick M. A. James; Melanie Murphy; Stephanie Manel; Pierre Legendre; Mark R. T. Dale
2010-01-01
Population genetics theory is primarily based on mathematical models in which spatial complexity and temporal variability are largely ignored. In contrast, the field of landscape genetics expressly focuses on how population genetic processes are affected by complex spatial and temporal environmental heterogeneity. It is spatially explicit and relates patterns to...
Complex Instruction: A Model for Reaching Up--and Out
ERIC Educational Resources Information Center
Tomlinson, Carol Ann
2018-01-01
Complex Instruction is a multifaceted instructional model designed to provide highly challenging learning opportunities for students in heterogeneous classrooms. The model provides a rationale for and philosophy of creating equity of access to excellent curriculum and instruction for a broad range of learners, guidance for preparing students for…
A scalable plant-resolving radiative transfer model based on optimized GPU ray tracing
USDA-ARS?s Scientific Manuscript database
A new model for radiative transfer in participating media and its application to complex plant canopies is presented. The goal was to be able to efficiently solve complex canopy-scale radiative transfer problems while also representing sub-plant heterogeneity. In the model, individual leaf surfaces ...
Adaptive Acceleration of Visually Evoked Smooth Eye Movements in Mice
2016-01-01
The optokinetic response (OKR) consists of smooth eye movements following global motion of the visual surround, which suppress image slip on the retina for visual acuity. The effective performance of the OKR is limited to rather slow and low-frequency visual stimuli, although it can be adaptably improved by cerebellum-dependent mechanisms. To better understand circuit mechanisms constraining OKR performance, we monitored how distinct kinematic features of the OKR change over the course of OKR adaptation, and found that eye acceleration at stimulus onset primarily limited OKR performance but could be dramatically potentiated by visual experience. Eye acceleration in the temporal-to-nasal direction depended more on the ipsilateral floccular complex of the cerebellum than did that in the nasal-to-temporal direction. Gaze-holding following the OKR was also modified in parallel with eye-acceleration potentiation. Optogenetic manipulation revealed that synchronous excitation and inhibition of floccular complex Purkinje cells could effectively accelerate eye movements in the nasotemporal and temporonasal directions, respectively. These results collectively delineate multiple motor pathways subserving distinct aspects of the OKR in mice and constrain hypotheses regarding cellular mechanisms of the cerebellum-dependent tuning of movement acceleration. SIGNIFICANCE STATEMENT Although visually evoked smooth eye movements, known as the optokinetic response (OKR), have been studied in various species for decades, circuit mechanisms of oculomotor control and adaptation remain elusive. In the present study, we assessed kinematics of the mouse OKR through the course of adaptation training. Our analyses revealed that eye acceleration at visual-stimulus onset primarily limited working velocity and frequency range of the OKR, yet could be dramatically potentiated during OKR adaptation. Potentiation of eye acceleration exhibited different properties between the nasotemporal and temporonasal OKRs, indicating distinct visuomotor circuits underlying the two. Lesions and optogenetic manipulation of the cerebellum provide constraints on neural circuits mediating visually driven eye acceleration and its adaptation. PMID:27335412
Haack, Tobias B; Madignier, Florence; Herzer, Martina; Lamantea, Eleonora; Danhauser, Katharina; Invernizzi, Federica; Koch, Johannes; Freitag, Martin; Drost, Rene; Hillier, Ingo; Haberberger, Birgit; Mayr, Johannes A; Ahting, Uwe; Tiranti, Valeria; Rötig, Agnes; Iuso, Arcangela; Horvath, Rita; Tesarova, Marketa; Baric, Ivo; Uziel, Graziella; Rolinski, Boris; Sperl, Wolfgang; Meitinger, Thomas; Zeviani, Massimo; Freisinger, Peter; Prokisch, Holger
2012-02-01
Mitochondrial complex I deficiency is the most common cause of mitochondrial disease in childhood. Identification of the molecular basis is difficult given the clinical and genetic heterogeneity. Most patients lack a molecular definition in routine diagnostics. A large-scale mutation screen of 75 candidate genes in 152 patients with complex I deficiency was performed by high-resolution melting curve analysis and Sanger sequencing. The causal role of a new disease allele was confirmed by functional complementation assays. The clinical phenotype of patients carrying mutations was documented using a standardised questionnaire. Causative mutations were detected in 16 genes, 15 of which had previously been associated with complex I deficiency: three mitochondrial DNA genes encoding complex I subunits, two mitochondrial tRNA genes and nuclear DNA genes encoding six complex I subunits and four assembly factors. For the first time, a causal mutation is described in NDUFB9, coding for a complex I subunit, resulting in reduction in NDUFB9 protein and both amount and activity of complex I. These features were rescued by expression of wild-type NDUFB9 in patient-derived fibroblasts. Mutant NDUFB9 is a new cause of complex I deficiency. A molecular diagnosis related to complex I deficiency was established in 18% of patients. However, most patients are likely to carry mutations in genes so far not associated with complex I function. The authors conclude that the high degree of genetic heterogeneity in complex I disorders warrants the implementation of unbiased genome-wide strategies for the complete molecular dissection of mitochondrial complex I deficiency.
Development of Cross Section Library and Application Programming Interface (API)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Marin-Lafleche, A.; Smith, M. A.
2014-04-09
The goal of NEAMS neutronics is to develop a high-fidelity deterministic neutron transport code termed PROTEUS for use on all reactor types of interest, but focused primarily on sodium-cooled fast reactors. While PROTEUS-SN has demonstrated good accuracy for homogeneous fast reactor problems and partially heterogeneous fast reactor problems, the simulation results were not satisfactory when applied on fully heterogeneous thermal problems like the Advanced Test Reactor (ATR). This is mainly attributed to the quality of cross section data for heterogeneous geometries since the conventional cross section generation approach does not work accurately for such irregular and complex geometries. Therefore, onemore » of the NEAMS neutronics tasks since FY12 has been the development of a procedure to generate appropriate cross sections for a heterogeneous geometry core.« less
2016-01-01
planning exercises and wargaming. Initial Experimentation Late in the research , the prototype platform and the various fusion methods came together. This...Chapter Four points to prior research 2 Uncertainty-Sensitive Heterogeneous Information Fusion in mind multimethod fusing of complex information...our research is assessing the threat of terrorism posed by individuals or groups under scrutiny. Broadly, the ultimate objec- tives, which go well
Genetic Heterogeneity in Algerian Human Populations
Deba, Tahria; Calafell, Francesc; Benhamamouch, Soraya; Comas, David
2015-01-01
The demographic history of human populations in North Africa has been characterized by complex processes of admixture and isolation that have modeled its current gene pool. Diverse genetic ancestral components with different origins (autochthonous, European, Middle Eastern, and sub-Saharan) and genetic heterogeneity in the region have been described. In this complex genetic landscape, Algeria, the largest country in Africa, has been poorly covered, with most of the studies using a single Algerian sample. In order to evaluate the genetic heterogeneity of Algeria, Y-chromosome, mtDNA and autosomal genome-wide makers have been analyzed in several Berber- and Arab-speaking groups. Our results show that the genetic heterogeneity found in Algeria is not correlated with geography or linguistics, challenging the idea of Berber groups being genetically isolated and Arab groups open to gene flow. In addition, we have found that external sources of gene flow into North Africa have been carried more often by females than males, while the North African autochthonous component is more frequent in paternally transmitted genome regions. Our results highlight the different demographic history revealed by different markers and urge to be cautious when deriving general conclusions from partial genomic information or from single samples as representatives of the total population of a region. PMID:26402429
Investment horizon heterogeneity and wavelet: Overview and further research directions
NASA Astrophysics Data System (ADS)
Chakrabarty, Anindya; De, Anupam; Gunasekaran, Angappa; Dubey, Rameshwar
2015-07-01
Wavelet based multi-scale analysis of financial time series has attracted much attention, lately, from both the academia and practitioners from all around the world. The unceasing metamorphosis of the discipline of finance from its humble beginning as applied economics to the more sophisticated depiction as applied physics and applied psychology has revolutionized the way we perceive the market and its complexities. One such complexity is the presence of heterogeneous horizon agents in the market. In this context, we have performed a generous review of different aspects of horizon heterogeneity that has been successfully elucidated through the synergy between wavelet theory and finance. The evolution of wavelet has been succinctly delineated to bestow necessary information to the readers who are new to this field. The migration of wavelet into finance and its subsequent branching into different sub-divisions have been sketched. The pertinent literature on the impact of horizon heterogeneity on risk, asset pricing and inter-dependencies of the financial time series are explored. The significant contributions are collated and classified in accordance to their purpose and approach so that potential researcher and practitioners, interested in this subject, can be benefited. Future research possibilities in the direction of "agency cost mitigation" and "synergy between econophysics and behavioral finance in stock market forecasting" are also suggested in the paper.
The application of ANN for zone identification in a complex reservoir
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, A.C.; Molnar, D.; Aminian, K.
1995-12-31
Reservoir characterization plays a critical role in appraising the economic success of reservoir management and development methods. Nearly all reservoirs show some degree of heterogeneity, which invariably impacts production. As a result, the production performance of a complex reservoir cannot be realistically predicted without accurate reservoir description. Characterization of a heterogeneous reservoir is a complex problem. The difficulty stems from the fact that sufficient data to accurately predict the distribution of the formation attributes are not usually available. Generally the geophysical logs are available from a considerable number of wells in the reservoir. Therefore, a methodology for reservoir description andmore » characterization utilizing only well logs data represents a significant technical as well as economic advantage. One of the key issues in the description and characterization of heterogeneous formations is the distribution of various zones and their properties. In this study, several artificial neural networks (ANN) were successfully designed and developed for zone identification in a heterogeneous formation from geophysical well logs. Granny Creek Field in West Virginia has been selected as the study area in this paper. This field has produced oil from Big Injun Formation since the early 1900`s. The water flooding operations were initiated in the 1970`s and are currently still in progress. Well log data on a substantial number of wells in this reservoir were available and were collected. Core analysis results were also available from a few wells. The log data from 3 wells along with the various zone definitions were utilized to train the networks for zone recognition. The data from 2 other wells with previously determined zones, based on the core and log data, were then utilized to verify the developed networks predictions. The results indicated that ANN can be a useful tool for accurately identifying the zones in complex reservoirs.« less
NASA Astrophysics Data System (ADS)
Deng, Liang; Bai, Hanli; Wang, Fang; Xu, Qingxin
2016-06-01
CPU/GPU computing allows scientists to tremendously accelerate their numerical codes. In this paper, we port and optimize a double precision alternating direction implicit (ADI) solver for three-dimensional compressible Navier-Stokes equations from our in-house Computational Fluid Dynamics (CFD) software on heterogeneous platform. First, we implement a full GPU version of the ADI solver to remove a lot of redundant data transfers between CPU and GPU, and then design two fine-grain schemes, namely “one-thread-one-point” and “one-thread-one-line”, to maximize the performance. Second, we present a dual-level parallelization scheme using the CPU/GPU collaborative model to exploit the computational resources of both multi-core CPUs and many-core GPUs within the heterogeneous platform. Finally, considering the fact that memory on a single node becomes inadequate when the simulation size grows, we present a tri-level hybrid programming pattern MPI-OpenMP-CUDA that merges fine-grain parallelism using OpenMP and CUDA threads with coarse-grain parallelism using MPI for inter-node communication. We also propose a strategy to overlap the computation with communication using the advanced features of CUDA and MPI programming. We obtain speedups of 6.0 for the ADI solver on one Tesla M2050 GPU in contrast to two Xeon X5670 CPUs. Scalability tests show that our implementation can offer significant performance improvement on heterogeneous platform.
Grajeda, Laura M; Ivanescu, Andrada; Saito, Mayuko; Crainiceanu, Ciprian; Jaganath, Devan; Gilman, Robert H; Crabtree, Jean E; Kelleher, Dermott; Cabrera, Lilia; Cama, Vitaliano; Checkley, William
2016-01-01
Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p < 0.001) when using a linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p < 0.001) and slopes (p < 0.001) of the individual growth trajectories. We also identified important serial correlation within the structure of the data (ρ = 0.66; 95 % CI 0.64 to 0.68; p < 0.001), which we modeled with a first order continuous autoregressive error term as evidenced by the variogram of the residuals and by a lack of association among residuals. The final model provides a parametric linear regression equation for both estimation and prediction of population- and individual-level growth in height. We show that cubic regression splines are superior to linear regression splines for the case of a small number of knots in both estimation and prediction with the full linear mixed effect model (AIC 19,352 vs. 19,598, respectively). While the regression parameters are more complex to interpret in the former, we argue that inference for any problem depends more on the estimated curve or differences in curves rather than the coefficients. Moreover, use of cubic regression splines provides biological meaningful growth velocity and acceleration curves despite increased complexity in coefficient interpretation. Through this stepwise approach, we provide a set of tools to model longitudinal childhood data for non-statisticians using linear mixed-effect models.
Environmental Integrity of Coating/Metal Interface.
1988-01-01
34. Report No. 1 FROM 02/01/87 TO 01/31/88 1988, JANUARY 32 * ’B SUPOLEMEN’ARY NOTATiON - 7 COSAT CODES 18 SUBJECT TERMS ,Co’r ’nXe on reverse ’,"ecessa’, ac ...AgCI accelerate disbonding by the formation of a weak fluid boundary layer at the coating/metal interface just ahead of electroosmotically produced...pockets of electroosmotically formed electrolyte or swollen regions of the heterogeneous polymer. A time series of micrographs allowed a virtually
NASA Astrophysics Data System (ADS)
Nagaso, Masaru; Komatitsch, Dimitri; Moysan, Joseph; Lhuillier, Christian
2018-01-01
ASTRID project, French sodium cooled nuclear reactor of 4th generation, is under development at the moment by Alternative Energies and Atomic Energy Commission (CEA). In this project, development of monitoring techniques for a nuclear reactor during operation are identified as a measure issue for enlarging the plant safety. Use of ultrasonic measurement techniques (e.g. thermometry, visualization of internal objects) are regarded as powerful inspection tools of sodium cooled fast reactors (SFR) including ASTRID due to opacity of liquid sodium. In side of a sodium cooling circuit, heterogeneity of medium occurs because of complex flow state especially in its operation and then the effects of this heterogeneity on an acoustic propagation is not negligible. Thus, it is necessary to carry out verification experiments for developments of component technologies, while such kind of experiments using liquid sodium may be relatively large-scale experiments. This is why numerical simulation methods are essential for preceding real experiments or filling up the limited number of experimental results. Though various numerical methods have been applied for a wave propagation in liquid sodium, we still do not have a method for verifying on three-dimensional heterogeneity. Moreover, in side of a reactor core being a complex acousto-elastic coupled region, it has also been difficult to simulate such problems with conventional methods. The objective of this study is to solve these 2 points by applying three-dimensional spectral element method. In this paper, our initial results on three-dimensional simulation study on heterogeneous medium (the first point) are shown. For heterogeneity of liquid sodium to be considered, four-dimensional temperature field (three spatial and one temporal dimension) calculated by computational fluid dynamics (CFD) with Large-Eddy Simulation was applied instead of using conventional method (i.e. Gaussian Random field). This three-dimensional numerical experiment yields that we could verify the effects of heterogeneity of propagation medium on waves in Liquid sodium.
Single-cell heterogeneity in ductal carcinoma in situ of breast.
Gerdes, Michael J; Gökmen-Polar, Yesim; Sui, Yunxia; Pang, Alberto Santamaria; LaPlante, Nicole; Harris, Adrian L; Tan, Puay-Hoon; Ginty, Fiona; Badve, Sunil S
2018-03-01
Heterogeneous patterns of mutations and RNA expression have been well documented in invasive cancers. However, technological challenges have limited the ability to study heterogeneity of protein expression. This is particularly true for pre-invasive lesions such as ductal carcinoma in situ of the breast. Cell-level heterogeneity in ductal carcinoma in situ was analyzed in a single 5 μm tissue section using a multiplexed immunofluorescence analysis of 11 disease-related markers (EGFR, HER2, HER4, S6, pmTOR, CD44v6, SLC7A5 and CD10, CD4, CD8 and CD20, plus pan-cytokeratin, pan-cadherin, DAPI, and Na+K+ATPase for cell segmentation). Expression was quantified at cell level using a single-cell segmentation algorithm. K-means clustering was used to determine co-expression patterns of epithelial cell markers and immune markers. We document for the first time the presence of epithelial cell heterogeneity within ducts, between ducts and between patients with ductal carcinoma in situ. There was moderate heterogeneity in a distribution of eight clusters within each duct (average Shannon index 0.76; range 0-1.61). Furthermore, within each patient, the average Shannon index across all ducts ranged from 0.33 to 1.02 (s.d. 0.09-0.38). As the distribution of clusters within ducts was uneven, the analysis of eight ducts might be sufficient to represent all the clusters ie within- and between-duct heterogeneity. The pattern of epithelial cell clustering was associated with the presence and type of immune infiltrates, indicating a complex interaction between the epithelial tumor and immune system for each patient. This analysis also provides the first evidence that simultaneous analysis of both the epithelial and immune/stromal components might be necessary to understand the complex milieu in ductal carcinoma in situ lesions.
Purification and biochemical heterogeneity of the mammalian SWI-SNF complex.
Wang, W; Côté, J; Xue, Y; Zhou, S; Khavari, P A; Biggar, S R; Muchardt, C; Kalpana, G V; Goff, S P; Yaniv, M; Workman, J L; Crabtree, G R
1996-01-01
We have purified distinct complexes of nine to 12 proteins [referred to as BRG1-associated factors (BAFs)] from several mammalian cell lines using an antibody to the SWI2-SNF2 homolog BRG1. Microsequencing revealed that the 47 kDa BAF is identical to INI1. Previously INI1 has been shown to interact with and activate human immunodeficiency virus integrase and to be homologous to the yeast SNF5 gene. A group of BAF47-associated proteins were affinity purified with antibodies against INI1/BAF47 and were found to be identical to those co-purified with BRG1, strongly indicating that this group of proteins associates tightly and is likely to be the mammalian equivalent of the yeast SWI-SNF complex. Complexes containing BRG1 can disrupt nucleosomes and facilitate the binding of GAL4-VP16 to a nucleosomal template similar to the yeast SWI-SNF complex. Purification of the complex from several cell lines demonstrates that it is heterogeneous with respect to subunit composition. The two SWI-SNF2 homologs, BRG1 and hbrm, were found in separate complexes. Certain cell lines completely lack BRG1 and hbrm, indicating that they are not essential for cell viability and that the mammalian SWI-SNF complex may be tailored to the needs of a differentiated cell type. Images PMID:8895581
The Student Course Experience among Online, Accelerated, and Traditional Courses
ERIC Educational Resources Information Center
Bielitz, Colleen L.
2016-01-01
The demand by the public for a wider variety of course formats has led to complexity in determining a course's optimal delivery format as many faculty members still believe that online and accelerated courses do not offer students an equivalent experience to traditional face to face instruction. The purpose of this quantitative, comparative study…
Beam property measurement of a 300-kV ion source test stand for a 1-MV electrostatic accelerator
NASA Astrophysics Data System (ADS)
Park, Sae-Hoon; Kim, Dae-Il; Kim, Yu-Seok
2016-09-01
The KOMAC (Korea Multi-purpose Accelerator Complex) has been developing a 300-kV ion source test stand for a 1-MV electrostatic accelerator for industrial purposes. A RF ion source was operated at 200 MHz with its matching circuit. The beam profile and emittance were measured behind an accelerating column to confirm the beam property from the RF ion source. The beam profile was measured at the end of the accelerating tube and at the beam dump by using a beam profile monitor (BPM) and wire scanner. An Allison-type emittance scanner was installed behind the beam profile monitor (BPM) to measure the beam density in phase space. The measurement results for the beam profile and emittance are presented in this paper.
Classical-trajectory simulation of accelerating neutral atoms with polarized intense laser pulses
NASA Astrophysics Data System (ADS)
Xia, Q. Z.; Fu, L. B.; Liu, J.
2013-03-01
In the present paper, we perform the classical trajectory Monte Carlo simulation of the complex dynamics of accelerating neutral atoms with linearly or circularly polarized intense laser pulses. Our simulations involve the ion motion as well as the tunneling ionization and the scattering dynamics of valence electron in the combined Coulomb and electromagnetic fields, for both helium (He) and magnesium (Mg). We show that for He atoms, only linearly polarized lasers can effectively accelerate the atoms, while for Mg atoms, we find that both linearly and circularly polarized lasers can successively accelerate the atoms. The underlying mechanism is discussed and the subcycle dynamics of accelerating trajectories is investigated. We have compared our theoretical results with a recent experiment [Eichmann Nature (London)NATUAS0028-083610.1038/nature08481 461, 1261 (2009)].
Explorative search of distributed bio-data to answer complex biomedical questions
2014-01-01
Background The huge amount of biomedical-molecular data increasingly produced is providing scientists with potentially valuable information. Yet, such data quantity makes difficult to find and extract those data that are most reliable and most related to the biomedical questions to be answered, which are increasingly complex and often involve many different biomedical-molecular aspects. Such questions can be addressed only by comprehensively searching and exploring different types of data, which frequently are ordered and provided by different data sources. Search Computing has been proposed for the management and integration of ranked results from heterogeneous search services. Here, we present its novel application to the explorative search of distributed biomedical-molecular data and the integration of the search results to answer complex biomedical questions. Results A set of available bioinformatics search services has been modelled and registered in the Search Computing framework, and a Bioinformatics Search Computing application (Bio-SeCo) using such services has been created and made publicly available at http://www.bioinformatics.deib.polimi.it/bio-seco/seco/. It offers an integrated environment which eases search, exploration and ranking-aware combination of heterogeneous data provided by the available registered services, and supplies global results that can support answering complex multi-topic biomedical questions. Conclusions By using Bio-SeCo, scientists can explore the very large and very heterogeneous biomedical-molecular data available. They can easily make different explorative search attempts, inspect obtained results, select the most appropriate, expand or refine them and move forward and backward in the construction of a global complex biomedical query on multiple distributed sources that could eventually find the most relevant results. Thus, it provides an extremely useful automated support for exploratory integrated bio search, which is fundamental for Life Science data driven knowledge discovery. PMID:24564278
Zhang, Weizhe; Bai, Enci; He, Hui; Cheng, Albert M.K.
2015-01-01
Reducing energy consumption is becoming very important in order to keep battery life and lower overall operational costs for heterogeneous real-time multiprocessor systems. In this paper, we first formulate this as a combinatorial optimization problem. Then, a successful meta-heuristic, called Shuffled Frog Leaping Algorithm (SFLA) is proposed to reduce the energy consumption. Precocity remission and local optimal avoidance techniques are proposed to avoid the precocity and improve the solution quality. Convergence acceleration significantly reduces the search time. Experimental results show that the SFLA-based energy-aware meta-heuristic uses 30% less energy than the Ant Colony Optimization (ACO) algorithm, and 60% less energy than the Genetic Algorithm (GA) algorithm. Remarkably, the running time of the SFLA-based meta-heuristic is 20 and 200 times less than ACO and GA, respectively, for finding the optimal solution. PMID:26110406
Bilateral multifocal Warthin tumours.
Deveer, Mehmet; Sahan, Murat; Sivrioglu, Ali Kemal; Celik, Ozgür Ilhan
2013-05-22
Warthin tumour, also known as papillary cystadenoma lymphomatosum, is the second most frequent benign tumour of the parotid gland after pleomorphic adenoma. A 57-year-old man was referred to our hospital with bilateral buccal masses without pain. He presented with a 1-year history of the condition and stated that growth of the mass has accelerated during the last 6 months. Ultrasonography examination showed two heterogeneous solid masses. Axial contrast-enhanced CT image revealed bilateral heterogeneous solid masses. The masses showed enhancement after contrast administration (95 HU). Fine needle aspiration cytology was recommended for further analysis and typical benign features of Warthin tumour was obtained. Right parotid gland including the masses was resected completely. 5 weeks later superficial parotidectomy was performed to the left parotid gland. Histological examination revealed cystic tumour in the parenchyma of parotid gland, composed of prominent lymphoid stroma and large epithelial cells with oncocytic features covering it consistent with Warthin tumour.
Bilateral multifocal Warthin tumours
Deveer, Mehmet; Sahan, Murat; Sivrioglu, Ali Kemal; İlhan Celik, Özgür
2013-01-01
Warthin tumour, also known as papillary cystadenoma lymphomatosum, is the second most frequent benign tumour of the parotid gland after pleomorphic adenoma. A 57-year-old man was referred to our hospital with bilateral buccal masses without pain. He presented with a 1-year history of the condition and stated that growth of the mass has accelerated during the last 6 months. Ultrasonography examination showed two heterogeneous solid masses. Axial contrast-enhanced CT image revealed bilateral heterogeneous solid masses. The masses showed enhancement after contrast administration (95 HU). Fine needle aspiration cytology was recommended for further analysis and typical benign features of Warthin tumour was obtained. Right parotid gland including the masses was resected completely. 5 weeks later superficial parotidectomy was performed to the left parotid gland. Histological examination revealed cystic tumour in the parenchyma of parotid gland, composed of prominent lymphoid stroma and large epithelial cells with oncocytic features covering it consistent with Warthin tumour. PMID:23704438
DOE Office of Scientific and Technical Information (OSTI.GOV)
Müller, Florian, E-mail: florian.mueller@sam.math.ethz.ch; Jenny, Patrick, E-mail: jenny@ifd.mavt.ethz.ch; Meyer, Daniel W., E-mail: meyerda@ethz.ch
2013-10-01
Monte Carlo (MC) is a well known method for quantifying uncertainty arising for example in subsurface flow problems. Although robust and easy to implement, MC suffers from slow convergence. Extending MC by means of multigrid techniques yields the multilevel Monte Carlo (MLMC) method. MLMC has proven to greatly accelerate MC for several applications including stochastic ordinary differential equations in finance, elliptic stochastic partial differential equations and also hyperbolic problems. In this study, MLMC is combined with a streamline-based solver to assess uncertain two phase flow and Buckley–Leverett transport in random heterogeneous porous media. The performance of MLMC is compared tomore » MC for a two dimensional reservoir with a multi-point Gaussian logarithmic permeability field. The influence of the variance and the correlation length of the logarithmic permeability on the MLMC performance is studied.« less
Davis, Kathryn M; Badu-Tawiah, Abraham K
2017-04-01
The exposure of an aqueous-based liquid drop containing amines and graphite particles to plasma generated by a corona discharge results in heterogeneous aerobic dehydrogenation reactions. This green oxidation reaction occurring in ambient air afforded the corresponding quinolines and nitriles from tetrahydroquinolines and primary amines, respectively, at >96% yields in less than 2 min of reaction time. The accelerated dehydrogenation reactions occurred on the surface of a low energy hydrophobic paper, which served both as container for holding the reacting liquid drop and as a medium for achieving paper spray ionization of reaction products for subsequent characterization by ambient mass spectrometry. Control experiments indicate superoxide anions (O 2 •- ) are the main reactive species; the presence of graphite particles introduced heterogeneous surface effects, and enabled the efficient sampling of the plasma into the grounded analyte droplet solution. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Davis, Kathryn M.; Badu-Tawiah, Abraham K.
2017-04-01
The exposure of an aqueous-based liquid drop containing amines and graphite particles to plasma generated by a corona discharge results in heterogeneous aerobic dehydrogenation reactions. This green oxidation reaction occurring in ambient air afforded the corresponding quinolines and nitriles from tetrahydroquinolines and primary amines, respectively, at >96% yields in less than 2 min of reaction time. The accelerated dehydrogenation reactions occurred on the surface of a low energy hydrophobic paper, which served both as container for holding the reacting liquid drop and as a medium for achieving paper spray ionization of reaction products for subsequent characterization by ambient mass spectrometry. Control experiments indicate superoxide anions (O2 •-) are the main reactive species; the presence of graphite particles introduced heterogeneous surface effects, and enabled the efficient sampling of the plasma into the grounded analyte droplet solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luszczek, Piotr R; Tomov, Stanimire Z; Dongarra, Jack J
We present an efficient and scalable programming model for the development of linear algebra in heterogeneous multi-coprocessor environments. The model incorporates some of the current best design and implementation practices for the heterogeneous acceleration of dense linear algebra (DLA). Examples are given as the basis for solving linear systems' algorithms - the LU, QR, and Cholesky factorizations. To generate the extreme level of parallelism needed for the efficient use of coprocessors, algorithms of interest are redesigned and then split into well-chosen computational tasks. The tasks execution is scheduled over the computational components of a hybrid system of multi-core CPUs andmore » coprocessors using a light-weight runtime system. The use of lightweight runtime systems keeps scheduling overhead low, while enabling the expression of parallelism through otherwise sequential code. This simplifies the development efforts and allows the exploration of the unique strengths of the various hardware components.« less
The effect of heterogeneous defectors on the evolution of public cooperation
NASA Astrophysics Data System (ADS)
Chen, Tong; Hu, Xuezhi; Wang, Yongjie; Wang, Le
2018-06-01
In recent years,more and more private capital join the construction of cultural facilities and the organization of cultural activities in China. Actually, the organization of cultural activities by crowd-funding mechanism is a kind of multi-player game. Not all players who donate different amount of money are real cooperators. In fact, some cunning defectors may donate a little money to avoid the gossip and punishment. This part of people are very tricky. They could be seen as heterogeneous defectors. The role of heterogeneous defectors is investigated in cooperative behaviors of complex social network. Numerical results show that heterogeneous defectors could be a buffer for maintaining the public pool when synergy factor is low in public goods game (PGG). It is relatively easy to be cooperators for heterogeneous defectors when synergy factor is high in PGG. To better improve cooperation, punishment towards heterogeneous defectors and complete defectors is introduced. We are glad to find that when the defectors' loss is equal to or larger than the altruistic cooperators' punishment cost, the mechanism could make great effect. In addition, the role of heterogeneous defectors depends on the relationship between the punishment cost and the defectors' loss.
One microenvironment does not fit all: heterogeneity beyond cancer cells.
Kim, Ik Sun; Zhang, Xiang H-F
2016-12-01
Human cancers exhibit formidable molecular heterogeneity, to a large extent accounting for the incomplete and transitory efficacy of current anti-cancer therapies. However, neoplastic cells alone do not manifest the disease, but conscript a battery of non-tumor cells to enable and sustain hallmark capabilities of cancer. Escaping immunosurveillance is one of such capabilities. Tumors evolve immunosuppressive microenvironment to subvert anti-tumor immunity. In this review, we will focus on tumor-associated myeloid cells, which constitute an essential part of the immune microenvironment and reciprocally interact with cancer cells to establish malignancy toward metastasis. The diversity and plasticity of these cells constitute another layer of heterogeneity, beyond the heterogeneity of cancer cells themselves. We envision that immune microenvironment co-evolves with the genetic heterogeneity of tumor. Addressing the question of how genetically distinct tumors shape and are shaped by unique immune microenvironment will provide an attractive rationale to develop novel immunotherapeutic modalities. Here, we discuss the complex nature of tumor microenvironment, with an emphasis on the cellular and functional heterogeneity among tumor-associated myeloid cells as well as immune environment heterogeneity in the context of a full spectrum of human breast cancers.
NASA Astrophysics Data System (ADS)
Ronayne, Michael J.; Gorelick, Steven M.; Zheng, Chunmiao
2010-10-01
We developed a new model of aquifer heterogeneity to analyze data from a single-well injection-withdrawal tracer test conducted at the Macrodispersion Experiment (MADE) site on the Columbus Air Force Base in Mississippi (USA). The physical heterogeneity model is a hybrid that combines 3-D lithofacies to represent submeter scale, highly connected channels within a background matrix based on a correlated multivariate Gaussian hydraulic conductivity field. The modeled aquifer architecture is informed by a variety of field data, including geologic core sampling. Geostatistical properties of this hybrid heterogeneity model are consistent with the statistics of the hydraulic conductivity data set based on extensive borehole flowmeter testing at the MADE site. The representation of detailed, small-scale geologic heterogeneity allows for explicit simulation of local preferential flow and slow advection, processes that explain the complex tracer response from the injection-withdrawal test. Based on the new heterogeneity model, advective-dispersive transport reproduces key characteristics of the observed tracer recovery curve, including a delayed concentration peak and a low-concentration tail. Importantly, our results suggest that intrafacies heterogeneity is responsible for local-scale mass transfer.
How to Recharge a Confined Alluvial Aquifer System
NASA Astrophysics Data System (ADS)
Maples, S.; Fogg, G. E.; Liu, Y.
2016-12-01
Greater water storage capacity is needed to offset future decreases in snowpack-water storage in California. Managed aquifer recharge (MAR) in California's Central Valley aquifer system is a promising alternative to new surface reservoir storage because it has the potential to both reduce overdraft conditions observed in many Central Valley groundwater basins and offset continued decreases in snowpack storage. MAR to the Central Valley's productive confined-aquifer system remains a challenge because, like most alluvial aquifer systems, it is composed mostly of silt and clay sediments that form nearly ubiquitous, multiple confining layers that inhibit direct recharge of the interconnected sand and gravel body networks. Several studies have mapped surficial soil types in the Central Valley that are conducive to MAR, but few studies have evaluated how subsurface geologic heterogeneity controls recharge to the confined aquifer system. Here, we use a transition probability Markov-chain geostatistical model conditioned with 1200 well logs to create a physically-realistic representation of the subsurface geologic heterogeneity in the American and Cosumnes River watersheds on the east side of the Sacramento Valley, CA, where studies have shown the presence of massive, interconnected, highly-permeable gravel deposits that are potentially conducive to considerably higher rates of regional recharge than would be possible over the rest of the landscape. Such localized stratigraphic features to support accelerated recharge occur throughout the Central Valley, but are mostly still undiscovered. A variably-saturated, fully-integrated, groundwater/surface-water code, ParFlow, was used to simulate MAR dynamics in this system. Results show the potential for (1) accelerated, high-volume recharge through interconnected gravels where they outcrop at land surface, and (2) regional repressurization of the deeper confined aquifer system. These findings provide insight into the critical role of subsurface heterogeneity on MAR dynamics in alluvial aquifer systems and highlight the potential for MAR in California and elsewhere.
Complex vestibular macular anatomical relationships need a synthetic approach
NASA Technical Reports Server (NTRS)
Ross, M. D.
2001-01-01
Mammalian vestibular maculae are anatomically organized for complex parallel processing of linear acceleration information. Anatomical findings in rat maculae are provided in order to underscore this complexity, which is little understood functionally. This report emphasizes that a synthetic approach is critical to understanding how maculae function and the kind of information they conduct to the brain.
Masunaga, Shin-Ichiro; Ando, Koichi; Uzawa, Akiko; Hirayama, Ryoichi; Furusawa, Yoshiya; Koike, Sachiko; Sakurai, Yoshinori; Nagata, Kenji; Suzuki, Minoru; Kashino, Genro; Kinashi, Yuko; Tanaka, Hiroki; Maruhashi, Akira; Ono, Koji
2008-01-01
To clarify the radiosensitivity of intratumor quiescent cells in vivo to accelerated carbon ion beams and reactor neutron beams. Squamous cell carcinoma VII tumor-bearing mice were continuously given 5-bromo-2'-deoxyuridine to label all intratumor proliferating cells. Next, they received accelerated carbon ion or gamma-ray high-dose-rate (HDR) or reduced-dose-rate (RDR) irradiation. Other tumor-bearing mice received reactor thermal or epithermal neutrons with RDR irradiation. Immediately after HDR and RDR irradiation or 12 h after HDR irradiation, the response of quiescent cells was assessed in terms of the micronucleus frequency using immunofluorescence staining for 5-bromo-2'-deoxyuridine. The response of the total (proliferating plus quiescent) tumor cells was determined from the 5-bromo-2'-deoxyuridine nontreated tumors. The difference in radiosensitivity between the total and quiescent cell populations after gamma-ray irradiation was markedly reduced with reactor neutron beams or accelerated carbon ion beams, especially with a greater linear energy transfer (LET) value. Clearer repair in quiescent cells than in total cells through delayed assay or a decrease in the dose rate with gamma-ray irradiation was efficiently inhibited with carbon ion beams, especially with a greater LET. With RDR irradiation, the radiosensitivity to accelerated carbon ion beams with a greater LET was almost similar to that to reactor thermal and epithermal neutron beams. In terms of tumor cell-killing effect as a whole, including quiescent cells, accelerated carbon ion beams, especially with greater LET values, are very useful for suppressing the dependency on the heterogeneity within solid tumors, as well as depositing the radiation dose precisely.
Criticality and Induction Time of Hot Spots in Detonating Heterogeneous Explosives
NASA Astrophysics Data System (ADS)
Hill, Larry
2017-06-01
Detonation reaction in physically heterogeneous explosives is-to an extent that depends on multiple material attributes-likewise heterogeneous. Like all heterogeneous reaction, detonation heterogeneous reaction begins at nucleation sites, which, in this case, comprise localized regions of higher-than-average temperature-so-called hot spots. Burning grows at, and then spreads from these nucleation sites, via reactive-thermal (R-T) waves, to consume the interstitial material. Not all hot spots are consequential, but only those that are 1) supercritical, and 2) sufficiently so as to form R-T waves before being consumed by those already emanating from neighboring sites. I explore aspects of these two effects by deriving simple formulae for hot spot criticality and the induction time of supercritical hot spots. These results serve to illustrate the non-intuitive, yet mathematically simplifying, effects of extreme dependence of reaction rate upon temperature. They can play a role in the development of better reactive burn models, for which we seek to homogenize the essentials of heterogeneous detonation reaction without introducing spurious complexity. Work supported by the US Dept. of Energy.
Effective electromagnetic properties of microheterogeneous materials with surface phenomena
NASA Astrophysics Data System (ADS)
Levin, Valery; Markov, Mikhail; Mousatov, Aleksandr; Kazatchenko, Elena; Pervago, Evgeny
2017-10-01
In this paper, we present an approach to calculate the complex dielectric permittivity of a micro-heterogeneous medium composed of non-conductive solid inclusions embedded into the conductive liquid continuous host. To take into account the surface effects, we approximate the inclusion by a layered ellipsoid consisting of a dielectric core and an infinitesimally thin outer shell corresponding to an electrical double layer (EDL). To predict the effective complex dielectric permittivity of materials with a high concentration of inclusions, we have modified the Effective Field Method (EFM) for the layered ellipsoidal particles with complex electrical properties. We present the results of complex permittivity calculations for the composites with randomly and parallel oriented ellipsoidal inclusions. To analyze the influence of surface polarization, we have accomplished modeling in a wide frequency range for different existing physic-chemical models of double electrical layer. The results obtained show that the tensor of effective complex permittivity of a micro-heterogeneous medium with surface effects has complicate dependences on the component electrical properties, spatial material texture, and the inclusion shape (ellipsoid aspect ratio) and size. The dispersion of dielectric permittivity corresponds to the frequency dependence for individual inclusion of given size, and does not depend on the inclusion concentration.
[Human tolerance to Coriolis acceleration during exertion of different muscle groups].
Aĭzikov, G S; Emel'ianov, M D; Ovechkin, V G
1975-01-01
The effect of an arbitrary loading of different muscle groups (shoulder, back, legs) and motor acts on the tolerance to Coriolis accelerations was investigated in 140 experiments in which 40 test subjects participated. The accelerations were cumulated and simulated by the Bryanov scheme. Muscle tension was accompanied by a less expressed vestibulo-vegetative reaction and shortening of the recovery period after the development of motion sickness symptoms. The greatest changes were observed during the performance of complex motor acts and tension of shoulder muscles. Possible mechanisms of these effects are discussed.
Vibrations At Surfaces During Heterogeneous Catalytic Reactions
NASA Astrophysics Data System (ADS)
Aragno, A.; Basini, Luca; Marchionna, M.; Raffaelli, A.
1989-12-01
FTIR spectroscopies can be used in a wide range of temperature and pressure conditions to investigate on the chemistry and the physics of heterogeneous catalytic reactions. In this paper we have shortly discussed the spectroscopic results obtained during the study of two different reactions; the skeletal isomerization of 1-butene to obtain 2-methylpropene and the surface aggregation and fragmentation of rhodium carbonyl complexes during thermal treatments in N2, H2, CO, CH4 atmospheres. In the first case high temperature proton tran-sfer reactions are proposed to be responsible for the skeletal isomerization reaction. In the second case our experiments have shown a partial reversibility of the nucleation processes at the surfaces and revealed a low temperature reactivity of methane on rhodium car-bonyl surface complexes.
Accelerator mass spectrometer with ion selection in high-voltage terminal
NASA Astrophysics Data System (ADS)
Rastigeev, S. A.; Goncharov, A. D.; Klyuev, V. F.; Konstantinov, E. S.; Kutnyakova, L. A.; Parkhomchuk, V. V.; Petrozhitskii, A. V.; Frolov, A. R.
2016-12-01
The folded electrostatic tandem accelerator with ion selection in a high-voltage terminal is the basis of accelerator mass spectrometry (AMS) at the BINP. Additional features of the BINP AMS are the target based on magnesium vapors as a stripper without vacuum deterioration and a time-of-flight telescope with thin films for reliable ion identification. The acceleration complex demonstrates reliable operation in a mode of 1 MV with 50 Hz counting rate of 14C+3 radiocarbon for modern samples (14C/12C 1.2 × 10-12). The current state of the AMS has been considered and the experimental results of the radiocarbon concentration measurements in test samples have been presented.
Accelerator science in medical physics.
Peach, K; Wilson, P; Jones, B
2011-12-01
The use of cyclotrons and synchrotrons to accelerate charged particles in hospital settings for the purpose of cancer therapy is increasing. Consequently, there is a growing demand from medical physicists, radiographers, physicians and oncologists for articles that explain the basic physical concepts of these technologies. There are unique advantages and disadvantages to all methods of acceleration. Several promising alternative methods of accelerating particles also have to be considered since they will become increasingly available with time; however, there are still many technical problems with these that require solving. This article serves as an introduction to this complex area of physics, and will be of benefit to those engaged in cancer therapy, or who intend to acquire such technologies in the future.
Generalized radially self-accelerating helicon beams.
Vetter, Christian; Eichelkraut, Toni; Ornigotti, Marco; Szameit, Alexander
2014-10-31
We report, in theory and experiment, on a new class of optical beams that are radially self-accelerating and nondiffracting. These beams continuously evolve on spiraling trajectories while maintaining their amplitude and phase distribution in their rotating rest frame. We provide a detailed insight into the theoretical origin and characteristics of radial self-acceleration and prove our findings experimentally. As radially self-accelerating beams are nonparaxial and a solution to the full scalar Helmholtz equation, they can be implemented in many linear wave systems beyond optics, from acoustic and elastic waves to surface waves in fluids and soft matter. Our work generalized the study of classical helicon beams to a complete set of solutions for rotating complex fields.
Hayat, T.; Hussain, Zakir; Alsaedi, A.; Farooq, M.
2016-01-01
This article examines the effects of homogeneous-heterogeneous reactions and Newtonian heating in magnetohydrodynamic (MHD) flow of Powell-Eyring fluid by a stretching cylinder. The nonlinear partial differential equations of momentum, energy and concentration are reduced to the nonlinear ordinary differential equations. Convergent solutions of momentum, energy and reaction equations are developed by using homotopy analysis method (HAM). This method is very efficient for development of series solutions of highly nonlinear differential equations. It does not depend on any small or large parameter like the other methods i. e., perturbation method, δ—perturbation expansion method etc. We get more accurate result as we increase the order of approximations. Effects of different parameters on the velocity, temperature and concentration distributions are sketched and discussed. Comparison of present study with the previous published work is also made in the limiting sense. Numerical values of skin friction coefficient and Nusselt number are also computed and analyzed. It is noticed that the flow accelerates for large values of Powell-Eyring fluid parameter. Further temperature profile decreases and concentration profile increases when Powell-Eyring fluid parameter enhances. Concentration distribution is decreasing function of homogeneous reaction parameter while opposite influence of heterogeneous reaction parameter appears. PMID:27280883
Hayat, T; Hussain, Zakir; Alsaedi, A; Farooq, M
2016-01-01
This article examines the effects of homogeneous-heterogeneous reactions and Newtonian heating in magnetohydrodynamic (MHD) flow of Powell-Eyring fluid by a stretching cylinder. The nonlinear partial differential equations of momentum, energy and concentration are reduced to the nonlinear ordinary differential equations. Convergent solutions of momentum, energy and reaction equations are developed by using homotopy analysis method (HAM). This method is very efficient for development of series solutions of highly nonlinear differential equations. It does not depend on any small or large parameter like the other methods i. e., perturbation method, δ-perturbation expansion method etc. We get more accurate result as we increase the order of approximations. Effects of different parameters on the velocity, temperature and concentration distributions are sketched and discussed. Comparison of present study with the previous published work is also made in the limiting sense. Numerical values of skin friction coefficient and Nusselt number are also computed and analyzed. It is noticed that the flow accelerates for large values of Powell-Eyring fluid parameter. Further temperature profile decreases and concentration profile increases when Powell-Eyring fluid parameter enhances. Concentration distribution is decreasing function of homogeneous reaction parameter while opposite influence of heterogeneous reaction parameter appears.
Harnessing the Power of Light to See and Treat Breast Cancer
2015-12-01
complexity : the raw data, the most basic form of data, represents the raw numeric readout obtained from the acquisition hardware. The raw data...has the added advantage of full-field illumination and non-descanned detection, thus lowering the complexity compared to confocal scanning systems... complexity of images that have varying levels of contrast and non-uniform background heterogeneity. In 2004 Matas described a technique for detecting
Fitzgibbon, Jessica; Beck, Martina; Zhou, Ji; Faulkner, Christine; Robatzek, Silke; Oparka, Karl
2013-01-01
Plasmodesmata (PD) form tubular connections that function as intercellular communication channels. They are essential for transporting nutrients and for coordinating development. During cytokinesis, simple PDs are inserted into the developing cell plate, while during wall extension, more complex (branched) forms of PD are laid down. We show that complex PDs are derived from existing simple PDs in a pattern that is accelerated when leaves undergo the sink–source transition. Complex PDs are inserted initially at the three-way junctions between epidermal cells but develop most rapidly in the anisocytic complexes around stomata. For a quantitative analysis of complex PD formation, we established a high-throughput imaging platform and constructed PDQUANT, a custom algorithm that detected cell boundaries and PD numbers in different wall faces. For anticlinal walls, the number of complex PDs increased with increasing cell size, while for periclinal walls, the number of PDs decreased. Complex PD insertion was accelerated by up to threefold in response to salicylic acid treatment and challenges with mannitol. In a single 30-min run, we could derive data for up to 11k PDs from 3k epidermal cells. This facile approach opens the door to a large-scale analysis of the endogenous and exogenous factors that influence PD formation. PMID:23371949
Effects of Glycine, Water, Ammonia, and Ammonium Bicarbonate on the Oligomerization of Methionine
NASA Astrophysics Data System (ADS)
Huang, Rui; Furukawa, Yoshihiro; Otake, Tsubasa; Kakegawa, Takeshi
2017-06-01
The abiotic oligomerization of amino acids may have created primordial, protein-like biological catalysts on the early Earth. Previous studies have proposed and evaluated the potential of diagenesis for the amino acid oligomerization, simulating the formation of peptides that include glycine, alanine, and valine, separately. However, whether such conditions can promote the formation of peptides composed of multiple amino acids remains unclear. Furthermore, the chemistry of pore water in sediments should affect the oligomerization and degradation of amino acids and oligomers, but these effects have not been studied extensively. In this study, we investigated the effects of water, ammonia, ammonium bicarbonate, pH, and glycine on the oligomerization and degradation of methionine under high pressure (150 MPa) and high temperature conditions (175 °C) for 96 h. Methionine is more difficult to oligomerize than glycine and methionine dimer was formed in the incubation of dry powder of methionine. Methionine oligomers as long as trimers, as well as methionylglycine and glycylmethionine, were formed under every condition with these additional compounds. Among the compounds tested, the oligomerization reaction rate was accelerated by the presence of water and by an increase in pH. Ammonia also increased the oligomerization rate but consumed methionine by side reactions and resulted in the rapid degradation of methionine and its peptides. Similarly, glycine accelerated the oligomerization rate of methionine and the degradation of methionine, producing water, ammonia, and bicarbonate through its decomposition. With Gly, heterogeneous dimers (methionylglycine and glycylmethionine) were formed in greater amounts than with other additional compounds although smaller amount of these heterogeneous dimers were formed with other additional compounds. These results suggest that accelerated reaction rates induced by water and co-existing reactive compounds promote the oligomerization of less reactive amino acids during diagenesis and enhance the formation of peptides composed of multiple amino acids.